• 0 Posts
  • 1 Comment
Joined 4 months ago
cake
Cake day: September 29th, 2025

help-circle
  • Everyone wants to run everything like a business these days. They want to save on payroll so rather than paying actual police to do the paperwork, they want to use Copilot or whatever to do the paperwork for them. Of course, because AI models are so crappy and error prone, they need to spend the same amount of money on payroll to verify the accuracy of the AI output. But they don’t do that. They just run with whatever the AI output is and figure it’ll be close enough to accurate. After all, big tech keeps telling everyone that AI is wonderful and can do anything.That is far from the truth though.

    A lawyer in California last year got in trouble for using ChatGPT to generate briefs for a trial. He wound up filing those briefs with the court even though they 21 of the 23 quotes from previous trials were complete fabrications. In another incident, a police department in Utah used an AI to generate a report from a traffic stop. That report claimed that an officer shape-shifted into a frog during the incident.

    There are endless reports of AI making shit up and demonstrating how error prone those tools are. Yet, people who should know better keep trusting AI to do these important jobs, just to save money on payroll, when AI is clearly far from ready for prime-time.