Dutch lawyers increasingly have to convince clients that they can’t rely on AI-generated legal advice because chatbots are often inaccurate, the Financieele Dagblad (FD) found when speaking to several lawfirms. A recent survey by Deloitte showed that 60 percent of lawfirms see clients trying to perform simple legal tasks with AI tools, hoping to achieve a faster turnaround or lower fees.



I find it useless for even basic tasks. The fact that some people follow it blindly like a god is so concerning.
It’s great when I need a one-off powershell command or something basic like that.
I work in a health-care-adjacent industry and you’d be surprised how many people blindly follow LLMs for medical advice
It’s been doing wonders to help me improve materials I produce so that they fit better to some audiences. Also I can use those to spot missing points / inconsistencies against the ton of documents we have in my shop when writing something. It’s quite useful when using it as a sparing partner so far.
Rule of thumb is that AI can be useful if you use it for things you already know.
They can save time and if they produce shit, you’ll notice.
Don’t use them for things you know nothing about.
LLM’s specifically bc ai as a range of practices encompass a lot of things where the user can be slightly more dumb.
You’re spot on in my opinion.
It’s great when you have basic critical thinking skills and can use it like a tool.
Unfortunately, many people don’t have those and just use AI as a substitute for their own brain.
Yeah well same applies for a lot of tools… I’m not certified for flying a plane and look at me not flying one either… but I’m not shitting on planes…
But planes don’t routinely spit out false information.
If you can’t fly a plane chances are you’ll crash it. If you can’t use llms chances are you’ll get shit out of it… outcome of using a tool is directly correlated to one’s ability?
Sound logical enough to me.
Sure. However, the outcome of the tool LLM always looks very likely. And if you aren‘t a subject matter expert the likely expected result looks very right. That‘s the difference - hard to spot the wrong things (even for experts)
So is a speedometer and an altimeter until you reaaaaaaaaly need to understand them.
I mean it all boils down to proper tool with proper knowledge and ability. It’s slightly exacerbated by the apparent simplicity but if you look at it as a tool it’s no different.
Rearranging text is a vastly different use case than diagnosis and relevant information retrieval