I would have no problem with AI if it could be useful.
The problem is no matter how many times I’m promised otherwise it cannot automate my job and talk to the idiots for me. It just hallucinates a random gibberish which is obviously unhealthful.
I’ve found it useful for a few things. I had had a song intermittently stuck in my head for a few years and had unsuccessfully googled it a few times. Couldn’t remember artist, name, lyrics (it was in a language I don’t speak) - and chatGPT got it in a couple of tries. Things that I’m too vague about to be able to construct a search prompt and want to explore. Stuff like that. I just don’t trust it with anything that I want actual facts for.
Yep, preparing a proper search is my use case. Like “how is this special screw called?”. I can describe the screw and tell the model to provide me a list of how that screw could be called.
Then I can search for the terms of that list and one of the terms is the correct one. It’s way better than hoping that somebody described the screw in the same words in some obscure forum.
But, is it worth to burn the planet, make RAM, GPUs, hard drives unaffordable for everybody and probably crash the world economy for a better screw search? I doubt it.
The thing is Google was already pretty good at that before things like chat GPT came out.
I remember many years ago I searched for “that movie where they steal a painting by changing the temperature” and Google got it on the first hit. That was way before AI.
I still would, as the increased productivity, once again, does not lead to reduced hours. Always more productive, always locked into a bullshit schedule.
Or you’re one of the few who have a pretty niche job
Just things like different words or vocabulary, or helping with some code related knowledge, Linux issues… or even random known knowledge that you happen not to know
helping with some code related knowledge, Linux issues
I’m sorry but why the absolute flaming fuck does everyone assume that programming is the only job in the universe?
There are entire industries that don’t revolve around Linux. It’s amazing but not everyone in the world has to care about programming. Everyone needs to stop telling me that AI is good at programming. Firstly because it isn’t, it’s good damn awful. Secondly because I’m not a programmer, so I don’t care.
I’m sorry but why the absolute flaming fuck does everyone assume that programming is the only job in the universe?
I’m just giving examples I relate to, and describe my use of it
Firstly because it isn’t, it’s good damn awful. Secondly because I’m not a programmer, so I don’t care.
That explains why you think it’s awful at it then. You just believe the haters because you have confirmation bias. Fact is companies and people wouldn’t use it to code if it wasn’t good at that
It’s perfect for summaries, known general knowledge, correcting text, roleplay, repetitive tasks and some logic problems…
I think LLMs are also just genuinely not as universally useful as expected. Everyone thinks it can automate every job except their own not because everyone thinks their job is special but because they know all the intricate parts of that job that LLMs are still really bad at.
For instance AI could totally do my job at a surface level but it quickly devolves into deal breaking caveats which I am lumping into very broad categories to save time:
Output would look good (great even) but not actually be useable in most applications
Output cannot even begin to be optimized in the same way humans can optimize it
It would take more effort for to fix these than to just do the whole thing on my own to begin with
Case in point for me. AI cannot write documentation for new technical procedures because by definition they are new procedures. The information is not in its knowledge base, because I haven’t written it yet.
I would have no problem with AI if it could be useful.
The problem is no matter how many times I’m promised otherwise it cannot automate my job and talk to the idiots for me. It just hallucinates a random gibberish which is obviously unhealthful.
I’ve found it useful for a few things. I had had a song intermittently stuck in my head for a few years and had unsuccessfully googled it a few times. Couldn’t remember artist, name, lyrics (it was in a language I don’t speak) - and chatGPT got it in a couple of tries. Things that I’m too vague about to be able to construct a search prompt and want to explore. Stuff like that. I just don’t trust it with anything that I want actual facts for.
Yep, preparing a proper search is my use case. Like “how is this special screw called?”. I can describe the screw and tell the model to provide me a list of how that screw could be called.
Then I can search for the terms of that list and one of the terms is the correct one. It’s way better than hoping that somebody described the screw in the same words in some obscure forum.
But, is it worth to burn the planet, make RAM, GPUs, hard drives unaffordable for everybody and probably crash the world economy for a better screw search? I doubt it.
The thing is Google was already pretty good at that before things like chat GPT came out.
I remember many years ago I searched for “that movie where they steal a painting by changing the temperature” and Google got it on the first hit. That was way before AI.
It’s really good at answering customer questions for me, to be honest.
But, I still have to okay it. Just in case. There’s no trust.
However that still does take a lot less bandwidth for me because I’m not good at the customer facing aspects of my business.
I still would, as the increased productivity, once again, does not lead to reduced hours. Always more productive, always locked into a bullshit schedule.
Prompt or model issue my dude
Or you’re one of the few who have a pretty niche job
Just things like different words or vocabulary, or helping with some code related knowledge, Linux issues… or even random known knowledge that you happen not to know
I’m sorry but why the absolute flaming fuck does everyone assume that programming is the only job in the universe?
There are entire industries that don’t revolve around Linux. It’s amazing but not everyone in the world has to care about programming. Everyone needs to stop telling me that AI is good at programming. Firstly because it isn’t, it’s good damn awful. Secondly because I’m not a programmer, so I don’t care.
I’m just giving examples I relate to, and describe my use of it
That explains why you think it’s awful at it then. You just believe the haters because you have confirmation bias. Fact is companies and people wouldn’t use it to code if it wasn’t good at that
It’s perfect for summaries, known general knowledge, correcting text, roleplay, repetitive tasks and some logic problems…
Well if by that you mean I’ve used it and it’s crap, then yeah, I’ve confirmed it.
I think LLMs are also just genuinely not as universally useful as expected. Everyone thinks it can automate every job except their own not because everyone thinks their job is special but because they know all the intricate parts of that job that LLMs are still really bad at.
For instance AI could totally do my job at a surface level but it quickly devolves into deal breaking caveats which I am lumping into very broad categories to save time:
Case in point for me. AI cannot write documentation for new technical procedures because by definition they are new procedures. The information is not in its knowledge base, because I haven’t written it yet.