• 0 Posts
  • 3 Comments
Joined 3 years ago
cake
Cake day: June 15th, 2023

help-circle
  • That’s pretty interesting. And I totally agree with your last part. One counterpoint I would have is that local models are often more efficient though, and there’s very little checking you can do on how much your query actually costs in the cloud, while using it at home you can monitor your GPU usage and your power bill, and that information creates a sense of responsibility if you overuse it, like the amount of gas station stops a 10 hour joyride would require. But yeah at the end of the day using it as little as possible is a good habit.


  • ClamDrinker@lemmy.worldtoLemmy Shitpost@lemmy.worldTankie
    link
    fedilink
    arrow-up
    2
    arrow-down
    1
    ·
    edit-2
    2 days ago

    It does not but that wasn’t my point. It was that not all forms of AI usage are the same. The same way someone driving around an EV that they charge with solar power isn’t the same as someone driving a 1969 oil guzzler (or something equivalent). Local usage more often than not means efficient models, low energy consumption, and little difference to other computer tasks like gaming or video editing. But when the conversation is around AI, there is always the implicit expectation that it’s the worst of the worst.