• brucethemoose@lemmy.world
    link
    fedilink
    arrow-up
    1
    ·
    3 days ago

    I am late to this argument, but data center imagegen is typically batched so that many images are made in parallel. And (from the providers that aren’t idiots), the models likely use more sparsity or “tricks” to reduce compute.

    Task energy per image is waaay less than a desktop GPU. We probably burnt more energy in this thread than in an image, or a few.

    And this is getting exponentially more efficient with time, in spite of what morons like Sam Altman preach.


    There’s about a billion reasons image slop is awful, but the “energy use” one is way overblown.