• Prove_your_argument@piefed.social
    link
    fedilink
    English
    arrow-up
    16
    arrow-down
    24
    ·
    edit-2
    2 days ago

    Better search results than google though.

    EDIT: DO NOT LOOK AT THE CHAT. LOOK AT THE SOURCE LINKS THEY GIVE YOU.

    Unless it’s a handful of official pages or discussion forums… google is practically unusable for me now. It absolutely exploded once chatgpt came to the scene and SEO has gotten so perfected that slop is almost all the results you get.

    I wish we had some kind of downvote or report system to remove all the slop, but the more clicks the more revenue from referrals… better to make people click more.

    Almost all recipe sites now give me “We see you’re using an adblocker!” until I turn on reader mode on my phone now too. Pretty soon that will be appropriately blocked and I guess i’ll go back to cook books or something?

    • Honse@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      29
      arrow-down
      5
      ·
      2 days ago

      No TF its not. The AI can only output hallucinations that are most statistically likely. There’s no way to sort the bad answers from the good. Google at least supplies a wide range of content to sort through to find the best result.

      • Prove_your_argument@piefed.social
        link
        fedilink
        English
        arrow-up
        19
        arrow-down
        9
        ·
        2 days ago

        You misunderstand. I’m not saying AI’s chats are better than a quality article. I’m saying their search results are often better.

        Don’t look at what they SAY. Look at the links they provide as sources. Many are bad, but I find I get much better info. If I try google I might try 10+ links before I get one that really says what I want. If I try a chatbot I typically get a link that is relevant within one or two clicks.

        There is no shortcut for intelligence… but AI “SEO” has not been perfected yet .

        • Honse@lemmy.dbzer0.com
          link
          fedilink
          English
          arrow-up
          1
          ·
          1 day ago

          Interesting. LLMs have no ability to directly do anything put output text so the tooling around the LLM is what’s actually searching. They probably use some API from bing or something, have you compared results with those from bing because I’d be interested to see how similar they are or how much extra tooling is used for search. I can’t imagine they want to use a lot of cycles generating only like 3 search queries per request, unless they have a smaller dedicated model for that. Would be interested to see the architecture behind it and what’s different from normal search engines.

        • gustofwind@lemmy.world
          link
          fedilink
          English
          arrow-up
          10
          arrow-down
          1
          ·
          2 days ago

          Yep this has happened to me too

          I used to always get the results I was looking for now it’s just pure garbage but Gemini will have all the expected results as sources

          Obviously deliberate to force us to use Gemini and make free searches useless. Maybe it hasn’t been rolled out to everyone yet but it’s certainly got us

          • AmbitiousProcess (they/them)@piefed.social
            link
            fedilink
            English
            arrow-up
            3
            ·
            2 days ago

            I’m honestly not even sure it’s deliberate.

            If you give a probability guessing machine like LLMs the ability to review content, it’s probably just gonna be more likely to rank things as you expect for your search specifically than an algorithm made to extremely quickly pull the most relevant links… based on only some of the page as keywords, with no understanding of how the context of your search relates to each page.

            The downside is, of course, that LLMs use way more energy than regular search algorithms, take longer to provide all their citations, etc.

            • Prove_your_argument@piefed.social
              link
              fedilink
              English
              arrow-up
              2
              ·
              1 day ago

              A ton of factors have increased energy costs on the web over the years. It’s insignificant per person, but bandwidth is exponentially higher because all websites have miles of crud formatting code nowadays. Memory usage is out of control. Transmission and storage of all the metadata your web browser provides in realtime as you move your mouse around a page is infinitely higher than what we had in the early days of the web.

              The energy cost of ML will reduce as chips progress, but I think the financial reality will come crashing down on the AI industry sooner rather than later and basically keep it out of reach for most people anyway due to cost. I don’t see much of ROI for AI. Right now it’s treated as a capital investment which helps inflate company worth while it lasts, but after a few years the investment is worthless and a giant money sink from energy costs if used.

    • leftzero@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      7
      ·
      2 days ago

      Because they intentionally broke the search engines in order to make LLMs look better.

      Search engines used to produce much more useful results than LLMs ever will (even excluding the ones they make up), before google and microsoft started pushing this garbage.

      • fristislurper@piefed.social
        link
        fedilink
        English
        arrow-up
        7
        ·
        1 day ago

        Nahh, even before LLMs became big, search results were becoming worse and worse because of all the SEO-spam.

        Now you can generate coherent-sounding articles using LLMs, making the amount of trash even bigger.

        Google making their search worse would be dumb, since all these LLMs also rely on it to some degree.

      • Prove_your_argument@piefed.social
        link
        fedilink
        English
        arrow-up
        4
        ·
        1 day ago

        I don’t think so man. They were just started to mass train on SEO in the ad industry 15 years ago. Now it’s practically a household term.

        Google had very different goals and ambitions early on. Things changed. Now they’re like any other giant soulless corpo. Their goal is revenue, which as their platform has grown we’ve seen a metamorphosis from a focus on interesting things to a never ending professional jester troupe on every endpoint still making pennies on the dollar compared to what google earns from advertisers. They’re the middle men and should be making next to nothing since they produce next to nothing of value, but advertising sells.

        Google Search revenue was something like 175bn in 2024. A tiny fraction of that is paid out to websites from clicks. Someone with a proper LLM tuned for SEO can churn out hot garbage nonstop and fill up results in perpetuity with a guaranteed revenue stream far in excess of what is possible as a worker in something in the overwhelming majority of the world. There’s just more garbage than everywhere… and people have found exactly the right formula to rise to the top despite human-useless content. Google doesn’t think of it as a bad thing, 175bn in revenue from search! lol

      • Prove_your_argument@piefed.social
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 day ago

        Genie is out of the bottle forever unfortunately.

        They could have instituted a system of genuine authentic reviewers who manually curate content so that your search is great, but that would result in less clicks. Less clicks means less revenue. They’re financially incentivized to make you click as much as you’re willing to.