• hexaflexagonbear [he/him]@hexbear.net
      link
      fedilink
      English
      arrow-up
      12
      ·
      6 months ago

      A lot of this content was already auto-generated in the sense that there are a lot of sites which operate on the business model of generic structure, data scrape, generate article about release date of popular movie or game. I imagine as a replacement of those sites this might actually be a mild improvement… well until it starts hallucinating release dates and also performs worse than human scrapers confusing new movies with ones they are remakes of or just have similar titles.

      • blobjim [he/him]@hexbear.net
        link
        fedilink
        English
        arrow-up
        2
        ·
        edit-2
        6 months ago

        I wish Google would just start having a policy of immediately delisting those websites from search results, or at least deprioritizing or graying them out. If it falls under a certain category of website, it should meet certain quality standards. Like I assume they already do for medical information.

        If they had a more general search version of Google Scholar where it was all stuff that Google reasonably thinks was actually made by other humans, that would improve things a lot.

    • Justice@lemmygrad.ml
      link
      fedilink
      English
      arrow-up
      7
      ·
      6 months ago

      Also ad “features that exist, are now shittier, and btw, you now have to pay and oh also btw yes you will have ads even with payment. Problem? Go fuck yourself.”

      Great trend

    • Sasuke [comrade/them]@hexbear.net
      link
      fedilink
      English
      arrow-up
      7
      ·
      6 months ago

      i think this is a genuine possibility in schools now, at least in my country. maybe not removing access to the internet entirely, but at least severely restricting it, and going back to analog teaching methods wherever possible.

      we were very early on digitalizing education btw, giving every elementary school kid their own tablet, implementing digital tools in almost every subject, and in recent years, replacing physical books with digital copies.

      and surprise surprise, it’s been a fucking disaster. i can’t even imagine what it’ll be like as AI gets more widespread

        • CptKrkIsClmbngThMntn [any]@hexbear.net
          link
          fedilink
          English
          arrow-up
          2
          ·
          6 months ago

          I like them, but a) I tend to ruin them if I’m ever carting them around (especially if they’re paperbacks) and b) I’ve noticed that when I have a book on my phone, it’s much easier to dip into it and read a few paragraphs in the in between moments when I would otherwise open social media or something. With a physical book it’s much less likely I’ll dig it out of my bag if I’m waiting for the bus for five minutes and my hands are full.

      • blobjim [he/him]@hexbear.net
        link
        fedilink
        English
        arrow-up
        3
        ·
        6 months ago

        How much of it has to do with the fact that adults are just as clueless about how to use computers as kids?

        Is it really that hard to just have people use computers the same way they’d use books? Not everything needs to start with a Google search.

    • blobjim [he/him]@hexbear.net
      link
      fedilink
      English
      arrow-up
      4
      ·
      6 months ago

      You can still use the internet to look at books and documents. Random websites and Google search have never been a replacement for actual research.

      • optissima@lemmy.ml
        link
        fedilink
        English
        arrow-up
        1
        ·
        6 months ago

        How do you find your research without any search engines and/or do you think the engine you’re using will never try to implement an llm-based search? Corporate has its fist deep in academia these days…

        • blobjim [he/him]@hexbear.net
          link
          fedilink
          English
          arrow-up
          1
          ·
          edit-2
          6 months ago

          You find things through real-world networking, or hyperlinks from other websites. Of course it’s much easier to use a general search engine like Google. The point is that a search engine becoming bad doesn’t suddenly mean “the internet” is unusable and you have to resort to going to a library or something.

          I just don’t like it when people conflate a couple large websites with the “internet” itself, which is really just an evolution of telephone and telegraph systems that connect the world together. The utility of accessing remote data doesn’t go away. And pretending it does is hyperbole.

          The internet is also so much more open and easy to access than any system before it. Telephone and telegraph systems were so much more limited. Like a phone number is attached to your identity as a real person in a way that an IP address or network interface isn’t. That’s a really powerful thing.

    • TheDoctor [they/them]@hexbear.net
      link
      fedilink
      English
      arrow-up
      27
      ·
      6 months ago

      Yeah, the main sites are somehow managing to squander their infinite content factories. It’s like the procedurally generated video games where everything is theoretically unique, but you learn to recognize the patterns and everything feels the same.

        • TheDoctor [they/them]@hexbear.net
          link
          fedilink
          English
          arrow-up
          9
          ·
          6 months ago

          I think that if game companies can find good corpuses to train on and train their own LLMs with less strict respectability filters, using AI for NPC dialogue could be a legitimate boon to immersion. It’s one of the few cases where I could see LLMs being ethically sourced (for example using the massive amounts of text in The Elder Scrolls universe to train on) and not displacing too many jobs since there would still need to be writers for main quests and for guiding plot points on random encounters.

  • Lerios [hy/hym]@hexbear.net
    link
    fedilink
    English
    arrow-up
    29
    ·
    6 months ago

    why? genuinely who does this help and how does it make google money? it seems like they’re paying for the energy for ai content in exchange for absolutely nothing

    • Awoo [she/her]@hexbear.net
      link
      fedilink
      English
      arrow-up
      30
      ·
      6 months ago

      The people internally at Google are techbro true believers. If it’s new technology it is inherently good and an improvement.

      • TheDoctor [they/them]@hexbear.net
        link
        fedilink
        English
        arrow-up
        20
        ·
        6 months ago

        God, it’s sad but you’re probably right. We had to implement something AI-related at work because the board all had massive hard ons for the buzzwords. They literally could not have given less of a shit what we used it for. We had full autonomy as long as ChatGPT ended up in our dependency tree somewhere.

        • homhom9000 [she/her]@hexbear.net
          link
          fedilink
          English
          arrow-up
          10
          ·
          6 months ago

          Same here. Every all hands at work emphasizes the need to use AI. Except they have no clue what to do with it yet beyond chatbots but we need to use it right now or else.

        • Lerios [hy/hym]@hexbear.net
          link
          fedilink
          English
          arrow-up
          4
          ·
          6 months ago

          yeah same. we have an AI assistent now and every meeting has a ‘gentle reminder’ that the sales people and devs and tech support etc etc should be using it. they’re never specific about what we should be using it for and the one time i touched it it didn’t seem like it even had access to our documentation.

          is it really that simple? this is a massive capitalist company, surely they have to understand that they should be acting to improve their material conditions? random libs not understanding shit is fine, but i thought the actual capitalists themselves understood capitalism. exchanging material wealth for like cyberpunk vibes or whatever is genuinely insane.

          • TheDoctor [they/them]@hexbear.net
            link
            fedilink
            English
            arrow-up
            4
            ·
            6 months ago

            surely they have to understand that they should be acting to improve their material conditions?

            In my experience with execs, they find a guiding principle from a book or a conference speaker and treat it like a personal religion. Everything outside of that is very much vibes based. There are a lot of conference talks that try to summarize new tech stuff for execs but it’s very much a short overview followed by practical applications. They don’t understand the stuff experientially unless they happen to do a deep dive on their own.

    • alexandra_kollontai [she/her]@hexbear.net
      link
      fedilink
      English
      arrow-up
      3
      ·
      edit-2
      6 months ago

      The theory is that people don’t want to click through blue links trying to find a source (or sources) they can trust, they rather want an instant summarised answer to any question. Google already does instant summarised answers for things like “when is the next public holiday” - generative AI content would expand these instant answers to any question, at the cost of accuracy. Google thinks ChatGPT is taking their market share (which it kinda is, and kinda was a year ago when they started developing this). The big idea of this new feature from Google is to retain market share, which is a prerequisite to making money.

    • blobjim [he/him]@hexbear.net
      link
      fedilink
      English
      arrow-up
      2
      ·
      6 months ago

      I thought Google was already incorporating some machine learning stuff into the core search algorithm anyways, which would be a much better use than directly making up sentences.

  • InevitableSwing [none/use name]@hexbear.netOP
    link
    fedilink
    English
    arrow-up
    28
    ·
    edit-2
    6 months ago

    Bluesky threads are already full of people laughing at this “pivot to video” moment. I’m pretty sure they didn’t even bother to read the article. It’s a typical social media site. Everybody is like-insane. Minutes - even seconds - count. Post first - read later if at all.

    I think this is awful. Aggressive plagiarism by Google could (will?) make it a big success.

    Google calls its AI answers “overviews” but they often just paraphrase directly from websites.

    […]

    Jake Boly, a strength coach based in Austin, has spent three years building up his website of workout shoe reviews. But last year, his traffic from Google dropped 96 percent. Google still seems to find value in his work, citing his page on AI-generated answers about shoes. The problem is, people read Google’s summary and don’t visit his site anymore, Boly said.

    -–

    Edit

    To be clear - my main point is that I think Google is going to plagiarize as much as they want to try to get the shit to work. They won’t be stopped by congress and they won’t be stopped by the courts. Will plagiarizing work well enough to generate aiShittyText that Joe Schmo who shops at Walmart, isn’t tech savvy will happily consume? It might be a ginormous flop. But my gut says Google’s plan might work.

    Rant: Holy mother of fuck. I haven’t had pointed online convo outside of Hexbear in a very long time. I totally forgot how annoying the net can be. Reddit is bad enough but Bluesky can be like trying yell an argument through a keyhole due to the 300 character limit.

    • Kereru [he/him]@hexbear.net
      link
      fedilink
      English
      arrow-up
      7
      ·
      6 months ago

      I agree, I think this could work. Google already has featured snippets, this just feels like an extension to that. I’m pretty sure those snippets often screwed over the sites they were taken from too, because people read them but don’t click through. But the AI summary ensures they get even less credit/ad revenue.

      Any high-value search terms and Google hides the summary. So you either get ads or AI slop for every search.

    • blobjim [he/him]@hexbear.net
      link
      fedilink
      English
      arrow-up
      1
      ·
      edit-2
      6 months ago

      It seems like it’s going to force people to make their websites less accessible or something, to prevent Google from getting the full answer. Like they’ll have some leading information indexable by Google and the rest of the answer will be in a video. Or maybe websites already do this in some way?

      It seems like trying to monetize publicly available content on the internet is a crapshoot anyways. Hence why websites have paywalls, or additional stuff that you pay for to go along with the content, like merchandise.

      • InevitableSwing [none/use name]@hexbear.netOP
        link
        fedilink
        English
        arrow-up
        1
        ·
        6 months ago

        It seems like it’s going to force people to make their websites less accessible or something, to prevent Google from getting the full answer.

        AI projects are laughable in many ways and google has a long track record of failure in most of its projects. But this to me feels different. Google is the monster in the room when it comes to search. I think google will give small web publishers a horrible choice.

        • Block us? We will fuck you over by never showing your sites in search results. Of course - we’ll lie and say everything was aboveboard. The algos made the choices all by themselves!

        • Allow us to suck up your data is better and the intelligent choice. We’ll give you some crumbs, peasants. Something is better than starving, right?

  • RyanGosling [none/use name]@hexbear.net
    link
    fedilink
    English
    arrow-up
    23
    ·
    6 months ago

    Sometimes when I Google the legality of certain things in my state, it brings up the laws in other states at the very top box lol. Can’t wait for AI to make search results good against completely making shit up instead of giving me inaccurate answers

  • JayTwo [any]@hexbear.net
    link
    fedilink
    English
    arrow-up
    18
    ·
    edit-2
    6 months ago

    In one of the forums for the niche hobby I’m into, Google snippets have already been causing chaos for years.

    Eg: “You’re wrong because Google says that I need to do this”. Well Google is wrong and doing that is the entire reason why you’re having so many problems.
    When investigating how the snippet was made it’s either from a review or forum comment made by a newbie that somehow got traction or often someone saying DON’T do it that way, but their algorithm doesn’t pick up the nuance and gets it twisted.
    More recently it’s been through taking snippets of entirely AI pages that write absolute gibberish which sound impressive to people with only a passing familiarity.

    This is gonna make things sooooooo much worse.

    • InevitableSwing [none/use name]@hexbear.netOP
      link
      fedilink
      English
      arrow-up
      5
      ·
      6 months ago

      I wonder if this will happen: “To make a chocolate milkshake: 1. Thank you kind stranger!..” And then news will break that Google is claiming a malformed algorithm caused their AI to suck up the entirety of Reddit. And after that Google will be forced to admit “Oh, oops!” the wonky algorithm caused Google to suck up ginormous amounts of data from 10,000s of sites on the net. Then they’ll say they’re “untraining” which is another big lie. All they’ll do as fast as they can is smooth out plagiarism so they can have deniability.

      I wouldn’t be surprised if a few years from now Google’s legal team is at the supreme court claiming something ridiculous. Some of the best legal minds in the US are pushing the bullshit idea that AI cannot plagiarize because it doesn’t know what plagiarism actually is. And the GOP majority seems to love the idea.

    • blobjim [he/him]@hexbear.net
      link
      fedilink
      English
      arrow-up
      1
      ·
      edit-2
      6 months ago

      Three quarters of that is idiots not only believing what they see on the internet, but believing the first thing they see at the top of search results, without context.

      We should be using computers as a means of interfacing with things generated by other humans, not expecting some simple algorithm to read our minds.

  • ZWQbpkzl [none/use name]@hexbear.net
    link
    fedilink
    English
    arrow-up
    15
    ·
    6 months ago

    Given the current state of google search results this really just sounds like cutting out the middleman. Complaints from SEO powered garbage like the spruce fall on deaf ears.