• binom@lemmy.world
        link
        fedilink
        English
        arrow-up
        3
        ·
        11 months ago

        does this actually help the internet archive in any way? as in are your local ressources used or ad revenue generated? i fail to see how telling them to archive everything you visit is of any help to them. other than you being basically a crawler, i guess

          • binom@lemmy.world
            link
            fedilink
            English
            arrow-up
            1
            ·
            11 months ago

            i still think this kind of shotgun approach is not ideal, and the extension seems more like a service to me than a way to help the ia. so “help contribute” is not the wording i would chose. but i very well might be missing the point. i do love the internet archive and their fight for information freedom, don’t get me wrong. this was more of a nitpick.

  • GeekFTW@kbin.social
    link
    fedilink
    arrow-up
    63
    ·
    11 months ago

    The Internet is not forever after all

    Lmao never was. Shit you don’t want on the Internet will never leave. Shit you do want on the Internet fucking disappears all the goddamned time.

  • slipperydippery@lemmy.world
    link
    fedilink
    English
    arrow-up
    60
    arrow-down
    1
    ·
    edit-2
    11 months ago

    It looks like they misunderstand how to improve their SEO ranking

    In fact, on Tuesday, Google’s SearchLiaison X account tweeted, “Are you deleting content from your site because you somehow believe Google doesn’t like “old” content? That’s not a thing! Our guidance doesn’t encourage this. Older content can still be helpful, too. Learn more about creating helpful content.”

    • SpaghettiYeti@lemmy.world
      link
      fedilink
      English
      arrow-up
      29
      arrow-down
      1
      ·
      11 months ago

      They really don’t. They’re going to hurt their domain authority and back links.

      It’s more valuable to make an update to past pages because Google sees it as useful content that is being maintained.

      You’re supposed to make tweaks once a year so it’s not stale, not nuke yourself.

    • body_by_make@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      25
      arrow-down
      3
      ·
      11 months ago

      TBH this doesn’t make me certain this tactic won’t work, Google hardly seems to know how their SEO works. They sorta intentionally do this so they can blame anything suspicious on their black box, “AI”.

  • AutoTL;DR@lemmings.worldB
    link
    fedilink
    English
    arrow-up
    37
    ·
    11 months ago

    This is the best summary I could come up with:


    Unfortunately, we are penalized by the modern Internet for leaving all previously published content live on our site," Taylor Canada, CNET’s senior director of marketing and communications, told Gizmodo.

    Proponents of SEO techniques believe that a higher rank in Google search results can significantly affect visitor count, product sales, or ad revenue.

    However, before deleting an article, CNET reportedly maintains a local copy, sends the story to The Internet Archive’s Wayback Machine, and notifies any currently employed authors that might be affected at least 10 days in advance.

    It is perhaps another sign of how bad things have become with Google’s search results—full of algorithmically generated junk sites—that publications like CNET are driven to such extremes to stay above the sea of noise.

    From time immemorial, the protection of historical content has required making many copies without authorization, regardless of the cultural or business forces at play, and that has not changed with the Internet.

    Archivists operate in a parallel IP universe, borrowing scraps of reality and keeping them safe until shortsighted business decisions and copyright protectionism die down.


    I’m a bot and I’m open source!

  • skankhunt42@lemmy.ca
    link
    fedilink
    English
    arrow-up
    30
    ·
    11 months ago

    Unfortunately, we are penalized by the modern Internet for leaving all previously published content live on our site

    Even if this is true, which I doubt, why not edit your robots.txt to disallow them to index it and leave the content up?

  • 6xpipe_@lemmy.world
    link
    fedilink
    English
    arrow-up
    29
    arrow-down
    1
    ·
    11 months ago

    However, before deleting an article, CNET reportedly maintains a local copy, sends the story to The Internet Archive’s Wayback Machine, and notifies any currently employed authors that might be affected at least 10 days in advance.

    People are freaking out so bad about this story. They’re doing the right thing and archiving it before deletion. Settle down.

    How many CNET articles from 2004 are you reading that you’re getting this angry about it?

    • HakFoo@lemmy.sdf.org
      link
      fedilink
      English
      arrow-up
      4
      ·
      11 months ago

      Storage and bandwidth have never been cheaper. If you’re not doing some grand replacement of the CMS, it’s less effort NOT to remove old content.

      I love the argument they’re trying to make: if they prune enough content, everything looks fresh and new. So you’re effectively discarding one of the most valuable assets you have-- the fact you’ve been doing the same thing for 25 years and have some established credibility-- for a perception of “fast” that could be imitated by any number of content mills or AI services.

      If you’re looking at a review of a RTX 4090, it says a lot when the same site also scored the Radeon VII, Geforce 3 Ti, and S3 Savage.

  • Gyoza Power@discuss.tchncs.de
    link
    fedilink
    English
    arrow-up
    16
    arrow-down
    1
    ·
    11 months ago

    Jesus. I long for the day we get rid of this cancerous companies that just ruin the internet with every day that passes.

    • poppy@lemm.ee
      link
      fedilink
      English
      arrow-up
      13
      ·
      11 months ago

      However, before deleting an article, CNET reportedly maintains a local copy, sends the story to The Internet Archive’s Wayback Machine, and notifies any currently employed authors that might be affected at least 10 days in advance

      From the article, CNET is archiving it on Wayback themselves.

  • TurnItOff_OnAgain@lemmy.world
    link
    fedilink
    English
    arrow-up
    10
    ·
    11 months ago

    All of the geocities websites I used to go to proved that the internet wasn’t forever. Did anyone really think it was?

  • FriendlyBeagleDog@lemmy.blahaj.zone
    link
    fedilink
    English
    arrow-up
    7
    ·
    edit-2
    11 months ago

    It’s fairly silly that this course of action is the consequence of a desire to manipulate search engine results, but at least they’re archiving the articles before taking them down.

    To address the headline, though, I don’t think that anybody reputable ever seriously claimed that the internet was forever in a literal sense - we’ve been dealing with ephemerality and issues like link rot from the beginning.

    It was only ever commonplace to say the internet was forever in the sense that fully retracting anything once posted could range from difficult to impossible after it’d been shared a few times.

    Only in the modern era dominated by corporations offering a platform in perpetuity have we been afforded even the illusion of dependable permanence, and honestly I’m much more comfortable with the notion of less widely distributed content being able to entropy out of existence than a permanent record for everything ever made public.