AI’s voracious need for computing power is threatening to overwhelm energy sources, requiring the industry to change its approach to the technology, according to Arm Holdings Plc Chief Executive Officer Rene Haas.

  • crispyflagstones@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    1
    ·
    2 months ago

    There’s an entire resurgence of research into alternative computing architectures right now, being led by some of the biggest names in computing, because of the limits we’ve hit with the von Neumann architecture as regards ML. I don’t see any reason to assume all of that research is guaranteed to fail.

    • AlotOfReading@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      arrow-down
      1
      ·
      2 months ago

      I’m not assuming it’s going to fail, I’m just saying that the exponential gains seen in early computing are going to be much harder to come by because we’re not starting from the same grossly inefficient place.

      As an FYI, most modern computers are modified Harvard architectures, not Von Neumann machines. There are other architectures being explored that are even more exotic, but I’m not aware of any that are massively better on the power side (vs simply being faster). The acceleration approaches that I’m aware of that are more (e.g. analog or optical accelerators) are also totally compatible with traditional Harvard/Von Neumann architectures.

      • crispyflagstones@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        1
        ·
        2 months ago

        And I don’t know that by comparing it to ENIAC I intended to suggest the exponential gains would be identical, but we are currently in a period of exponential gains in AI and it’s not exactly slowing down. It just seems unthoughtful and not very critical to measure the overall efficiency of a technology by its very earliest iterations, when the field it’s based on is moving as fast as AI is.