Running AI models without matrix math means far less power consumption—and fewer GPUs?

  • Pennomi@lemmy.world
    link
    fedilink
    English
    arrow-up
    2
    ·
    5 days ago

    Only for maximum efficiency. LLMs already run tolerably well on normal CPUs and this technique would make it much more efficient there as well.