One million Blackwell GPUs would suck down an astonishing 1.875 gigawatts of power. For context, a typical nuclear power plant only produces 1 gigawatt of power.

Fossil fuel-burning plants, whether that’s natural gas, coal, or oil, produce even less. There’s no way to ramp up nuclear capacity in the time it will take to supply these millions of chips, so much, if not all, of that extra power demand is going to come from carbon-emitting sources.

  • Fermion
    link
    fedilink
    arrow-up
    2
    ·
    4 months ago

    Resistive heating is not the dominant energy loss mechanism in modern computing. Since the advent of field effect transistors, switching losses dominate. Room temperature super conductors could be relevant in power generation, distribution, and manuafacturing, but would not radically alter the power requirements for computing.

    I personally don’t think any possible room temperature super conductors would be economical to produce at a large enough scale to make a large difference in energy demands. Researchers have pretty thoroughly investigated the classes of materials that are easy to manufacture, which suggests a room temperature superconductor would be prohibitevely expensive to produce.