Is PoE more efficient than plugging in adapters for each network device?

And at what scale does it start to matter?

From my perspective; I’m going for a 3 node mesh router, plus 2 switches, and was considering if in 5 years time the electricity difference would be less than the extra upfront cost. The absolute max length of cable would probably be around 30m

  • DJArbz@lemmy.notmy.cloud
    link
    fedilink
    arrow-up
    11
    ·
    8 months ago

    I think the efficiency greatly depends on your switch efficiency and how many devices you are going to power vs the efficiency of your PoE injector or wall wart.

    There is also a convenience factor of not needing to plug something in locally.

    Also, security, someone can’t just unplug the power supply for your camera per se.

  • MangoPenguin@lemmy.blahaj.zone
    link
    fedilink
    English
    arrow-up
    6
    ·
    8 months ago

    My thoughts are:

    • With PoE you’re doing 2 conversions which could waste more power, AC to 48V at the switch, and then 48V down to whatever the device needs with it’s internal buck converter. You also have slightly more losses on the longer run of low voltage 48V DC through ethernet, vs AC.

    On the other side of things:

    • With PoE you only have 1 AC-DC conversion happening, every wall wart power adapter has an idle power draw even without a load attached to it. With PoE you just have the single switch power supply wasting power.

    Overall I doubt the difference will be large enough to matter, and some PoE switches are quite power hungry even with nothing plugged in for some reason, so could end up costing more.

    • gramathy@lemmy.ml
      link
      fedilink
      arrow-up
      1
      ·
      8 months ago

      Dc-Dc is pretty efficient, I wouldn’t worry about conversion after the initial 48v, but I would potentially worry about losses in poor quality home wiring on longer runs in bigger homes

  • swicano@kbin.social
    link
    fedilink
    arrow-up
    5
    ·
    8 months ago

    I don’t think I would trust anyone here’s answer. The only way to know is to to test it. Theoretical talk about ‘more conversions’ is kinda discounting the entire field of power supply design. We need someone to slap a killawatt on a system using PoE, and then do it again on that system using external adapters.
    I tried Googling to see if anyone had done that and didn’t see anyone doing real testing (on the first page of google at least).
    I do have these findings to report: 1) PoE is marketed as cost saving, largely on the install and maintenance costs: fewer cable runs for weird AP locations, less electrical work, etc. Which means we cannot assume that if PoE is in wide usage, that it is due to electricity cost savings. And 2) increasing efficiency of newer PoE power supplies is an active area of development, meaning that a particularly old set of PoE hardware might be less efficient than expected.

  • litchralee@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    4
    ·
    8 months ago

    As other posters have remarked, it’s difficult to offer a generalized statement on PoE efficiency. One thing I will point out that hasn’t been mentioned yet is that PoE switches tend to have poor “wall to device” efficiency when lightly loaded. Certifications like 80 Plus only assess efficiency at specific loading levels.

    Hypothetically, a 400W PoE switch serving only a 5W load may cause an additional 10W to be drawn from the wall, which is pretty horrific. But if serving loads totalling 350 W, could draw 390 W from the wall, which might be acceptable efficiency.

    Your best bet is to test your specific configuration and see how the wall efficiency looks, with something like a Kill-o-Watt. Note that even a change from 120V input voltage to 240V input voltage can affect your efficiency results.

  • poVoq@slrpnk.net
    link
    fedilink
    arrow-up
    4
    ·
    8 months ago

    Probably not, but it is nice to still have WiFi in the house when power cuts, as you can easily run the PoE switch on a UPS.

  • Boring@lemmy.ml
    link
    fedilink
    arrow-up
    2
    ·
    8 months ago

    Surley there is a switch out there that can detect voltage used by the device and only deliver that amount.

  • Max-P@lemmy.max-p.me
    link
    fedilink
    arrow-up
    4
    arrow-down
    2
    ·
    8 months ago

    I’ll add, it also depends on the efficiency of the local power supplies if those devices were using wall warts. Those are often pretty generic, and may only be used at 25% which for some wall warts would be outside of their top efficiency curve. A single power supply in the form of PoE can be more efficient if it lets both the switch and PoE regulator on the device operate at a better efficiency point.

    In some way, stepping down 48V DC down to 3.3/5V is a bit easier than stepping down the 168V that results from rectifying 120V AC to DC. But the wart could be stepping down the 120V to 5V first with a simple AC transformer which are nearly always more efficient (95%+) than a DC/DC buck converter, but those can still reach 90% efficiency as well.

    In terms of cabling, power loss is a function of current and length (resistance). AC is nice because we can step it up easily and efficiently to extremely high voltages as to minimize the current flowing through the wire, and then step it back down to a manageable voltage. In that way, american 120V has more loss than rest of the world 240V, although it only matters for higher power devices. That also means that the location of the stepping down matters: if you’re gonna run 30m of ethernet and a parallel run of 30m of 5V power, there will be more loss than if you just ran PoE. But again, you need to account the efficiency of the system as a whole. Maybe you’d have a wart that’s 5% more efficient, but you lose that 5% in the cable and it’s a wash. Maybe the wart is super efficient and it’s still way better. Maybe the switch is more efficient.

    It’s going to be highly implementation dependent in how well tuned all the power supplies are across the whole system. You’d need either the exact specs you’ll run, or measure both options and see which has the least power usage.

    I would just run PoE for the convenience of not having to also have an outlet near the device, especially APs which typically work best installed on ceilings. Technically if you run the heat at all during the winter, the loss from the power supplies will contribute to your heating ever so slightly, but will also work against your AC during summers. In the end, I’d still expect the losses to amount to pennies or at best a few dollars. It may end up more expensive just in wiring if some devices are far from an outlet.

  • ChaoticNeutralCzech@feddit.de
    link
    fedilink
    arrow-up
    1
    ·
    edit-2
    8 months ago

    Depends on the voltage.

    Most IT devices’ key components run at 3-5 V nowadays and the voltage needs to be converted down if the PoE voltage is higher. This introduces losses, especially if a linear regulator is used as opposed to a buck converter. On the other hand, one powerful PSU is slightly more efficient than a lot of smaller ones.

    I don’t think there is a huge difference. If you run a lot of tiny devices (too small/cheap to use a buck converter) off a significantly higher voltage like 24 V or 48 V and/or the cabling is very long, PoE will be less efficient. If the PoE voltage matches the devices’ adapters’ voltage and the cables are reasonably short (<30 m) with few connectors, PoE may be more efficient.

  • antlion@lemmy.dbzer0.com
    link
    fedilink
    arrow-up
    2
    arrow-down
    1
    ·
    8 months ago

    Considering the PoE switch is also powered by an adapter, I’d say no. Your have the same efficiency from the AC-DC conversion plus line losses.

      • JustEnoughDucks
        link
        fedilink
        arrow-up
        2
        ·
        8 months ago

        Really depends. If you have 1 device that uses 10W with 90% efficiency, you lose 1 W. If you have 10, 1W devices with 90% efficiency, you lose 1 watt.

        It is less efficient if X device has lower efficiency or if you have DC/DC converters in series (which is likely with PoE), but likely it is negligable difference.

  • slazer2au@lemmy.world
    link
    fedilink
    arrow-up
    2
    arrow-down
    1
    ·
    8 months ago

    It depends if your POE device can communicate back to the switch to lower the power output to match its requirements.

    A switch will generally push the full power over the wire unless the remote device can talk back with LLDP power management TLV to lower the power, while connecting with a wall wart the device will pull only what it needs.

    • MangoPenguin@lemmy.blahaj.zone
      link
      fedilink
      English
      arrow-up
      5
      arrow-down
      1
      ·
      8 months ago

      Devices pull power, it doesn’t matter how much the PoE switch supplies, as each device will only pull what it needs.

      • DJArbz@lemmy.notmy.cloud
        link
        fedilink
        English
        arrow-up
        3
        arrow-down
        2
        ·
        8 months ago

        A switch will recognize a PoE device and push the full 15.4w by default. Then the device has the abi6to communicate back to the switch to specify how much power is actually needed.

        • Max-P@lemmy.max-p.me
          link
          fedilink
          arrow-up
          5
          ·
          8 months ago

          The switch can put out 15.4W, but it doesn’t control how much power flows. The device can draw 15.4W if it wants to but it won’t necessarily do so. The switch can lower the voltage it supplies, and it can cap the power output by lowering the voltage it supplies, but it can’t push a certain amount of power. That would violate the fundamental physics of electronics.

          Put a 2.4kΩ resistor as the “device”, and at 48V, the absolute maximum that will flow is ~1W. The switch would have to push 196V to force that resistor to use 15.4W which would put it way out of spec. And there’s nothing preventing the device from being smart enough to adjust that resistance either to maintain 1W. That’s basic Ohms law.

          The device must negotiate if it’s going to use more than the default 15.4W, or it can advertise it’s low power so the switch can allocate the power budget to other devices as needed. But the switch can only act as a limiter, it can’t provide more than the device takes. It can have the ability to provide more than the device takes, but simply can’t force the device to take more.

        • MangoPenguin@lemmy.blahaj.zone
          link
          fedilink
          English
          arrow-up
          1
          ·
          edit-2
          8 months ago

          It will allow up to 15.4W to be drawn from the port by default, if the device only uses 2W then it only gives 2W.

          You can’t push 15.4W and have it go nowhere, that’s not how electricity works.