Information doesn’t hold energy themselves, so yes the CPU produces 80W of heat, no matter what computation they do.
(Also it’s pretty hard to define what’s information, people generally talk about information entropy, which is different from thermodynamic entropy. This can be confusing).
But your instinct is correct, one cannot reduce entropy without spending energy. And the kind of computation computers do today does change entropy, so some energy must be dissipated as heat to do computation. This is the Landauer’s principle: https://en.wikipedia.org/wiki/Landauer’s_principle
Also you need to remember that energy can not be destroyed or created. Expending energy to do something doesn’t mean there will be less energy in the end.
and that’s a no-no.
Not sure why you think that? If information holds no energy then creating it without converting energy isn’t against any physical laws.












Well yeah but quantum computers are naturally reversible.