Let me clarify what I mean:
If you put 80W of electrical energy into an ideal electrical heater, it will put out 80W of heat energy.
But what if you put 80W of electrical energy into a CPU and let it calculate things, creating new information?
Will it also output exactly 80W of heat?

Or is some energy transformed into “information”, so the CPU will radiate less heat?

My instinct is that if information isn’t energy, then you could theoretically create it (thereby reducing entropy) without expending energy, and that’s a no-no.

But if it is energy, then a CPU running a random number generator (creating no information) at max load would get hotter than one doing actual calculations. Which also sounds wrong.

(I’m neither a physicist nor a computer scientist, in case that wasn’t obvious)