Ok so I don’t quite understand why self hosting makes the situation better:
Turns out once you’ve played enough the database on the host just gets too big and chokes out the CPU threads since it can’t use more than 2 cores
Ok so you will have the same problem when hosting it on a dedicated server…
Wife is hosting and pulling 10 fps with a Nvidia 3070TI
I have an old HP SFF i5 16GB RAM with an SSD
Are you telling me that the old HP SFF has more performance than your wife’s PC?
I also don’t quite understand what using a dedicated server has to do with FPS because server calculations should usually not impact this expect when the game’s code is utter garbage and it performs server calculations on the rendering thread…
I used to run an old gaming computer as a home server and it felt like $30 a month in electricity.
Let’s assume that server somehow requires 1kWh per day (which is quite a lot). That would be around 30ct per day here in Europe which is like $10 a month. So not sure if your server was constantly running at 200W or your power plugs are coated with gold…











That’s still a 5W difference.
I have other devices like a TV that require 1-2W when they are in standby/shut off.
OP’s number is way to low. The power supplys conversion loss itself will already be a few watts.