A PC consuming X Watts of power will release heat into the room at a rate of X Watts - all that energy has to go somewhere. That RTX 4090 GPU that some people are salivating about draws 450W. A Threadripper/Xeon class CPU can draw a couple of hundred watts. So a powerful PC can easily end up releasing heat at a rate of ~1000W and the effect of that on your room will be just the same as running a 1kW electric heater. I.e. probably not enough to single-handedly heat a large room on a cold night but certainly enough to help.Are you sure a PC can heat up a room when it’s cold? I guess you can hug the PC case when it’s really cold and not turn on the heater.
Now, if you have really efficient thermostatically controlled heating with sensors in every room and no unheatable cold spots that keep it running indefinitely, the "help" from your computer should mean it doesn't have to run for so long at a time and reduce the amount of fuel it uses.
As for "saving" anything - that's complicated. You could see that as a saving if you regard the electricity bill for running your computer as a "sunk cost" - however, your heating may be running off a cheaper fuel (don't know about elsewhere, but in the UK gas is still significantly cheaper per kWh than electricity - let alone solar/heat pump etc.) so you'd still save money by using a lower powered computer.
I checked last year and my Mac Studio was using 15W at idle... which (at the current sky-high UK fuel prices) added up to £40/year (or would, if I left it running 24/7). Even a relatively modest "regular" PC could use 5-10x that much, so we're not talking negligible sums of money here.