I use a 5600g on b450 ITX board and 4x 8GB Seagate drives and see about 35W idle and about 40W average. It used to be 45W because I was forced to use a GPU in addition to a 3600 to boot (even though its headless, just a bad bios setup that I can’t fix) and getting a CPU with graphics dropped my idle consumption quite a bit. I suspect the extra wattage for your machine is probably the bigger motherboard and the less efficient CPU.
It is possible to get the machine part down into single digits wattage and then about 5W a drive is the floor without spinning them down, so the minimum you could likely see with a much less powerful CPU is about 30-35W.
Initially a lot of the AI was getting trained on lower class GPUs and none of these AI special cards/blades existed. The problem is that the problems are quite large and hence require a lot of VRAM to work on or you split it and pay enormous latency penalties going across the network. Putting it all into one giant package costs a lot more but it also performs a lot better, because AI is not an embarrassingly parallel problem that can be easily split across many GPUs without penalty. So the goal is often to reduce the number of GPUs you need to get a result quickly enough and it brings its own set of problems of power density in server racks.