Nvidia is killing it recently
Data centers, last decade, huge emphasis on serving up content. Streaming video. Needs lots of CPU cores. I think it as one core / stream, though it is not actually 1:1. Take bits from a hard drive, serve em out to someone's ipad in chunks. Do this for 45 minutes.
There's other stuff too like "run business applications" for some company or another. Constant CPU work.
That's not stopping, but next decade, the AI generative models will run mostly on GPUs. Each query needs a thread active on a GPU.
So for that emerging workload, you really only need a CPU to route tasks to all the GPUs that are busy generating letters (chatgpt) or pixels (generative art) or frequencies (generative music). And pass back the output. The ratio of work is probably way different. And the CPU less important.
My current take ..