If you look at Adobe apps for example, they benefit more from CPU as long as the GPU is at a certain level. Once a user has a mid-range GPU then they don’t need GPU as much as RAM & CPU (and the real-life RAM requirements have decreased with Apple Silicon).
Another example is ZBrush which is purely CPU based. Even most of the time working in other 3D applications the CPU is more imporant as people working in 3D spend more time not-rendering than rendering and the machine can render while you put the kettle on.
It’s gamers and some 3d renderers that use more GPU - but CPU 3D Rendering is more accurate and so CPU rendering (obviously with farms) is the default in hollywood whilst us mortals have to just use what’s available on our budgets - typically a desktop GPU rather than a cloud render. The usual options when thinking only of rendering for games or lower-end 3D rendering are GPU (cheap and fast on PC), or CPU (slower, more accurate and slightly better on Mac generally).
When/if Apple release an M4 Ultra that is twice the performance M4 Max (GPU) it should be equivalent to an Nvidia 4090 and set the cat amongst the pigeons. 2025 could be the start of Apple desktop disruption.