I was trying to figure out how far I can push ZBrush, and it seem that ZBrush is not using the resources available to it.
This is a minor problem, but I’m more curious if I’ve setup something wrong and what is causing the limitation.
I’ve got ZBrush up to 32.5 million active points. It won’t let me subdivide any further. So, I duplicated the subtool and it crashed hard with an out of memory error.
It says it’s only using 4GB of memory and there are 112 GB currently free according to ZBrush. (The rest is being used by Firefox browser, which is a memory hog.) The CPU is idling most of the time under 3% usage, although I’ve momentarily been able to get it up to around 80% (on all 12 cores) which I wouldn’t even notice if it were not graphing it, because it goes back down below 5% immediately.
I think most users would be more than excited to get this performance, but I am a little disappointed that I was able to get it to crash to desktop while most system resources are not even being used.
I’ve set compact memory to the highest setting. And I’ve upped the undo level. Max poly per mesh is set at 100.
Are there any other optimizations that I can set to get it to use the resources available?
And what is the bottle neck here? Is it my 2080 TI? Did it crash from a lack of video ram or system ram? Because it has about 112GB of unused system ram. However, with that many polygons, I could definitely imagine it choking even with a liquid cooled 2080 TI on a lack of video ram. (Although it says its only using 4GB of memory which is way less than the video memory.)
I’m pretty certain that this is the 64 bit version of ZBrush. For one thing it recognizes that it has 112 GB of unused memory. It’s in the regular Program Files folder, not Program Files X86. Although it is suspicious that it doesn’t ever say its using more the 4GB.
ZBrush 2019.1.2