ZBrushCentral

Decimation Master seems wonky.

Okay, I will try to be concise. I am working on a file with several subtools. I have followed the pixologic video tutorial to the letter, but there seems to be some discrepancy with memory issues. I am trying to decimate a 9 mil poly subtool (with a unique name) and I get memory crashes during the decimate current process, or the cpu becomes unresponsive during the pre-process current function. When I go down subd’s to 2 mil polys, I have been succesful in decimating the subtool once and just barely…it was only after I restarted the cpu and went directly to decimation master and ran through the steps after several attempts was I finally succesful. I have not adjusted the default setting in zbrush other than pushing the compact memory to 4096. I am running Mac OSX 10.6.4 snow leopard on a Due 2.8GHZ Quad Core Intel Xeon with 14gigs of ram and 430gigs of free space available. I run utilities regularly and defrag once a week. Also, there are no other apps running in the background. Sooo…I think my sys should be able to handle this with no problems, but maybe I’m missing something here.
In fact, over at CGBootcamp I watched Leo Covarrubis decimate a 14.8 mil poly subtool with only 8gigs on his sys in his video tutorial. He did say it takes quite a bit of time. So, I am aware of this and have walked away for over an hour only to come back and see a message saying “please wait computing” In other words it hasn’t even made it to the yellow task bar. Also, my default max poly count is 15mil poly’s…shouldn’t I be getting more out of my sys than that? Anyway, any help would be greatly appreciated I’m stuck with a model I cant decimate to remesh. It would be greatly disappointing if I could only decimate up to a 2mil poly object. Thank You for your time. :cry:

Darn, put this in the wrong thread…sorry.