ZBrushCentral

insufficient memory error?

Not being able to use ZBrush with all this ram and hardware is pretty frustrating. Pixolator where are you?

have you set your memory prefs to allow for higher poly counts, Im not sure if you actaully have to or If Zbrush will not let you subDivide to higher levels if the memory options arnt set. If you look up a couple posts you can see the memory prefs that I have set in the screenGrab, mainly the max polys per mesh is the one you need to make sure is set to 10, maybe that will help if you dont already have that set?

also if anyone who knows is reading this, why is 10 max number under the max polys in the prefs menu, seem odd to have a locked down number that cannot be exceded, and what is it doing, is it a somekind of multiplier for the number of polys in your base mesh, like whats the difference between say, 9 or 10, how many more polys can a mesh have if its set to 10 and why??

hmm the post before this one somehow got grey as the letter color so if you highlight the bottom of it you will see thier is more text, I have no idea how that happened?? haha it was definatley not meant to be a secret message but you could totally do tht if you wanted ! :slight_smile:

Yes I have the max poly set to 10, and it makes no difference. Even though I have plenty of disk space and ram(3+ gigs) it wiill not subdivide to a level over 5. And even at I level of 5 I can only do one or two opperations on the model before ZBrush chokes and locks up. Is there another forum that we can post in that ZBrush staff read more frequently?

my guess would be if you tried to subd a 10million poly object…it would be 40 million polys and probably way beyond at least windows 4g mem limit capabilites to handle…don’t know about macs tho.

you could always cut up your model so you can subd the parts you need more polys on. skycastle has posted his work arounds for such things.

ktaylor…i see from the fast defrag you have less than 1 gig free memory…on my old computer with that available I could never get passed about 1.5 mill polys before running out of mem like this screen shot. You could try hitting clean to free some more memory and make sure you have a ton of free disk space…less that 3 megs free probably wont cut it.

i forget but at how many polys are you in that snapshot? before you try to subd with only 972mb left?

thanks aminuts, my problem isnt subdiving the mesh, all that is fine the mesh in the pic is aroung 2 million polys which is plenty for this particular mesh, My problem is I am actually trying to EXPORT the mesh as an obj at this res and that is what I cant do, I can export it at a lower level around 200,000 but I want to export the 2 million poly version as an obj, It seems like I should be able to do that with 2 gigs of Ram and 134 gigs of free hard disk space. like I was saying earlier when I try to export the mesh my memory only peaks at about 1.3 gigs, so why is it crashing and not using the other 700 megs of memory??

Not sure if you are refering to to my post or Ktaylor’s but I’m working with a ~1.5 million poly object at a division level of 5. At no point was I ever trying to subdivide a 10 million poly object, and neither was Ktaylor. I would like to subdivide the ~1.5 million poly object to ~6 million polys, but to do that I would need to raise the poly per mesh limit to >6, as I understand it. Maybe I understand it wrong?

ktaylor…am moving my stuff to this computer…as soon as element 5 gives me my new access i will try to export a 2mg file and see if i get the same problem…will also watch various ram progs at same time and see what happens. while zbrush is processing the save.

g3d…is that a zsphere model turned polymesh? I ask because somewhere along the way, I remember reading that you need to set the density to the highest level you plan on subdiving to. I don’t know if that meant…if i set my density to level 6 and created the skin if I cannot subd higher since I haven’t tried that…but maybe thats the problem?

The model in question was modeled in Lightwave so no ZSpheres are involved.

Here are more details about my situation:

The model is 6,242 polys; all quads. This is what the OBJ file’s poly count is upon initial import as a “tool”.

I divided to 5, at which point the model(the tool) contains approx. 1.5 million polys.

When I divide one more time to 6, the model(tool) should be about 6 million polys. I wrote should be because this is not what happens; instead I get the same “insufficient memory” message that Ktaylor got. With the ram and disk space I have, ZB should be able to handle a 6 million poly count without fail.

What I meant was that a division of 5 is not enough for the kind of detail I need to put into the model/tool. When I use a brush to paint on details at that mesh density, the strokes are too blocky to be useable. This makes me think that a higher density is needed. Would that not be correct?

Another problem is that even at a division level of 5 the 1.5 million poly too is still too much for ZB to handle. Hiding parts of the model is awkward at best and painting a stoke is too sluggish and more than one or two strokes makes ZB lock up with same “insufficient memory” error.

I have 7 gigs of space on my C drive and 3.5 gigs of ram on my system.

G3D

hmm that is odd seems like your system should be able to handle that,

it is recomended to work in Projection mode at higher res levels, ususally I work in level 1-4 for just getting all the forms/muscle stucture down and the mesh is only maybe 200,000 polys at that point which is easy to manage and work with, usually when you get to millions subdivision you are pretty much doing detial stuff anyways which is also way easier with projection master since you have all the brushes and tools available, hope this helps some.

Adding a bit more info… ZBrush 2.03 (currently available for Mac systems) has new memory and virtual memory managers capable of managing higher number of polygons in the same amount of system memory, and the export routines are able to export higher resolution meshes faster and with less strain on memory resources.

These will be included with the next free-update for Windows systems :slight_smile:

when that might be? Both Ktaylor and I are PC users, although I do have a couple Macs. But I think that if this model is giving a well equipped PC trouble, it would choke even a Mac with the same amount of ram.

In the meantime, is what KTaylor wrote correct … shouldn’t the PC I’m using with over 3 gigs of ram and even more disk space be able to work with a model that is 6 million polys? Shouldn’t I also be able to paint details one it at the lower polycount of 1.5 million?

Hey Aurick that is great to hear thanks!!

G3D I have a dual 3.4, 2 gigs of ram, aTI 9800 card and a 140 gig hardDrive at work and it handels 6 million polys with no problem, I have a 2.8 ghz, with 1.5 gigs of ram as my main home machine and it can deal with meshes at least 3 million polys.

I’d really like to move ahead with ZB. I know that the problem isn’t my sytsem or hardware; I’ve done ram tests till the cows come home and the ram passed every test I threw at it. What about my other questions:

Could the trouble be related to available disk space?(I have approx. 7 gigs free)

Could it be XP service pack 2?

Video drivers?

Something else?

:qu:

TIA

i hate that damned error, hehe

The problem is the 32 bit OS
and the 4 gig limit thing.
In reality, any app is only able to utilize 2 gigs of memory, whether it be RAM, the paging file, or a combo of the two.
Win XP tends to split the memory management in half giving half to RAM, and half to the paging file.
I have been doing a lot of exporting of high rez zb files, and this pushes zb to the max. As soon as i get it to 2 gigs, boom. That damn error pops up.
And when i say 2 gigs, i mean, 2 gigs of total memory usage, typically half RAM and half paging file.
if you want to test this make a 3 million poly model in zb.
then open the task manager, go into the processes tab, and in the view menu up top, choose “select columns” and pick “virtual ram.”
then click the mem usage tab so that zb stays at the top and is easier to view.
then export the model you just made as an obj.
as soon as both chunks get to about 2 gigs, boom.
it looks like it blows up right before that, but i believe what is happening is that the memory is being used in chunks, so when zb adds another chunk, it tries to go from about 1.9 gigs to something over 2, but you don’t see that in the task manager cuz that is when zb blows up.

this will be fixed when it is compiled for win 64, i believe.

-zee

oh that is interesting, I didnt know any of that, good thing I didnt buy more ram then :)Thanks Zeebit!

Hey Zeebit, I really don’t have any need to export such a hi rez mesh, but for the hell of it I gave it a shot. The model I exported was a little over 4 million polys. I have 2.5 gigs of ram. Nice little hang you found. First time I have ever had to kill the app. :cry:

Ktaylor what is the destination app for your high poly model? Are you planning to reconstruct the surface?

sirquadalot “Ktaylor what is the destination app for your high poly model”

I was going to take it into the Nvidia Melody(freeware from Nvidia, trying to see how high of a mesh it will accept with my system specs) app to generate a normal map, something about the way Z brush compares the meshes for normal maps doesnt appeal to me so I have been exporting the High res meshes and using other apps to generate them, this is all for realTime stuff.

yeah, i use it for exactly the same thing ktaylor.
the best method i have found while we wait for win64 is to just break the mesh up using polygroups and the delete hidden function.
then gen the normal maps a chunk at a time and comp them in PS.

hope that helps

-pete

Cool thats good to hear cause that is what we are doing as well :slight_smile: Thanks for all the help!