ZBrushCentral

Error has been ecnountered while trying to load a Tool. Loading has been aborted.

Error has been encountered while trying to load a Tool. Loading has been aborted.

Can you help?

I’ve checked other posts, and have found similar issues, but this one is rather nebulous to pin point. This started due to having invested several hours into a new ZTL, while saving regularly and versioning up, and then having it error with the above error when I try to load the subTool after some extensive sculpt work.

I’ve performed different tests with the base mesh to try and get to the source of the problem, as you can see below after the specs list. I’ve isolated different features or aspects such as poly count, file size, morph targets, and layers, etc. I’ve not been able to nail it down to anyone of those as particularly responsible.

System Specs
cpu- Intel Xeon Pentium Extreme Edition 2.8GHZ Dual Core cache- L1 Data - 2x16 KBytes L1 Trace - 2x12 Kuops Level 2 - 2x2048 KBytes
ram- 3GB
graphics card- NVIDIA Quadro FX 3450/4000 SDI
OS- Windows XP Pro v.2002 SP2
Hard Drive- 6 gigs free

Zbrush version 3.1

Zbrush MEM settings
compact mem- 2000
DU- 4
TU- 4
MPPM- 20
HDPMP- 20

Model Specs
1 mesh
Maya Tris- 20,372
Zbrush ZTL subDiv 1- 10,186 polys

subDiv test
opens up to- subDiv 5 @ 2,607,616 polys @ file size 22,881 KB
fails at- subDiv 6 @ 10,430,464 polys @ file size 81,506 KB

Morph Target test
fails after saving subDiv 5 with a morph target @ file size 22,670 KB

…this struck me as strange… the subDiv 5 file is larger than the subDiv 5 file + a morph target… yet, the subDiv 5 file (22.9 megs) and opens, while the subDiv 5 + morph target file (22.7 megs) does not open… so then one might think it has to do with the morph target, but I also cannot open at subDiv’s over 5, and that’s without any sculpting work or layers, and lower subDiv opens with a morph target fine… point being, that it’s not necessarily tied to that feature, and it doesn’t appear to be file size or poly count specifically.

Conversely, my supervisor was able to open up a much larger ZTL (3.8 million polys) mesh without issue, and a co-worker is working on a subDiv 5 file at 123 megs in size, which dwarfs even my largest file size for my ZTL’s… so polys and file size aren’t necessarily relative. I’m at a loss here, I’m trying to isolate, what seems to be a random issue. I’ve seen much more complex assets on average machines, and yet, I’m unable to load something I was working on not 5 minutes earlier… this is a production asset, can anyone help please?

Max polys per mesh and the HD value are a little high for your system. That’s probably not the problem, though.

Your drive has 6 GB of free space. I have some questions:


  1. How large is the drive?
  2. Is it also the drive where Windows is installed?
  3. When is the last time you defragmented the drive?
The reason why I’m asking is that 6 GB is getting to be a bit small in the free space department. That’s especially true if it’s the OS drive because your OS writes and deletes a lot of files on a regular basis. This rapidly leads to disk fragmentation, which in turn makes it hard for ZBrush to find big enough memory blocks to work with. That can result in corruption when saving files.

You should defragment your drives often. But you also need to keep a MINIMUM of 15% of the disk empty or the disk defragmenter will not be able to work right.

One other thing that may assist you is to use SubTool Master’s save feature rather than Tool>Save As. It has specific routines in it that help avoid file corruption problems.

Hey Aurick,

Thanks for responding to this, I’m told that you are the super guru around here, and I’m flattered that you would pick up the ball and run with it. :slight_smile:

Xenon posted this thread for me, as I was having issues with my registered former account… but now that I’ve setup a new one, I can respond.


  1. 80GB
  2. Yes
  3. A long time ago, I will do this immediately.

My Windows XP Pro OS is in fact on the same drive. I spoke with a tech last night, and they mentioned some of what you did. I have an 80GB drive that was bogged down by some very large Photoshop temp files that were blown away, and some personal files. I now have 20GB free instead of 6GB, so I’m hoping this will head off any further problems while saving.

The troubling part of this, is that I never received any error messages while saving, so if in fact the files have become corrupted because of some virtual memory issue, then I would hope the program would recognize that and warn the user about risky saving. As it turns out, the work was not totally lost, because I thought I may have trouble opening the saved file (since it had happened before), so I generated an 8K, 32 bit displacement map before closing Zbrush. I plan to re-apply the map (16 bit) and hopefully can continue from there.

I’m going to defrag, thank you for your suggestions, I will try them out and post an update. :cool:

I think I may have finally discovered the root issue. I was pulling this asset from our local, pipeline server. As an experiment, I tried copying the troublesome files to my local hard drive, and opened them each without issue. Perhaps it was a latency issue, I’m not sure, but either way, I’m much happier now! :slight_smile: :+1: