ZBrushCentral

Multi Map Exporter for Low Poly Models?

Hi all, just wanted to ask a quick question about exporting maps for a retopologised low poly model, more about workflow than anything. To keep it brief, my workflow is create a high poly mesh in ZB, retopologise in another app such as 3D coat, then bring the retologised mesh back into ZB and subdivide to the same number of levels as the original high poly sculpt, then ‘project’ the details of the high poly onto the low for map extraction.

The only issue is that I find the low poly loses a lot of its original form and I’d like to keep it exactly the same instead of changes being made to it. A few people have said it’s better to bring a decimated mesh back into Maya for Normal map generation, is there any reason why this would be more suitable than extracting maps in Zbrush? Thanks in advance :slight_smile:

Torch, it all really comes down to workflow. Which one would you feel more comfortable with? As to losing scale on the mesh, store a morph target before you divide the mesh and project the details. When you go back to your lowest subD level you’ll need to switch the morph target and then create your maps. This will bring back the exact shape of your LP mesh so you can ensure your tangents are the same for your LP mesh that you built.

The best thing to do is bake in a program that allows for you to use the same tangent space as the engine you’re going to be rendering in. I.E. if you’re building a mesh to work in MAYA then you should bake in MAYA, if you’re using unity, UDK, Crytek, then you should bake in a program that allows you to match the tangents of those engines. xNormal allows you to do this and is what we use for production works.

We also prefer to bake outside of zBrush because zbrush won’t let you project across multiple subtools…at least it doesn’t do a good job of projecting. Other baking algorithms handle this part much better. Baking your diffuse and normal maps, etc in a traditional baker also allows you to ensure that your maps will all align correctly. Zbrush has a tendency to have the diffuse and normal textures not line up correctly because they use two different types of projection when creating the maps…at least if you store a MT and project the way I mentioned above. While this ensure your normals line up correctly with the lowpoly mesh, the diffuse can mess up. So…basically, it all depends on what you want to sacrifice on if you’re generating maps inside of zbrush.

Zbrush also doesn’t have realtime lights or a normal map viewer, which to me is a real turn off. soon maybe, or another nice plugin like Zmapper…but only as a visual tool for checking normals would be nice.

eh, my 2cents