ZBrushCentral

Projecting detail into UV space of topologically dissimilar mesh..?

The short version of my question is, can it be done?

If you want more detail…

I bought ZB a while ago, but I’ve rarely used it, so I’m still quite the noob. Right now, I’m working on a project that involves hard surfaces, and I need to go between Lightwave and ZB. I don’t know what the typical workflow is, but in Lightwave I like to keep all my surfaces as subdivided meshes, and use edge loops or “holding lines” to crease corners.

Of course, when in ZBrush, using extra geometry to hold corners gets disastrous a few subdivision levels in, since we end up packing tons of unnecessary polys into the creased edges.

In short, my “ideal topology” is different in Lightwave and ZBrush.

This is what I would like to accomplish:

  1. Create a mesh in Lightwave, using the topology I like, including extra loops to keep those corners sharp.

  2. Import a high-ish resolution “frozen” (baked subdivision) mesh into ZB. The extra geometry along corners will make this bad for sculpting, though.

  3. Create a high resolution detailing mesh using the skinning and projection all inside ZB. This has bad topology for keeping a subdivided mesh in Lightwave, but is much better for sculpting.

  4. Sculpt fine detail, polypaint, etc.

  5. Bring in my low-res base mesh with UVs (or make UVs in ZB).

  6. Bake the detail I painted/sculpted in step (4) into color, bump, and/or normal maps for the low-res base mesh (obviously, these would be in the UV space of the base mesh).

  7. Bring the whole mess back into LW, activate subdivision, light, animate, render, etc.

Everything in this workflow seems straightforward–except for step (6). Can it be done?

If not, what workflow would you recommend? I’m running an old version of Lightwave on an old machine, so I’d like to avoid bringing tons of polys into my Lightwave scene. Also, I think this would just be an amazing feature.

If the question is, can you create UV maps with bump, displacement and normal information from the high poly model detailed in Zbrush and apply those to a low poly copy of that same model in another program, the answer is yes. Check out this page.
http://www.pixologic.com/docs/index.php/Bump%2C_Displacement%2C_and_Normal_Maps

Thank you for your reply. However, the provided link does not seem to have what I’m looking for. I am well aware of what displacement, bump, and normal maps are and principles on which they function.

I am also aware of the workflow in which one subdivides a low resolution mesh, sculpts detail, and then carries that detail over into another app as displacement, bump, and normal maps applied to the low resolution mesh. In this case, the UVs of the high resolution mesh and those of the low resolution mesh are effectively the same; as I understand it, ZB simply subdivides the UVs of base mesh along with the geometry.

I am speaking of a situation in which not only the resolutions of the two meshes are different, but the topologies as well.

I know that one might use the SubTool projection techniques to transfer information from a mesh of one topology to another mesh of a different topology. However, if I understand correctly, the new mesh would have to have to be subdivided a lot as well, because projection would only affect the mesh’s geometry and polypaint, and not its texture/displacement/normal maps. This is a bad situation if the new mesh uses extra geometry to hold creased lines during subdivision, as the creased edges will get packed with tons of polys and likely bog the system down before the flat areas get enough polys to resolve the details.

Otherwise, if there was some way I could skip polypaint and paint directly into the UV maps, I think that would solve my problems. So far, my searching has indicated that this cannot be done.

Does any of this make any sense at all?

Does Lightwave have a feature akin to Maya’s Transfer Maps or XSI’s Utilimapper? I’m a new zBrush user myself and my solution to this problem (aside just biting the bullet of extra edge geometry and dividing models into subtools to get density where I need it) is to use zBrush’s automatic UVs and Decimation Master to export the sculpt and maps, then bake them into the final model with Maya’s Transfer Maps. You’ll have to temporarily bring a very high poly model into your 3D app which you said you didn’t want to do, but it’s the simplest solution I know of. Once the map transfer is done the decimated model can be deleted.

The other idea I had but didn’t really explore was taking a UVed model without the extra edge loops, creasing those edges instead and using zBrush’s Crease Levels setting so they’d only be applied in the first two or three subdivides, hopefully giving a more even topology. Even if this works it’d be more manual labor creasing everything.

As for painting directly to the maps, that’s the approach taken by Mudbox and Photoshop Extended’s 3D tool. I don’t believe zBrush supports such a thing.

I’m new to 3D work in general so sorry if I’m off base on your needs.

Hm, this Transfer Maps feature sounds like just the thing I need! I think you understood my question exactly.

Sadly, I don’t think Lightwave does this, at least not the old version I’m running. I might just have to deal with the polygon overload in the edges… Right now I’m going to see if I can use the polygroups/creasing in ZB alongside edge weights in LW… hopefully they’ll play nice together.

Thanks for the replies!

Try using xNormal , it’s a free program that can do what I was talking about and more.

I swear I posted this the other day but it disappeared , hopefully this doesn’t come through as a double post.