ZBrushCentral

Scan data workflow

Hi all!

I have some really nice scan data with a lot of surface details (both model & color) consisting of an obj file with 1.3million active points and an 8k texture map:

I would like to perform some basic cleanup on this, namely:
a) remove a section of the mesh and close the hole
b) smooth some of the artefacts & sculpt a bit of extra detail in

Then I would like to create a low-poly version, and bake the color and the sculpt detail from the high res mesh onto this low poly for me to then take into substance painter for cleaning up the color & ultimately using within a game engine.


I’m confused as to best practice workflow.
I think I need to
duplicate the mesh WITH the color texture
dynamesh it,
subdivide it,
and then project the color detail from the original onto it somehow?

Or do I need to somehow convert the color texture to a polypaint of the actual mesh, then dynamesh it or something?

I have watched a number of good videos but they appear to have different workflows in mind -> https://www.youtube.com/watch?v=PrFQXjs_6_w

Any help greatly appreciated!

When converting high poly into low poly with a texture involved, you will first have to convert the texture into polypaint, duplicate your tool and Zremesh (or otherwise retopologize) the duplicate, create UVs for the new object, subdivide it sufficiently to hold the polypaint detail, project the detail (both sculpting and polypaint) onto the new tool, then generate a texture from that polypaint based on the new UVs.

Your original mesh, even at 1.3 million polygons may need to be subdivided first before converting texture into polypaint, depending on detail. Polypaint resolution is dependent on mesh resolution.

Copied and pasted from a similar inquiry that contains links to the relevant concepts in Zbrush:

Zremesher completely changes topology and reduces polycount, which will obliterate any UVs or polypaint, as well as high res sculpting detail. In order to reclaim this detail onto your new mesh, you will need to project sculpting or polypaint detail from the original mesh onto the new mesh (which will need to be sufficiently subdivided to hold the former detail). In the case of textures, this means first converting a texture into Polypaint, and then projecting it onto a mesh (which will require new UVs), and then back into a texture on the new mesh.

Thank you greatly Scott. I have been playing with all the different parts of your workflow and am familiar now with how to do all of them. A few followup ques if I may :


  • convert the texture into polypaint
  • duplicate your tool
  • Zremesh the duplicate
  • create UVs for the new object [COLOR=#00ff00](I can use UV master for this) -> [COLOR=#00ff00]then I have my lowpoly, UV’d model right? I can export this ready for game engine
  • subdivide low poly’d model sufficiently to hold the polypaint detail
  • project the detail (both sculpting and polypaint) onto the new tool [COLOR=#00ff00](I have read that it is best to perform this step and the previous step one subdivision level at a time?)
  • generate a texture from that polypaint based on the new UVs [COLOR=#00ff00](this will be my color texture - I can then export this model as the high res model and bake a normal map in Marmoset or other program, correct?)

[COLOR=#00ff00]
[COLOR=#add8e6]If I need to clean up areas of the original model - cut away some sections, smooth out some sections, add further fine sculpt details and close some holes, I need to use dynamesh correct? I take it I should do this here:
[COLOR=#00ff00]


  • convert the texture into polypaint
  • duplicate your tool
  • [COLOR=#00ff00]dynamesh to a resolution similar to original model & perform model alterations

  • [COLOR=#00ff00]Zremesh the dynameshed version
  • create UVs for the new object

  • etc

That sound right?

This video (https://youtu.be/n8t-SV8rU8o) suggests I should just

  • use the dynamesh for the removing of geometry after duplicating the model
  • do the sculpting / smoothing refinement AFTER zremeshing at the different subdivision levels as I am projecting


  • create UVs for the new object [COLOR=#00ff00](I can use UV master for this) -> [COLOR=#00ff00]then I have my lowpoly, UV’d model right? I can export this ready for game engine
  • >

You can create the UVs with UV master, Zbrush Tile mapping, or any external tool you wish.


  • project the detail (both sculpting and polypaint) onto the new tool [COLOR=#00ff00](I have read that it is best to perform this step and the previous step one subdivision level at a time?)
  • >

That is the prescribed workflow, though sometimes I get fine results just projecting at the highest SubD level. Sometimes you need to tweak settings in the ProjectAll menu to get good results–notably increasing the .dist slider is often necessary.


  • generate a texture from that polypaint based on the new UVs [COLOR=#00ff00](this will be my color texture - I can then export this model as the high res model and bake a normal map in Marmoset or other program, correct?)
  • >

Yes, the polypaint on a high-res object can be converted to a color map if the mesh has UVs at its lowest subdivision level. You can also generate a normal map in Zbrush for export based on those UVs. The workflow for creating textures is here–note the storing of a morph target.

[COLOR=#add8e6]If I need to clean up areas of the original model - cut away some sections, smooth out some sections, add further fine sculpt details and close some holes, I need to use dynamesh correct? I take it I should do this here:

Your inquiry here is too broad to answer, especially without looking at what youre trying to do. Your sequence is broadly correct for some situations, but dynamesh is just one tool in the toolbox. It’s more a matter of being as familiar as possible with all the tools in Zbrush, and knowing when to employ each. Dynamesh will close simple small holes very easily, but large open sections of mesh may require more complex reconstruction. Scanned meshes can have really awful geometry.

See the thread here by a user in a similar situation trying to clean up geometry from photogrammetry.