ZBrushCentral

multidirectional Displacement (Moved To Q&T Forum)

I have noticed that when in Z brush and you are sculpting away that it is VERY easy to produce results that just wont convert into a displacement(hieghtfield)map, due to the nature of how disMaps work, it gets espeacially broken when you use the inflate brush on detail to "snug"it up together, these results become invalid and its very frustrating to end up with some really nice work that cant be rendered anywhere else than Zbrush.
So my question is that I have heard of multidirectional displacement coming online with (mental Ray) I thougth, in which an actual RGB map is used instead of a greyscale map, and the colors equal displacment in different directions, this would fix the current problem with Z brush.

Has anyone else heard of this or know anything about it??

…also is there a way to clamp the way the detail is created in Zbrush that would allow you to “lock” to one displacement axis so that you couldnt acidentally sculpt something that will not evaluate correctly when generating the disMap??

Thanks! -Kris

I think that this kind of error might be related to the initial density of the mesh you have. Also, try a proven render engine - MR or PRMAN. I know turtle is nice, but it’s still beta.

problem is that displacement will only move a pixel in the relative nornal direction, up or down, so anything that has a mushroom shape will not be able to be displaced with a greyscale image, yet its very easy and desired to sculpt with shapes that bulge at the top, so back to the original question, is anyone messing with renders that will take advantage of multidirectional displacement, I know some in house renders will do this using a RGB image, if Zbrush could calcutate a map that had this info, and then a render would come online that would support it we would have a winner, right now its just not possible to get a lot of the results that you get with Zbrush into anything usealbe, . . .making models that look great in Zbrush is great but if you cant reproduce the results in a production environment then you got nothing, now I know you can produce 95% percent of the stuff with dis maps but Im talking about the other 5%!

No the problem lies in the cage(low res mesh) and DispMap relation. In this case, you would have to have SOME poly modeling under those ridges for it to read right. Thats why it is recommend to not overdue the modeling if export with Dmaps is required. I recommend dropping down to the lowest level and generate a cage based on the upper levels. This should produce better results. See, the map is a grayscale image so aside from the actual cage normal, the diplacement of normals is based on the image and its falloffs and not if the map is pushing parallel to normals. Now Normal maps, thats a whole different story.
ZBrush does make Normal maps too by what I have seen. Go through the programs help system…its in there. As for industry usage, well, I don’t know if weta had a problem or not but, ROTK looked awesome to me. :wink: