ZBrushCentral

Seams in normalmap?

I’m trying to create a normalmap for my lowres, but wherever there is a seam in the UV-coordinates there becomes a seam in the normalmap, where there shouldn’t be one. I don’t understand why. Can’t Zbrush handle UV-seams when generating a normalmap?
fyi- I’m using local tangent coordinates.

Here you can see the seams I’m getting. It doesn’t make sense that there should be so different colors.

Can anybody help? I really need to know, because if I can’t solve this we might not be able to use Zbrush in our production and that would be too bad. Painting out the seams is not an option.

When I generate a normalmap with world coordinates there isn’t a problem, so if anyone knows a way to convert a normalmap from world to local coords that would really help too.

CM

Hello, now I have tried to generate a normalmap with the ATI Normalmapper and I get the same seams there, so it must be something else. Can it have anything to do with the model having mirrored UV’s, same UV’s on both sides?

EDIT: I tried to import only half the mesh, but there is still a seam, so I’m back to thinking that the normalmap generator can’t handle UV seams…

Feelfree to jump in at any time :wink: It’s getting lonely here in this thread.

Try running Tool>Texture>UV Check. If you have any overlapping coordinates, they will appear in red. Overlapping coordinates will definitely cause a problem.

someone i know solved this by manually uving both sides, and not using mirrored uvs.

  • well… _UVs should be as seamless as possible, when using normal maps

  • Do separate channel of UVs with least possible seams in your 3d app, then use this for Normal maps, and maybe for color textures too

  • Don’t mirror UVs if it is possible to avoid, and use tangent space!

-S-S

maybe those seams arent as harmful as they look. heres 2 snaps i took from a cyslice demo.

Those seams are perfectly normal and will render correctly if your game engine is well designed.
They have nothing to do with mirrored texture.
To understand why there are seams you must understand how tangent-space work.

Tangent-space is calculated for each vertices, it is a special system of coordinates that wont change even if the mesh is deformed by bones or by morph. “Z” is easy to find, it is always the direction of the vertex’s normal. However, the “X” and “Y” directions are more confusing. To find them we must look at the UVs.

Basicly, when you look at the UVs, X is left-right and Y is up-down. This mean that the orientation of the UVs does affect the color of the normal map! If you rotate the UVs of your model by 90 degrees, the colors of a new normal map would be completely different.

This explain why there is always a “seam” between different clusters of UVs when they have different rotations, and why world-space normal maps dont have this “problem”. However, tangent space is always accurate during animation, unlike world-space.

If you are still having problems it can only mean that your renderer is not accuratly calculating tangent-space. zBrush’s normal map are very accurate.

Let me say this again: Those seams are normal and will render correctly.

  • :slight_smile:

  • What i was after with “Skin” style UVs;

  • Less UV vertices

  • More speed with hw accelaration

  • Less wasted space in texture page

  • Less problems with mipmaps

  • If you use as fragmented map as ambient-whisper, you will very likely run into problems, unless you waste multiple pixels for “edge padding” or what ever you want to call it

  • Easier to debug faulty map calculations

  • Hand painting fixes to maps is easier

  • As you can see, the same stuff is true for color textures, or any other typical surface definition map :slight_smile:
  • Mirroring is actually a problem, if you use object or world space normal maps, which are OK for static objects…

-S-S

I’m confused, are we still answering Millrider’s question? You are giving good general advices but these things dont explain why there is a color difference between his UV seams.

am no expert but looking at the pics it seems to me the color differences relate to depth and overlapping issues. I don’t know how to fix these things but I think the answer is somewhere in between the two explanations given previously.

Ah ok, I think I understand why there’s a difference in colors now and why it can’t be in any other way(unless the UV-seams are aligned in exactly the same axis). Thanks a lot for the useful help everybody.
Now I just need to make a kickass character to convince my company that Zbrush is the way to go :slight_smile:

hey just out of curiosity, anyone get normal maps rendering out of mental ray?

i’ve looked at the cyslice stuff b4 in maya, which was very helpfull for understding the process… but noticed that they dont render correctly out of Mray… just needa proper shader network perhaps?
normal maps are grrrrrrrreat!

Just as NilreMK pointed out, the seams you see when viewing the shaded model are because most viewers don’t properly adjust the normals from the normal map.

Here’s some more info I posted elsewhere, might help here…

You need a tool/engine that massages the normals of the vertices on your mesh, along each of the UV seams in your model. These normals are then used to transform the per-pixel normals from your tangent-space normal map into world-space… the orientation that’s needed so the lighting (which is in world-space) will be seamless across the model.

Most models need tangent-space normal maps (the ones that look predominantly light blue). If you were to use world-space normal maps (the ones that have full rainbow colors), then you can mirror/rotate UVs to your hearts content, without needing to create new vertex normals. Problem is, world-space doesn’t work with any object that deforms or rotates.

So in my experience, I can mirror and rotate UVs to my heart’s content when using tangent space, but I need to use a “smart” viewer to see shading without seams. Only caveat I’ve seen with mirroring is the need to remove overlaps by pushing the mirrored bits outside the 0-1 UV space… if I push them exactly 1 U or V unit out, they’ll still be mapped correctly, but they’ll be properly ignored by the normal map creator tool.

There’s some free code on the Nvidia site that helps solve the seams problem. http://developer.nvidia.com/object/NVMeshMender.html

Also a PDF file with some info about the issue… http://developer.nvidia.com/attach/3534

Hope this helps somebody. Took me some sleuthing to figure out.

I had this problem with objects when I used the ATI Mapper. I fixed it by selecting all my edges and setting them to SOFT in Maya. Then I applied the normal map to my original model.

Hope this helps… :smiley:

:idea:
:lol:

Hi,

I thought I would ressurect this thread since aurick points to it in the FAQ section of the site.

Does anyone know if you need to layout your uv coordinates with relative scale to each other?

Here shows some quite obvious seems rendered in max8 with mental ray.

test01.jpg

My normal was created by first creating a displacement in zbrush then using Displacement Exporter with the default Normal 32 [code DE-LCGK-FAIAJA-Normal32] export set.

Have read through this thread it suggests that seems in the viewport are normal so maybe we need to just make sure the geometry either side of the seam is roughly the same scale…?

Its just an idea but I was hoping that an expert might put me straight on this.

Thanks

Spencer

Hey I could really do with some help with my seams problem, I have tried every way I can think to get rid of the seams but nothings working…I cant paint out as I will need to generate too many normal maps, I just need to find a workflow that creates seamless normal maps at render time.

I have even tried to use the xNormal app to generate the normal from a hires geometry and low res version but I can only get a 512x512 pixel image out and it STILL has seams…Im not talking about seams in the viewport, i mean in the render.

Something else that I am sure would work is to bring into max or maya and using the in-built normal map generation tools but neither package will accept my hi-res obj becuase its too big.

After I tried everything else I thought why not take both meshes into zbrush2 and use zmapper to make the normal map from the hires version on to the lowres model but zbrush2 wouldn’t import the hires mesh that zbrush3 created for me. Not sure why, I know zb3 supports 64xbit but my machine is only 32xbit.

Please could somebody shed some light on this for me.

Thanks

Spence

You might want to check your filtering type in whatever 3D program you are using. By default whenever you plug in an image into Maya under file attributes there is “Filter Type” set to gaussian; change this to off (In Max it is under Bitmap Parameters > filtering it is set to Pyramidal set it to None) and often times this will solve your filter problems.