Ok I will post some images and stats if the following description does not suffice. I am busy today and its Friday. 
I am roughly using 2-4k poly for the base mesh and going up to lvl 7. a few mil tri basically. The lvl 7 sub lvl quad size relative to the texels in tangent space is on par with almost being sub texel. 4k normal map with perfect UV placment for captureing Normals in tangent space.
The point being that i want smooted vertex normals to be captured becuase the lvl of detail is definetly adiquate.
Ok here is what I think is happening in the rasterization of NMaps. Basically Zbrush calculates the normals using face normals of the subd mesh as a defualt (hopefully there is a way to use the unbroken vertex normals). This of course is not ideal. This basically creates many facets in the
NM. And ruins the smooth look of the captured shape.
If there is a way to caputure smoothed normals I have not found any documentation that points me to the proper funtion.
Another reason NMaps look “Chunky” is that there seems to be no supersampling going on. This would raise render times obviously but enhance the output vastly. Only being able to output 4096x4096 maps does not alow me to do some box filtering on my own.
PS (off topic)
Another sugestion would be to write a tool that creates per pixel Ambient occlusion data in the same way as the NMaps and Displament maps are captured.