ZBrushCentral

real-time stereo editing plugin?

How about ChromaDepth technology?? is cheap and easy realise - need material zdepth from red in front green in middle and as depper color is blue.

I think is can be created easy and fast, performance still same.
But I’m not seen not it “live”, is diffractive prism s, and glass cost is around 10$ (!), this of.site http://www.chromatek.com/

Thanks for that link. I had not heard of this before. It’s very interesting. I’m going to inquire around further about it.

Its can be simulated by matCap material with circular ramp image (red in center, green in middle and blue at borders), but its not truly depth, but interest effect (simply based face angle).
I try doing another variant with red material and green fog. (but only red-green gradient, cant do 3 color gradient)
But without glasses it not check right.

You can buy glass on ebay or search on web (on ebay is ~12-15$ with international delivery on pair of this)

I think is 10 minutes to Marcus write depth shader with 3 linear color point. :wink:

Thanks for that explanation. I could not follow it however since I am new. Is there a way to find someone who can do this?

I’m think marcus_civis can make it easy :wink:

Circular Polarized displays can work as well, no weird colors, looks great too. But it needs a special monitor ( or at least a special screen ).

http://www.businesswire.com/portal/site/google/?ndmViewId=news_view&newsId=20071009005718&newsLang=en

These work real well, and don’t require funky colors. But I can’t imagine any of these being easy on the eyes for sculpting.

10$ vs 5000$ :slight_smile:
Is good and interesting but very expensive and limited size display.
And main I think is need anyway some support from application or how its know depth information to polarize…

I’m read about realise 3D TV in Japan, and think its use polarization based illusion, tv is bigger and cost around 4000$

3D monitors are overkill, really.

If you have a Direct3D signal, any NVidia card can (at a loss of framerate) move a duplicate camera a few virtual inches and display the stereo views in alternating or interlaced patterns for use with a $30 pair of LCD glasses. And there are similar drivers for OpenGL.

Most 3D display solutions leverage those same drivers, so again it becomes a question of whether the self-contained presentation is worth such an increase in cost. Hint: it’s not. Just go with the LCD glasses.

That said, those drivers don’t work in a windowed environment. Which is something every sculpting platform I’m aware of requires. You need to find one which is truly full-screen, with menus drawn in 3D space rather than overlaid by the operating system. Or you need a custom solution to be created from scratch by the makers of your sculpting platform.

Sadly, neither of those are ever going to happen.

One other possibility…

Take a pair of cheap sunglasses. Poke out one of the lenses. Wear and enjoy.

It’s not a perfect solution, but it works more often than you’d expect. Try orbiting around your model. If the effect doesn’t kick in, switch lenses, or orbit the opposite direction.

This is also good for turning normal broadcast television 3D (and those technologies which claim to convert 2D to 3D in realtime pretty much work on the same principal).

The idea is that you’re introducing a delay before light reaches one of your eyes, such that each eye is looking at different frames of the same realtime animation. When an object rotates in Z, or moves in X, you already got depth cues through parallax distortion, but the forced time disparity takes it a step further, simulating what you might have seen from a dual-camera setup.

Of course, when it isn’t working, you’ll get headaches and eye fatigue, and you’ll look like an idiot. But for some of us, that’s not much of a change. :lol:

Seems I do not get automatic notification of replies…

Original poster here. The best solution I’ve heard of, but have not seen in action, is the SpatialView plugin for C4D. Apparently everything on screen is properly displayed in stereo 3D. Ya need an autostereoscopic monitor, but I think the plugin also does anaglyph (red/green) and polarized. IZ3D is one of the autostereoscopic monitors; it will work with Direct3D. SpatialView offers one of their own. Way outta my league for total cost. I am still asking around about Chromadepth. I did find a post filter for Max, but post work is not what I want.

Thanks for the input. I was surprised by the number of replies. Usually, stereo 3D for a 3D app as a forum subject gets a poo-poo, won’t work, a yawn, or no replies period. I just think it’s such a cool concept. I still think it’ll happen one day and headaches won’t be an issue.

I swear, I didn’t make that up. It’s okay to try it.

Side note: I just stumbled across something I wrote about this back in 2003.

“This is called the Pulfrich effect, because German physicist Carl Pulfrich discovered it back in 1922. Interesting bit of trivia: Carl was blind in one eye, and could thus never experience the effect named for him.”

Wikipedia has a lot more to say about it.

Original poster here:

I am exploring the Chromadepth possiblity first because I have the company’s prism glasses coming soon.

Can someone explain in some detail what would be needed (if it’s possible) to have a preset color gradient applied to the active viewport so all distance from the viewpoint (center point of the picture plane) is indicated by color change… in the case of Chromadepth, pure red at the point where the line of sight emanating from the viewport camera/viewpoint first intersects an object/plane and then changing through orange, green and finally to pure blue for the furthest object in the viewport, along any radial line of sight from the middle point of the picture plane? This is for active editing, not post work.

See again the sample Chromadepth example attached.

Thanks for all the various comments and suggestions.

Attachments

Teapot3.jpg

I actually tried that busted out sunglass thing a few years ago. Was alot of fun but, when my eyes started crossing without them on I got a little freaked out and stopped doing it. It made me think of all the times mom said “keep doing that and you’ll get stuck that way!” :lol:

I tried the Chromadepth glasses today and you can read about it it at:

http://209.132.96.165/zbc/showthread.php?p=456032#post456032

What about the movie function?
is that possible?

yes if your recording a movie and you have your shading set up properly then it will play back with it.

I have pair of old chroma depth glasses

My best guess is to make MatCap rainbow shader and a proper background gradient should get it done

ahh, a button would be great :>
thx though…

Original poster here: nice to see there is some interest in Chromadepth for a 3DCC app such as ZBrush. Chromadepth is the only means I could think of to possibly create depth while editing. Other methods would be stereo pairs or red green pairs offset. Neither of which ZBrush can do. The bust above didnb’t work with the Chromadepth glasses, BTW.

Hear is to all ZBrush users programmers and Plugin dev’s
http://leonar3do.com/
Some info for your inspiration.
By: Moe.

I think you would have to have a camera function just collecting and projecting pixels that the camera sees rather than rendering two screens of the same model or two rendered screens but only the view we see is rendered (I don’t know if Zbrush currently renders from all sides within the software or if it has render “what you see” kind of mode)