ZBrushCentral

GPU Rendering (3D Studio Max question)

Hi all there… im in need of some help here… actually I work for a company that organizes events… i present every week, renders of how the events will look like. The thing is that the renders take ages… they’re really complex, full of lights(100+), 100’s of dif materials, etc…

The catch! :confused: The company’s software engeneer, told me that 3Ds Max could make use of my ATI x550 PCIE card to make my renders much much faster…
But I can’t figure out how to make 3dMax use my GPU!

Anyone knows how can I acomplish this difficult mission? :rolleyes:
Can’t even find it on the web :td:

Talk soon,
ArkangelFX

Hi

GPU based rendering for 3dsmax is as far as i know only avaiable via Nvidia Gelato, so you need a Nvidia card for that. Then you can squeeze some realtime juice outta it :smiley:

http://www.nvidia.com/page/gelato.html

/ Max

Thanx a lot! Ive got an GeForce FX 5200 at home ill give that GPU renderer a try! If it really improves my rendering times and its not that hard to make animations work good (cause i saw something about pyg and rib files and i really dont get it… seems i cant make Gelato render to a .avi or something…) ill make my boss buy me a nice Quadro FX (not the 3900u$s one… but something like 600u$s) and throw this crappy Radeon X550 to the garbage can…

Talk soon,
Arkangel

It won’t on a 5200.

I tried the simple vasefield scene (comes with gelato) on my 5200 in the basement and it took 72 seconds. It took 9 seconds on my newer new card (a 7600gs).

I’ve been reading up on Gelato on their boards and the little bits and pieces I can find and it seems to be a bit faster in some areas and a bit slower in others compared to other serious renderers.

Well, thats good to know… i will use my 5200 to see if Gelato makes a speed improvement from MentalRay… if i get some improvement using gelato, ill get my hands on a nice Quadro FX…

The thing i dont understand is: im used to MentalRay, VRay and the Default Scanline Renderer… but when it comes to gelato, i’ve seen somethings about .rib and .pyg files…? How can i render a full animation in 3D Max to a .avi o .mov file?

talk soon,
ArkangelFX

Running it on a 5xxx card won’t be a good indicator of the speed of it.

There’s a plug in for max, so you don’t need to make rib (renderman) files or pyg (python for gelato) files to run it.

Ive got another question… what about animations? how do i make gelato render like frames 0 to 200 and then build up a 30fps NTSC uncompressed avi?

Talk soon,
ArkangelFX

Will the max plug-in handle that for you?

Interesting bit about the speed of Gelato on the 5200. I was playing with Aqsis, and then converting the rib files and running them through Gelato, and even on that older card Gelato was still outpacing Aqsis. Aqsis is pretty darn slow though from what I’ve seen of it though.

Hi,
This past weekend, I tryied Gelato on my 5200 and voilá! 12 Seconds for the vasefiled render… gelato seems tu run much faster than Vray and MentalRay, the only thing is that when i export a time segment like 100 to 200 i get those frames each on a separate .pyg and i have to render each frame separately using a batch file…

Now my question is… once i have the 100 frames in .tiff format… is there any app that can make a 30fps avi or mov from my frames? or how do film studios put all toghether?

Talk soon,
ArkangelFX

If I understood well you use Max. and when you have like a hundred frames named like this : frame0000.tiff, frame0001.tiff etc then just go into menubar render (not the render menu itself the menu (like the file and edit etc.)) if you look all the way down there is the ramplayer.

click on it and a new window will open. click on the open channel A button scroll to the folder with the different frames click on frame0000.tiff and down the menu click on “sequence” normally it will indicate how many frames there are. then click open it will ask you how much memory you want to use it (I never change it and keep the general settings) it loads the animation into memory and from there click on save and then save it as a .avi or .mov or as you wish :slight_smile:

(don’t know if the explanation is really clear hehe :s if not let me know :slight_smile: )
hope this helps :slight_smile:

is there something like this for ATI cards aswell?
greets

1st of all, thanx very much for the advice! I think there are some other solutions for GPU rendering on ATI cards, but i couldn’t find out how… besides that, ATI and NVIDIA, both sell GFX Cards designed for non-realtime rendering…

Talk soon,
ArkangelFX

Would quicktime (pro) be able to do that?

Just something that I have always wondered.

Dont really know… I just tried Vegas 6, but when i import the .tiff and add them one next to the other… each frame lasts like 2 seconds… :confused:

Talk soon,
ArkangelFX

PS: Ive tryied another GPU rendere… RTSquare… but the quality is really poor!

Sounds more like a slideshow maker than a video encoder.

I did some searching and came up with mencoder.

http://web.njit.edu/all_topics/Prog_Lang_Docs/html/mplayer/encoding.html

edit maybe you could ask about quicktime here… http://discussions.apple.com/category.jspa?categoryID=122

Thanx for that tip! Ill give mencoder a try when i have some time…
I still need to figure out how to solve my rendering time problem…
The thing is that i dont want to loose quality, but 30hs for 5min clips y outrageous…

Talk soon,
ArkangelFX

Having max generate all of those .pyg files might actually work well for you if you really want to decrease your time to render. You could easily get a couple of half decent nvidia cards in a couple of machines and run half of the .pygs on one and half on the other.

I’m real curious about how you end up solving the problems. So if you don’t mind I hope you keep giving updates.

There’s a lil problem with that… i live in argentina… and a decent machine would be? a Pentim IV 3.0Ghz / 1Gb DDR2 / GeForce 6600gs? that costs like 700u$s here in argentina… or more!

I still have to think alot how im gonna solve it… and that the solution I pick is the best one fitting to my problem… its a lot of money to give a try to something…

Actually, im thinking of migrating to Maya and use renderman to render on loads of Machines (cheap ones… like 100u$s) or even use mental ray distributed rendering… dunno yet :confused:

Talk soon,
ArkangelFX

Max is capable of net rendering aswell I think? with backburner. dont know if that is maybe a solution. it comes for “free” with max 8

Yeah… sure max has net render… but Max 8 also comes with Mental Ray… and with max 8, mental ray’s distributed rendering also comes for free… but the thing is that if I render a 5 min clip y 30hs… with 2 machines it’ll take 15hs… but i need it to take less than 10hs…

Talk soon,
ArkangelFX

Well finally I got something going on here!
I’ve tested GPU Rendering on ATI and NVIDIA cards using Gelato and RTSquare and know what… they suck! its really time consuming to batch render 2000 frames that take 1min each using those… since you need more time to script the batch to render those 2000 frames and the time isn’s really that different… (thats for NVIDIA) now for ATI i tested RTSquare and the quality sucks… its really awfull…

So i started a little render farm with some comps at work:
WorkStation: Pentium D 2.8Ghz / 1GB DDR2 / ATI X550 / 160Gb x2(fast raid)
Slave1: Pentium D 2.8Ghz / 1GB DDR / 80Gb x2(fast raid)
Salve2: Duron 800Mhz / 256Mb DIMM / 40Gb IDE

And the times really decreased!! Im so happy… maybe in some near future i can get my hands on some Xenon Dual or Opteron Dual… but its tooooo expensive here in argentina!

Talk soon,
ArkangelFX