Rendering

Rendering

From the beginning I had been hoping to use the Viper render farm as its rendering capabilities far surpass my own PC. However, when I tried to use the Viper shelf exporter and upload the .ass (Arnold Scene Source) files to the University server I realised my internet upload speed was too slow and would have taken around 100 hours to transfer everything.

I then remembered that with access to the remote lab in the University I could upload my Maya file, generate the .ass files then upload with the University’s higher internet speed.

When first testing this I had been waiting through the night and my files hadn’t got to the top of the job queue due to Viper having a lot of requests. As there was no ETA given on when my job would start or finish I decided to render chunks at home as and when I completed the individual scenes.

This meant rendering at minimum requirements for the assignment and trying to render at a high quality at the end. 

Rendering in chunks also meant that Natalie could use them as references progressively rather than waiting until the very end.

Stereoscopic vs Monoscopic

Using Viper I managed to render out just under two minutes of animation in 2K in a fairly short period of time, splitting each of my major scenes up so that I could render them as they were completed. With the videos rendered out I put them into Premiere Pro and exported them together, making sure to tick the ‘Is VR’ option so that applications such as YouTube recognise it as a 360 video.

I uploaded the video to YouTube and waited for it to be processed, which took a few hours. Once uploaded I realised that the quality seemed a lot lower than what I initially expected it to be. I thought maybe it could have been due to YouTube’s compression or choosing the wrong export options for Premiere, but I believe it was because I rendered the video in Stereoscopic above and below.

A stereoscopic render consists of two images taken from slightly different perspectives and can be composed as left and right or above and below. Creating these two slightly different images means that when viewing as a VR video the difference between the images tricks the mind into seeing depth in the video. However, due to rendering two images it effectively halves the overall video quality.

With a monoscopic render, the overall resolution is higher as it only uses one spherical wrap, but doesn’t have the depth that stereoscopic VR has, making it like viewing an image on a screen when in VR.

As I still had the Viper render farm available to me, I rendered another stereoscopic version but in 4K, removing a lot of the blurriness, but still not the best quality. As 4k didn’t take too long to render I decided to try out a monoscopic render too as the general resolution would be better.

2k 4k and 8k

In the end I found that submitting a job around Saturday or Sunday was best as there were generally fewer jobs, allowing me to utilise around 18 cores. This meant that I would effectively be rendering frames out roughly 18 times faster than if I were to do it at home and I wouldn’t have to worry about my computer crashing or overheating during the night. Renders that would take potentially days took only a few hours on the Viper farm.

Trying different resolutions I eventually made my way up to 8K, each frame producing roughly 1.5GB worth of frames.

Luckily Viper has an automatic function which compiles all of the frames into a small video that you can download instead of how you used to have to download all of the frames yourself and compile it on your own computer.

Render testing

Single frame renders 

During the process of setting up the scene in Maya I would do multiple single frame renders after completing a sequence to get a rough idea of how long it would take to render the full video if I were to do it on my own PC. The calculations were done by taking the time required to render a single frame, multiplying by total then dividing by 60 twice to get the number of hours, which were anywhere between 20 to 100 hours.

Checking in-progress renders 

When I did my first render test in the Viper render farm I found that it had rendered out in a perspective view rather than 360. Despite double and triple checking that the select camera was set to my 360 camera, when I downloaded frames from the in progress render it was still coming out in a perspective view. I talked with Jason and found that I needed to change the viewport perspective to the 360 camera too and that just setting it in render settings wasn’t enough, thankfully that was enough to fix it from there on.

HTC Vive headset

To get a better idea of what it felt like to view the 360 experience I loaded up my Vive headset and using an application on Steam I was able to view it in true 360. The difference in depth and general immersion was much more apparent when wearing the headset, something that you can’t really understand when viewing from a phone or computer screen. It’s also something that wouldn’t have been possible If I had rendered out as monoscopic instead of stereoscopic, which lacks the depth stereoscopic videos give.

When showing a couple of people the video on the Vive headset I got a lot of good feedback on the sound and the fact that the experience you get with the headset is completely different to viewing it on a screen.

Leave a Reply