Genesis

virtual production platform

VIRTUAL PRODUCTION PLATFORM

Genesis is our Virtual Production platform. Virtual production enables filmmakers to make better creative choices much earlier in the production process, leading to better quality outcomes. Genesis is a multi-year development project that has been used on some of the biggest movies of the past few years.

Genesis is our Virtual Production platform. Virtual production enables filmmakers to make better creative choices much earlier in the production process, leading to better quality outcomes. Genesis is a multi-year development project that has been used on some of the biggest movies of the past few years.

VISUALIZE CG CONTENT

With the increasing complexity and amount of VFX work done for today’s films, it has become extremely important for the director and the film crew to be able to visualize CG content directly on set, integrated with any live action elements that are being shot. Aside from visualizing content, it is also essential to be able to edit this content live on-set, through direct manipulation or performance capture, and record any modifications or animations generated during the take. This workflow is called Virtual Production.

With the increasing complexity and amount of VFX work done for today’s films, it has become extremely important for the director and the film crew to be able to visualize CG content directly on set, integrated with any live action elements that are being shot. Aside from visualizing content, it is also essential to be able to edit this content live on-set, through direct manipulation or performance capture, and record any modifications or animations generated during the take. This workflow is called Virtual Production.

AUTOMATICALLY PUSH PRODUCTION ASSETS

Genesis is hooked up to Tessa, our asset management system, so we can automatically push production assets and all of their dependencies into Unity, as well generate new content that can be ingested back into our production pipeline. We are working on a USD integration that will allow us to push virtually any asset from our VFX pipeline straight into the engine.

Genesis is hooked up to Tessa, our asset management system, so we can automatically push production assets and all of their dependencies into Unity, as well generate new content that can be ingested back into our production pipeline. We are working on a USD integration that will allow us to push virtually any asset from our VFX pipeline straight into the engine.

REALISTIC REAL-TIME RENDERING

Thanks to a collaboration with the Technicolor R&I team, we have pushed the boundaries of real-time rendering and camera effects in order to achieve incredibly realistic looks. We now have physically based depth of field, motion blur, area lights, indirect lighting and many other effects that act as key differentiators from our competitors. In addition, we have recently developed an internal geometry streaming format that is fast enough to let us playback in real-time final animations for our characters, complete with all secondary deformations (like muscles and cloth).

Thanks to a collaboration with the Technicolor R&I team, we have pushed the boundaries of real-time rendering and camera effects in order to achieve incredibly realistic looks. We now have physically based depth of field, motion blur, area lights, indirect lighting and many other effects that act as key differentiators from our competitors. In addition, we have recently developed an internal geometry streaming format that is fast enough to let us playback in real-time final animations for our characters, complete with all secondary deformations (like muscles and cloth).

MULTI-DEVICE SUPPORT

The challenge of Genesis rendering is we need to support a multitude of different input and output devices at the same time. We will have some users in VR, the DOP looking through one or more virtual cameras and some operators sitting at desktop computers. Each device will require a different frame rate or resolution so our system can enable or disable a variety of rendering features depending on the limitations or requirements of the specific device that is being used.

The challenge of Genesis rendering is we need to support a multitude of different input and output devices at the same time. We will have some users in VR, the DOP looking through one or more virtual cameras and some operators sitting at desktop computers. Each device will require a different frame rate or resolution so our system can enable or disable a variety of rendering features depending on the limitations or requirements of the specific device that is being used.

SYNCHRONIZED DATA STREAMS

There is a lot more to Genesis than its real-time rendering component: the platform is a distributed system where multiple independent machines communicate over the network via synchronized data streams. This allows collaborative editing by multiple users editing the set together, controlling the cameras, changing the lighting or generating new animations via performance capture or puppeteering. Its distributed nature also allows us to delegate some very expensive tasks, like solving performance capture data, to powerful dedicated machines.

There is a lot more to Genesis than its real-time rendering component: the platform is a distributed system where multiple independent machines communicate over the network via synchronized data streams. This allows collaborative editing by multiple users editing the set together, controlling the cameras, changing the lighting or generating new animations via performance capture or puppeteering. Its distributed nature also allows us to delegate some very expensive tasks, like solving performance capture data, to powerful dedicated machines.

SCRIPTABLE COMMAND API

After a new take is made on set and recorded, we can ingest all the data back and reconstruct the whole take offline at any time. At this point we can re-render it with all the expensive rendering features turned on and produce a very high quality daily. This feature required us to not only record all the data from a take precisely, but also develop a scriptable command API for Unity, through which we can queue commands that can either be loaded from a script or received via socket over the network.

After a new take is made on set and recorded, we can ingest all the data back and reconstruct the whole take offline at any time. At this point we can re-render it with all the expensive rendering features turned on and produce a very high quality daily. This feature required us to not only record all the data from a take precisely, but also develop a scriptable command API for Unity, through which we can queue commands that can either be loaded from a script or received via socket over the network.

028_KH_0260_A 028_KH_0260_B 028_KH_0260_C 051_BF_0240_A 051_BF_0240_B 051_BF_0240_C