Game Engines
.unity
More indy. Essentially, Unity acted as middleware to take care of the technical set-up and allow for creative production, unhampered by the 1s and 0s of low-level code. Unity also created extensive documentation to explain the engine and lower the barrier for entry. Alongside this accessible documentation, was a strong community for sharing information.
.unreal
Unreal is still used more by large teams that require higher-fidelity graphics. On a spectrum from AAA to indie development, Unreal Engine would still sit nearer the AAA end than Unity
Unity aims to let developers release products for a much wider range of target platforms than Unreal.
In particular, Unity targets mobile device development, whereas Unreal does not.
- Have an understanding of the Virtual Reality creation toolbox available within the Unity Game Engine
- Be able to upload a pre-existing Unity Project and view it within a Head Mounted Display (HMD)
In a videogame or VR experience, worlds are created from what we can call assets. These are the raw component parts of a virtual scene, and can be visual, or sound-based. Here are some examples of these assets we might expect to be involved in a videogame or VR production.
- 3D Models e.g. – Characters – Props – Vehicles – Flora / Fauna
- 2D Textures e.g. – Grass – Brick – Skin
- Terrain e.g. – Mountains – Lakes
- Sound Design e.g. – Dialogue – Foley (e.g. footsteps)
- Music e.g. – Songs – Score
- User Interface e.g. – Heads Up Display (HUD) – Icons
A head–up display is a projector that beams essential driver information into your field of view, so that it appears superimposed on the road ahead in mid-air. Although this may sound distracting, HUDs are carefully designed to be almost invisible when they’re not being looked at directly.
The core building blocks of video games and virtual reality developed in game engines are the same.
With this in mind, is VR simply a delivery mechanism? It is an interesting thought.
Regardless, game engines can create the virtual, interactive worlds we need for both video games and VR productions. Game engines can export those same worlds in the formats required for both TVs and headsets.
The “shared DNA” that we’ve already discussed between traditional video games and VR experiences developed within game engines.
What is Virtual Production (VP)?
This means that film productions can now use virtual environments that interact with live action. Background imagery, displayed via large LED screens, is now used in place of real-life locations. The fact that this can be done with real-time graphics means that camera movements can be synced with these backdrops. Many processes that would have normally been part of a long post-production process can now be done on set.
Capturing traditional post-production special effects “in camera” cuts down on the work required after a shoot. In addition, virtual production also impacts pre-production.
The blocking and framing of shots can be planned in advance with virtual production. This idea of “pre-visualisation’ is not entirely new, but virtual production improves it greatly. Directors can now see the effect of their camera movements within a scene weeks before arriving on set.
In the same way that digital cameras allow immediate review of shots, virtual production allows us to create high-end images in real time so we can see straight away if we got the shot, or if we need to go again.”
https://medium.com/@thefocus/virtual-production-101-how-does-it-work-how-can-it-revolutionise-vfx-1c7e80ade0f2
Game Engine Development Tools
The game engine itself. As we’ve already explored, the two major real-time game engines to consider are Unity and Unreal Engine. Both are free to download initially. Payment plans may be necessary when you decide to release a product.
For VR, however, we want to stick to PC. Apple does not yet properly support VR development.
- 4 Core CPU minimum
- Benchmark 10,000 + GPU
- 8 GB RAM minimum
- SDK kit
Headsets/HMDs (Hardware)
Before you begin, you need to know the target platform of your VR experience. The tethered headsets produced by Oculus, Vive and Valve can all be used for VR development.
Oculus Rift S Oculus Quest / Quest 2 HTC Vive Valve Index
SDK kit
The target platforms above will require an expansion to the game engine, through a Software Development Kit (SDK). For Oculus HMDs, this would be the Oculus XR Plug-In, and for Valve or Vive HMDs, OpenVR.
Unity
- .god view
- .hierarchy and inspector windows – parent child
- .project wind with assets … package experience
- Liam Walsh is Creative Technology Director of Interactive Arts at Nexus Studios in London. Liam focuses on Interaction design and storytelling for immersive productions.
- .how it is consumed
- .users change their reality
- .his role bridges the gap tech and and creative
- .game engine the iterative way to work –
- .maths
- .stuff it’s going to change
- Set you project up in UNITY – connect headset via USB to test …need an SDK
- Oculus example project
- Oculus integration package
- In Unity – Window – package manager
- File – Build setting – platform –
- Enable VR support – Edit – Project Settings – XR plug in management or Player – install – we’ve enable vr
- Unity Engine Case Studies
- Unreal Engine Case Studies
- Unity Learn
- Unreal Learning and Support
- Unity Community
- Unreal Engine Community
- Unity Documentation
- Unreal Engine Documentation
- Unreal Engine Virtual Production Hub
- The Virtual Production Field Guide
- Disney Gallery: Star Wars: The Mandalorian
StoryFutures Academy presents: Masterclass with Mohen Leo, ILMxLAB
- https://www.unrealengine.com/en-US/blog/virtual-production-field-guide-a-new-resource-for-filmmakers
- https://docs.unity3d.com/Manual/index.html
- https://learn.unity.com