KEY CONCEPTS

Virtual Production is a process used in the film and TV industry that allows filmmakers to work with digital environments and set pieces in real-time, while actors perform in front of a green screen or LED screen. This technology enables directors, cinematographers, and production designers to see the virtual world they are creating in real-time, allowing them to make decisions on lighting, camera angles, and framing on the spot.

This process can be used to create visually stunning scenes, film hazardous stunts with almost no risk, or adjust time of day or year to fit their needs, making it easier and more cost-effective for filmmakers to bring their creative vision to life.

By reducing the need for travel, transport and single use set pieces, Virtual Production can also significantly reduce the carbon footprint of the film and TV production process. This is noteworthy because, traditional film production is known to have a significant impact on the environment, with some reports estimating an average CO2 demand of 2840 tons for tentpole film productions. By implementing Virtual Production, at scale, the industry can significantly reduce its impact on the environment.

The IVPC project is an initiative that is focused on developing courses that provide skills and knowledge to Nordic film and TV professionals in the field of Virtual Production. The courses are tailored to the context of Nordic professionals, including the need to develop sustainable practices in the face of the climate crisis.

The project’s primary aim is to provide blended-format courses that introduce the practical application of VP and related technologies to the film and television sectors. Through the IVPC courses, Nordic film and TV professionals can gain a conceptual understanding of Virtual Production and its practical applications, along with tools to calculate the much lower CO2 footprint of VP. With this knowledge, they can develop competitive and sustainable industry practices while remaining up-to-date on the digitalisation and innovation trends.

IN THE BACKGROUND

Virtual Production sets can use a variety of setups to deliver backgrounds for a shoot, including LED walls, back projection, and green screens. Each setup has its advantages, depending on the specific needs of the production. LED walls, also known as LED screens, are large displays made up of individual LED panels. These walls can display pre-rendered or real-time graphics and videos as backgrounds for the shoot. One of the main advantages of LED walls is that they provide realistic and interactive lighting for the actors, making it easier to capture the scene’s atmosphere and mood. Additionally, since the backgrounds are displayed in real-time, the actors can react naturally to their surroundings, enhancing the realism of the final footage.

Back projection, on the other hand, uses a projector to display visuals onto a screen behind the actors. The primary advantage of this setup is that it’s often low(er) in cost than LED solutions and easier to set up, making it a popular choice for smaller productions. Although, it should be considered that back projections emit less light and therefore demand very light sensitive cameras and lenses. However, some back projection setups offer the advantages of not creating “moiré” in the images, allowing for a steeper angle of shooting and enabling the camera to be closer to the screen.

Green screen, also known as chroma keying, involves filming the actors in front of a green backdrop that’s later replaced with a digital background in post-production. The advantage of green screen is that it offers the most flexibility in terms of the backgrounds that can be added later, as almost any digital image or video can be used. However, green screen requires more post-production work to blend the foreground ( including actors ) with the digital backgrounds seamlessly and the creatives on set are restricted in viewing the results as they are created. Choosing the right setup will depend on the specific needs of the production, including the budget, the desired level of immersion and interactivity on set, and postproduction resources available.

CAMERA TRACKING

Camera tracking is the process of capturing and recording the movement of a camera in real-world space. On a Virtual Production set, camera tracking is used to integrate real-life camera movements with virtual environments seamlessly. By using camera tracking, the virtual backgrounds or sets can be rendered in real-time to match the camera’s movement and position accurately. This allows the actors to interact with the virtual environment more realistically and as such, camera tracking provides more creative freedom, allowing for dynamic camera movements and shots that would be impossible or too expensive to achieve in the real world

Another artistic advantage of camera tracking in Virtual Production is the ability to pre-visualise and plan shots more accurately. Directors and cinematographers can experiment with different camera angles and movements and preview the final footage in real-time. This saves time and money by avoiding reshoots and helping to make more informed creative decisions.

Conclusively, camera tracking on a Virtual Production set provides a more natural and realistic integration of virtual and live-action elements. The artistic advantages of camera tracking include the ability to experiment with different camera angles and shots, mix live-action footage with virtual elements, and create a more immersive visual experience for the creatives on set.

In addition to tracking the movement of the camera, tracking the movement of camera lenses is essential in Virtual Production. This technique is known as “lens tracking” or “focus tracking.” The goal of lens tracking is to ensure that the virtual environment is rendered correctly, taking into account changes in focal length and depth of field. Lens tracking is achieved by attaching markers or sensors to the camera lenses, which allow for the accurate tracking of lens movements. By tracking the focal length and depth of field, lens tracking ensures that the virtual environment’s depth and perspective match that of the live-action footage. This provides a more natural and realistic integration of virtual and live-action elements, making it easier for actors to interact with the virtual environment convincingly.

GAME ENGINE

A game engine, such as the Unreal Engine, is a software used in Virtual Production for creating and rendering virtual environments real-time. It provides a range of tools and features to create and manipulate virtual sets and environments that can be rendered to the background displays ( LED, projection or for later use during postproduction). The engine’s real-time rendering capabilities allow directors, photographers, and production designers to experiment with different camera angles, lighting, and visual effects, and preview the final footage in real-time. This allows for more informed creative decisions.

The engine’s ability to integrate with other technologies, such as camera and lens tracking, further enhances its capabilities in Virtual Production. By accurately tracking the movement of the camera and lenses, the engine ensures that the virtual environment is rendered correctly, taking into account changes in focal length and depth of field. In summary, a game engine is an essential tool in Virtual Production. It provides features necessary for creating virtual environments and rendering them in real-time, enabling greater creative freedom, and a more natural integration of virtual and live-action elements on set – as well as before and after.



MANAGED BY

The project has been managed by the Norwegian Film School, a subsidiary of the Inland Norway University of Applied Sciences

ADDRESS

HINN – LILLEHAMMER

Postboks 400

2418 Elverum

Inland Norway University of Applied Sciences

FINANCED BY

IVPC has been funded by the Nordplus Adult Programme