FPS provides on set supervision and teams for all aspects of virtual production; this ranges from lightweight AR tools used to roughly block set and crowd extensions to running highly complex LED volumes for in-camera compositing.
fARsight GO is Framestore's on-set visualisation tool, allowing creatives to preview a live composite of CG set extensions, objects, characters and animations within their physical set. It is a lightweight, easy-to-use handheld application giving creatives a live window into their VFX content while recce’ing and shooting. fARsight GO is an iOS application, utilising Apple's ARkit framework for tracking, human pose estimation and depth-based compositing.
fARsight go supports custom film-backs and lens packs and allows you to frame up on virtual content with cm accuracy. It is built using Unreal Engine so the assets created in our other UE4 tools are easily portable to fARsight GO. It's also an excellent tool for briefing actors working against blue-screen, allowing all the storytelling collaborators to see the Virtual environment for themselves. The total footprint is very light, requiring a single operator, an IPad, some fiducial markers and a laptop to support it.
In Camera VFX creation and supervision
In Camera VFX is a methodology for shooting real-time visual effects during a live action film shoot. It brings together digital environments, characters and other elements directly to the set, facilitating a blending of the digital and physical worlds allowing for even greater immersion and collaboration. The technique relies on a mixture of LED lighting, live camera tracking, and real-time rendering with off-axis projection to create a seamless integration between foreground actors and virtual backgrounds. Its primary goal is to remove the need for green screen compositing to produce final pixel results in camera.
In Camera VFX is created within an immersive LED volume and is fully supported by Unreal Engine through multiple systems such as nDisplay, Live Link, Multi-User Editing, and Web Remote Control. The visual fidelity that can be achieved today with real-time rendering can reach final pixel quality — with lighting, reflections, digital environments and visual effects elements all captured live in-camera
Motion Base & Motion Control Supervision
As an extension of the previs and techvis process, FPS is able to provide programmatic data and supervision of motion bases or motion platforms on a shoot. Motion bases are mechanisms that recreate the feeling of being in a real motion environment. It is a platform consisting of a number of telescoping cylinders driven by hydraulic or electric means and these cylinders contract or expand to move the platform using motion control data or via joystick control.
A motion base is usually shot on a green screen stage with the shot element being composited back into the main plate. This makes lighting an integral consideration of the shoot to ensure elements integrate seamlessly. Motion bases may also be used as a simulator, the movement is synchronised with a visual display of the outside world scene, and can be used effectively for in camera VFX.
Motion Control & Simulcam
Motion Control is the term used to describe electronic equipment whose motion is pre-programmed and is often used in instances where the same precise motion needs to be repeated. It is most commonly used for repeating camera moves using camera rigs such as the Technodolly crane. Motion control can also be used to drive Cablecam or Spydercam camera systems, motion bases and robotic arms. Virtually anything that needs to move in a controlled and precise manner can be driven via a motion control system.
Simulcam is a virtual production camera system that utilises the physical camera and uses it as a virtual camera, treating the set or stage as a capture volume in which to track the camera. This allows CG environments as well as pre animated and real time motion captured characters to be superimposed and integrated into a live action shoot. The tool is especially useful for camera framing.