Virtual production has a reputation for being inaccessible for indie filmmakers and small production studios, but it is possible to use virtual production on a small-scale production. With an indie-level budget of just $10,000, we made “Alien Mutiny,” a science-fiction short film inspired by the Alien franchise.
Here’s how we did it:
Step 1: Concept Development
We went into this project with the goal of creating a proof-of-concept for the possibilities of virtual production for indie filmmaking. After looking at some preliminary renders, we settled on a science fiction short film loosely inspired by the Alien franchise.
Step 2: Scripting
We kept the plot pretty simple: two space pirates, Ellen and Wyatt, are expectantly waiting for their fellow crew members to return from a mission when they realize that the probe that docked on their space ship contains something much more sinister — a Xenomorph. Our script was about 3 pages long for a 4 minute long short.
Step 3: Storyboarding
With a script in place, we could start storyboarding. Storyboards are important for any film production, but they’re especially vital for virtual production projects because they will help dictate what the virtual art department (VAD) will build in UE5.
We made sure to indicate which shots would be filmed practically, which would have to be filmed on the LED wall, and which would be rendered completely in UE5.
Step 4: Environment Building
Our team spent the better part of 6 weeks building the levels/environments for our short film in Unreal Engine 5.
An important thing to take into consideration at this step is optimization. This ensures that the environment runs smoothly on the LED wall and doesn’t cause any latency or frame-dropping issues. As a rule-of-thumb, an optimized environment’s frame rate should be around double the fps your camera will be shooting at.
Building only what’s needed in the frame is one of the core pillars of optimization, which is why it was so important that we create our storyboard first. Without a plan of which shots we’ll film in front of our LED wall, our VAD team might build things that never actually get used for the short film. Besides being a waste of time and effort, that also will require extra rendering power and therefore slow your project down.
It’s also important to keep in mind that assets that require complex calculations, like particle systems, lighting systems, and objects with complex geometry, may also slow your project down. The art of optimization lies in creatively figuring out how to fit these assets into the environment in a way that takes up less rendering power. Read more on how to optimize environments here.
Step 5: Motion Capture
We can’t have an Alien short film without a Xenomorph, and to make our Xenomorph, we needed some motion capture.
An affordable alternative to expensive motion capture suits is Move AI’s Move One app, which does single-camera motion capture with just an iPhone. We used the motion data generated through Move One to create our Xenomorph.
Step 6: Set Design
As we’ve said many times before, virtual production is all about blending the virtual world with the physical one, and set decoration goes a long way to making that happen.
Our talented set designer, Thomas Cram, created the spaceship’s control panel and other set design pieces from various calculators and gadgetry from a local thrift store. The control panel was central to many of our shots, and giving our actors something to interact with really helped sell the virtual production effect. We also used things like rags hanging from a C-stand to add foreground elements and help dirty the frame.
Step 7: Casting
When casting for a virtual production film, it isn’t necessary to seek out actors who already have experience acting on VP sets. In fact, placing this restriction may limit the pool of actors you can pick from.
We found that while certain things like flipping the set (which we’ll explain later) can be confusing for your actors, filming with ICVFX virtual production can help immensely with immersion, leading to a better performance.
One of our actors, Tavis Putnam, said filming on an LED wall is “definitely better than pretending [a background] was there.”
Step 8: Techvis & Previs
With everything else in place, the last step before filming was techvis and previs. This included testing the environment on our LED wall to ensure everything is running smoothly, and testing various lighting setups to make sure our lighting matches the environment.
Step 9: Filming
After several weeks of preparation, it’s time to film! We filmed in front of our office’s LED wall, which is comprised of ROE Black Pearl 2V2 (BP2V2) LED panels, our choice for their reliability and durability, and powered by the industry-leading Brompton Technology Tessera S4 LED processor.
As far as LED volumes go, our wall is relatively small at 20 ft x 10 ft, which limits our camera movements. Since our camera was essentially fixed to one spot, we cheated having multiple “angles” by moving the virtual camera in the environment and shifting our set accordingly.
When turning the set, it’s important to consider lighting as well. Instead of moving our lights every time we turned the set, our DOP had the bright idea to light in a triangle and simply rotate the responsibilities between the lights. We used two ARRI Skypanel S60-C lights and an ARRI Orbiter, so their temperatures, saturations, and hues could easily be adjusted when switching responsibilities.
Lighting is important for any production, but it’s especially important for virtual production projects because it helps blend the physical set with the virtual environment. The key to this is DMX (Digital Multiplex) lighting, which has been long used for film production. DMX lights can be placed within an environment in Unreal Engine and easily controlled to work with your scene. For example, for one of our scenes, we placed blinking red lights in our environment and used an automatic trigger to match our lights in the physical space.
Of course, certain shots need more than just motivated lighting to look realistic.
For example, for a shot of Wyatt walking down a hallway, we created a spline in the environment and matched our walking pad speed to the spline speed. To create a sense of parallax, we also placed foreground elements on a dolley track and slowly pulled it in the opposite direction. Lastly, we used a fog machine to dirty the frame, which helps make the background look less video game-esque.
We filmed on the Blackmagic Design URSA Mini Pro 4.6K G2, our powerhouse for virtual production, using the HTC VIVE Mars CamTrack system, one of the best camera tracking systems at an indie price-point. SmallRig also sent us the FreeBlazer Carbon Fibre Tripod, which made it easy for us to move our camera when flipping our set thanks to its light-weight design.
Step 10: Editing
We wanted to line up our short film’s release date with the theatrical release of Alien: Romulus, which was a day after shooting was scheduled to wrap up. So, we decided to do a live edit. During the time between each shot, we’d bring our footage on a hard drive to our editor so he could edit as we were simultaneously filming.
After the initial cuts, we colour-graded our footage to make sure it matched with the final Unreal Engine renders. We kept the post-production VFX minimal, as most of our VFX was captured in-camera. We sprinkled in a little sound design using Soundstripe and Everyday Cinematic Sounds, and voila! We had an indie virtual production short film, ready to present to the world.
While the short film came out a couple days after we had originally planned, it was an amazing feat nonetheless. It was released to positive reception on YouTube, and we’re very proud of our work.
Throughout the production of this short film, we kept a bi-weekly vlog detailing each step and the challenges we faced along the way. If you haven’t already watched the series, you can check it out here.