So, while I’m buried in the development for Chainsaw Fairytale (aka Fairytale Princesses vs the Horror Genre – I just CANNOT seem to decide between the two titles), the whole AI thing keeps rearing its head. Obviously.

Let’s talk about the Virtual Filmmaking thing first. As Chainsaw Fairytale largely takes place in a fairytale/fantasy landscape that starts out bright and cheerful but gets darker as the movie progresses, it seemed like a pretty obvious candidate for using virtual filmmaking. Everyone’s top choice of an LED immersive soundstage, Mandalorian-style, is likely to remain outside of our budget but that hasn’t stopped me grabbing a few figures and looking at them. My logic in this case; even if the going daily rates for use of LED soundstages are terrifying to a low-budget filmmaker, they still might be possible if the pre-production process allowed for such an intense level of planning that the shooting schedule could be dropped to a handful of days.
This was a trick I learned on my third movie, KillerKiller, which was mainly shot in a location that we couldn’t really afford. A sprawling former mental hospital with a terrifying past (which I wrote about over here), we managed to shoot a ridiculous percentage of our film in that incredibly atmospheric location by eliminating everything with too many variables. At the scripting stage, we worked out that most of the kill scenes needed to be shot somewhere cheaper. We worked out that if we filmed with available light (with reflectors where necessary) we could massively reduce the set-up time for each shot. Much as I’d have loved to have shot the whole goddamn thing at night, it would probably have trebled the cost of production.

So all the dialogue took place in that brilliant, sprawling location; we rehearsed somewhere cheap and then hit the ground running in the hospital, making incredibly quick decisions and using the location as best we could. We got about 75% of the finished film in the can over the course of three days. The remaining 25% we spent our good sweet time on, in cheaper locations and using more complicated set-ups. It was a learning curve, and I learned a lot.
So I tried to investigate to what degree we could use similar principles for virtual filmmaking on a low budget. There are a lot of similarities, in many ways. An LED studio is an incredibly costly environment, but if that gets factored in during the screenwriting stage it struck me that there might be possibilities for getting big chunks done quickly in that environment. The main thing that worried me was the variables, though; what kind of turnaround time might there be for swapping out one Unreal Engine environment for another? Might there be calibration issues that could swallow up hours of potential shooting? These are the sorts of things I’m still digging into, but I’m not actually ruling an LED studio out at this point until I’ve got my head around the workflow a little better.
Assuming, however, that we don’t go with an LED studio, where does that leave us? Well, I’ve been looking at VIVE Mars Cam Track as a solution for shooting in a green-screen environment instead. It presents a few more problems in terms of effective lighting to bed the characters into the environment (the light thrown out by the LED walls does a lot of the heavy lifting in that kind of environment) but I’ve been looking into current developments in Image Based Lighting to see if we might be able to counter the problem that way.
Aside from the Virtual Filmmaking stuff, I’ve also been looking into the options for crowd-related VFX for a couple of short high-impact shots in the script. The standard-bearer for this kind of thing has been MASSIVE for a good couple of decades, but people have been kindly tipping me off towards some other options that could potentially work out cheaper. It’s quite nice to be working through all this kind of development/pre-production (God, the lines get blurry when you’re creating some elements in computers over a year before you’ll be filming the live action elements).
And that brings me to the AI side of things, which seems to be developing at such a dizzying speed that I can barely keep track of it from week-to-week let along consider what the landscape is going to realistically look like by next July. I talked about AI quite a bit in my recent talk at Horror-on-Sea – the possibilities, the threats and the way its going to upend the entire creative landscape whether we’re ready for it to or not. The moral argument about the way AI might be used in finished productions is enormous can of worms that I’m not even going to peek into the thing in a short piece like this, but the tools that can be used for pre-vis and development are incredibly useful. Whether its character design or creating something rather more detailed than a storyboard to give your DoP a sense of your desired shots, the tools available right now are a million miles away from the ones that were available at this point last year.
My motto has always been ‘do stuff whilst it’s hard to do’. If you’re a filmmaker sitting around waiting for stuff to get easier, you’re sitting around waiting for your competition in that sector to increase by 1000%. The only reason my first flick got decent distribution (and, by extension, the only reason I’ve maintained a career for the last 20 years) was because I did it when it was really tough. There were only 16 British horror movies made that year, before the digital boom, and it was a damn sight harder to make stuff and a damn sight easier to get people to pay attention once you’d finished.
I’m embracing the future and turning it into the most messed-up fairytale you’ve ever seen.
Leave a comment