top of page

Sigma_001

Unreleased

An engineer (John D. Michaels) at a major tech company sneaks a skeptical journalist (Maria-Elena Laas) inside to interview what he believes is the first sentient AI (Chase Simone).

Anchor 1

15 minutes   |   AI Short Form

PASSWORD ./enter

be sure sur use the "dot-slash"

​

Directed By: Quinn Halleck
Production Company: Late Arrivals 

 

Written By: Drew Maron

Written By: Quinn Halleck

​

Produced By: Mike Gioia

Produced By: Ian Eck

Produced By: Quinn Halleck

Produced By: Bryce Dressler

 

Full Credits in Film

The Concept

This project began based on a true story, I found myself captivated and inspired by the true story of a major tech company that is revealed to have a potentially sentient AI.  My goal was to shed light on the absence of adequate policies governing this groundbreaking technology.

​

 I am hoping to spark meaningful conversations and a deeper understanding of the profound impact that artificial intelligence is having on our lives.

sigma_006_low.gif

Why AI?

The film didn’t begin as an “AI film.” Instead, it was a very traditional narrative project with a story the team and I loved. Early in the writing process, we began to see tools emerge that we found could help us tell our story better: the rise of deepfakes, the emergence of voice-changing AI tech, and, of course, ChatGPT. 

 

As we used one tool, we’d discover another. And another. And Another. Eventually, the secondary goal was to lean into this new AI pipeline as much as possible, using AI tools to help us with our script, storyboards, actor-prep, and eventually VFX. We embraced AI as a collaborator. 

​

​With “Sigma_001” I’ve asked each one of my crew members to find and use the latest tools in AI in their respective department and apply them in their creative process.

sigma_008_low.gif

Workflow

Pre-Production

Early versions of ChatGPT served as a creative sounding board for my writer and collaborator, Drew Maron. They guided our script and enriched our story. We used conversations with these language models and transcripts of engineers interacting with AI to ensure Sigma’s behavior felt true to real-world technology. Prompt engineers utilized tools like DALL-E and Midjourney to generate concept art for our world and characters, while Runway helped us turn that art into storyboards and previsualization.

sigma_004_low.gif

Principle Photography

As we moved into production, I would ask my actors to use AI chatbots tailored specifically to help find their characters, uncovering the nooks and cranny’s for backstories that would have not otherwise made it into our world.

sigma_003_low.gif

VFX

In post production, Gavin Bedford, Ian Eck, and artists by the name of CoffeeVectors and Druuzl would use tools such DeepFaceLabs were used to to deepfake my characters face to shift from one persona to the next. And Geraldo Gutierrez would use AI voice generation from a company called ElevenLabs to bring a voice to our AI character “Sigma.”

sigma_009_low.gif

The Result

The overarching goal was twofold: create a captivating AI origin story and, however meta, embrace the very AI tools we were using to help tell our story. It wasn’t simply about cool new visual effects, but it was about expanding the bandwidth of our entire creative process.

 

I mention the names of my crew not only to give them credit, but to illustrate my idea that it takes real artist to bring the film to its final vision.

 

In all over 100 people worked on this film. Alongside nearly 15 different AI tools

sigma_001_low.gif
bottom of page