Washington DC
New York
Toronto
Distribution: (800) 510 9863
Press ID
  • Login
Binghamton Herald
Advertisement
Wednesday, May 13, 2026
  • Home
  • World
  • Politics
  • Business
  • Technology
  • Culture
  • Health
  • Entertainment
  • Trending
No Result
View All Result
Binghamton Herald
No Result
View All Result
Home Business

Why ‘House of David’ director thinks AI can save Hollywood jobs

by Binghamton Herald Report
May 13, 2026
in Business
Share on FacebookShare on Twitter

In 1926, director Cecil B. DeMille hired hundreds of workers to build a set of Jerusalem inside the DeMille Studios in Culver City for the classic silent film “The King of Kings.”

A century later, Jon Erwin filmed his biblical epic ‘The Old Stories: Moses,’ starring Ben Kingsley, on the same studio lot now owned by Amazon MGM Studios.

Except now, much of the architecture, desert location, and supernatural parts of the three-episode miniseries were generated through artificial intelligence. The prequel to ‘The House of David’ series debuts on Amazon Prime on Thursday.

A production that traditionally would have taken months to shoot and require multiple locations was filmed entirely in one week with a crew of just 100 people — who never left Los Angeles.

“We did this massive sword-and-sandal epic, and we never left a soundstage, very similar to how James Cameron does Avatar or how Jon Favreau does ‘The Mandalorian,’” said Erwin, the director of the series. “When you preserve the performance and the work of the crews and the department heads, then you can do things that are incredibly cost-effective for studios.”

As Hollywood grapples with rapid technological change, a growing number of filmmakers and companies in Southern California are using AI tools to radically rethink how films and TV shows are made.

“Some are still resisting, but many are recognizing that, for better or worse, AI is here and not going anywhere and it is important to reimagine what film creation can look like in light of the new possibilities AI creates,” said Victoria Schwartz, director of the entertainment, media, and sports law program at Pepperdine Caruso School of Law.

A screen of LED panels called “the Volume” is used to film scenes for director Jon Erwin’s series “The Old Stories: Moses.”

(Genaro Molina / Los Angeles Times)

Erwin is among the first working directors at a major streaming platform to fully integrate AI into a commercial production.

Last month, he launched Innovative Dream, a Manhattan Beach production services company backed by Amazon. The company will rent its virtual production facilities to other studios and develop training programs for emerging filmmakers.

Although much of Hollywood is bracing for AI to hollow out jobs, Erwin argues the opposite: that AI, applied ethically around human performances, can return at least some production jobs that have been outsourced even as other positions are eliminated.

“I think the greater threat of job loss in our industry is actually just how expensive things have gotten and how long they take to make,” Erwin said. “If you can make things quicker, and you can make things at a price point that studios will say ‘yes,’ you can employ more people in aggregate and create jobs.”

Although computer graphics have been essential to Hollywood since the 1990s, they traditionally required hundreds of artists and months of post-production work to place actors or crowds in digital worlds. Much of the labor-intensive visual effects work known as rotoscoping was outsourced to shops in India and other countries with much lower labor costs than in California.

By 2019, productions such as Disney’s “The Mandalorian” series advanced this further by using massive LED screens to project images of photorealistic digital worlds — “Star Wars” ships, forests, or deserts — as actors’ performed in costume in front of them. A virtual art department spent months designing the digital environments, and then loading them onto the large screen on the day of the shoot.

AI takes the process a step further.

Through “Moses,” Erwin is championing what he calls “hybrid” filmmaking: a workflow that marries live-action with AI-enhanced workflows in virtual production. The process combines what used to be separate phases — filming with actors and visual effects — to occur almost simultaneously. Scenes shot on set is made available to multiple editors and AI artists within minutes on the production floor, as they show near-finished sequences back to the cast and director.

“You can create assets in three or four days, not 10 weeks. And that means you can actually kind of generate the environment while you’re shooting,” he said.

Erwin, 43, grew up in Alabama and built his career around faith-based films such as ‘I Still Believe’ and ‘Jesus Revolution.’ He had spent years trying to tell biblical stories at the scale portrayed in the source material.

When he pitched “House of David,” a drama about the life of King David, studio executives were initially skeptical. “I was told to just come up with a smaller idea,” he said.

To portray Goliath’s origin story, actors were filmed on green screens and AI was used to generate a mythical sequence involving dark sky, rain, mountains and angels with wings.

It marked one of the first integrations of generative AI in a major commercial production. The series, which premiered last year was viewed by 44 million viewers worldwide and reached No. 1 on Prime Video in the U.S.

By Season 2, the team used 30 different tools, both traditional and AI, to generate images, sounds and video. They pivoted from shooting solely on location in Greece to filming some parts in L. A. in front of an LED wall.

AI was used to generate battle scenes and expand the background crowd size to thousands of people in a fraction of the time traditional CGI required. The use of AI-generated scenes jumped from 70 in Season 1 to 400 shots in the second season.

Jeff Thomas, a generative AI filmmaker who directed two episodes of Season 2, said each episode was made for less than $5 million, defying studio consensus that the show required a “Game of Thrones”-level budget of $12 million to $15 million per episode. Erwin declined to disclose the budgets for the “House of David” series or the “Moses” prequel..

“The Bible describes that battle as there was 100,000 people on each side. Well, it’s never been portrayed like that because we’ve never had the resources,” Erwin said. “We’re finally able to show that scope and scale.”

Erwin conceived of the idea of “Moses” over Christmas, wrote the script in January and created a four-minute trailer entirely created by AI. Amazon greenlighted the series later that month.

Kingsley had a short window before his next commitment, so Erwin prepared and shot all three episodes on a soundstage in a week — a project that would have previously taken six months to prepare.

For the pivotal Red Sea scene, Erwin generated the water volumes and tidal waves in less than hour using AI models from Chinese company Kling AI and Palo Alto-based Luma AI, which would have taken weeks in the traditional process. They wrote text prompts that explored 18 different variations of the sea parting and discarded the ones that didn’t work, enabling Kingsley to react to a tidal wave projected onto a 360-degree LED wall screen.

“‘Moses’ really represented a whole new method of filmmaking for me,” Erwin said.

Jon Erwin stands in front of a screen of LED panels he used to film "The Old Stories: Moses"

For “The Old Stories: Moses,” director Jon Erwin used AI for wide shots, stunt-heavy battle sequences and to generate large crowds to showcase the grand scope of biblical stories. The red line he said he wouldn’t cross is using it in place of actors.

(Genaro Molina / Los Angeles Times)

For crucial scenes portraying the palace hallway in Egypt, where Moses talks to the Pharaoh, they built cardboard boxes as the columns in the palace, and “reskinned” them with intricate carvings using AI. Although the set could accommodate only 20 extras, they used AI to create hundreds of background actors.

Erwin also used generative AI to synthetically expand partially built sets featuring sand and rocks and to “de-age” Kingsely to appear as a young Moses.

But some things were off limits for AI, including Kingsley’s performance.

“I just think our faces are so intricate and the micro expressions are so intricate, so that’s always real,” he said.

Instead, AI was used to co-design the character: Erwin originally imagined a bald Moses, but based on Kingsley’s feedback, they fine-tuned the look with weathered hair and mustache.

“The line in the sand for me is replacing an actor,” Erwin said. “I don’t want to be in the industry if I can’t work with actors.”

The "hybrid" production creates AI-generated environments such as forests, deserts and battle sequences.

Jon Erwin’s “hybrid” production involves generating a variety of environments such as forests, deserts, or battle sequences using AI, and projecting them on the LED screen.

(Genaro Molina / Los Angeles Times)

When asked about the background extras displaced by AI crowd generation, Erwin said that’s the wrong way to think about it.

“It’s not a comparison of what would “Moses” have cost otherwise. It’s a comparison of “Moses” would have never been made otherwise, and that’s the way you have to think about it,” he said.

Overall contraction in Hollywood has led to fewer films being shot on location in Los Angeles, and a 30% drop in entertainment industry jobs since its 2022 peak.

“I think you can do those things three to five times faster, at less than 30% the cost,” he said. “I actually see this tool set as an antidote to the job loss problem in our industry.”

In 1926, director Cecil B. DeMille hired hundreds of workers to build a set of Jerusalem inside the DeMille Studios in Culver City for the classic silent film “The King of Kings.”

A century later, Jon Erwin filmed his biblical epic ‘The Old Stories: Moses,’ starring Ben Kingsley, on the same studio lot now owned by Amazon MGM Studios.

Except now, much of the architecture, desert location, and supernatural parts of the three-episode miniseries were generated through artificial intelligence. The prequel to ‘The House of David’ series debuts on Amazon Prime on Thursday.

A production that traditionally would have taken months to shoot and require multiple locations was filmed entirely in one week with a crew of just 100 people — who never left Los Angeles.

“We did this massive sword-and-sandal epic, and we never left a soundstage, very similar to how James Cameron does Avatar or how Jon Favreau does ‘The Mandalorian,’” said Erwin, the director of the series. “When you preserve the performance and the work of the crews and the department heads, then you can do things that are incredibly cost-effective for studios.”

As Hollywood grapples with rapid technological change, a growing number of filmmakers and companies in Southern California are using AI tools to radically rethink how films and TV shows are made.

“Some are still resisting, but many are recognizing that, for better or worse, AI is here and not going anywhere and it is important to reimagine what film creation can look like in light of the new possibilities AI creates,” said Victoria Schwartz, director of the entertainment, media, and sports law program at Pepperdine Caruso School of Law.

A screen of LED panels called “the Volume” is used to film scenes for director Jon Erwin’s series “The Old Stories: Moses.”

(Genaro Molina / Los Angeles Times)

Erwin is among the first working directors at a major streaming platform to fully integrate AI into a commercial production.

Last month, he launched Innovative Dream, a Manhattan Beach production services company backed by Amazon. The company will rent its virtual production facilities to other studios and develop training programs for emerging filmmakers.

Although much of Hollywood is bracing for AI to hollow out jobs, Erwin argues the opposite: that AI, applied ethically around human performances, can return at least some production jobs that have been outsourced even as other positions are eliminated.

“I think the greater threat of job loss in our industry is actually just how expensive things have gotten and how long they take to make,” Erwin said. “If you can make things quicker, and you can make things at a price point that studios will say ‘yes,’ you can employ more people in aggregate and create jobs.”

Although computer graphics have been essential to Hollywood since the 1990s, they traditionally required hundreds of artists and months of post-production work to place actors or crowds in digital worlds. Much of the labor-intensive visual effects work known as rotoscoping was outsourced to shops in India and other countries with much lower labor costs than in California.

By 2019, productions such as Disney’s “The Mandalorian” series advanced this further by using massive LED screens to project images of photorealistic digital worlds — “Star Wars” ships, forests, or deserts — as actors’ performed in costume in front of them. A virtual art department spent months designing the digital environments, and then loading them onto the large screen on the day of the shoot.

AI takes the process a step further.

Through “Moses,” Erwin is championing what he calls “hybrid” filmmaking: a workflow that marries live-action with AI-enhanced workflows in virtual production. The process combines what used to be separate phases — filming with actors and visual effects — to occur almost simultaneously. Scenes shot on set is made available to multiple editors and AI artists within minutes on the production floor, as they show near-finished sequences back to the cast and director.

“You can create assets in three or four days, not 10 weeks. And that means you can actually kind of generate the environment while you’re shooting,” he said.

Erwin, 43, grew up in Alabama and built his career around faith-based films such as ‘I Still Believe’ and ‘Jesus Revolution.’ He had spent years trying to tell biblical stories at the scale portrayed in the source material.

When he pitched “House of David,” a drama about the life of King David, studio executives were initially skeptical. “I was told to just come up with a smaller idea,” he said.

To portray Goliath’s origin story, actors were filmed on green screens and AI was used to generate a mythical sequence involving dark sky, rain, mountains and angels with wings.

It marked one of the first integrations of generative AI in a major commercial production. The series, which premiered last year was viewed by 44 million viewers worldwide and reached No. 1 on Prime Video in the U.S.

By Season 2, the team used 30 different tools, both traditional and AI, to generate images, sounds and video. They pivoted from shooting solely on location in Greece to filming some parts in L. A. in front of an LED wall.

AI was used to generate battle scenes and expand the background crowd size to thousands of people in a fraction of the time traditional CGI required. The use of AI-generated scenes jumped from 70 in Season 1 to 400 shots in the second season.

Jeff Thomas, a generative AI filmmaker who directed two episodes of Season 2, said each episode was made for less than $5 million, defying studio consensus that the show required a “Game of Thrones”-level budget of $12 million to $15 million per episode. Erwin declined to disclose the budgets for the “House of David” series or the “Moses” prequel..

“The Bible describes that battle as there was 100,000 people on each side. Well, it’s never been portrayed like that because we’ve never had the resources,” Erwin said. “We’re finally able to show that scope and scale.”

Erwin conceived of the idea of “Moses” over Christmas, wrote the script in January and created a four-minute trailer entirely created by AI. Amazon greenlighted the series later that month.

Kingsley had a short window before his next commitment, so Erwin prepared and shot all three episodes on a soundstage in a week — a project that would have previously taken six months to prepare.

For the pivotal Red Sea scene, Erwin generated the water volumes and tidal waves in less than hour using AI models from Chinese company Kling AI and Palo Alto-based Luma AI, which would have taken weeks in the traditional process. They wrote text prompts that explored 18 different variations of the sea parting and discarded the ones that didn’t work, enabling Kingsley to react to a tidal wave projected onto a 360-degree LED wall screen.

“‘Moses’ really represented a whole new method of filmmaking for me,” Erwin said.

Jon Erwin stands in front of a screen of LED panels he used to film "The Old Stories: Moses"

For “The Old Stories: Moses,” director Jon Erwin used AI for wide shots, stunt-heavy battle sequences and to generate large crowds to showcase the grand scope of biblical stories. The red line he said he wouldn’t cross is using it in place of actors.

(Genaro Molina / Los Angeles Times)

For crucial scenes portraying the palace hallway in Egypt, where Moses talks to the Pharaoh, they built cardboard boxes as the columns in the palace, and “reskinned” them with intricate carvings using AI. Although the set could accommodate only 20 extras, they used AI to create hundreds of background actors.

Erwin also used generative AI to synthetically expand partially built sets featuring sand and rocks and to “de-age” Kingsely to appear as a young Moses.

But some things were off limits for AI, including Kingsley’s performance.

“I just think our faces are so intricate and the micro expressions are so intricate, so that’s always real,” he said.

Instead, AI was used to co-design the character: Erwin originally imagined a bald Moses, but based on Kingsley’s feedback, they fine-tuned the look with weathered hair and mustache.

“The line in the sand for me is replacing an actor,” Erwin said. “I don’t want to be in the industry if I can’t work with actors.”

The "hybrid" production creates AI-generated environments such as forests, deserts and battle sequences.

Jon Erwin’s “hybrid” production involves generating a variety of environments such as forests, deserts, or battle sequences using AI, and projecting them on the LED screen.

(Genaro Molina / Los Angeles Times)

When asked about the background extras displaced by AI crowd generation, Erwin said that’s the wrong way to think about it.

“It’s not a comparison of what would “Moses” have cost otherwise. It’s a comparison of “Moses” would have never been made otherwise, and that’s the way you have to think about it,” he said.

Overall contraction in Hollywood has led to fewer films being shot on location in Los Angeles, and a 30% drop in entertainment industry jobs since its 2022 peak.

“I think you can do those things three to five times faster, at less than 30% the cost,” he said. “I actually see this tool set as an antidote to the job loss problem in our industry.”

Previous Post

Disabled veterans may be getting a big property tax break in California

Next Post

Trump Arrives In Beijing For Crucial Talks With Xi Jinping On Trade, Tariffs And Taiwan

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

BROWSE BY CATEGORIES

  • Business
  • Culture
  • Entertainment
  • Health
  • Politics
  • Technology
  • Trending
  • Uncategorized
  • World
Binghamton Herald

© 2024 Binghamton Herald or its affiliated companies.

Navigate Site

  • About
  • Advertise
  • Terms & Conditions
  • Privacy Policy
  • Disclaimer
  • Contact

Follow Us

No Result
View All Result
  • Home
  • World
  • Politics
  • Business
  • Technology
  • Culture
  • Health
  • Entertainment
  • Trending

© 2024 Binghamton Herald or its affiliated companies.

Welcome Back!

Login to your account below

Forgotten Password?

Retrieve your password

Please enter your username or email address to reset your password.

Log In