Hollywood at a Crossroads: “Everyone Is Using AI, But They Are Scared to Admit It”

For horror fans, Late Night With the Devil marked one of the year’s most anticipated releases. Embracing an analog film filter, the found-footage flick starring David Dastmalchian reaped praise for its top-notch production design by leaning into a ’70s-era grindhouse aesthetic reminiscent of Dawn of the Dead or Death Race 2000. Following a late-night talk show host airing a Halloween special in 1977, it had all the makings of a cult hit.

But the movie may be remembered more for the controversy surrounding its use of cutaway graphics created by generative artificial intelligence tools. One image of a dancing skeleton in particular incensed some theatergoers. Leading up to its theatrical debut in March, it faced the prospect of a boycott, though that never materialized. — Read More

#vfx

Filmmakers Launch AI Studio Late Night Labs 

A group of filmmakers are launching an AI film and animation studio and has snagged some A-list advisors.

Eric Day, Benjamin Michel, and Nick Confalone have launched LA-based Late Night Labs with Poker Face star Natasha Lyonne and Blue Beetle director Angel Manuel Soto among its advisors.

The trio are using generative AI in the creative process but are hoping that the new technology can also provide artists with “tangible ownership” with what they create. — Read More

#vfx

Google targets filmmakers with Veo, its new generative AI video model

It’s been three months since OpenAI demoed its captivating text-to-video AI, Sora, and now Google is trying to steal some of that spotlight. Announced during its I/O developer conference on Tuesday, Google says Veo — its latest generative AI video model — can generate “high-quality” 1080p resolution videos over a minute in length in a wide variety of visual and cinematic styles.

Veo has “an advanced understanding of natural language,” according to Google’s press release, enabling the model to understand cinematic terms like “timelapse” or “aerial shots of a landscape.” Users can direct their desired output using text, image, or video-based prompts, and Google says the resulting videos are “more consistent and coherent,” depicting more realistic movement for people, animals, and objects throughout shots. — Read More

#vfx

The Great Flattening

Apple did what needed to be done to get that unfortunate iPad ad out of the news; you know, the one that somehow found the crushing of musical instruments and bottles of paint to be inspirational:

…Creativity is in our DNA at Apple, and it’s incredibly important to us to design products that empower creatives all over the world…Our goal is to always celebrate the myriad of ways users express themselves and bring their ideas to life through iPad. We missed the mark with this video, and we’re sorry.

The apology comes across as heartfelt — accentuated by the fact that an Apple executive put his name to it — but I disagree with Myhren: the reason why people reacted so strongly to the ad is that it couldn’t have hit the mark more squarely. — Read More

#vfx, #strategy

Kingdom of the Planet of the Apes’ VFX lead argues that the movie uses AI ethically

RightRight now, every industry faces discussions about how artificial intelligence might help or hinder work. In movies, creators are concerned that their work might be stolen to train AI replacements, their future jobs might be taken by machines, or even that the entire process of filmmaking could become fully automated, removing the need for everything from directors to actors to everybody behind the scenes.

But “AI” is far more complicated than ChatGPT and Sora, the kinds of publicly accessible tools that crop up on social media. For visual effects artists, like those at Wētā FX who worked on Kingdom of the Planet of the Apes, machine learning can be just another powerful tool in an artistic arsenal, used to make movies bigger and better-looking than before. Kingdom visual effects supervisor Erik Winquist sat down with Polygon ahead of the movie’s release and discussed the ways AI tools were key to making the movie, and how the limitations on those tools still make the human element key to the process. — Read More

#vfx

How AI adds to human potential

Generative AI is advancing at a breakneck pace, prompting questions on risk and opportunity, from content creation to personal data management. In a special live recording, we delve into the ways AI can augment human work and spur innovation, instead of simply using AI to cut costs or replace jobs. Host Jeff Berman joined a seasoned AI researcher, Intel’s Lama Nachman, and a young start-up founder, Scale AI’s Alexandr Wang, on stage at the Intel Vision event in April 2024. They explore topics like AI’s disruption of creative industries, mitigating its biggest risks (like deep fakes), and why human critical thinking will be even more vital as AI technology spreads. — Read More

#podcasts, #vfx

Next Stop Paris — AI Production Output from TCLtv+

Read More

#vfx

SoA survey reveals a third of translators and quarter of illustrators losing work to AI

Survey on generative AI highlights the growing impact of new technologies on creative careers, and an urgent need for ethical development that works within copyright laws

Throughout January 2024, we ran a survey of our 12,500 members and other authors, receiving nearly 800 responses on respondents’ experiences of generative artificial intelligence (AI) systems, and their views and concerns about the future impact on creative careers.

The findings demonstrate not only the deep uncertainty about the future role of generative AI in the profession, but also the impact it is already having on careers and livelihoods. — Read More

#vfx

How Hollywood’s Most-Feared AI Video Tool Works — and What Filmmakers May Worry About

As generative artificial intelligence marches on the entertainment industry, Hollywood is taking stock of the tech and its potential to be incorporated into the filmmaking process. No tool has piqued the town’s interest more than OpenAI’s Sora, which was unveiled in February as capable of creating hyperrealistic clips in response to a text prompt of just a couple of sentences. In recent days, the Sam Altman-led firm released a series of videos from beta testers who are providing feedback to improve the tech. The Hollywood Reporter spoke with some of those Sora testers about what it can, and can’t, really do.

… [Walter] Woodman [of Shy Kids, a Toronto-based production company,] says he considers Sora another tool in his arsenal, similar to Adobe After Effects or Premiere. “It’s something where you bring your energy and your talents and you work with it to make something,” he explains. “There’s a lot of hot air about just how powerful this is and how this is going to replace everything and how we don’t need to do anything. That’s really undervaluing what a story is and what the components of a story are and what the role of storytellers is.” — Read More

#vfx

200+ Artists Urge Tech Platforms: Stop Devaluing Music

STOP DEVALUING MUSIC. An open letter signed by over 200 musicians calls on AI developers, tech companies, platforms and digital music services to stop using AI to “infringe upon and devalue the rights of human artists.”  — Read More

#audio, #vfx