Luma promoted its AI video generator with some very familiar incorporated monsters

Last week, AI startup Luma posted a series of videos created using its new video-generating tool Dream Machine, which the company describes as a “highly scalable and efficient transformer model trained directly on videos.”

The only problem? At about 57 seconds in, the Dream Machine-generated trailer for Monster Camp — an animated story about furry creatures journeying to a sleepaway camp — features a slightly AI-smudged but still recognizable Mike Wazowski from Pixar’s Monsters, Inc. Many people noticed that multiple characters and its overall aesthetic look borrowed from the franchise, and the questions quickly started pouring in.

Was it fed a prompt asking for animation in a Pixar style? Is it trained on material that includes the Disney studio’s work? That general lack of transparency is one of the biggest concerns about these kinds of models, as Dream Machine joins OpenAI’s Sora, Google’s VideoPoet, and Veo as one of the many text-to-video AI tools shown off in recent months.

Luma hyped its Dream Machine model as the future of filmmaking, featuring “high quality, realistic shots” created simply by typing prompts into a box. Watching videos showing cars racing down a dissolving highway or an awkwardly narrated sci-fi short, you can sort of see why bullish fans of this tech were quick to call it a novel innovation.

Currently, Luma is encouraging people to sign up and play with Dream Machine for free, but the company also has “Pro” and other tiers that charge users fees for more features. We reached out to Luma for comment about where it sources the footage Dream Machine is trained on but did not hear back by time of publishing.

Disney hasn’t publicly commented on what Luma seems to be up to, and it’s possible that the company hasn’t even noticed. But at a time when people have been pushing for more transparency about the datasets powering AI tools like the ones Luma is building, things like Monster Camp make it hard not to look at the generative AI ecosystem as prone to plagiarism.

Source link