USA technology

Luma AI debuts Dream Machine 1.6 with camera motions


Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More


AI video generators are multiplying in number and increasing in realism, but one big roadblock preventing them replacing traditionally filmed video is in a lack of fine-grained control.

Many leading AI generators allow users to enter in text prompts and/or upload still images that the underlying models convert into motion, but the resulting video clips are often a surprise and may feature unrealistic or bizarre motion.

Now one well-regarded AI video generation startup, Luma AI, is adding a new set of more precise controls for users and releasing its newest AI video generator model, Dream Machine 1.6.

Specifically, Luma’s Dream Machine 1.6 offers list of 12 different camera motions that a user can apply when typing in a text prompt into its website’s generation bar.

These include:

  1. Pull Out
  2. Pan Left
  3. Pan Right
  4. Orbit Left
  5. Orbit Right
  6. Crane Up
  7. Crane Down
  8. Move Left
  9. Move Right
  10. Move Up
  11. Move Down
  12. Push In

The user access these by typing in the word “camera” at the beginning of their prompt — whether using a still image or pure text to start — and should see a dropdown menu automatically appear listing all of these options.

Many of the descriptions of these camera motions are self-explanatory, though those unfamiliar with filmography may at first wonder what they mean.

Fortunately, Luma is also aiding in bringing new users along for the ride by also showing a small 3D animation pop-out beside each camera motion that represents what the user will see when applying it to the resulting generated clip. See the below screenshot for an example.

Initial reactions are wildly positive

Among early adopters in the burgeoning AI video creation scene, those who have tried Luma’s new Dream Machine 1.6 camera controls say that they are a significant upgrade and addition to their toolset.

“The new 1.6 model appears to be fine tuned on specific phrases (such as camera push in, camera orbit left) which is helpful as there are typically many different ways to describe the same camera movement,” wrote AI video creator Christopher Fryant in a direct message on the social network X sent to this VentureBeat journalist. “Knowing which phrases are fine-tuned saves a lot of time on guesswork.”

“Additionally the range and strength of the camera motion seems to have been increased quite a bit. The results show a definite increase in dynamic motion. Here’s a good example of that:”

Similarly, AI video creator Alex Patrascu wrote on X that the update was “top stuff!”

Feature arms race among AI video providers

The addition of camera motions follows the release of Luma Dream Machine 1.5 last month, which promised higher-quality, more realistic text-to-video generations.

It also comes in direct competition to Runway’s Gen-2 model, which added a variety of motion features including a Multi-Motion Brush earlier this year.

Close AI industry observers have also spotted indications that Runway plans to introduce a similar feature for its latest and most realistic AI video generation model, Gen-3 Alpha Turbo, which is considered by many AI video creators to be the “gold standard” in quality.

Meanwhile, OpenAI’s Sora, which started the year off by wowing observers with its realism, still remains unreleased to the public for now — 7 months later.

Regardless, the addition of camera controls to Dream Machine 1.6 shows that AI video continues to advance at a rapid pace, offering users more fine grained controls and ever higher quality visuals — putting it closer on par to what a traditional director is able to achieve, at a fraction of the time and cost.

Enterprise decision makers looking to equip their company with cutting-edge video production tools for creation of internal videos or external-facing marketing would do well to consider Dream Machine 1.6 among their options.

Avatar

Vasundhara Mali

About Author

Leave a comment

Your email address will not be published. Required fields are marked *

You may also like

USA technology

Mysten Labs unveils specs for SuiPlay0X1 Web3 gaming handheld

GamesBeat Next is connecting the next generation of video game leaders. And you can join us, coming up October 28th
USA technology

Best Noise-Canceling Headphones (2024): Over-Ears, Wireless Earbuds, Workout

Now that the majority of new headphones and earbuds offer at least a modicum of noise canceling, it’d be impossible