How I made a cinematic movie trailer using Midjourney
So this week I did a little fun experiment where I used Midjourney and another tool called Runway ML, which is a video AI tool with various features such as text-to-video, image-to-video, video-to-video, and many more.
Runway ML does a very similar job to Midjourney but for videos, and they have been leading the generative AI for video production so far.
In this newsletter, I’m going to write about how I produced a movie trailer using Midjourney images and Runway’s new tool, which allows you to turn images into videos. This feature is only the tip of the iceberg in terms of what we will achieve very soon, and it is very promising.
So here is the video I made using Midjourney and Runway:
Step 01: Brainstorming with AI assistant
For the first step, I wrote a plot for a short movie trailer with my favourite AI assistant, Pi.ai.
This AI assistant is designed to be capable of deep conversations; therefore, it’s a great companion for when I want to brainstorm ideas.
Step 02: Prompt crafting
I used my prompt crafting skills to turn the plot that I generated with Pi.ai into some good Midjourney prompts to get cinematic imagery. The keywords that helped these images look like cinematic and dreamy imagery were: Cinematic, dreamy landscape and pink grass.
Step 03: Setting up the Runway account
I signed up for a Runway ML free account but ended up buying the Standard plan for $15 since I ran out of credits to perform my exercise (which I will probably cancel as I don’t need it).
Step 04: AI video production
I used the “Text/Image to Video” tool and uploaded the images from Midjourney and clicked on the generate button without putting any text prompts.
The downside of Runway is the lack of control over the outcome since you can’t specify any description/prompt for the video, and you get to only upload your image and pray that Runway animates it in a good way. For this purpose, you might end up regenerating some of your videos since they won’t be good enough at first.
Make sure that the images that you input have some indication of movement, for example, people, cars, planes, clouds, or anything else that we expect to not remain still during the video. This way, Runway will produce much better results.
Step 05: Editing
I used the videos, which are only about 4 seconds each, to create a new project in Adobe Premier, made a few edits, added music, and was done! We have a cool cinematic trailer!
Finally, Runway is a promising tool in AI video production and can potentially revolutionise video production. Imagine if something like Runway got integrated with popular video editing tools such as Adobe Premier or Final Cut Pro! This would be a game-changer in many ways. You no longer need huge budgets to produce a film therefore, so many people who did not previously get a chance to make a film are then able to compete with other directors with million-dollar budgets!
For more weekly articles like this, feel free to subscribe to my free newsletter “Sundays’ Pivotpoint”.