AI generated video is still in the embryotic stage.
At this point, we are dealing with technology that is nowhere near ready for primetime. However, we have some of the largest companies in the world starting to enter the space with their generative models.
The latest was Meta, who brought it Movie Gen AI to market. This was on the heels of Sora being introduced (but not released for public use) by OpenAi. Google and others are also diving deep into this realm.
It is easy to dismiss this idea simply because the technology is lagging. However, this is where people get caught off-guard.
Chatbots really entered the narrative 23 months ago with the release of ChatGPT 3.0. Since that time, we saw a massive explosion of innovations. In the text realm, there are hundreds of models that people can choose from. In addition to that, we are starting to see companies like Microsoft and Salesforce integrate it into their existing applications.
Video is going to follow a similar path.
Meta Working With Moviemakers
Meta is doing the same thing as OpenAI.
Upon announcing Sora, the company started to engage with people in the business to test out the product while providing feedback. Naturally, the more that utilize it, the better the model becomes.
Meta is taking a different approach since it is entering the AI realm by going open source. This means that anything it brings to market can be used as the basis for other development.
The starting point is to get with people in the business.
The tech giant announced Thursday that it has been working with horror studio Blumhouse and select creators as part of a pilot program for Movie Gen, its generative-AI video models. The company said it will continue to expand the program in 2025.
Again, nothing surprising here. Meta will use the feedback it gets to further develop the capabilities. This will allow it to hone the algorithms to present output that is desired.
For the pilot, Blumhouse selected a group of filmmakers to test out the technology and use Movie Gen’s AI-generated video clips as part of larger pieces: actor and director Casey Affleck (“I’m Still Here,” “Light of My Life”); Aneesh Chaganty (“Searching,” “Run”); and the Spurlock sisters (“The Breakline”), who are participants in Blumhouse’s first annual Screamwriting Fellowship.
This is something that is likely to be expanded as time passes. That said, this is the only announcement we have thus far.
Job Destruction
One of the advantages of following technology is that it gets easy to spot the corporate mantra. We get the same nonsense stated by these companies, right up to the point they change it.
At the top of the list is the "this isn't going to replace people because they are the lifeblood of our company". Instead, they spew how it is a tool meant to help them.
It is the narrative until they announce layoffs. Of course, technology is never given as the reason for people losing their jobs.
This is no different.
“Artists are, and forever will be, the lifeblood of our industry,” Jason Blum, founder and CEO of Blumhouse, said in a statement. “Innovation and tools that can help those artists better tell their stories is something we are always keen to explore, and we welcomed the chance for some of them to test this cutting-edge technology and give their notes on its pros and cons while it’s still in development.”
Should we take Jason Blum at his word? I wouldn't.
The reality is that companies are always looking to cut costs. Reducing payroll, i.e. paying people, is one of the easiest ways to do that.
Does this mean all filmmakers and people associated with the industry will be out of work? No. However, like most technologies, look at the overall numbers.
For example, each scene that is done with software means a set does not have to be built. That means set makers lose out. Then we have the costume department, who is not required. Make up artists and hair stylists do not show up since the computer generates the scene.
Over time, this begins to add up.
Here is the kicker: whatever level the models are operating at, this is the worst it will be. It will only get better from here.
Outside Threat
What is interesting is Hollywood will help train these models which will ensure its own demise.
The disruption is not going to come from the traditional studios. While they will adopt the software, the major damage will come from outside the industry.
Joe Rogan could be considered one of the most influential people in media, at least based upon his paycheck. His podcast gets millions of viewers per episode.
Of course, we have to mention that he did not come from traditional broadcasting. Yet here he is inviting presidential candidates on his show.
Hollywood is going to get destroyed by the hundreds of thousands of aspiring filmmakers who suddenly have very powerful tools at their disposal. It will likely take another couple years before what Meta and OpenAI is bringing out is up to par with the expectations of the audience.
Why this becomes devastating is due to the fact this is a numbers game. Even if 98% of what is generated is pure garbage, the 2% that is of acceptable quality (with maybe 1/10th of 1% being outstanding) will pull eyeballs away.
We are seeing the same path followed. It is occurring with news, sports commentary, and music. Eventually, this will work up the scale to "tv shows" and eventually movies.
Posted Using InLeo Alpha