Movie ratings are vital to a film’s bottom line and determine its impact on audiences. Traditionally, a movie is manually rated by humans watching it, taking into account violence, drug abuse, and sexual content.
This dynamic could change soon with the rise of artificial intelligence (AI). Recently, researchers at the USC Viterbi School of Engineering used AI tools to rate a movie within seconds. One of the most impressive aspects of this approach is that the rating could be done based solely on the movie script, without shooting a single shot. Because of this, movie executives could develop a script, make edits, and design a movie rating in advance and before shooting any scenes.
The newly developed approach would have a financial impact on studios, but it can also help the creative minds develop and edit a story based on the predicted impact and response from viewers.
The research was led by Shrikanh Narayanan, University Professor and Niki & C. L. Max Nikias Chair in Engineering, alongside a team of researchers from the Signal Analysis and Interpretation Lab (SAIL) at USC Viterbi.
Applying AI to Scripts
After applying AI to movie scripts, the team found that linguistic cues can indicate certain behaviors surrounding violence, drug abuse, and sexual content that are about to be demonstrated by the characters. These content categories are often used to rate today’s movies.
The team utilized 992 movie scripts that were determined by Common Sense Media to have violent, substance-abuse, and sexual content. The non-profit organization is responsible for making movie recommendations for families and educational institutions.
A trained AI was then applied to the 992 scripts, identifying risk behavior, patterns, and particular language. It first receives the script as input before processing it through a neural network, which scans for semantics and sentiment expressions.
The AI works as a classification tool, labeling sentences and phrases as positive, negative, aggressive, or some other descriptor. Words and phrases are also classified into three categories: violence, drug abuse, and sexual content.
Victor Martinez is a doctoral candidate in computer science at USC Viterbi and lead researcher.
“Our model looks at the movie script, rather than the actual scenes, including e.g. sounds like a gunshot or explosion that occur later in the production pipeline,” Martinez said. “This has the benefit of providing a rating long before production to help filmmakers decide e.g. the degree of violence and whether it needs to be toned down.”
“There seems to be a correlation in the amount of content in a typical film focused on substance abuse and the amount of sexual content. Whether intentionally or not, filmmakers seem to match the level of substance abuse-related content with sexually explicit content,” he continued.
Findings and Correlations
One of the researchers’ findings was that it is highly unlikely for a movie to contain high levels of all three risky behaviors, which is likely to be caused by the standards set by the Motion Picture Association (MPA). They also found a correlation between risk behaviors and MPA ratings. For example, MPA puts less emphasis on violence/substance-abuse content as sexual content increases.
“At SAIL, we are designing technologies and tools, based on AI, for all stakeholders in this creative business — the writers, film-makers and producers — to raise awareness about the varied important details associated in telling their story on film,” Narayanan said.
“Not only are we interested in the perspective of the storytellers of the narratives they weave, but also in understanding the impact on the audience and the ‘take-away’ from the whole experience. Tools like these will help raise societally-meaningful awareness, for example, through identifying negative stereotypes.”
The research team also includes Krishna Somandepalli, a Ph.D. candidate in Electrical and Computing Engineering at USC Viterbi, and Professor Yalda T. Uhls of UCLA’s Department of Psychology.
The research was presented at the EMNLP 2020 conference.
- How VCs Can Identify High Potential Investments Using Artificial Intelligence
- Brain Machine Interface Types Mental Handwriting
- Neural Rendering: How Low Can You Go In Terms Of Input?
- Researchers Simulate Movement of Single-Celled Organisms
- Determining The Extent Of Video Surveillance Through Google Street View Data