Antavira is a graduate of the Sprint 2.0 accelerator by IIDFRead more →

Predictive modeling through the lens of the film industry

Predictive modeling is the process of using data and algorithms to create models that can predict future events and trends or their likelihood of occurring. This process has found application in various industries, including the film industry.

Predictive analytics in the film industry can be used to predict the success of a film based on historical data on previous works, audience preferences, and demographic factors. This helps studios and directors make decisions regarding financing, marketing, and distribution of films.

Moreover, in cinema, predictive modeling can play a significant role in creating engaging and captivating storylines as well as in visual effects. It enables the invention of incredible worlds, vivid action scenes, and fantastic characters that leave audiences incredibly impressed.

Films that use predictive modeling tend to reflect the current technological development of society at the time of their production. For instance, the use of virtual reality, artificial intelligence or biometric technologies in movies shows the current trends and possibilities of the digital world.

Predictive modeling in cinema can serve not only to reflect the current technological development, but also to explore potential future scenarios and the impact of new digital technologies on society and human life. Movies whose plots revolve around the use of technologies directly related to building predictive models can stimulate discussion about the possible consequences and ethical issues that arise from the introduction of such progressive technical innovations.

Let’s recall a few movies and TV series where the plot touches on the subject of predictive modeling.

An amazing fact, but the Soviet film “Sluzhebny roman”, which premiered back in 1977, indirectly covers this topic. At first glance, the film focuses on the romantic relationship between the protagonists Lyudmila Prokofievna Kalugina and Anatoly Efremovich Novoseltsev and the comical clumsy situations they find themselves in at work. However, a careful study of the place of work of the heroes and minor details makes it clear that the plot unfolds at a Moscow statistical enterprise, showing the work of typical Soviet statisticians of that era. Numerous employees of this enterprise are engaged in the maintenance of statistical records in various branches of Soviet industry. The enterprise has several departments, for example, the department of the chemical industry, the construction department, or, as Novoseltsev called it, the department of “easy” industry; the work of these departments determines how the Soviet people will be supplied with products. Moreover, thanks to the unique episode in which Lyudmila Prokofievna severely reprimands Novoseltsev for a poor report due to unverified data, it becomes clear that the employees collect procurement statistics, and in fact their activity is aimed at building a predictive model of consumer demand. This episode is also distinguished by a quote from the main character Kalugina, which later came into our use as a catchphrase: “Statistics is a science, it does not tolerate approximation.”

At the same time, it is interesting to observe the accounting and computing equipment that abounds in the Soviet feature film, showing the advanced technical equipment of the enterprise at that time. Thus, the installation of computers at the enterprise is directly mentioned by the main character Lyudmila Prokofievna, who has a Videoton-340 video terminal on her desk, produced in the 1970s in Hungary and used in conjunction with an electronic computing machine of the earliest generation (apparently located in a separate huge hall). Ordinary employees use simpler equipment. For example, on Novoseltsev’s desk, there is a huge Soemtron 220 calculator, produced from 1966 to 1977 in the GDR and based on discrete transistor logic and gas-discharge indicators. Other employees have small network calculators on their desks, supposedly elka 50m, produced since 1978 in Bulgaria. And on the wall, there are ordinary scoreboards, presumably for conservative statisticians.

Thus, the popular Soviet film “Sluzhebny roman”, which has acquired the status of a cult classic, is remembered not only for its humor and laid-back style of storytelling, but also for reflecting the technological development of Soviet society in the second half of the 20th century. In addition to the romantic line, the film directed by Eldar Ryazanov draws our attention to how the digital revolution affects the technical support of the statisticians of that time, working on the creation of predictive models.

Let’s consider an example of a movie from a later time period.

To our deepest regret, the end of the 20th century and the first decade of the 21st century were marked by a significant number of terrorist acts in the world, which had a serious impact on the international situation and security. These attacks have raised concerns, forced to focus efforts on fighting terrorism and improving security systems at both the international and national levels. Their influence was felt in all spheres of society, including the film industry: during this period, a large number of films and series, covering the topic of international security and the formation of counter-terrorism strategies, including the construction of predictive models, were produced.

In the movie “Source Code” (2011), the plot is directly related to predictive modeling. The film, directed by Duncan Jones, tells the story of a secret government project that has developed technology that allows people to dive into the “source code” and restore the past events to prevent future terrorist attacks.

The protagonist of the film Colter Stevens, against his will, becomes a participant in this project and is used by the military and scientists in order to get into the last eight minutes of the life of another person who died during an explosion on the railroad in order to prevent another explosion in downtown Chicago. He repeatedly goes back in time, relives what has already happened again and again, and follows various patterns of behavior in order to change the outcome of events.

Predictive modeling in “Source Code” allows characters to analyze past events and predict possible scenarios. They model various options based on the information they have and make decisions to reach their desired outcome and save the city from another tragedy. This creates intrigue and tension in the film, as well as opens up opportunities to explore concepts of time, reality, and personal identity.

Accordingly, the technology behind the secret project in the action movie “Source Code” is inextricably linked to predictive modeling and its use to analyze the past and predict future events. This is a key element of the film and brings an element of science fiction and intrigue to its storyline.

Continuing to analyze the evolution of mankind through the historical perspective, it should be noted that the period from the second decade of the 21st century to the present is becoming a period of intensive technological development that has affected many areas of life and led to significant changes in society. Key trends and achievements of this period include the expansion of mobile technologies, the development of social networks and dating applications (Instagram, Tinder, etc.), the rise of cloud technologies, breakthroughs in artificial intelligence, and the rapid growth of online commerce. The film industry, of course, could not remain indifferent to the wave of digital progress, which covered almost all areas of activity.

For example, the science fiction anthology series Black Mirror, which premiered in 2011, is permeated throughout with the leitmotif of the impact of information technology on society and interpersonal relationships.

In the episode “Hang the DJ” (2017), the creators use their favorite trick, bringing to the point of absurdity a familiar element of modern reality – a dating application. In this case, the concept of predictive modeling is used as the basis for a dating algorithm: a handheld device continuously collects information about the participant’s behavior with various partners and uses the collected data to select the ideal partner. As a result, in order for the system to study behavioral reactions, thoughts, dreams and make a “cast of the mind” for the selection of a fully compatible couple, it is first necessary to be in a relationship with many partners. However, the duration of the relationship and the identity of the partners are dictated by the system.

Accordingly, this episode of the series explores the topic of predictive modeling in the context of dating and relationships, raising questions about freedom of choice, the role of technology in our personal lives and its impact on our decisions. This series allows you to reflect on how much we are ready to rely on algorithms and predictions in the field of interpersonal relationships and on how much they can determine our happiness and future.

Thus, predictive modeling is often used in the film industry to improve the quality and success of films. It helps to predict trends, adapt to the tastes of the audience, and optimize the process of filmmaking. Moreover, films that use predictive modeling can be a mirror to reflect current technological innovations and ideas that exist at the time they are made. The movies described also allow viewers to see possible future technologies and imagine how they could affect our lives. Such films act as a kind of window into the world of technological progress and inspire viewers to think further about where technological development can lead.

From this it follows that predictive modeling has become an integral element of modern reality and is used in various fields of activity. The widespread use of this process causes an increased interest in the development of user-friendly software for building predictive models.

An important breakthrough in this area is the development of automated machine learning (AutoML), which allows users to integrate all stages of modeling process in a single platform or system. This has led to the simplification and automation of modeling, reducing the need for specialized programs and tools for each step. Now data analysts can focus on uncovering hidden patterns, correlations, trends that can be useful in building accurate predictive models, while AutoML platforms provide data pre-processing and process optimization. An example of such a development is the ANTAVIRA platform.