![]() |
Chyung Eun-ju |
![]() |
Joel Cho |
By Chyung Eun-ju and Joel Cho
Did the title compel you to click on the article? It's not surprising, as marketing practitioners often target consumers' emotions, more occasionally negative ones, because they are powerful motivators.
Recently, we watched an episode titled "Joan is Awful," from the Netflix series, "Black Mirror." The episode centers around a fictional streaming platform called Streamberry, which bears similarities to Netflix, where users unknowingly grant the platform the rights to their likeness and personal lives, and Streamberry uses the events of their daily lives to make a show about them. In this particular episode, the main character, Joan, finds her own life transformed into a TV drama, starring Salma Hayek, with her worst qualities highlighted for comedic effect.
The twist is that the CEO of Streamberry is making personally customized content for every single user's life into a show called "[Insert Name] is Awful," because self-doubt, neurosis and introspection are what makes content go viral. The CEO explains how they tested the show with "Brian is Awesome" but people were not interested in or did not buy such content.
In the episode, the CEO says, "It didn't chime with their neurotic view of themselves. What we found instead was when we focused on their more weak or selfish or craven moments, it confirmed their innermost fears and put them in a state of mesmerized horror. Which really drives engagement."
Also, the Joan we are watching isn't really Joan, but a version of Joan that the "source [real] Joan" watches on her screen.
Streamberry portrays Joan's life with embellished details amplifying her worst qualities because the audience can't take their eyes off that kind of juicy content. "Joan is Awful" makes an assertive modern critique that taps into the inhumane approach used to engage audiences and the unethical uses of data, all possible because of the "Quamputer," an infinite content creator capable of willing entire multiverses into existence. The computer can steal users' stories and manipulate its actors' likenesses. This is all legal because users sign away their rights to Streamberry when agreeing to the terms and conditions, thereby allowing the company to exploit their image. In addition, the streaming service knows all the details about Joan's life because it can spy on people via their data. Joan's lawyer draws a comparison of how an end user can be surveilled and the concept of targeted advertising using cookies. When someone mentions an item or hobby, related banner ads or Instagram ads show up on one's personal device, that is how Streamberry uses the details of users' daily lives to create personalized content.
A study conducted by Microsoft Researcher Daniel McDuff and Wharton School marketing professor Jonah Berger explored the influence of emotions on sharing different advertisements. They used automated facial recognition technology to analyze people's reactions to ads and the findings revealed that not only did positive emotions compel people to share ads, but negative emotions such as disgust, fear and anger also played a role. What was interesting was that some negative emotions such as sadness decreased the likelihood of sharing.
Physiologically arousing emotions, such as those experienced in response to encountering a snake or engaging in a fight, trigger a heightened state of alertness. Emotions that accelerate heart rate have been found to increase the likelihood of a video going viral. Thus, for effective marketing of a product or service, the content needs to evoke emotions that fire up consumers. While positive emotions can also elicit sharing, when negative emotions are used accordingly, they serve as a powerful means to engage consumers. We've all witnessed the power of disinformation, particularly when it triggers fear and anger.
Marketing is based on the fundamental principle that consumers enhance their self-perception. They strive to confirm their self-held views. Consequently, manipulations that deliver negative feedback threaten an individual's self-view and increase the need to self-enhance and engage in compensatory consumption.
Ever wonder why Instagram ads are so powerful?
Facebook whistleblower Frances Haugen, a former data scientist at Facebook, leaked a Facebook study that found that 13.5 percent of U.K. teenage girls stated how their suicidal thoughts became more frequent after starting Instagram. Subcommittee Chair Richard Blumenthal said, "Facebook exploited teens using powerful algorithms that amplified their insecurities." Negativity has a way of becoming more addictive than positivity and this can pose serious problems even more now, where our personal lives are no longer really private.
The episode of "Black Mirror" brings up an interesting and relevant question about the consequences of a lack of proper and adequate regulation placed on corporations when using our data. Although it tries to pin the root of the issue on technology itself, it's not farfetched to say that it also paints a picture of the control and power a corporation can have when it possesses a vast amount of its users' personal data.
It is not by chance that we are seeing countries move towards legislation that regulates the use of personal data, specifically the processing and treatment of our personal data by corporations.
There's no denying the existence of big data corporations like Meta and TikTok that currently possess an astonishing amount of users' personal data. So, it is not very difficult to imagine a world where the concept of a fictional streaming platform such as that in the "Black Mirror" episode may become a reality. Our recent technological advancements, especially considering AI technology, seem to take away our doubts regarding the feasibility of such a future.
In the past, before we compiled so much data, marketers or companies may have not known our true selves, as consumer surveys could be influenced by lies or biases. However, digital footprints cannot be falsified. The Google search bar, in particular, is known to be one of the most brutally honest and revealing platforms in existence. This vast amount of personal data can be used for personal targeting without our full understanding of the consequences. Companies could delve into our insecurities, and tap into our fears, without us even recognizing it.
Is it our choice to buy a product when we don't know all the facts? Do we even fully comprehend what we agree to in the terms of service every time we make a registration? As generative technology becomes more prevalent in creative spaces, without timely and proper regulations for each technological advance and, consequently, our reality, the grim story portrayed in "Joan is Awful" might not be a reality far from our own future.
Chyung Eun-ju (ejchyung@snu.ac.kr) is studying for a master's degree in marketing at Seoul National University. Her research focuses on digital assets and the metaverse. Joel Cho (joelywcho@gmail.com) is a practicing lawyer specializing in IP and digital law.