Videos Don't Lie, Right?
by Jacquelyn Whiting
Sometimes it feels eerie to me that we live in an era where so much of what was science fiction only one or two decades ago is now reality and the norm in many of our lives. The 1999 Disney Channel movie Smart House is the story of a fully-automated dream house that becomes an overbearing, angry, and jealous presence in the lives of the family that inhabit it. The personality in that house is named PAT, short for "personal applied technology" (instead of Alexa or Hey, Google) but it was just as omnipresent if not omniscient. How is it that the more time we spend interacting with machines, the better equipped they are to finish our sentences?
When I talk with my really techy friends about machine learning, the substance of the conversation quickly goes over my head. Even if I can follow it, I can't replicate it. Then, I found How to Speak Machine: Computational Thinking for the Rest of Us by John Maeda and things became more clear. Woven through technical and daily life examples of the inner working of machines are pearls of wisdom about what it means to be human and the good, bad, and the ugly of our relationships with machines. One of those passages gave rise to the post you are reading right now. Here is what Maeda said: "The actual machinery of computing is complicated yet understandable; the social impact of this complicated machinery becomes complex when it involves as many humans as it does today."
Machines have made it possible for humans to do heart-warming, miraculous, provocative, and tragic things. Do you remember when Natalie Cole and her dad, Nat King Cole, sang a duet together? Well, they didn't actually sing together in the same time and space, machines united their voices and presence to create a duet for the ages. We didn't see the technology that brought these iconic performers into one virtual space as threatening or insidious. In fact, Natalie Cole's project won six Grammys that year! Heart-warming for sure.
When we think of machines, of machine learning, of artificial intelligence, we frequently slip into the rabbit-hole of machines taking over. Maeda offers a nuanced and thoughtful consideration of what AI is and can be: "Machine learning feeds off the past…which is why if we keep perpetuating the same behavior, AI will ultimately automate and amplify existing trends and biases…AIs are not to blame when they do bad things. We will be the ones to blame for what they do in the service of us." Remember, PAT from Smart House? When the family became disillusioned with her interference and tried to power her off, she animated herself as a hologram to be a more "real" presence in the family members' day-to-day lives.
At some point in the somewhat recent past, Photoshop, originally a product name, became a verb and then an adjective, as in "Can you photoshop out my double chin?" and "That's fake. It's photoshopped." Here we are in the next phase of media manipulation: video alteration and fabrication. While the Coles' 1992 engineered duet broke ground with its technical creativity, perhaps PAT was a harbinger of the insidious deep fake videos that creators of disinformation disseminate today. (See News Literacy Project's piece on deep fakes, https://newslit.org/tips-tools/deepfakes-when-you-cant-believe-your-own-eyes/.) Whether it is the altered playing speed making someone appear to stumble and slur to the completely computer generated videos in which someone is made to appear to be saying or doing things that were never said or done, deep fakes are horrifying for the deceit they perpetrate and the damage they do to our civil discourse.
Organizations like Common Sense Media are diving into the deep fake issue and creating lessons to help educators discuss this content with their students (https://www.commonsense.org/education/articles/are-deepfake-videos-a-threat-to-democracy). To examine the impact of deep fakes with your students, start with a definition of "media." My definition is: media is content created for a purpose. With this in mind, students can critically view these videos and ask: who is the intended audience? What message does the creator want that audience to receive? What does the creator hope the audience will do? Say? Believe? How does the creator benefit from the consumption of this media? How does this video impact relationships? How does the impact and prevalence of manufactured video compare with manipulated still images? What can and should be done about these videos?
Now add to the nuance of their discussion. Show them that the same machine learning has also been used to contribute to national debate, to provide satirical commentary on our current national state of being While they may not agree with the message, the medium is being used with transparency to engage audiences in reflection and dialogue. For example, Trey Parker and Matt Stone (the creators of South Park) and actor and voice artist, Peter Serafinowicz, partnered to create "Sassy Justice," a deep fake they created as both satirical commentary and comedic entertainment saying that they trust their audience to distinguish manipulated media from reality and entertainment from intentional deception. (NOTE: "Sassy Justice" is not appropriate for young audiences. Please preview it before showing - https://www.youtube.com/watch?v=9WfZuNceFDM.)
Manuel and Patricia Oliver also entered the machine-manipulated video production arena this election season. Manny and Patricia are Joaquin "Guac" Oliver's parents. Guac was killed in the shooting at Marjory Stoneman Douglas High School in Parkland, Florida. He would have turned 18 in time to vote for the first time in the 2020 election. His parents had a video made that animated Joaquin in a powerfully lifelike way so that artificially engineered Guac could urge people to vote and champion the causes for which he could no longer advocate (https://www.youtube.com/watch?v=m6I_wEetSck). It is a potent, gut-wrenching testimony to their love for their son and commitment to their cause.
Machines aren't out to get us, though we may be out to get each other. And we are teaching machines to do our bidding. Even though popular science fiction portrays machines as evil-doers bent on destroying humanity, ultimately machines are a reflection of their creators. If we don't always like what we see, we need to remember we are looking in the mirror.
Additional Resource
Jackie's Information Literacy Choice Board - https://docs.google.com/presentation/d/e/2PACX-1vR0CthE5PhdCT72HLv9gW88Fnsksi9kPPevvduhQwokyw7jIAFnMrhCdpT3ic4xpl12xaycijQc1nNP/pub?start=false&loop=false&delayms=3000
Entry ID: 2259956