New! Become A member
Subscribe to our newsletter
Interview

Interview: A Screenwriter and a Computer Scientist Critique The Social Dilemma

We asked AI&F Founding Experts Craig Detweiler and Gilad Berenstein to share their perspectives on the message and storytelling approach of The Social Dilemma as a theologian/media critic/screenwriter and start-up entrepreneur respectively.  Craig brings a Christian perspective to his work while Gil has been considering the implications for his Jewish faith for the values and ethics issues inherent in AI-driven technologies like those employed in social media.

Q:  The Social Dilemma is a documentary.  How would you compare it to fictional series around AI and technology like Silicon Valley and Black Mirror as a broad influencer in the rapidly emerging cultural critique of Big Tech?

Craig:  Hollywood has been adapting stories that raise red flags regarding technology and A.I. for a looooong time. The road from 2001: A Space Odyssey and Blade Runner to The Matrix and Ex Machina is quite well trod. What makes a comedy/satire like Silicon Valley and a documentary like The Social Dilemma similar are their primary subjects–the creators of the programs and applications we use.  While Mike Judge mocks the players in Silicon Valley, The Social Dilemma invites them in to express their horror at what they’ve created. And that makes a big difference in how the audience receives it. A New York Times reporter may note that Steve Jobs didn’t give his own children an iPad but when you hear the coder express regrets onscreen about what their innovations have created, that’s a different story.  It shifts from social critique towards confession. As a Stanford grad, the filmmaker had access to technologists that made this an inside story of self-examination and doubt rather than an external, investigative report. Which results in a much greater ‘scoop’ than most books and articles have mustered.

Gil:  I came into this documentary well informed about social algorithms, the attention economy, gamification, and the like. Even so, I found The Social Dilemma documentary to be very powerful and illuminating. I found certain elements, like the human personification of Facebook’s algorithms, triggering feelings of intense unease and concern.

I remember similar feelings of nervousness as I watched the Black Mirror series. I remember telling my wife that I couldn’t watch more than one episode at a time because of the intense discomfort I felt after each viewing. For me, the genius of Black Mirror is that every episode opens in a world quite similar to our own and one that I can easily imagine arriving at in the coming years. Then  the narrative unfolds revealing a dystopian world which is genuinely terrifying and realistic.

What makes The Social Dilemma even more powerful is that we are already living in that world and the cat is already out of the bag.

Silicon Valley felt very different.  To me that show comes across as much more of a critique of the ridiculous nature of Silicon Valley the place and its key players as opposed to a true social critique or global warning sign. Silicon Valley was silly and fun whereas The Social Dilemma, especially in the narrative about the family of 5, paints a world that is no fun at all.

 

Q:  Something like The Big Short did for esoteric financial vehicles, The Social Dilemma anthropomorphizes algorithms to explain how they influence social media consumers.  How well did this work from a creative and communications point of view?

Craig: To see the three characters ‘turning the dials’ on a social media user in real time makes something quite disembodied like A.I. seem more threatening and maybe even malevolent. And that is the challenge with all good drama. The Terminator and Agent Smith are chilling adversaries because they are both embodied and capable of morphing in real time. In The Social Dilemma, the morphing is applied to our social media feeds. While Eli Pariser may have warned us about The Filter Bubble, The Social Dilemma figured out how to dramatize it–which makes it far more frightening for audiences. By giving A.I. a face/hands/body, The Social Dilemma delivered the chills it was looking for.

Gil:  Very well. I found the personification of the algorithms to be disturbing and frightening as it put a sinister and deliberative face on this whole opaque operation. I also found the detailed explanations of AB testing and similar techniques to be a powerful edification showing viewers how systematically social networks work to increase engagement and desired behaviors. Highlighting the fact that while you open the app and don’t think much about its design, the folks on the other side have spent years optimizing every button, color, and pixel to achieve their business objectives.

 

Q: As a cultural critique of technology, do you feel The Social Dilemma makes a compelling case for the downside effects of Facebook’s business model?

Craig:  Many had levied these kinds of warnings (I even wrote a book about the iGods in 2013), but a 90 minute movie proved far more effective at connecting to fear already residing in viewers’ hearts and mind. Timing matters. This filmed critique arrived after the testimonies before Congress and while Facebook’s own employees had begun questioning it’s policies publicly. The Social Dilemma was the right film in this election year that has been all about fear of voter manipulation and election fraud and divisiveness.  So many reached out to me after watching the doc to say, “See? I knew it.” It confirmed so many suspicions for why we feel like Facebook continues to divide us for the sake of profit.

Gil: Yes, but I think it could have been done more straightforwardly. I don’t think most members of the public are aware of the different types of business models available to technology companies nor the foreseeable benefits and disadvantages of each.

For example, what if Facebook utilized a subscription model; rather than an ad model?  How would that alter their prioritization and the guidelines underpinning their algorithms and recommendations? What if they used a pay as you go model or freemium approach like video games? Each of these models would lead to a different set of incentives and as Warren Buffet always reminds us: incentives really do matter.

 

Q:  Is  it possible for a company to build a legitimate business model based on capturing users’ attention and data describing their personal interests, and selling access to that data to third parties to influence the users in ways undisclosed to the users?  Or is Facebook’s model simply irredeemable?

Craig: I remember how odd it seemed that Facebook demanded I sign an NDA just to walk into their building. Why does a company that trafficks in our information guard their own so closely. Talk about a disconcerting disconnect! Clearly, many technologists assumed that the invisible hand of the marketplace and the power of information that wants to be free would suffice. The Social Dilemma calls a mantra like ‘Don’t be evil’ into question. A laissez faire approach to technology does not seem to suffice. The profit motive supersedes whatever mantra may have been set forth as an ideal. So to prevent companies from ‘being evil’, we will likely need to enact much stronger privacy laws that protect consumers and keep us from being bought and sold–basically, enforced ethics.  Or we will opt out of the current systems and move towards those that offer more privacy and protection and don’t function as manipulative monopolies.

Gil: It is certainly possible to ethically utilize an attention focused business model but the key to doing so is radical transparency; something the big tech companies are staunchly opposed to.

While it is theoretically possible to do so, a vast majority of users do not understand what data is being tracked, how granular that data is, how often data is being collected, and how that data is ultimately utilized and secured. A truly transparent system would make it simple to see and delete the data already collected about you. That system would tell you why a given ad or piece of content is being shown to you and what monetization is taking place behind the scenes. A truly user-centric system would enable users to set goals for what they hope to get out of their interaction with said platform (like catching up with old friends, being informed about new recipes, or seeing cute puppy photos) and that platform’s algorithms would actually work to help that user accomplish their goals; not just drive revenue.

It is even possible for social networks to enable users to choose between different sorting algorithms or perhaps even upload their own algorithms, which are well suited to their personal goals and desires.

 

Q:  The Social Dilemma has been criticized as long on critique and short on solutions.   Do you believe that bringing a better understanding of human nature derived from the ancient wisdom of the Bible (Old or New Testament or both) could help to address Facebook’s challenges?

Craig:  Our ancient wisdom is often adapted to the guiding metaphors of each era. God as dis-interested clockmaker arose from an era when we were just starting to grasp and predict the parameters of space and time. The challenge is to adapt enduring truths to a particular era while also retaining the critique that it also carries. So in the industrial age, we had to decide that treating children like cogs in a machine was inhumane. People of faith and conscience celebrated the dignity and worth of people to found disciplines like social work or ecclesial innovations like Sunday School for working kids who weren’t getting an education. In the information and computer age, we must decide that we are more than data processors. And we cannot treat others as mere objects/data to be harvested/mined/manipulated.  So we need ethicists and religious leaders to remind us that we are embodied people rather than just meatspace.  We are still developing the equivalents of social work for a digital age. But we are slowly discovering the depths of these dehumanizing challenges.

Gil:  Yes and yes! The documentary is certainly short on solutions but I don’t fault it for that; sometimes when the problem at hand is this massive, one needs to work to understand it before one can properly evaluate solutions. Further, as the documentary points out, it sure seems shortsighted to ask the same people who created the problem and who profit from it to be the ones who solve it.

The main dilemma is the way these networks exploit human weaknesses and age-old shortcomings for profit. To address this, while maintaining the numerous benefits derived from technology and social networking, society needs to draw on lessons derived over the millennia of human history across religion, philosophy, and culture. Because together we can create a better future for us all.


Gilad Berenstein, Craig Detweiler

<b>AI&F Founding Expert Gilad Berenstein</b> recently wrapped an envisioning role as an Entrepreneur-in-Residence at the Allen Institute for Artificial Intelligence (AI2) in Seattle. Previously, he was the Founder and CEO of Seattle-based travel personalization startup, Utrip, which utilized AI and human experts to help travelers plan highly personalized trips. Gilad is a graduate of the UW Foster School of Business where he obtained both his undergraduate and masters degrees. Gilad is passionate about travel, technology, food, innovation, and history and is a congregant at Temple De Hirsch Sinai in Seattle.
<br>
<b>AI&F Founding Expert Craig Detweiler</b> is a professional screenwriter, long time film and tech culture critic, educator, and Christian theologian in Los Angeles, and author of numerous books including <i>A Matrix of Meanings: Finding God in Pop Culture; Into the Dark: Seeing the Sacred in the Top Films of the 21st Century; iGods: How Technology Shapes our Spiritual and Social Lives; and most recently Selfies: Searching for the Image of God in a Digital Age. </i>

Privacy Settings
We use cookies to enhance your experience while using our website. If you are using our Services via a browser you can restrict, block or remove cookies through your web browser settings. We also use content and scripts from third parties that may use tracking technologies. You can selectively provide your consent below to allow such third party embeds. For complete information about the cookies we use, data we collect and how we process them, please check our Privacy Policy
Youtube
Consent to display content from - Youtube
Vimeo
Consent to display content from - Vimeo
Google Maps
Consent to display content from - Google
Spotify
Consent to display content from - Spotify
Sound Cloud
Consent to display content from - Sound
Subscribe to our newsletter