Surveillance Capitalism is, in part, an extension of the post-panopticon’s dispersed network of surveillant assemblages. It is not merely about tracking, but extracting behavioral data from individuals for corporate profit. It has subsequently been utilized by governments for tyrannical citizen control.
Some background on memetic ideas, and how ideology can become a pervasive means of thinking blinding us to alternate paths. I compare religious thinking to the belief in technology as a means of progress. This progress is measured through efficiency, and tech belief falls into a modernist trap of assuming efficiency is progress. This “religion,” or ideology, has led to a dehumanization of individuals when progress (efficiency) is “essential,” because essential-ism and efficiency are inherently at odds with societal and individual needs. As Andrew Feenberg discusses, we made tech, but instead of it serving human ends we are reduced to an ends-means relationship that benefits tech.
Moore’s law maps out the exponential curve of processing power in a circuit, which seems to correlate to the exponential amount tech produced and data recorded.* In the early days of the internet there was a torrent of information, which led to tracking or finding anything as a “needle in the haystack” problem until Google stepped in. As they developed their search engine, they found additional data that was tracking the behavior of online users. It wasn’t really helpful for the advancing the stated goals of their product (a better search engine), but over time they figured out how to use this “exhaust data” to predict user behavior with more accuracy, which is vital for targeting consumers.
*In the podcast I made a sloppy misrepresentation of Moore’s law, aligning it with the production of tech hardware, but it is really about density of processing power. Moore’s 2nd Law discusses the increased production cost of tech.
This behavioral extraction model is valuable, a new resource to be mined, with corporations and governments willing to buy it and the technology (and techniques) that allow it. This model -surveillance used for capitalist ends- relies on uninformed consent and is ethically dubious, but congress was appraised of the issues in 1997 and no firm actions have been taken to curb the practice. More interestingly is instead of protecting citizen rights, the government has been caught using the same techniques to further undermine privacy protections.
Facebook, Tinder, and Pokemon Go… all are examples of manipulation, addictive behavior modification, and expansion of data collection into real world results and profits.
One thing I forgot to mention in the podcast: the concept of “friction,” or more accurately a removal of friction and any obstacle that could thwart easy online participation. It is a key element to not only game-ify and entertain, but to make spending time and money online so easy that impulse control is diminished, while time online is increased until dependence is habituated.
We move into the reversal, from online surveillance to real world surveillance, using techniques pioneered online, as well as some context on what the internet was promised as (a democratic equal access platform) and how this has also been reversed until the “new norm” is barely questioned.
WalMart is a key example of using panoptic and post-panoptic technology to increase efficiency at the cost of dehumanizing employees both through part-time low-wage labor and behavioral indoctrination and intimidation. They are referred to not only as “the beast from bentonville” but also as the “Panopticon of Time”.
Google is forming campuses that employees never need to leave, which is a parallel to Margaret Atwood’s “MaddAddam” apocalyptic dystopia series where corporate employees live in utopian gated compounds and produce unethical products that are tested on the masses living in a poverty stricken and precarious wasteland.
Finally, we close on two examples of China using facial recognition technology, panoptic surveillance and social ostracization to control their citizens: 1) the social credit system, and 2) tracking the Uyghur minority populations and forced sterilization.
Donations have been disabled
If you enjoyed the content, please help offset the costs of production.
REFERENCES / RESOURCES
Wiki: Surveillance Capitalism [link]
The Conversation: “Explainer: what is surveillance capitalism and how does it shape our economy?” [link]
Wiki: Shoshana Zuboff [link]
Wiki: Moore’s Law [link]
Forbes: 90% of data produced in the last 2 years [link]
Forbes: “A short history of big data” [link]
Andrew Feenberg: “Questioning Technology” [link]
The Atlantic: “Everything We Know About Facebook’s Secret Mood Manipulation Experiment” [link]
David Shaw: “Facebook’s flawed emotion experiment: Antisocial research on social network users” [link to pdf]
Judith Duportail: “Love me Tinder” “The algorithm of love: A trip to the bowels of Tinder” [link]
Guardian Article: “I asked Tinder for my data. It sent me 800 pages of my deepest, darkest secrets” [link]
Haven and Stoneman: “Walmart: The Panopticon of Time” 2009 [link to pdf]
Cato Institute: “Uyghur Genocide Shows Urgency of Combating Neo‐Malthusianism” [link]
NY Times: “One Month, 500,000 Face Scans: How China Is Using A.I. to Profile a Minority” [link]
Previous Sources brought up again:
Gilles Deluze [link] 1791
Michel Foucault, Discipline and Punish [link] 1975
Stanley Kubrik “Dr. Strangelove or: How I learned to stop worrying and love the bomb” [link]
Black Mirror (Netflix): Nosedive [link]
Margaret Atwood: MaddAddam Series [link]
Earnest Cline: Ready Player One [link]
I didn’t get to this in the podcast, but it is worth discussing later:
Trevor Paglan: AI image classification [link]
“How the U.S. Military Buys Location Data from Ordinary Apps” Vice, Nov 16, 2020 [link]
“Federal Agencies Use Cellphone Location Data for Immigration Enforcement: Commercial database that maps movements of millions of cellphones is deployed by immigration and border authorities” Wall Street Journal,Feb 2020 [link]
“IRS Used Cellphone Location Data to Try to Find Suspects: The unsuccessful effort shows how anonymized information sold by marketers is increasingly being used by law enforcement to identify suspects” The Wall Street Journal, June 2020 [link]
Leave a Reply