Showing posts with label internet. Show all posts
Showing posts with label internet. Show all posts

Tuesday, July 12, 2022

The Dogs of WWW

@kkomaitis and @j2bryson discuss the anniversary of the New Yorker cartoon On the Internet nobody knows you're a dog.

Obviously this is no longer true. Konstantinos Komaitis raises the important topic of surveillance capitalism and government snooping. There is more than enough data to know how many dogs you have, what you call them, how often you take them for walks, which other dogs and dog-owners you meet in the park, and how much you spend on dog-food and veterinary bills.

Joanna Bryson also raises the topic of deep fakes. Does this mean that some of those cute dogs we see on the Internet don't even exist? Or perhaps shifting our understanding as what counts as existing?

 

The title of this post is a reference to the words Shakespeare gives to Mark Antony:

Cry Havoc!, and let slip the dogs of war.

In its original meaning, crying havoc is a signal for looting and plunder. On the internet, this would include stealing your data and stealing your identity. 

In its article on the dogs of war, Wikipedia reproduces a Punch cartoon from 1876, showing Russia threatening war against Turkey in revenge for its losses in the Crimean War twenty years previously. Isn't history interesting?


Wikipedia: On the Internet nobody knows you're a dog, The Dogs of War, Crimean War (1853-1856), Russo-Turkish War (1877–1878)

Monday, March 08, 2021

Polarizing Filters

In photography, a polarizing filter can manage your reflections and darken your skies: this may be a good metaphor for what happens in media and communications. While filtering and polarization were (and remain) well-known phenomena within traditional media, there are enhanced mechanisms on the Internet, whose effects are still not fully understood.

The term filter bubble was introduced by Eli Pariser around 2010, drawing on earlier work by Nicholas Negroponte and Cass Sunstein, to refer to a phenomenon previously known as echo chamber or information cocoon. If you get all your news from a single partisan source - for example, a single newspaper or TV channel - you could easily get a one-sided view of what is going on in the world. In the USA, the CNN audience doesn't overlap very much with the Fox News audience. Barack Obama is one of many who have expressed concerns about the consequences of this for American democracy.

The concept is easy enough to challenge if you take it too literally.

The images of chambers and bubbles conjures up hermetically sealed spaces where only politically like-minded participants connect and only ideologically orthodox information circulates, but this seems highly improbable. ... We cluster but we do not segregate. Bruns pp 95-96

Or if you imagine that filter bubbles are a technologically determined problem associated exclusively with the Internet. As Ignas Kalpokas notes in his review of Bruns, 

It is easy to slide into a form of fundamentalism, particularly when researching something as pervasive as social media, by simply assuming that the architecture and policies of the dominant platforms determine everything. ... Once we attribute causation to technology, we can comfortably and conveniently avoid responsibility for any societal ills and the ensuing necessity to put some effort towards ameliorating them. Kalpokas

However, the problem identified by Sunstein twenty years ago was not just about filters and fragmentation, but also about group polarization - the fact that there are internal and external forces that push individuals and groups to adopt ever more extreme positions. This is akin to Bateson's notion of schismogenesis. Axel Bruns stresses the importance of this.

The problem, in short, is polarisation, not fragmentation, and such polarisation is not the result of our use of online and social media platforms. Bruns p 105
I agree with his first point, but I want to qualify his second point. While the internet is certainly not the only cause of political polarization, its influence cannot be completely discounted. The techno-sociologist Zeynep Tufekci has identified a specific mechanism on the Internet that appears to have a polarizing effect similar to that predicted by Sunstein - the recommendation algorithms on social media designed to keep users engaged for as long as possible, showing them progressively more outrageous and extreme content if that's what it takes. She argues that this is built into the business model of the tech giants. 

This is consistent with something Des Freedman noted in 2012,

The digital sphere is not a parallel economy but one that accentuates the tensions between the creativity and collaboration of a generative system and the hierarchies and polarisation prioritised by a system that rests, above all else, on the pursuit of profit. Curran Fenton Freedman p 92 my emphasis

In the same volume, Natalie Fenton noted Sunstein's argument that group polarisation is likely to become more extreme with time (p 168).

Many of the comments under Dr Tufekci's CBC interview are from what one might call nudge denialists - people who point out how easy it is to switch off the auto-play function on YouTube, and who claim never to be influenced by recommendations from the tech giants. Yeah, right. But that's not the point. Nobody said you can nudge all of the people all of the time. But the tech giants are certainly capable of nudging some of the people some of the time, at massive scale.

As a former computer programmer herself, Tufekci is able to explain the extraordinary power of the recommendation algorithms deployed by the tech giants. The tech giants choose to develop and use these algorithms in the pursuit of profit; and legislators and regulators around the world may or may not choose to permit or control this. So this deployment is not a matter of technological determinism but is subject to social and political choice. I wonder who is best placed to nudge governments to do the right thing?

 

Update

A few days after I posted the above, further details emerged about Facebook's approach to polarization, including a detailed exposé by @KarenHao (click on her name for Twitter discussion), which in turn appears to have prompted an internal Facebook meeting on Polarization and our Products reported by Ryan Mac and Craig Silverman.


Another Update

John Naughton's latest article has alerted me to a quantitative study of internet usage and polarization, published in 2017, which appears to show that the effects of political polarization are most marked in those demographic groups with comparatively less internet use.

While this evidence provides a warning not to overstate the Internet Causes Polarization thesis, it doesn't fully refute it either. A plausible explanation of these findings is that those who are less familiar with the workings of the Internet may be more vulnerable to its effects. I look forward to seeing further empirical studies.




Levi Boxell, Matthew Gentzkow and Jesse M. Shapiro, Greater Internet use is not associated with faster growth in political polarization among US demographic groups (PNAS, 114/40, 3 October 2017) HT John Naughton, Is online advertising about to crash, just like the property market did in 2008? (The Guardian, 27 March 2021)

Axel Bruns, Are Filter Bubbles Real? (Polity Press, 2019)

James Curran, Natalie Fenton and Des Freedman, Misunderstanding the Internet (1st edition, Routledge 2012). Note that some of this material didn't make it into the 2016 second edition.

Karen Hao, How Facebook got addicted to spreading misinformation (MIT Technology Review, 11 March 2021)

Ignas Kalpokas, Book Review: Are Filter Bubbles Real? by Axel Bruns (LSE Blogs, 17 January 2020)

Ryan Mac, Facebook: Polarization and Our Products (Twitter, 11 March 2021)

Ryan Mac and Craig Silverman, Facebook is good for America actually, says Facebook executive (Buzzfeed News, 12 March 2021)

Thomas Nagel, Information Cocoons (London Review of Books, 5 July 2001)

Eli Pariser, Did Facebook's big new study kill my filter bubble thesis? (Medium/Backchannel, 7 May 2015)

Cass Sunstein, The Law of Group Polarization (John M. Olin Program in Law and Economics Working Paper No. 91, 1999)

Zeynep Tufekci, YouTube, the Great Radicalizer (New York Times, 10 March 2018). Is YouTube Radicalizing You? (CBC via YouTube, 22 April 2018)

 

Wikipedia: Echo Chamber (Media), Filter Bubble, Schismogenesis,

Related posts: Social Networks and Received Opinion (July 2010), The Pursuit of Truth (December 2016), Polarization (November 2018), Technological Determinism (December 2020), Optimizing for Outrage (March 2021), Bias or Balance (March 2021)

Monday, September 26, 2005

Unambiguous Threat

We are asked to believe that the mass media (including television and internet) are inherently progressive, and support democracy everywhere. In September 1993, Rupert Murdoch claimed that satellite TV was "an unambiguous threat to totalitarian regimes everywhere". China immediately banned private satellite dishes. 

Murdoch then embarked on a long process of placating, reassuring and (as many websites describe it) courting the Chinese authorities. But according to the BBC (19th September 2005), he remains disappointed with the results of this process.

Actually, unambiguous threats are a bit unfashionable, even in military circles. In 2000, one of Clinton's military advisors, US Admiral William A. Owens, said that ambiguous threats posed a greater challenge than unambiguous ones. (Revolutionizing Warfare, Blueprint Magazine 2000)

Like us, our allies face an ambiguous world. The need to cut through ambiguity, especially at operational and tactical levels, has replaced the need to offset the prowess of a superior adversary posing an unambiguous threat. Sharing dominant battle-space knowledge - the key to modern deterrence - will reassure our friends and allies.

It now seems that some media giants are happy to share "battle-space knowledge" with the Chinese authorities. For example, Yahoo passed the identity of a journalist to the Chinese. (Murdoch criticized this decision.) And Microsoft is willing to enforce the Chinese vocabulary blacklist (which includes the word "democracy"). So much for Thomas Friedman, who argued in his 1999 book The Lexus and the Olive Tree that two great democratizing forces—global communications and global finance—would sweep away any regime which is not open, transparent and democratic.  

Sources: Bloomberg, Guardian, Andrew Leonard, George Monbiot