Feeling the Algorithm Working: Machine Learning as Cultural Product and Methodology for Cultural Analysis – Tanja Wiehn

This blog post takes a look at the perception of machine learning through three different stages of feelings and in interaction with theoretical discourses. These different stages reflect upon my encounters with the technological layers of machine learning throughout the first year of my PhD. I take thereby the methodological and theoretical engagement with digital cultures, algorithmic studies and critical posthumanism into consideration to form my argument. In a first step I will introduce the uncanny in encounters with technology. I look thereby at an art piece by the Berlin-based artist duo PWR that used a code for a machine learning algorithm in order to create meaningfulness in a text or rather a meaningful text. The piece was part of their exhibition at Annual Reporttin Copenhagen in January 2018. In a second step, the blog post will dive into the practical aspects of machine learning as a methodology in the digital humanities. Thereby, I will reflect upon a workshop I took part in this summer at Leipzig University. In looking at practice settings of machine learning – e.g. as it is used within the digital humanities and in form of an artistic interpretation, I intend to propose an unfolding of cultural understandings of this technology and analyze it with the help of a critical posthumanist perspective. For a conclusion, I want to return to a feeling of confidence. In the final paragraph of this blog post, I will combine acts of deciphering the magical elements of the machine learning by algorithmic studies and throughout the discourse of unboxing and knowing algorithms. I argue therefore for an affirmative and interdisciplinary thinking about machine learning technologies and their in- and outputs.

1) THE UNCANNY

Pandæmonium is a logistical nightmare. Pandæmonium is a pancomputational dream sequence. Pandæmonium shows a future where digital networking has merged with fundamental reality. Everything is an interface to something else. Everything is connected to everything else. Everything is inhabited by autonomous agents acting according to opaque programming.”[1]

The technological heart of PWR’s art exhibition is not the 9min-long video, but a code. A code that is produced automatically, when entering the website of the exhibitions at the space of Annual Reportt. It is an “[…] algorithm which autonomously generates seemingly meaningful text, based on an analysis of linguistic consensuses. This code was used to create a double-sided poster, while also being output on this website.”[2] I will take the actual code-written texton the poster and on the webpage aside for now, even though it is worth to take the analysis of this text into

consideration at a later point in time. But with the blog post only being a proposal, I use the self-written code only as an entry for my discussion. Worth to mention briefly at this point is the range of discourses provoked by algorithmic cultures in research, their authority and impact in creating feelings. Also, looking at it from the other end of the spectrum, the agency of algorithmic cultures in detecting patterns of sentiments in data. As machine learning being a basis for artificial intelligence (Alpaydin 2016), the debate of the usage of data-driven technologies as and for algorithmic cultures is in full blossom (Gillespie 2014; Ruppert, Isin, Didier 2017; Roberge, Seyfert 2016).

Screen Shot 2018-12-16 at 14.16.13

Figure 1) Code being generated when entering the PWR’s exhibition webpage

However, with this art piece, the duo PWR is putting emphasis on one particularly notion that is under a significant shift in the posthuman condition. I’m referring here to Braidotti’s conceptualization of the era of the posthuman predicament which is deeply connected to the notion of “Man” as no longer being th
e measureof all things, the Anthropocene and the multilayered complications surrounding this point in time (Braidotti 2013, 2018). Through PWR’s art work, the fear and hype of machines being able to learn, and act independently becomes an encapsulated moment. danah boyd and M.C. Elish problematize not only the faith and reliance in data-driven technologies, but moreover the magical and uncanny moments of artificial intelligence (Elish; boyd 2017). They refer to the “uncanny valley” (Mori 2012), a term borrowed from Mori, that marks a moment in which the technological other is either to too far off of the wanted end product, e.g. an absurd recommendation or advertisement. In the other case, the technological other comes to close, uncanny close (Elish; boyd 2017). The old humanistic ideal of humans as exceptionalism sticks very much to the uncanny feeling towards machines being enabled to write meaningfulness into a text and having the ability of expressing and producing feelings. If machines are allowed to have and make feelings, what’s left for human? A rethinking of this form of human exceptionalism has to take place on diverse levels in and out of cognitive capitalism (Braidotti 2018). PWR’s piece is only one of the examples from art that challenge the notion of the neutrality of technology, but also grasp the ambivalent feelings towards perceptions of machine feelings. The work Pandæmonium touches thereby upon ideas of automatism of machines having learned to perform writing as if human.

2) THE DOUBT

Doubt is a feeling that recognizes a shift from certainty towards uncertainty on a particular matter. Doubt forms in this case around preconceptual ideas of machine learning and theirconcrete application. In the summer of 2018, I took part in two workshops of the European Summer University in Digital Humanities. Hereby, I introduced myself to the programming language Python and trained in a workshop of Reflected Text Analysis[3]machine learning algorithms on a set of literary and non-literary texts. The emphasis was lying here on the idea of a reflected text analysis, a “lifting of the veil”[4]of machine learning in digital humanities. The training of algorithms in Python shaped my preliminary thoughts and understandings of machine learning, especially within this branch of humanities, the digital humanities. The workshop at ESU fed my need to understand the function and usage of machine learning algorithms for sentiment analysis, but also made a way into the contemporary discourse on algorithms. Where lies the critical intersection of reflections of the machine, machine learning and the posthuman in humanities research projects? How can tools and methodologies create new knowledge on a transdisciplinary level? What is needed to reflect upon machine-driven methodologies in new research fields, like the digital humanities? Using my doubt in exercising the machine learning algorithm is thereby an excuse for engaging with the contemporary discourse on transdisciplinary knowledges being established (Braidotti 2018). In making use of two different concrete algorithms of machine learning, testing and training it, making it work on a text to detect sentiments, the perception of the algorithms as the blackbox shifts. “The understanding of data and what it represents, then, is not merely a matter of a machine that learns but also of humans who specify the states and outcomes in which they are interested in the first place.” (Bucher 2018, 25). Doubt as a feeling towards technology like machine learning is useful in order to overcome the reproduction of the image of the unknown, blackbox algorithm, as Taina Bucher reminds us in her recent book. I propose therefore a rethinking of the notion of machine learning in a concept of machine teaching. I hereby follow researchers who have been working on understandings of algorithms through ethnographic methodologies (Seaver 2017). Taina Bucher’s understanding of algorithms as eventful (Bucher 2018) becomes valuable here as well, as both examples manifest the processual nature of algorithms in machine learning from two different ends of the spectrum. However, it enables to shift the idea of algorithms as tools and technological instruments to a more fruitful notion. Bucher strengthens the embeddedness of algorithms in systems – technological and cultural ones (Bucher 2018). Furthermore, my proposal is also supported through research that challenge the repetitive image of algorithms as blackboxes that need to be opened, unveiled or made visible – which is  deriving from a deeply humanistic standpoint. (Bucher 2018, Gillespie 2014).

3) THE CONFIDENCE

Concluding this blog post, I want to advance towards a feeling of confidence in relation to the technology of machine learning. It seems to be a very uncommon feeling when considering the posthuman predicament and its highly complex issues. Rosi Braidotti makes her argument for a critical posthumanism based on the idea of a new need for a theoretical framework that is able to analyze the challenges of a cognitive capitalism (Braidotti 2013). The examples I used in this post should briefly underlined the complexity of beginning a cultural analysis of machine learning –  a technology that is part of a cultural product, as with the artwork by PWR and becomes simultaneously a part of the methodological framework of a cultural analysis. This I briefly outlined with the example of the digital humanities embedding machine learning in the detection of sentiment in texts. I want to hint to this ambivalence of machine learning establishing feelings either in creating meaningfulness or in detecting meaning and feelings through the reading of a text. A standpoint from critical posthumanism strengthens the idea of an affirmative politics towards the potentials of machines and machine feelings, without neglecting the ties and embededdness of algorithmic cultures in the complexity of systems – one of which is still advanced capitalism. Katherine Hayles’ version of the posthuman remind us of the fact that “human life is embedded in a material world of great complexity, one on which we depend for our continued survival.” (Hayles 1999). I see it therefore as necessary to bring theoretical discourses on the methodological impacts of algorithmic cultures and the critical reflections deriving from critical posthumanism together. Not only to strengthen the reflective aspect of new research outputs, like within the digital humanities. But by highlighting the different layers of my encounters with machine learning, I want to stress the notion of becoming posthuman that doesn’t follow a transhumanist imaginary (Hayles 1999, Braidotti 2018, 2013), but considers the strengths of the technological other, as well as the potentialities of non-normative kinship for the era of the Anthropocene (Haraway 2015).

On a less gloomy note, I will come to the end of this blog post with Tiziana Terranova’s vision of algorithmic systems:“The basic idea is that information technologies, which comprise algorithms as a central, do not simply constitute a tool of capital, but are simultaneously constructing new potentialities for post-neoliberal modes of government and postcapitalist modes of production.” (Terranova 2014, 216). This feeling of confidence has its origin in knowledge about new establishing terminologies and interdisciplinary research discourses that are in the making. Therefore, a critical posthuman perspective on algorithmic cultures supports my argumentation for an engagement with these research environments. The look at artistic products of machine learning as well as machine learning as research methodology in this post is emphasising the conjunction algorithmic cultures will continuously play in the creation of feelings of ambivalence.

 

Works used:

Alpaydin, Ethem. Machine Learning. MIT Press, 2016.

Braidotti, Rosi. “A Theoretical Framework for the Critical Posthumanities.” Theory, Culture & Society Theory, Culture & Society, 2018.

Braidotti, Rosi. The Posthuman. Polity Press, 2013.

Bucher, Taina. If…Then: Algorithmic Power and Politics. 2018.

Drucker, Johanna. “Why Distant Reading Isn’t.” Publications of the Modern Language Association of America, vol. 132, no. 3, 2017, pp. 628–35.

Haraway, Donna. “Anthropocene, Capitalocene, Plantationocene, Chthulucene: Making Kin.” Environmental Humanities, vol. 6, no. 1, 2015, pp. 159–165.

Hayles, Katherine. How We Became Posthuman: Virtual Bodies in Cybernetics, Literature, and Informatics. University of Chicago Press, 1999.

Mori, M. The Uncanny Valley: The Original Essay by Masahiro Mori. IEEE Spectrum. Retrieved from http://spectrum.ieee.org/automaton/robotics/humanoids/the-uncannyvalley, 2012.

Nick Seaver. “Algorithms as culture: Some tactics for the ethnography of algorithmic systems.” Big Data & Society, vol. 4, no. 2, 2017.

Ruppert, Evelyn, et al. “Data Politics.” Big Data & Society Big Data & Society, vol. 4, no. 2, 2017.

Seyfert, Robert, and Jonathan Roberge. Algorithmic Cultures Essays on Meaning, Performance and New Technologies. 2016.

Terranova, Tiziana. “Red Stack Attack!” Algorithms, Capital and the Automation of the Common” 2014, p. 202-220.In: Bishop, Ryan, et al., editors. Across & beyond: A Transmediale Reader on Post-Digital Practices, Concepts, and Institutions. Sternberg Press: Transmediale e.V, 2016.

[1]See Annual Reportt Website http://www.annualreportt.com

[2]See ibid.

[3]Reflected Text Analysis Workshop at ESU 2018:  http://www.culingtec.uni-leipzig.de/ESU_C_T/node/940

[4]See ibid.

2 Comments

  1. Hi Tanja. Thank you for your somewhat uplifting post.
    I want to promt discussion by turning to the specifics of your proposed approach.

    How may an investigation of machine learning (/teaching) actually take place — what are the kinds of things needed for such an investigation? When referring to the workshop in Leipzig, it seems that the way you relieved your doubt was by gaining somewhat ‘technical’ insights into specific braches of machine learning. But is this the only (or the best) way, to ‘lift the veil’ of machine learning? In other words, what is the ‘veil’, and how can it be ‘lifted’ — in your approach, is the ability to program machine learning systems enough to fully ‘lift the veil’?
    In a way, I am here referring to your inclusion of an artwork in your post: what is the specific role of the artwork in your investigation, other than to spark a feeling of uncanny-ness?

    Like

  2. Hi Tanja, thanks for your piece!
    I wanted to inquire about how you understand the concept of “machine sentiment”. You write: “This I briefly outlined with the example of the digital humanities embedding machine learning in the detection of sentiment in texts. I want to hint to this ambivalence of machine learning establishing feelings either in creating meaningfulness or in detecting meaning and feelings through the reading of a text.” If a machine learning programme can be trained to recognise by “detecting meaning and feelings”, does this mean that the programme is trained to recognise certain keywords? How does a machine “detect meaning”?

    I challenge your claim that a machine is able to “read” a text for meaning in the same way that we do–a machine may be able to parse a text for specific keywords, for example, but the layer of human intervention that is required to make meaning out of this data is the reason why we might group this data under the heading of “examples of sentimental discourse in Julius Caesar” for instance. Or do you have another theory of machine feeling that does not rely on this humanist conceptualisation? In other words, I am interested to hear what a posthumanist theory of machinic sentiment looks like, if you have given some thought to the matter. You intimate that you may have an answer when you write: ” I want to hint to this ambivalence of machine learning establishing feelings either in creating meaningfulness or in detecting meaning and feelings through the reading of a text,” but I would like to hear more.

    Like

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s