Seizing the Means of Subjection: Felix Guattari Among the Machine Learners – Brett Zehner

1 Technological Conceptions of the Subject

Conceptions of the political subject have long been intimately tied to technologies of measure and legibility. Enlightenment Man, the Cartesian subject divided between mind and body, the modern factory workers and their union, the ‘objective’ view from nowhere, Lacan’s ideal analyst – all conceptions of subjectivity are bound to perspectival technologies that mediate human experience. In obvious fashion, conceptions of the euro-centric subject illustrate the measuring functions of coloniality, whiteness, and the technological organization of capital brought to bear on the individual. However, technology has also been weaponized to trouble static notions of subjectivity. Notably, Donna Haraway’s situated feminist subjects dethrone the god tricks of scientific rationalism (Haraway, 1988). As well, Sylvia Wynter’s attempt to liberate humanism from ‘enlightenment Man’ can be seen as a techne in its own right, or as Wynter called it– a decolonial science (McKittrick, 2014).

So in our present moment, how does the recent ubiquity of machine learning affect our conceptualizations of political subjectivity in a time of algorithmic governance? This short essay maps a blind spot in critical media studies concerning desubjectivization within digital culture. I look to Felix Guattari’s concept of machinic enslavement as a crucial term in contemporary political struggles. I then briefly speculate the tactical possibility of seizing the means of technical desubjection and the “alien rules” of machine learning.

2 Machine Learners and Surveillance Capitalism

For the past two decades, fields of knowledge-production that utilize statistics have adopted machine learning as their primary mode of operation (Mackenzie 2013, 434). Due to the advance of computational technology, machines can now be programmed to find patterns in large datasets. Machine learners recursively use those patterns to infer correlations, essentially hailing new performative judgments on the world. But as Kieran Healy quips, machines don’t have to think. They only have to learn. By abstracting concrete social practices into data vectors, machine learners measure, forecast and modulate human behaviors. Machine learners are now some of the most potent social inscription devices of our age.

As the big data revolution ramps up, much attention has been drawn to online platforms that modulate political identities “situated at a distance from traditional liberal politics and removed from civil discourse” (Cheney-Lippold 2011, 165). On two ends of the extreme, we have seen the rise of white supremacists propagating through networks that segregate public opinions. While on the ‘back end’ of computational culture, data practices desubjectify human users for proprietary gain. Capitalism doesn’t care if you’re a fascist, a passivist, or even a bot; so long as it can extract behavioral information from your actions to be packaged and resold by its advertisers. Between the front end user interface, and the back end logics of computational capital – machine learners are situated at a powerful contradiction within capitalist logics.

Amidst this contradiction, it may seem a moot point to be reconsidering the production of subjectivity within computational media. We, of course, know all about the algorithmic bias built into machines stemming from a long line of quantitative racisms. We also know from scholars such as Jasbir Puar and Hortense Spillers that racialization under neo-liberalism is produced through surplus labor and the “right to maim” marked bodies unprotected by the law (Spillers 1987, Puar 2017). Still, political subjectivity seems to be the theory that won’t die. Now emerging scholarship at the intersection of identity and machine learning has opened new pathways to consider the question of subjection.

Kieran Healy remarks in Seeing Like a Market – the state used to be the only apparatus with the technological power to track its subjects, this is no longer the case. The recent ability for machine learners to track online user’s digital footprints, or their “data exhaust,” marks an important moment for what Shoshona Zuboff calls surveillance capitalism. For instance, every action a user performs on a system is considered a signal to be analyzed packaged and subsequently fed back into the system. The quantity of user data is much more important than quality. As long as an action online can be converted into data, it can be utilized in predictive behavioral models. Zuboff explains that no online action is too trivial to be aggregated, repackaged, and sold again (Zuboff 2005, 79). “Facebook likes, Google searches, emails, texts, photos, songs, geo-location, communication patterns” are all considered lucrative data to marketing firms and myriad other companies (79). Though let’s be clear. This is not merely a social media concern. The targeting of the poorest members of society continues unabated, only now it operates through various forms of data surveillance and predatory credit scoring (Fourcade and Healy 2016, 31). In this extractive logic, we see an impersonal form of subjection at the heart of surveillance capital. Zuboff argues that technique supplants authority, and that “discipline and control produce a certain knowledge of human behavior independent of consent” (Zuboff 2005, 81). New forms of power emerge, alienating persons “from their own behavior while producing new markets of behavioral prediction and modification” (75).

Despite the innovation in considering machine learning’s effect on social identity, there still seems to be a lack of rigorous attention to desubjectiviation at the heart of algorithmic governance. For instance, John Cheney-Lippold asks – “What does the banality of competing for a job interview using machine learning to predict future friendships say about subject formation” (Cheney-Lippold, 8)? However, this line of questioning still focuses only on the subject at the level of language and self-awareness. It only thinks the subject as a ‘user.’ Even though data analysis seems to aggregate our most intimate habits, surveillance remains automated and deeply impersonal as it bypasses individuated modes of subjectivity.

Critical media studies, if focusing merely on the identity-side of subjective technology, is limited in offering any new insights into the forces at play in our present moment. I argue that the acceleration of predictive techniques, machinic enslavement, and the recent fascist return through supposedly democratic networks of connectivity; all force us to consider more robust theories of desubjection.

3 Machinic Enslavement and Technologies of Desubjection

What are machines of subjection and desubjection?

Felix Guattari argues that the main mode of capitalist production is not the production of commodities but the production of subjectivity. In the broad sense here, subjectivity is an individual’s relation to themselves. Felix Guattari tries to move the point of analysis from a political economy to a subjective economy, which he sees as one and the same.  He even once quipped that that capitalism launches new subjectivities like new models of cars.

Guattari’s conceptual arsenal is repurposed from cybernetics as a way to theorize an emancipatory “cybernetic marxism.” As Matteo Pasquinelli notes, for Guattari the machine is constituent of subjection, not its “other” (Pasquinelli 2015, 176). This “machinic animism” allows Guattari to critique capitalist technologies at the same level as ‘being’. Guattari’s concept of the machine then is not merely an assemblage of any parts whatsoever or a flattened ontology. Guattari clarifies that under the regime of cybernetics, surplus value becomes inherently machinic (Deleuze and Guattari 1987, 506). Language, labor, automated decision-making, and the abstraction of concrete practices into data management, all comprise the new apparatuses of capture. Machine learners, for instance, can extract surplus value without a subject doing any ‘work’ in the traditional sense (502). So instead of the self-aware, rational liberal subjects of old, in machine learning, we find the desubjectified dividual of Deleuze’s Control Society. Dividuals are the swirls of information, units of attention, nervous actions, aggregated twitches, pre-cognitive impulses, circling the black hole of identity. And in our present moment, dividuals are the divided bits of information that are reconstituted to infer a subject (Cheney-Lippold 2011, 169).

Most importantly for our purposes, Guattari identities two essential systems of power working in a concurrent yet contradictory manner. On the one hand, we face systems of social subjection. Social subjection categorizes us with assigned identities – it gives us a gender, a race, a profession – a position of symbolic representation. However, the production of an individuated subject is coupled with a completely different process that proceeds though desbujectivation. For instance – focus groups stopped using questionnaires long ago in favor of measuring biometric responses. Guattari defines this process as machinic enslavement which dismantles the individuated subject, consciousness, and representations, acting on both pre-personal and supra-individual levels. ⁠In machinic enslavement, the individual is no longer instituted as an “economic subject” or a “citizen.” She is instead considered “a gear, a cog, a component part in financial and various other institutional assemblages” (Lazzarato 2014, 25). (For a brilliant explication of Guattarian theories see Lazzarato’s Signs and Machines).

Capitalism is so successful because it operates heterogeneously at the intersection of social subjection and machinic enslavement. We are all caught in a double bind between individuation and our dividual parts, unknown to ourselves.

In Guattari’s own words:

“Capitalism seizes individuals from the inside. The ideal of Capital is no longer to bother with individuals endowed with passions, capable of ambiguity, hesitation, and refusal as well as enthusiasm, but exclusively human robots. It aims to erase, neutralize, if not suppress, any categorization founded on something other than its own axiomatic of power and its technological imperatives. When, at the close of the chain, capital “rediscovers” men, women, children, the old, rich and poor, manual laborers, intellectuals, etc., it pretends to recreate them by itself, to redefine them according to its own criteria. But, precisely because it intervenes on the most functional levels—sensorial, affective and practical—the capitalist machinic enslavement is liable to reverse its effects, and to lead to a new type of machinic surplus-value accurately perceived by Marx (Guattari 1996, 262).

Guattari’s critique in his theories of desubjection (which I extend to the critique of machine learners) is of those critical theories that deal only with language and/or recognition while ignoring the de-subjectivation process and its non-representational semiotics.

4 Seizing the Means of (de)Subjection?

Now desubjection poses many interesting political problems.

Maurizio Lazzarato writes that “One must follow Guattari’s advice to “exit language” by doing two things: dissociate subjectivity from the subject, from the individual, and even from the human, and cease considering the power of expression as exclusive to man and subjectivity” (Lazzarato 2014 ,182).

Following Lazzarato – the question I pose is: in the dual fight against fascism and late technocapitalist austerity, would it be desirable to seize the means of desubjection? Or rather, must we destroy the means of subjection? Dividuals are the congealed dead labor of a digital era, a forever undead potential for resubjection. Yet how can dividuality be organized, away from the reterritorializing politics of representation and repressive identities? Is there a power inherent to the dividual subject?

The politics of machine learning are not yet entirely clear. What is clear is that machine learning needs endless supplies of data. Any data will do.  And increasingly that data can be unstructured.

The processes in which machine learners operate are actually becoming less understandable to the designers engineering their functions (Fourcade and Healy 2016, 11). With no hypothesis, no pre-existant model, machine learners experiment in ways that are pseudoscientific and virtually unrecognizable to their engineers. What emerges is what Luciana Parisi calls the “alien rule” of algorithmic ubiquity. “Far from making the rational system of governance more efficient, this new level of determination forces governance to rely on indeterminate probabilities, and thus to become confronted with data that produce alien rules. These rules are at once discrete and infinite, united and fractalized” (Parisi 2013, 11).

What will the potentials of alien rules, and desubjectivization lead toward? How can we hijack data vectors away from the endless reproductive markets of post-humanism and veer off toward an inhuman futurity? Desubjection must be theorized from a more situated context. For instance, Guattari’s project explicitly looked to unseat the tyrannical subjectivity embodied at all scales of  “western man- male and white.” Too often Felix Guattari’s technological theories of the subject have been utilized to continue the project of mutational capital and to seek the production of new subjectivities – any subjectivity whatsoever. One must wonder if Guattari’s exploration of desubjection can find a more radical usage today. For one, afro-pessimism refuses to validate affirm or strengthen forms of subjectivity presently produced under capitalism (Aarons, 18). Similarly, the negative identity politics that Madhavi Menon proposes in Indifference to Difference is quite compelling “not because it has to do with identity” but because it has to do with the “mundane radicalism of the desire to desubjectivize all categories” (Menon, 2015).

Of course, the false empiricism of machine learners is problematic as it uses arbitrary rules and predictions to justify social segregation (Mackenzie 2015, 441). However, for those political struggles not interested in recognition but more invested in functional power and the right to opacity, perhaps there is an opportunity offered by machine learning to turn its alienating weapons against the subjects of privilege.

 

 

WORKS CITED

Aarons, K, “No Selves To Abolish: Afropessimism, Anti-Politics and the End of the World.” online journal article. http://www.metamute.org/editorial/articles/wandering-abstraction# sdfootnote3sym (19/05/2017).

Cheney-Lippold, John. 2011, “A New Algorithmic Identity: Soft Biopolitics and the Modulation of Control.” Theory, Culture & Society 28, no. 6: 164-181.

Deleuze, Gilles, and Guattari, Felix. A Thousand Plateaus: Capitalism and Schizophrenia. University of Minnesota Press, 1987.

Deleuze, Gilles. 1992, “Postscript on the Societies of Control.” October 59: 3-7.

Fourcade, Marion, and Kieran Healy. 2016, “Seeing Like a Market.” Socio-Economic Review 15, no. 1: 9-29.

Guattari, Fèlix. Soft Subversions. Semiotext(e), 1996.

Haraway, Donna. 1988, “Situated Knowledges: The Science Question in Feminism and the Privilege of Partial Perspective.” Feminist Studies 14, no. 3: 575-599.

Lazzarato, Maurizio. Signs and Machines: Capitalism and the Production of Subjectivity. Semiotext(e), 2014.

McKittrick, Katherine, ed. Sylvia Wynter: On Being Human as Praxis. Duke University Press, 2014.

Mackenzie, Adrian. 2015, “The Production of Prediction: What Does Machine Learning Want?” European Journal of Cultural Studies 18, no. 4-5: 429-445.

Mackenzie, Adrian. 2013, “Programming Subjects in the Regime of Anticipation: Software Studies and Subjectivity.” Subjectivity 6, no. 4: 391-405.

Menon, Madhavi. Indifference to Difference: On Queer Universalism. University of Minnesota Press, 2015.

Mumford, Lewis. The Myth of the Machine: Techniques and Human Development. Harcourt, Brace & World, 1967.

Parisi, Luciana. Contagious Architecture: Computation, Aesthetics, and Space. MIT Press, 2013.

Pasquinelli, Matteo, ed. Alleys of your mind: Augmented Intelligence and its Taumas. Meson Press, 2015.

Puar, Jasbir K. The Right to Maim: Debility, Capacity, Disability. Duke University Press, 2017.

Spillers, Hortense J. 1987, “Mama’s Baby, Papa’s Maybe: An American Grammar Book.” Diacritics 17, no. 2: 65-81.

Zuboff, Shoshana. 2015, “Big Other: Surveillance Capitalism and the Prospects of an Information Civilization.” Journal of Information Technology 30, no. 1: 75-89.

8 Comments

  1. Hi Brett, thanks for a really thought-provoking contribution! My question is a bit speculative in that it’s slightly beyond the scope of your writing, but I wonder if you’ve given any thought to the feminisation of digital labour and its role in paving the way for the automation of digital labour? You write that “Language, labor, automated decision-making, and the abstraction of concrete practices into data management, all comprise the new apparatuses of capture. Machine learners, for instance, can extract surplus value without a subject doing any ‘work’ in the traditional sense (502)” Given the way the women’s work is still routinely recognised as less valuable to that of men, I wonder if and how this extraction of surplus value operated unevenly, across hierarchies of race, gender, class, and nationality–and, relatedly, if and how it exerts efforts to appear even, despite these discrepancies in subject position/relative labour-power.

    Specifically, I have in mind those workers (often women, queer, working class, or otherwise marginalised) who are tasked with identifying and flagging offensive material on social media platforms like Facebook and Instagram.(see https://www.wired.com/2014/10/content-moderation/ ) I’m thinking about how the processes of automation–for example, offensive materials being removed or reported on platforms–might seem to be automated by machine learning software trained to recognise certain features of an image, but are in fact inspected and meticulously removed by marginalised, exploited digital labourers whose minimum wage paychecks are no recompense for the traumas they endure flagging highly offensive, psychological harrowing material from the web. Platforms do not clearly outline what happens after images are reported–they omit this information, and so elide the personal, traumatic, and human work that it often involves to resolve these issues. I”d really like to hear your thoughts on this, if you have any.

    Like

    1. thanks for your thoughtful read and the suggestions. you bring up an interesting point here. the play between social subjection (identity based positions in the division of labor which tend to be marginalized) and de-subjectivization is something that has been troubling me for a long time now. i think you’re right to speak of automation as a myth, a smokescreen used to hide the realities of digital labor. i think what is fascinating to me is the play between re-humanizing machine learning (focusing on labor and the human) or following the alienating features of AI (unknown to the human). somehow being able to hold the contradictions together would be quite the task but i would like to see what could happen if we can account for the social injustices of the ubiquity of machine learners and their application in old systems of power while also being able to experiment with the ways that machine learners could potentially show us something different or alienating about ourselves that can’t be understood in terms of capitalist metrics or the temporalities of digital labor.

      Like

  2. dear brett, thx for sharing!

    – the inversion you propose makes me think about the interaction between processes of subjectivation and desubjectivation. you refer to it as a double bind towards the end of your paper. this references seems to apply well on the functioning of POV technology of vision within the frame of what I’m trying to refer to as POV-opticon (replacing the Panopticon as the regime of visibility of modernity), the regime of visibility outlined by the explosion of POV (Point of view) technologies of vision – mobile phones, VR and AR technologies – that enables forms of data-veillance based on the simultaneous unfolding of the processes of subjectivation and desubjectivation (here paper in case you’re curious https://www.academia.edu/37648284/From_Panopticon_to_POV-opticon_drive_to_visibility_and_games_of_truth). the question for me here would be how the double bind act in relation to the POV-opticon? the guattarian passage from political economy to subjective economy you mention at the beginning makes me think of a form of POV politics acting via the POV-opticon and its corrispondent games of truth (bubbles and fake news) that turns into regime of truth (forcing here a distinction only sketched by Foucault) once the datafication of the subject (its algorithmic de-subjectification) is inscribed back into the subject himself in the form of a POV-data double that thus re-orient the subject towards the desired affects and behaviors (subjectivation). to practice the inversion you propose means then to reinvent the forms of de-subjectivation as non representational semiotics à la Guattari capable of resisting POV-capture. following this line, the question would be then to understand how to inject de-subjectivation as “exit from language” into the algorithmic de-subjectivation at work in 21 century media algorithmic governance happening simultaneously to its positive re-inscription or subjectivation.

    looking forward to talk more!

    Like

    1. thanks for the thoughtful read! POV-opticon is super interesting. i wonder how sound practices and embodied media fit into your concept. i think i remember reading somewhere, maybe it was a bifo book, that the brain is always in the dark. part of my project is trying to come to terms with opacity and the tactics of invisibility… through poetics, speech, the things that exit language or exit visuality. not in an excessive manner but in a way of recoding and retooling the world. or destroying the world (deleuze). not sure. maybe lets talk about how subjective and visual need to be seperated? looking forward to getting into it!

      Like

  3. Thanks for sharing Brett.

    I agree we are quite similar in our interest in subjectivity. I find your reference to Guattari quite enlightening. I’m not an expert on him so my comments and questions might seem a bit naive. It seems to me that the question “is there power inherent to the dividual subject?” relies on two elements. On the one hand what is meant by ‘power’? There are a few avenues for this, power as knowledge, truth and discourse is an obvious one of course. Or the topic of agency could be discussed. So does the dividual have agency? Now it seems to me that what Guattari is claiming is that somehow as a cog, a piece of the machine, or in your case some data, the dividual’s agency is fractured, it’s divided or perhaps shared. If it’s the later than the question might be how does one share agency or what is to be done with shared agency. If it’s the former then we can ask, is there some clinamen binding the dividual, is that the social subjection? or can it be something else? does it need to be something else? I suppose what I’m exploring is the libidinal economy as that which binds. I’m not going to say dividuals in my case. I’m not familiar enough with Guattari to make that claim and I think the notion of libidinal economy as Frank B. Wilderson talks about it can be quite violent. So the clinamen doesn’t always produce a favourable or morally sound relationship. Anyway sorry for the waffle and I look forward to hearing more next week.

    Like

    1. I appreciate these comments. The dividual is such an interesting category that is hard to really come to terms with. Shared dividuality is really where the potential is. A kind of data labor relation beyond cooperation in a conscious sense. Not sure. But I do look forward to our discussions!

      Like

  4. Hi Brett, thank you for the thought-provoking writing. I am sorry this comment is late – I recently re-moved abroad and I am battling the natural visa routine parallel to the inconsistencies of circadian rhythms.

    Nevertheless, I am too caught in a bind regarding the “possibilities of machine learning” and (de)subjectification.

    Per your article, I am specifically interested in Lazzarato’s (per Guattari) comment regarding “exiting language” as a means of disassociation or even reclamation. I can’t help but focus on the digital here, regarding the concept of language. Meaning, perhaps we have already exited the use of language as a system of relation, or as you might call “social subjectification,” and entered a digital language, which here is simply data. Thereafter, I wonder how ‘digital language’ might possibilize instances of “reterritorializing politics of representation and repressive identities”?

    Looking forward to future discussions!

    Like

  5. hi brett! like many of the above, i also very closely share your interest in questions regarding the processes of subjectivation as they relate to ML processes that undergird our discursive systems of meaning. i think the guattarian/ media ecological framework is very useful here in expanding an analysis -beyond- merely representational practices and towards forms of collective politics that might lie in the realm of the dividual. guattari maybe opens up futurity for us at a place that might otherwise seem an impasse– but i often wonder what it means to seize the process of desubjectivization as a site of politics if the dividual is by nature what is -not- able to appear within a certain regime, logic, or discursive milieu. you ask, “What will the potentials of alien rules, and desubjectivization lead toward? How can we hijack data vectors away from the endless reproductive markets of post-humanism and veer off toward an inhuman futurity?” if humanist discourse, inherited from philosophical reason, always serves to recapitulate humanist biases, how can technologies birthed of those same discourses be retooled towards radical ends?? this is an interesting question happening in a lot of accelerationist feminist work (like xenofeminism) and has a history in postmodern feminist thought as well, like irigaray and harraway (which you reference). irigaray also talks about a hypothetical “exit from language”– in her case it is towards a feminine non-representational form of communication and meaning-making we must strive. so i wonder, what does it mean to resist representation in favor of non-representational practices if practices of representation are imbued with long-standing and hidden power dynamics? sorry for the ramble– but i think this nexus of representation, reproduction, and identity politics is very core to a productive way of thinking about ML and i look forward to talking more!

    Like

Leave a reply to Rebecca Uliasz Cancel reply