Why We Hate AI
Why do we hate AI?
It feels emotionally obvious. It's because it's an affront to humanity and our ways of expression; but this is not a way of explaining why, it's just a form of lashing out. I get it, I want to lash out too. But a human/inhuman distinction is a miserable tool to begin throwing around in our ranks.
It feels legally obvious. It's because it's intellectual property theft on an obscene scale; but law is something that is carved and grown as needed by forces of power. Most power in the western world is held in Capital, therefore law is sown and reaped by Capital at its own whims. Rarely do we get to slice back, and the structure of law being under the thumb of Capital means those slices at best return an individual-sized windfall. A windfall is not revolution. Intellectual Property as a legal framework was sown by Capital and it will be reaped only as far back as Capital needs it to be, not so much so that we all benefit.
It feels materially obvious, this one is far less muddy. It's because we're seeing less work. Vocations that lives were built on are decaying. It's muddy because we see it directly in the workplace, or in that workplace's absence. This one is far less muddy because our economy is measured in ways that read output without measuring value.
Commodity Fetishism
Marx writes about how this measure of output hides where value comes from, how the price-at-market of a commodity hides the social relations of the work, the workplace, materials, labour, training – the social networks – that produced it.
A commodity is something sold within a market. Market forces suggest that the value of a commodity is where supply meets demand. A process of production/extraction provides a seller with a commodity, and the value that commodity is sold at market will converge based on supply/demand. Price of a commodity is the output, to a market. Price-at-market is about normalising the value of objects in relation to one another. The price of 500kg of grain vs 250kg of flour vs 20000 bagels vs 50 game consoles.
A screwdriver is a commodity, or at least it is when the social relations of production between it and you are obscured. A friend knows you work on personal projects on Thursday evenings, this friend is a machinist that has spare hours in the day and makes for you a screwdriver she knows is just what you need for particular tasks. You receive this gift from her, you know she made it and who she learned from. This screwdriver is not a commodity, it was not placed on a market for a buyer to see. It was produced through labour, but you did not buy it, nor did you coerce it from her in any way.
This friend-made screwdriver is an artefact of a social relationship. So is the screwdriver you bought 5 years ago at a hardware shop, or that came in a game controller repair kit, but the social relationship between you and the people involved in the production of those commodity screwdrivers is obscured. That commodity screwdriver is instead something with a numerical value attached to it, and that numerical value is only what it is when compared to other items at the market.
Enclosure of Networks
What GenAI means in practice is that there are fewer opportunities to connect with people. If a craft or industry collapses, the networks that one draws social, economic, cultural value from collapse with it. In the intellectual labour sphere we face people simply not connecting with each other because participants can produce the things we measure success with mechanically without constructing the social connections or growing the critical skills that we value. Those artefacts that once acted as proof of having those skills and connections are now in a crisis of trust.
This is how Capital has operated since inception, and this is where the similarities to proletarianisation emerge from current GenAI work. The economic transformation that is wanted is one that reaps the value generated by communities of people interacting with one another. We are torn apart from one another.
This is what pains us when we don't see it all collapse. This is what pains us when we wake up and find another thing we depend on has started suckling at silicon valley's latest rotten teat. It is violence the way that so much of the world is violence, violence in that the status quo is violent. It hurts, in part, because so many people who are affected by this thought we could get away with this not happening to us. We're too good for this. We had safety. If we didn't have safety we at least had prestige. The work was too artisan, too education-dependent. It couldn't happen here.
Objectivity Fetishism
Life is so fucking difficult right now. Is anyone else noticing this? The pressure on workers to put out as much as possible for as little as possible is the fruit of neoliberal economics. Commodity fetishism is called such because there is a detached way of thinking about commodities that comes with Capital. The commodity is detached from the worker, the networks are refused to be made visible.
Objectivity is a near-200-years-old method of deferring to the machine. Lorraine Daston and Peter Galison offer a massive tome that is a genealogy of Objectivity. It's a new phenomena, it's highly influential, but it's not best practice.
Objectivity (2007, Daston & Galison) highlights 3 historical modes of doing science. Not 3-in-total, but 3 recent ones. The first is truth-to-nature, this is when scientists would observe a subject and try to imagine a perfect form of which that subject is an imperfect instance. The second is objectivity, the idea that mechanical capture of information is the most important part of scientific method, this emerged from technological changes in science: the development of the camera. The third is trained judgement, this emerged after the failure of overreliance on objectivity and as people got more familiar with objective tools, how to deploy them, and how the user of these tools is always present in their results.
Outside of scientific best practice, through the popular perception, objectivity is deeply enticing. The world is so difficult right now, we have judgement-and-summary automation tools, we're tired, why not use them whenever they appear?
Before the marketing of current-generation GenAI we had people telling lies with statistics in normal ways, building the story they wanted around data to appear objective to the layman. Pseudoscientific objectivity, the appearance of scientific method through the deployment of objective tools in key areas.
GenAI fits so cleanly into this, we have a machine that produces convincing language and images. We can deploy this in so much of day to day life it's maddening. We have machines that produce what we know as artefacts of subjective experience, evidence of our lives, in a mechanically objective manner.
Because of the overvaluation of mechanical objectivity we find ourselves in a "computer says no" situation. A position where it is so painfully difficult to advocate for human writing, art, judgement, general humanities practice because a mechanical alternative exists. Because that mechanical alternative exists, it's the easiest thing in the world for someone above you to ask why don't you use the machine? This carries the implicit statement you can get more done if you use the machine, and if you don't use the machine we'll replace you with someone who will. It is easy, in positions of exhaustion, to make this argument to yourself too.
This is a detached way of thinking about cultural production. Much like we have a detached way of thinking about commodities, the commodity fetishism, we imagine that the value in information is the final information and not the networks, labour, context, and life that surround that information and mean it is signal not noise. The machine produces something statistically similar to a worker, all social relations that established that distribution hidden.
Commodity fetishism obscures the value drawn from social relations in favour of suggesting the value is in relations between objects. Similarly, we have an objectivity fetishism that obscures social relations of a particular kind of labour in favour of the machine-output objects that rely on that labour to have any current and future use-value.
The Efficiency Mandate
In a workplace that is otherwise maybe treating you well, you might have encountered a demand that you use "Technology XYZ" in your workflow.
This relationship between management and worker is fascinating. Rather than you, as a worker, adopting a technology on your own terms and integrating it into your workflow there is an external demand to. This demand, this mandate, could be mundane or appreciated in the long run if the new workflow is acceptable to the worker. If this new workflow isn't acceptable to the worker, it causes conflict between worker and manager/employer.
Managers and employers have a tool to remove conflict from a workplace that workers tend not to: termination of contract. Getting you fired. As a worker, this is stressful for obvious reasons.
Many workplaces in tech now demand that an AI tool in some form or another be used in workflows. The reasoning being "efficiency." There is an idea that the technology is so flexible and obviously useful that no matter the domain its deployment will improve work efficiency.
What's being demanded here is greater efficiency through the proxy of AI usage. "We have AI now, so you must be more efficient. We will measure two things: your AI usage, and your Productivity."
This increased focus on Efficiency is part of a broader liberal-gone-fascist milieu of "the west" right now. Efficiency and Measure are tied up in this detached way of thinking about process and people and product in a way that erases the relationships that make up the things we make together.
This fealty to "objective perspectives" through measurement of output and measurement of tool use is deeply stressful to many workers who don't eagerly jump to this mode of working that is currently being heavily marketed. It pushes conflict into the workflow from the top-down, it is a professionally polite act of aggression.
Life Support
Someone once shared their favourite comic with me and it quickly became one of my own favourites, forgiving its 2010s-era technologist signifiers. The Subnormality story Message 652 expresses one of the many social relationships in safety-critical engineering. An anonymous apology from an engineer whose work will, through individual and collective fallibility and laws of large numbers, fail some people who trust their lives with it. A relationship between a person working on something and those who will trust that work in the future.
Alienation, Commodity Fetishism, objectivity fetishism, loneliness, psychosis, suicide: these are all failure outcomes of a social world. We as highly-social highly-specialised information workers are going through a long period of deskilling capped by this latest introduction of GenAI machinery. This doesn't mean that the outcome of these devastating technologies will for certain mean a lack of social information labour in the future, but it does mean that we're in a period of uncertainty. There is potential for the perceived use-value of these technologies to collapse and for them to see only niche use, but that current argument:
"We can seemingly reanimate the products of some specialised labour into machine copies of the workers that produced it, so cull your workforce. Throw your workers into a manufactured crisis, and reap the rewards."
is infinitely enticing to Capital, organisations-under-capital, governments-under-capital. It would be enticing to us, if its deployment were strictly on our terms without enforcement or hype.
Capital demands that society is forced into producing value on its terms, its understanding of commodity, but we spend so much of our lives grasping for social value. It's the social value from which we find and express desires, it's through social circumstances that so much life worth living springs.
The value of the work that is currently having its total mechanisation argued for comes from the social relations between the people who produce things through work and the people who engage with that produce. It comes from our ability to know and trust, where we need to, that someone somewhere is saying to us as we say to those who come after us: "I tried so hard because I was thinking of you. I was always thinking of you."