Palantir CEO Alex Karp is a man in charge of one of the most important and frightening companies in the world. Karp's new book, cowritten with Nicholas Zamiska, is called The Technological Republic. After claiming "because we get asked a lot," Palantir posted a 22-point summary of the book that reads like a corporate manifesto. It evokes both weird reactionary shit and also trilby-wearing Reddit comments from the early 2010s.
Palantir's summary of the book is ominous. But even the company's name is unironically ominous. The palantíri are crystal balls in The Lord of the Rings that let Middle-earth's worst tyrants spy on the heroes of the st …
“AI” exists to disenfranchise labor. That’s what it’s for. Regardless of how good these stochastic systems are or the flaws they have just being able to point at the non-unionized robot whenever the employees ask for raises or anything really is incredibly valuable for business. The existence of “AI” and the supporting narrative mean that you defending your value and therefore price on the market will have a way harder time.
But that’s the impact for people on the top. That’s why the C-Suite wants those tools to exist and to be deployed. I think there is also a strong effect on the way “AI” affects the social relationships between peers and coworkers.
Work is not really about finding friends. It happens but it’s not necessary. But there is – ideally – a level of respect for one another. The acknowledgement of each other’s expertise, skills and contributions as well as one’s weaknesses. All those allow the forming of communities based on solidarity: If the people touching computers realize that the workers in the warehouse are part of the same struggle all those groups together can actually develop some power to make their needs heard and acted upon. If for example a company manages to separate the different groups and departments, plots them against each other, this weakens everyone but management.
And I think “AI” has this massive effect on feeling connected with each other, in seeing each other as peers.
Let me tell you a story. A few weeks ago a colleague was talking about an issue he had with a specific piece of software that we are – as we usually do – using a bit outside of what it is intended for. He outlined the problem and then went into the things he tried to solve it culminating in an explanation of what probably will work. Another colleague I respect a lot then responded with “did you ask the AI?” and it felt like a scene from a movie where the protagonist is just sitting there, hearing something and suddenly a big dude comes in and slaps him in the face with a fish. I was irritated and it really took me a while to understand why.
It was not the absurdity of the statement (What even is “the AI”?) or the weird dynamic of asking a colleague who just presented his solution with the suggestion to use an unreliable search engine instead. It was me feeling drifting away from that person. Because that statement made it so obvious how every “here’s a problem” is now connected to “let’s ask the spicy autocomplete” for that person.
In the weeks after that whenever there was an issue with the stuff coming from that team, when something wasn’t up to par, or had a weird structure I realized how I kept attributing it to them leaning heavily on slop machines. Which isn’t necessarily true: Shit happens, software and hardware is complicated and sometimes things end up weird or hackish or broken – regardless of what tech you use.
But I realized that this event (and all the little similar events before) ate away from a trust relationship that had been developed over years. I was having sort of the perceived “workslop” experience.
“Workslop” is a term that defines bad, “AI” generated work product that someone produces to fulfill their work duties on a surface level that their coworkers will have to clean up: I generate a bunch of code that kinda works and someone else realizes that it’s a hot mess when trying to run it and has to clean up the mess. In that example I would have produced workslop (but might have been very efficient!).
The experience of workslop (whether it’s real or mostly perceived as I showed in my short story) directly erodes social connection, erodes trust in one another and in the end erodes solidarity. Because why would you stand with a person who does not do their job and offloads their work on you?
Pride in one’s work under capitalism is a bit of a weird thing: You can be proud of what you did but that great work will more often than not not be especially rewarded. But what it does is signal something to your coworkers: Doing good work means that you respect the people working with you, working off of what you did. It means that you try to make their lives as easy as possible because you’re all in the same boat.
“AI” pushes everyone towards slop. “Just do it. It’s easy. You look innovative. It’s fast. The others can also just use the slop machine to fix the mess if it occurs.” But the dissolving effects on the social fabric of the workplace cannot be underestimated: We are already working in conditions that make building the foundations of solidarity harder. Teams get pitted against each other based on KPIs, everyone is working from home, never having to meet their coworkers. As usual “AI” is just gasoline on the fire. But maybe we should start putting out some fires?