This week’s Metal Machine Music is about:
what happens when you platformize the police
a roundup of readings on subjects ranging from the smart city to VC
a bit of history involving a group of French anarchists who went around destroying computers
Subscribe to get this newsletter in your inbox, or read on.
“Platform” is one of those words that, like “innovation,” has become so overextended as to become almost meaningless. If you wrote down all of the things that companies are calling platforms these days, you would end up with a very long list of very different things.
In fairness, this ambiguity has been there since the beginning. And it’s been a productive ambiguity, as Tarleton Gillespie explores in his 2010 classic, “The Politics of ‘Platforms’.” It has been useful for tech firms to define “platform” rather broadly, since the qualities associated with the metaphor—openness, neutrality—are ones that firms can use to absolve themselves of responsibility for what happens on their services. The business model of these firms rests on the fiction that they are not publishers (thanks Section 230!) and thus not liable, legally or otherwise, for the content that their services circulate. The platform metaphor is a valuable tool for sustaining this fiction—particularly now, as it comes under strain on Capitol Hill.
This is why, following Gillespie, it makes sense to see the platform as a discursive phenomenon as much as a technical one. It also probably makes sense to think of this phenomenon as a process rather than a thing: as an ongoing practice of platforming.
One place where the practice of platforming has been producing some alarming effects is the world of policing. Police platforms aren’t all that widely known: there’s been a lot of mainstream conversation in recent years about how law enforcement agencies are using ML-based technologies like “predictive policing” algorithms, but I’ve seen relatively little discussion of the platform angle.
That’s why I was excited to read Stacy E. Wood’s new article, “Policing through Platform.” Wood looks at a cloud-based platform service sold by Axon, the company that makes police tasers and bodycams. The platform enables law enforcement agencies to connect various Axon devices and services—bodycams, tasers, in-car cameras, a digital evidence management system, smartphone apps, etc—into a single integrated portal. It also makes it feasible for even small police departments to engage in some version of “big data policing” without the cost and headache of managing their own infrastructure—a bit like how AWS made it possible for small companies to get the benefits of a big data center.
Here’s what struck me most strongly as I read the piece:
Patriot Act as ImageNet: The era of big data policing wouldn’t have been possible without a series of post-9/11 federal policy changes that both vastly extended the scale of government data collection and integrated formerly siloed databases from various agencies. To do ML well, you need lots of data. And just as one of the preconditions of the current ML boom was the rise of the web, which offered a new source of abundant training data—ImageNet needed Flickr, for instance—the policy response to 9/11 was a precondition for the emergence of big data policing.
Insulate, insulate, insulate. We’re living at a moment when popular struggles over policing are proliferating. Social movements are pushing back against police violence, organizers and scholars are putting terms like mass incarceration and prison abolition into mainstream circulation, and progressive DAs are winning races around the country (hello Chesa). Interestingly, Axon seems to be marketing its platform with exactly these developments in mind. According to Wood, Axon claims that the use of their platform will lead to “a reduction in the number of false complaints (against the police); decreased use of force… enhanced public trust… [and] decreased litigation.” In other words, platformization can help insulate police departments from criticism, protest, and legal action at a time of growing public anger.
Opacity-as-a-service. Platformization is often presented as a process of opening: you open an API to let developers build apps around your service. Actually existing platforms, of course, are riddled with black boxes: you might be able to talk to Facebook’s APIs, but its internals are totally opaque. In the case of Axon’s platform, however, even the pretense of openness has been dispensed with. The value proposition, it seems, lies precisely in the platform’s opacity. Opacity is a mechanism for hiding what law enforcement does, to preclude the possibility of public oversight. As Wood writes:
“In fact, through Axon’s platforms, even more aspects of police labor are hidden. In the world of platform policing, opacity is a feature not an accident. A lack of understanding about what exactly goes into the functioning of the platform allows for the performance of process, precluding intervention, questioning or dispute. The record claims further authority through this process of automation, even as the sources of data are no less problematic or even more accurate.”
Content creators. Viral videos of police shootings shared on social media have become a major phenomenon in recent years. They have indisputably played a role in propelling the current cycle of popular struggles around policing. Two of the apps within the Axon platform that Wood examines are designed in part to help police counter this dynamic. They enable law enforcement agencies to produce social media narratives of their own, by sourcing and selecting video that seems to substantiate their version of events. Axon Citizen lets members of the public submit smartphone video directly to police via “public evidence submission portals” that can be advertised on social media, while Axon View gives police officers the ability to do both instant replay and livestream of their bodycam footage. “Mimicking the user interface and informational flow of social media platforms,” Wood writes, “these apps give the impression that police work is another form of content creation.”
Some other stuff
Here’s a handful of other things worth reading:
Urban warfare: Jathan Sadowksi has a new piece, “The Captured City,” that’s quite relevant to the above discussion. The “smart city” concept has typically been sold as a way to make cities more convenient, more efficient, more entrepreneurial—think Sidewalk Labs. But Jathan argues that the smart city is in fact primarily about the militarization of urban space. He talks in particular about the Domain Awareness System, a collaboration between the NYPD and Microsoft that uses a vast network of cameras and sensors to create a unified system of ubiquitous surveillance. Here’s Jathan:
“These technologies treated the city like a battlespace, redeploying information systems originally created for military purposes for urban policing. Sensors, cameras, and other networked surveillance systems gather intelligence through quasi-militaristic methods to feed another set of systems capable of deploying resources in response… Contrary to the suggestions of ‘smartness’ shills, these systems are not used by the general public but on it. This urban war machine (as I call it in my forthcoming book Too Smart) is the true essence of “smart” urbanism. It is the next step in the high-tech militarization of society… The idea of the captured city requires an adversarial view of a city’s inhabitants: When the enemy can be anywhere, the battlespace is everywhere; all places and people must be accounted for at all times.”
Minnesota nice: This newsletter has been a bit of a downer so far, so here’s a pick-me-up: “Meet the Immigrants Who Took On Amazon” by Jessica Bruder. It’s a story about a group of Somali immigrants who are organizing for better working conditions at Amazon warehouses in Minnesota, and pulling off the first strike actions the company has seen in North America.
It’s getting crowded: Sai Krishna Kamepall, Raghuram Rajan, and Luigi Zingales from the University of Chicago have produced an interesting report with a fun title: “Kill Zone.” They look at major acquisitions conducted by Facebook and Google from 2006 to 2018 and conclude that VC investments in startups in the same space as the company acquired fall by 46 percent and the number of deals by 42 percent in the three years following an acquisition. Big acquisitions by the tech majors generate “kill zones” that other investors don’t want to enter, in other words, because they figure there’s no hope of competing. If Facebook buys a social photo-sharing app and integrates it into its massive network, then why invest in another social photo-sharing app? The report offers an interesting glimpse at how much the Silicon Valley ecosystem has changed over the past decade or so, as the big firms have grown so big that they’re crowding out VC.
Information wants to be free: The Labour Party has just announced a proposal to provide free high-quality broadband to everyone in the UK by 2030. I have a new piece in Tribune about it called “Internet for All.” Taking internet access off the market and making it a social right will improve the lives of a lot of people, particularly those in rural and poor communities. It will also open up political space for a deeper democratization of digital life, as we work our way up the stack from the pipes to the platforms.
In the 1980s, a French anarchist organization called CLODO conducted a series of attacks on computer centers. While “clodo” is slang for homeless, the name was also an acronym—although there seems to be some confusion about what exactly the acronym stood for. A few possibilities: “Committee for the Liquidation and Misappropriation of Computers,” “Computer Liquidation and Hijacking Committee,” and “Committee for Releasing or Setting Fire to Computers.” You get the idea.
In 1980, they broke into the offices of Philips Data Systems in Toulouse and destroyed its computers. In 1985, they firebombed the offices of computer manufacturer Sperry Univac, also in Toulouse. In a letter to Libération, they explained their reasoning:
We are computer workers and therefore well placed to know the present and future dangers of computer systems. Computers are the favorite instrument of the powerful. They are used to classify, control, and repress. We do not want to be shut up in the ghettos of programs and organizational patterns.
In 1984, the great underground magazine Processed World—which is a treasure if you haven’t encountered it before—ran a translation of an interview with a CLODO member that offers a bit more detail on their thinking:
Why do you do computer sabotage?
To challenge everyone, programmers and non-programmers, so that we can reflect a little more on this world we live in and which we create, and on the way computerization transforms this society.
We are essentially attacking what these tools lead to: files, surveillance by means of badges and cards, instrument of profit maximization for the bosses and of accelerated pauperization for those who are rejected…
Aren't you really a bit retro, like the machine breakers of the 19th Century?
Faced with the tools of those in power, dominated people have always used sabotage or subversion. It's neither retrograde nor novel. Looking at the past, we see only slavery and dehumanization, unless we go back to certain so-called primitive societies. And though we may not all share the same "social project,'' we know that it's stupid to try and turn back the clock.
Computer tools are undoubtedly perverted at their very origin (the abuse of the quantitative and the reduction to the binary are proof of this) but they could be used for other ends than the ones they now serve. When we recognize that the most computerized sector is the army, and that 94% of civilian computer-time is used for management and accounting, we don't feel like the loom-breakers of the 19th century (even though they fought against dehumanization in their jobs). Nor are we defenders of the computer-created unemployed… if microprocessors create unemployment, instead of reducing everyone's working-time, it's because we live in a brutal society, and this is by no means a reason to destroy microprocessors.