Excerpted from David Judd and Zakiya Khabir:
“Software developers are much better compensated than the average worker in the tech industry. Last year in the U.S., the median worker earned a paltry $35,540 compared to $91,320 for software developers and programmers. And this doesn’t include the comprehensive benefits packages that are the industry norm. Even the lowest 10th percentile of programmers makes $50,920 per year on average.
Programmers typically have more control over their working lives than workers on an assembly line or at a checkout counter. Nobody has yet figured out how to Taylorize software development. The task of writing a working program has yet to be broken down into subtasks that can be performed without specialized knowledge and some grasp of the whole system. Individual output is hard to measure effectively, especially in the short term, which makes it difficult for employers to control the pace of work or use carrots and sticks to make up for a lack of internal motivation on the part of employees.
For these reasons, programmers are often thought of as “professionals,” like doctors or lawyers, and not part of the “working class.” But this is misleading. Most programmers are neither self-employed nor employed in an organization where they have a meaningful say, and most can’t expect any kind of tenure or partnership status, no matter how long they work. If anything, older programmers instead start to experience age discrimination.
Programmers are typically hired at-will, and are not generally accepted to have any professional obligations that could supersede management’s authority, nor the ability to do research or pro bono work on company time. Management styles range from the likes of Dilbert’s pointy-haired boss to a hierarchy so flat that it’s almost invisible–until it’s time for somebody to get fired. Still, for most programmers, management authority is a constant.
Of course, this is increasingly true of people who work in hospitals and law offices, too. The larger trend doesn’t bode well for the relative privileges that programmers have acquired.
Since the Industrial Revolution, capitalism has shown a long-term tendency to deskill labor and separate conception from execution in a way that increases the power of management. As industries mature, they tend to try to make workers interchangeable units. That hasn’t happened to developers yet–but it’s worth taking noting when perks that used to grant some workers unusual freedom from oversight–like Google’s “20 percent time” or remote work at Yahoo and Reddit, are stripped away. (This despite the disproportionate impact of a forced relocation on working parents, and thus on the industry’s vaunted diversity initiatives.)
In part because the measurement of programmer productivity is an unsolved problem, software developers are often expected to work long hours. In Marxist terms, management settles for extracting more absolute surplus value because it can’t extract more relative surplus value.
At crunch times, particularly in startups and the gaming industry, 60- to 80-hour workweeks are routine. Perks like catered meals–increasingly common, at least in areas like Silicon Valley and San Francisco with highly competitive labor markets–also serve the purpose of keeping employees in the office longer.
And programmers can rarely expect to leave work behind when leaving the workplace. Even if they are not on-call to fix a crashed server, they typically need to read, program and attend meetups and conferences on their own time in order to keep up with a rapidly changing field.
The imperative to extract every drop of productivity from the workforce is not unique to the tech industry. It appears anywhere that profit comes from the difference between the value of what workers produce and the wages they are paid–that is, throughout capitalism.
It may sound strange to suggest that someone coding, for example, a smartphone app whose single purpose is to let you send your friends the message “Yo” is not only producing value, but a surplus on top of an above-average salary. Yet in a society in which cash value trumps human need, it’s true–at least assuming the Yo app, a real example, ever makes money for its investors.
Despite the most celebrated tech companies’ aversion to explicit hierarchy, their widespread use of stock options as compensation, and other corporate techniques to convince workers that every member of “the team” is on the same side, it’s clear that the people at the top are know that wages come at the expense of profits.
Thus, Apple, Google, Intel, Adobe, Intuit and Pixar have not yet disentangled themselves from antitrust and class action lawsuits resulting from an agreement–initiated by Google and Apple, and eventually involving dozens more companies, with more than a million employees in total–to cap wages by refusing to compete for each others’ employees.
This cartel resulted in several billion dollars in lost wages, which went straight to corporate profits. But this transfer, which the law might recognize as theft, is in fact only the tip of the iceberg. It represents a deviation from the ideal of a competitive market, which itself offers no guarantee to workers that they’ll receive value equivalent to what they provide.
Across the whole U.S. economy, hourly productivity grew by 80 percent in the last 40 years, while hourly income for the median worker grew only 10 percent. In the software industry, measured productivity has grown 12 percent a year for the last 25 years–meaning that it doubles about every six years. Wages have increased in tech, but not that fast.
Software developers often buy into the idea of advancement by individual merit–either through a liberal lens in which meritocracy is an ideal we still need to work toward, or a libertarian one in which everyone is already where they deserve to be, top or bottom.
This is partly a trickle-down illusion, based on an aspiration to have more in common with Larry Page, Sergey Brin and Mark Zuckerberg than a bank teller or barista. But it also has a basis in the following reality: Most programmers personally know a lot more colleagues who became unhappy at some workplace and quit for a better job than who achieved something through any kind of collective organizing. In this context, it’s easy for those who believe they are being mistreated to feel isolated and even personally inadequate, rather than seeking solidarity from co-workers.
The cultural barriers aren’t insurmountable. It’s hard to explain this year’s union drive at Gawker as the product of some sclerotic old-economy work structure, for example. And collective organizing could very obviously bring benefits that are hard to negotiate individually: transparency around salary and promotions, with equal pay for equal work; reasonable scheduling and accommodations like child care for people with families; the right to contribute to open source projects or veto the release of insecure, privacy-violating or otherwise unprofessional code.
Most important, however, might be the idea of solidarity. One of the only workers’ organizations among programmers, WashTech, was launched at Microsoft in 1998 by contractors who were fed up with being treated as second-tier employees. Immigrants on H1B visas frequently face similar problems. Issues of discrimination based on gender and race constantly simmer in tech and occasionally boil over.
Yes, it’s hard to imagine software developers on a picket line, especially in the short term. But it’s also hard to imagine real progress against sexism and racism within tech–or a world where white-collar tech workers relate to the cities they live in as something other than foot soldiers of gentrification and displacement–without a revival of the old radical idea that an injury to one is an injury to all.”