Dangers of the sharing economy (1): Is Your Digital Boss Cheating You?

From a series of five contributions for The Nation.

by Frank Pasquale for THE NATION

Imagine if you never had to deal with a human boss again. Work projects would flow to your inbox seamlessly. You could take them or leave them as you chose. Want a day off? Just don’t answer email. Freed from the 9 to 5 grind, you’d have a chance to work whenever you want.

Executives at new labor platforms say that future is now. They paint their drivers, hosts, and “taskers” as so much freer than ordinary workers to move to where their talents are best appreciated (on a day-by-day, or even hour-by-hour, basis). It’s Silicon Valley’s version of “right to work” laws.

Of course, someone still has to decide who gets prime assignments—or any projects at all. Those decisions, fueled in part by customer feedback, are mediated by computer code. But what happens when the software isn’t accessible to anyone other than top managers and the engineers who worked on it (who are bound by non-disclosure agreements never to reveal what they coded, or why)?

It’s a situation ripe for exploitation, if other experiences with “black box” algorithms online are any guide. Facebook is a critical intermediary between journalists and readers now, but it’s nearly impossible for the average reader (or even small media outlet) to keep up with all the changes the company makes to EdgeRank—the algorithm that decides what posts get top billing in your feed, and which get dumped into obscurity. Researcher Christian Sandvig has already documented plenty of examples of “corrupt personalization” in that setting—algorithmic choices that seem designed less to serve media or users, than to accelerate profits for Facebook shareholders. There are also few, if any, opportunities for those damaged by algorithmic shake-ups, or irresponsible data collection, to challenge online behemoths.

The same problems are emerging in “real space,” as algorithmic methods automate the distribution of opportunity in corporate platforms. For example, consider how easily mere petulance or spite—from either an Uber passenger or a manager in the firm—could doom a driver. The “Uber Driver Diaries” site has myriad stories of those perplexed by a sudden drop in their status. When a tenth of a point can be the key to making ends meet, the stakes of digital ratings are high. But many are afraid to even contest their status—complain too much, and you might end up branded a “whiner,” or worse.

These problems are old hat for those who have relied on Google to drive traffic to their sites. Metafilter recently complained that a Google algorithm change decimated the site’s traffic, leading to layoffs. The Georgia owner of 2bigfeet.com said that, when Google penalized his site, it was like the highway department taking out the roads to his store. Another online retailer, GourmetGiftBaskets.com, described filling out a “Google Confessional” in response to Mountain View’s suspicions that he was using search engine optimization inappropriately. Once absolved, his site climbed back up the rankings, and appears safe now—or at least until one of Google’s many conglomeratized branches decides to enter the gift basket space.

Real cars, rooms, and people seem a lot harder to manipulate than characters on a screen. But the interface—between hirer and tasker, or rider and driver—is the critical glue holding platform economies together. I have no idea how Uber matches me to a driver, nor does he know how I was matched to him. And if Uber wants to mess with either of us, there’s not much we can do at this point.

That should change. The Electronic Privacy Information Center has led a campaign for “algorithmic transparency,” to end secret profiling—whether of consumers or workers. Everyone participating in the new platform economy deserves a chance to understand exactly how our profiles are being created. And we deserve the right to inspect, annotate, and correct mistakes, or even unfair marks against us. That’s a cornerstone of technological due process. And it would be far more easily implemented in platform co-ops, subject to governance by stakeholders, than by the current crop of corporate platforms. As long as they are beholden to Wall Street’s speculative demands for rapid scaling, they simply can’t afford to be fair or open.

Skeptics may gripe that workers and consumers didn’t have such rights in meatspace, so why grant them in the new, weightless economy? Several responses should be obvious. First, the concentration of power in massive firms like Uber or AirBnB is something new in fields like car services and room rental. Their leverage demands a considered response. Second, we’re constantly hearing about how Internet disintermediation is reducing costs. So why not invest a few of those gains in making sure basic principles of anti-discrimination, due process, and reputational integrity survive our transition to a more digital economy? Where is it written that every extra penny must be routed to some venture capitalist’s trust in the Cayman Islands?

The platform economy as it now stands isn’t an “American Dream” of fairly compensated labor—it’s a lottery. It’s bizarre to see someone become a billionaire on the basis of a few weeks of coding and lucking into network effectsat the right place and time. But the demands of speculative capital shouldn’t be the guiding principle behind the platform economy. Older values, of fair play and transparency, need to be translated into new environments. And we can only hold accountable systems we can understand.

Continue to Read the Full Article – http://m.thenation.com/article/208057-5-ways-take-back-tech

Leave A Comment

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.