In the future you may be working less for “the man” and more for the machine—machine learning, that is. Automation and AI will replace some jobs outright—threatening up to 25% of U.S. jobs, especially those with simple manual and repetitive tasks—according to a study by the Brookings Institution. But even those of us who don’t lose our jobs to automation may still be under its thumb. In 2040, it may be common for algorithms to supervise our work—sometimes to train our AI replacements, but often to optimize our performance in the corporate machine.

advertisement

advertisement

For some workers, that day has already come. The same algorithms that devise your Uber route are directing the drivers themselves, making them more a component of the system than an individual doing a job. Not only directed by algorithms, drivers also feed systems that evaluate their performance, with details such as how smoothly they accelerate and how customers rate the experience. It’s a continuous iterative cycle of observing our work and then directing how we work based on those observations, repeated again and again. “Employers have an insatiable appetite for information about employees, whether it’s relevant or not,” says Lew Maltby, head of the National Workrights Institute. “Employers have never passed up the chance to get more information about employees, and it’s hard to believe that they’re going to pass up the chance here.” Employers have an insatiable appetite for information about employees.” Lew Maltby New forms of worker surveillance are subtler, however, and may not look like surveillance at first. “Cameras or those kinds of things are more in your face,” says Alexandra Mateescu of tech and policy research organization Data & Society. She contrasts that with modern point-of-sale systems in stores—which track not only sales but how quickly employees ring them up. “Tracking of workers is often embedded in the actual measurement of the work itself. So it’s not very easy to separate the two,” she says. Tracking employees is not bad, per se. Companies who make or deploy monitoring software often state noble intentions, such as helping office workers understand how they use their time, so they can learn how to use it more efficiently. In manual labor jobs such as warehousing and trucking, some companies promote monitoring as a way to ensure that workers stay safe by lifting properly or not driving too many hours without a break. But new technologies can exacerbate an existing culture of nosiness or micromanaging. The seeds of the future are already being planted. Today’s bleeding-edge algorithmic and data-driven surveillance could become the norm in 2040. And the feeling of being observed, no matter how subtle or well-intentioned, could fundamentally change how workers relate to their jobs. Working-class algorithms Algorithmic minders may be most visible today among the working class. Gig-economy platforms in fields such as ride-sharing and food delivery are built on algorithms. Surveillance is also coming into more established manual labor professions, such as warehouse work, trucking, and housekeeping services. “I think that regardless of whether or not platform-based work becomes ubiquitous, there are elements of it that are being transported piecemeal to other areas,” Mateescu says. For instance: “A lot of hotel workers now are often guided by an app and told in which order to clean each room.”

advertisement

Perhaps nowhere is that algorithmic taskmaster more powerful today than in an Amazon fulfillment center. Perhaps nowhere is that algorithmic taskmaster more powerful today than in an Amazon fulfillment center. The handheld devices that warehouse workers use to scan packages also allow Amazon to track worker productivity in exacting detail. Workers are held to “the rate”—a calculation of just how fast employees should be able to work. As workers redouble their efforts to meet the current rate, the pace of their work may increase, causing Amazon’s algorithms to set an even higher rate. Software assumes that whatever most workers achieve, all of them should be able to meet, consistently. Workers can receive automated warnings, retraining assignments, or even dismissal if they fail to keep up. “From the consumer point of view and the corporate point of view, the efficiencies that are brought into this system are really compelling,” says attorney Frederick Lane, author of books on technology and law including The Naked Employee. “It’s just a question of whether or not we have gotten ourselves to the point where the efficiencies are starting to have serious adverse impact for human beings.” Monitoring and automation are also coming to the trucking industry, motivated by a growing shortage of highly qualified drivers as well as skyrocketing insurance rates, says Santosh Sankar, a partner in Dynamo, a venture capital firm that invests in supply chain and mobility companies. Data collection can help to make drivers safer, but it also enables companies to develop the algorithms that could someday replace them. For instance, experienced drivers have valuable on-the-ground knowledge, such as what street a building’s loading dock is on. “Little things like that, as you think about a world that’s autonomous, that’s really, really important [data],” Sankar says. Computers are already riding along. Since 2017, the federal government has required commercial trucks to be outfitted with an electronic logging device (ELD) to monitor and enforce safety requirements, such as limiting the number of hours truckers can drive per day, and ensuring that they take prescribed breaks with adequate time for sleep. ELDs are also a rich source of data for trucking dispatchers, insurance underwriters, and autonomous tech developers, Sankar says. Data includes not only where and when truckers drive, but how they drive—in terms of braking, acceleration, and how long they go between breaks. “All of that’s really, really interesting because the fleet manager has an understanding of just how that driver does their job,” Sankar says. It can also help with those high insurance rates. “Maybe we can underwrite insurance more closely to who’s driving that particular vehicle,” Sankar says. “That’s where the world’s going.” In fact, Progressive Insurance is already offering lower rates for truckers who upload safe-driving data from their ELDs. They’re just like anybody else. They don’t want somebody looking over their shoulder.” Santosh Sankar Dash cameras are also increasingly on board, monitoring both the road ahead and the truckers themselves to see if they are alert and focused. Some trucking companies are even experimenting with sensors to measure heart rate and perspiration. But Shankar acknowledges that truckers bristle at being watched. “They’re just like anybody else. They don’t want somebody looking over their shoulder”—or in this case, right at their face.

advertisement

As warehousing and trucking go, so could other industries. “In 20 years, we might see that happening not only in the trucking industry, but in other workplaces. And so the surveillance is going to get more granular,” says Gabrielle Rejouis, an associate at the Center on Privacy & Technology at Georgetown University Law Center. “We can imagine that moving to other places where an employer might want to make sure that their employee is looking at their computer for most of the day.” In fact, some software is already starting to watch desk workers just like it does truck drivers. Big data at the office Monitoring, done in the name of optimization, is growing in popularity at the office. Take Crossover Worksmart—software marketed as a tool for keeping tabs on remote workers. It monitors keyboard activity and application usage and takes periodic screenshots and even webcam photos of workers to create what the company calls a “digital timecard” every 10 minutes. “It’s intended to be ensuring that people are getting paid for productive time,” says Crossover founder and CEO Andy Tryba. “And if it’s nonproductive time . . . then you don’t actually get paid for that time.” Tryba believes that hard metrics can determine how much work is getting done. “Banging away on the keyboard and mouse,” for instance, is an indicator of being productive, he says. Screenshots, meanwhile, take the place of “management by walking around,” so bosses can see exactly what employees are up to. We believe that that’s a big part of the future of work.” Andy Tryba Tryba claims that Worksmart isn’t a tool for micromanaging, but rather for work coaching. “If you know what application’s in the foreground, what’s in the background, where you are spending your time, who you’re interacting with—these are all leading indicators of how you’re actually working,” Tryba says. By analyzing the work patterns for the most productive people on a team, he says, managers can encourage new or less-productive workers to follow those patterns. “We believe that that’s a big part of the future of work,” he says. That may be a noble goal, but Worksmart could also be a dream tool for a nosy, micro-managing boss. Tryba grudgingly concedes the possibility that someone will go overboard, but essentially says that the market will sort it out. If a boss drives employees crazy, they can quit. Furthermore, employees opt in to monitoring, he says, when they agree to take the job. He also frames monitoring by Worksmart as a fair price for the ability to work remotely. By 2040, working from the cloud could be the norm. “I don’t believe that my kids will even be looking in [specific] cities for jobs,” Tryba says. Workplace software is also watching how we interface with each other. Software from a company called Humanyze, for instance, monitors apps such as email and message platforms to track how people communicate. Some Humanyze customers even fit their employees with ID badges that track where they go in the office and, via microphones, when they talk.

advertisement

Humanyze emphasizes that identities are anonymized, with serial numbers for employees grouped by team. That means the employer may see that someone from marketing messaged someone in sales, or that someone from each team met in conference room A. Humanyze claims it is impossible to know the true identity of the employees, just the department they are in. The company also says that it does not record the content of messages or conversations, only the “metadata” for patterns of how large numbers of people communicate and congregate. Plenty of research over the years shows that even with names removed, the remaining details about anonymized people can be used to reestablish who they are. (A 2019 study shows that nearly every American can be identified based on 15 demographic attributes.) But Humanize says there is no reason to even care about what an individual is up to. No one is getting penalized for chatting too much—or too little—with coworkers in this model. Workers will be evaluated not just for the work they do but for the way they accomplish it. “We really only care about patterns. Are they communicating up to their manager, down to their direct reports, across to their peers?” says Humanyze CEO Ellen Nussbaum. She gives the example of a recent client whose highest performing salespeople also happened to have the most interaction with engineers. “The company used that to change the footprint of their office plan [to] put sales and engineering closer together, because that leads to better performance,” she says. If the market for companies such as Crossover and Humanyze grows, office workers may have to get used to being evaluated not just for the work they do but for the way they accomplish it. Anonymous or not, workers could increasingly be thinking about where they put their eyes, how long they spend in an app, whom they message, and where they eat lunch—knowing that those factors are being recorded. It’s hard to imagine that not causing some additional stress. These data-driven evaluations of workers also seem to go against the trend of flexible work, in which technology allows people to work more on their own terms. The notion that you have to have your butt in a seat at the office from 9 a.m. to 5 p.m. is being replaced by remote work and work-life blending that put the emphasis on results, not methods. Will the machines win? These are the early days of algorithmic and data-driven employment, with technologies developing far faster than workers, regulators, or society can adapt. But resistance is starting to build, which may lead to a more nuanced and restricted use of these technologies. “I don’t think in 20 years there will be no regulation of workplace surveillance,” Rejouis says. “Worker rights advocates are calling for more protection.”

advertisement

I don’t think in 20 years there will be no regulation of workplace surveillance.” Gabrielle Rejouis Instacart and DoorDash delivery contractors, for instance, are challenging the opacity of algorithms that determine their fee for assignments—by pointing to what look like inconsistencies between how similar-looking jobs pay. They have also put out their own calculations to show what the algorithm typically pays out. The estimates, if accurate, are sobering. Advocacy group Working Washington recently calculated that, after expenses, DoorDash drivers average just $1.45 per hour. Tips from customers are required to make many of these jobs worthwhile, and workers are also pushing for the apps to default to more-generous tip amounts. Governments are just starting to address algorithmic pay. In 2018, New York City’s Taxi and Limousine Commission imposed a minimum hourly rate for ride-share drivers ($17.22 per hour, after expenses). This provides some certainty about pay and undercuts the ability of Uber or Lyft algorithms to lowball wages. Other workers are fighting the creep of data gathering into their personal lives. The 2018 West Virginia teachers strike was in part a fight against surveillance through a new health-monitoring requirement. The schools’ health insurer, Humana, was offering a program called Go365 that uses a Fitbit or similar tracker to record activity such as steps and heart rate. Employees who “volunteered” for this and other health monitoring could earn points to reduce their health premiums. Those who didn’t submit to monitoring, or didn’t earn enough points, faced a $500 penalty fee. The state ultimately dropped the program in the face of worker resistance. The teachers’ plight was not unique, as employers across the U.S. have been promoting such data-for-discounts health programs. “The employer has an economic incentive to collect information to minimize healthcare costs, and given the rise in healthcare costs, that impulse gets stronger and stronger,” says author Frederick Lane. Are you going to promote the person with the sedentary lifestyle, with a BMI of 40?” Frederick Lane He projects where this may go. “Are you going to promote the person with the sedentary lifestyle, with a BMI of 40? Or, all things being equal, are you going to promote someone who might be around a little bit longer and who costs you less?” Lane says. For worker advocates, the lesson from such monitoring is clear: Some employee information should simply be off-limits. “The two examples are health data and data on what an employee does outside of work,” Rejouis says. The latter includes apps on cellphones (even personal ones) that collect GPS data. “There should be just a clear line between a worker’s personal life and their professional life,” she says. Consumer privacy laws can serve as inspiration for worker privacy protection, Rejouis points out. The new California Consumer Privacy Act gives residents the right to know and look over what information has been collected about them by companies they do business with (say Google, Facebook, or Walmart). Companies are also required to delete the information or cease sharing it with other companies if the consumer orders them to. Worker advocates want employees to get similar rights to see and control the data their bosses collect on them.

advertisement

Illinois just took a small step in this direction. Its new Artificial Intelligence Video Interview Act addresses the growing hiring practice of using AI to analyze a candidate’s appearance and gestures during an interview. A company called HireVue, for instance, enables employers to set up automated, video screening interviews. In addition to providing the tapes for managers to review, HireVue’s AI evaluates details such as facial expressions, word choice, body language, and vocal tone to provide a score for the candidate. Illinois’s law doesn’t ban such practices, but it requires that job candidates be notified of the process and get an explanation of how it works. Would-be employers need written consent from the job seeker before they can proceed, and they have to delete all copies of the video if the applicant requests. There should be just a clear line between a worker’s personal life and their professional life.” Gabrielle Rejouis Legislation doesn’t always come in order of practical priority. Politicians in Arkansas and Indiana, for instance, are working on laws to ban employers from forcibly inserting microchips into their employees. Panic arose after dozens of workers at Wisconsin technology company Three Square Market agreed to have microchips inserted between their thumb and forefinger in 2017 as an always-available substitute for an RFID keycard. There are no reports of companies in the U.S. or elsewhere even contemplating compulsory chipping of employees. But the very thought of it has been enough to trigger countermeasures. Society is just beginning to fathom how technology is supercharging nosy bosses—human and robotic alike. Automated warehouse managers, trucker monitoring, and office communication surveillance are just a smattering of everything happening. For every worker protest or new law, there are countless implementations of monitoring technologies. And as the role of algorithms in running the workplace grows, it may become harder to resist the temptation to feed them with data collected about workers. Most people I spoke with drew some analogy between monitoring workers and data gathering on consumers. Facebook, Google, digital ad services, and other firms have perfected the process of capturing and sorting the thick data trails we leave online and refining it into profiles that define what we like and how to reach us. “Approaches to employee monitoring are starting to look more like consumer marketing, where you have segmenting and targeting,” Mateescu says. “You have HR professionals saying things like, we want to know our employees as well as we know our customers.” Like it or not, we’ve grown accustomed to consumer monitoring, and more workers are getting used to employee monitoring too. On the flip side, there has been some progress on protecting consumer data, such as California’s new privacy law and a raft of laws in other states that are in the works. (There’s even a nonzero chance that federal lawmakers will cobble together some national consumer protections.) But even the strictest laws seem likely only to set some guidelines and transparency for the collecting of consumer data. No one expects to return to the level of privacy we had before the internet age. In even a best-case scenario, efforts to curb employee data gathering will likely go a similar route: They may set some boundaries, but they won’t turn back the clock.