Could Your Next Boss Be a Robot?
Life + Money

Could Your Next Boss Be a Robot?

iStockphoto

Meet the new boss. She never plays favorites and doesn’t partake in office gossip. She gives clear directions. At performance review time she offers valuable observations and backs them up with examples. She is perfect. Well, almost. If she does tend to repeat herself and wouldn’t give you time off to attend your son’s violin recital, you’ll have to forgive her: She’s a robot.

Totally automated management may seem far-fetched, and, indeed, few see the day when the authority figure in the corner office is an automaton. But in recent years, a surprising array of managerial functions has been turned over to artificial intelligence. Computers are sorting resumes of job seekers for relevant experience and to estimate how long a potential employee is likely to stay. They are mapping email exchanges, phone calls and even impromptu hallway interactions to track workflow and recommend changes. Widely used software is analyzing customer data for algorithms, which in turn is changing when and where workers are deployed.

SLIDESHOW: 12 Jobs We're Losing to Robots

Connect the dots, and the image of an objective, all-knowing and fully automated manager may spring to mind. But Wharton management professor Peter Cappelli says that when it comes to being the boss, robots are no substitute for humans.

“It is possible for software to provide accounting information –i.e. ‘Here is how you are doing’–but management is still a much more complicated task of making adjustments to the work being performed in order to meet changing demands, diagnosing problems and offering solutions,” notes Cappelli, director of Wharton’s Center for Human Resources. “While it might be true that robots would be better than some managers, robots cannot yet perform these tasks well,” he adds.

Wharton management professor Matthew Bidwell says he, too, “struggles a bit” with the idea of the automated boss. “At the end of the day …it’s less about having an automated boss and more about taking away some of the mundane functions of the boss and leaving them with more of the judgment calls.”

But the pace and dissemination of sophisticated automation innovations has picked up, and anything that can be coded and tied to the bottom line is an opportunity for model building, says Shawndra Hill, a Wharton professor of operations and information management.

Related: 5 Outrageous Workplace Etiquette Mistakes

“That said, the problems [in which] answers are more subjective and harder to evaluate are historically the problems that people thought computers would not be best at,” she points out. “[But] if there is data you can link to outcomes, you can still build models—and people are doing that. More and more, companies are dealing with these subjective problems in ways we haven’t thought of before. Every year I am surprised by something that [is­ automated], and think, ‘Why didn’t someone think of this 20 years ago?’ We’ve reached the point where computers will drive cars. And my guess is that they will be safer than humans.”

Automated Scheduling, Ruined Relationships
One highly visible foray into automated management has upset workers and caused one company to take a public-relations beating. Starbucks employs a widely-used software program that examines sales patterns and other data to determine scheduling of its baristas. The practice was thrown into harsh light in a lengthy New York Times feature that documented the consequences of an algorithm-dictated schedule on flesh-and-blood workers.

Such computer programs do not take into account how unpredictable hours may or may not mesh with the needs of workers who must account for child care and other domestic obligations. Starbucks said it gives baristas a week’s notice of their hours, but The Times found few instances of that happening.

The article drew hundreds of responses—from workers in big-box retail, grocery chains, banks, medical facilities and a major art museum—who complained that computerized scheduling systems had left their lives a time-management hell. “This automated software removes the human equation,”wrote a Lowe’s worker from Woodbury, N.J. “This system has massively strained my family life and has had a hand in ruining relationships. And to add insult to injury, the schedule barely receives a passing glance by management.”

Related: 7 Ways to Fix Your Biggest Management Mistake

Bidwell says, in effect, computers don’t hurt people—people with computers hurt people. “That’s the problem with algorithms and scientific management. The challenge with scientific management was that you had some terribly smart engineer sitting up in an office somewhere dreaming up clever ways to get things done without taking into account the way things were working on the shop floor,” he notes.

“It’s kind of the same here. To take a cynical interpretation, you have employers … pushing flexible scheduling back onto employees, or, in a more charitable interpretation, a bunch of consultants in the head office trying to make systems more efficient, and they are detached from the context in which they are going to operate.”

Either way, there are other inefficiencies elsewhere that haven’t been factored in, Bidwell says. “While you may gain some efficiency, you lose some trust in the [employer-employee] relationship, and the result is the employee isn’t willing to go the extra mile, and isn’t going to stick around as well.”

According to Hill, although the technological ability to make automated decisions has grown, consideration of the implications of such practices is not keeping pace. “The Starbucks example is a perfect one,” says Hill. “I imagine the objective was the optimal scheduling system, but they didn’t pay attention to constraints important for their employees, and it became newsworthy. What’s critical is that there is still a human in the loop to make sure proper constraints are being considered that do not discriminate in ways that are not in line with the law or do not make employees miserable.”

Related: The 7 Worst Job Interview Mistakes People Make

In fact, the plight of workers who have little or no voice in how they are scheduled is drawing attention from lawmakers. San Francisco and Vermont have enacted laws giving such employees the right to predictable schedules, and President Obama in June directed federal agencies to “carefully”consider their workers’ requests for flexible schedules.

The Persuasive Robot
And yet, automated functions and algorithm-driven decisions are becoming engrained in the workplace. In nearly all categories relating to HR from recruitment to performance management, companies participating in the CedarCrestone 2013-2014 HR Systems Survey said they were substantially increasing technology enablement of HR processes. (The survey represented 20 million employees, mostly in the U.S.)

Job loss is a concern: A 2013 Oxford University study examined 702 occupations and estimated that job automation threatens 47 percent of the U.S. workforce. Loan officers, for one, may want to spruce up their resumes. The job category topped the list, with a 98 percent probability of being taken over by computers. Also vulnerable: information clerks and receptionists; paralegals and legal assistants; even fast-food cooks and bartenders.

Taking something shaken, not stirred from a computer is one thing. But what about taking direction? Recent experiments have shown people to be surprisingly willing to accept the authority of a computer. Researchers at MIT’s Computer Science and Artificial Intelligence Lab recently looked at three team configurations made of two humans and a computer named “Jim”—one in which the human allocated tasks, one in which the computer allocated tasks and a third in which authority was shared.

Related: What Women Can Learn from Men at Work... and What Men Can Learn from Women

The configuration in which the computer alone was calling the shots was not only the most efficient, but also the one preferred by workers, who said that robots “better understood them” and “improved the efficiency of the team.”

Another recent study, by the Human-Computer Interaction Lab at the University of Manitoba in Winnipeg also found humans willing to take orders from computers, but much less readily than from other humans. Participants were asked to perform a menial task (renaming computer files) for 80 minutes, and a computer named “Nao” was able to exert enough authority to keep 46 percent of participants on task for the full 80 minutes even as they voiced a desire to quit.

Humans were almost twice as likely —86 percent—to obey another human, in this case an actor in a white lab coat. Still, researchers were struck that “even after trying to avoid the task or engaging in arguments with the robot, participants still (often reluctantly) obeyed its commands. These findings highlight that robots can indeed pressure people to do things they would rather not do.…”

Does this mean workers are ready to accept a computer boss? "I think we'll need more of those studies to understand what employees’ tolerance is for robots, however it’s going to be a moving target,” says Hill. “As we become more familiar with automation, and how it can make our lives better, employees will become more flexible and more open to these things.”  Among the incentives, she adds, is the fact that most people want to be more efficient themselves, not to mention establishing “a more direct link between performance and reward.”

Cappelli sees another driver for workers to accept automated management: “I think if it is presented along the lines of self-management —i.e. ‘You don’t need a boss, you have to report info to our software system [instead]’—they might actually like it.”In other words, “Get rid of a boss,” Cappelli says.

Whether automated functions are across the board a good idea is another question. While automated processing of mundane or even complex tasks can mitigate the chances of error, unintended consequences are not uncommon.

For example, the extent to which cockpit automation has contributed to atrophy of pilot skills has been getting attention. In a study titled, “Thoughts in Flight: Automation Use and Pilots’ Task-Related and Task-Unrelated Thought” published in Human Factors: The Journal of the Human Factors and Ergonomics Society in 2013, authors Stephen M. Casner and Jonathan W. Schooler note that “studies find that cockpit automation can sometimes relieve pilots of tedious control tasks and afford them more time to think ahead. Paradoxically, automation has also been shown to lead to lesser awareness.”

Needless to say, “lesser awareness” isn’t a desirable condition when the safe arrival of a Boeing 747-400 hangs in the balance.

The End of Human Resources
“What you want to do is show employees how changes in [HR] behavior relate to changes in things they care about—productivity, turnover, stress in the workplace, whatever.” This is how Ben Waber, a visiting scientist at MIT Media Labs and president/CEO of Sociometric Solutions, thinks firms can sell the idea of “people analytics.”

Related: The Rise of Robots - and Decline of Jobs - Is Here

Waber’s firm tracks employee movements during the day through sensor badges, detects email and phone call algorithms, and analyzes workforce behavior like facial expressions and tone of voice — up to 100 data points a minute—to ultimately recommend changes to improve efficiency and productivity.

In one case, at a Boston-area hospital, this meant mapping teamwork patterns among nurses and tracking their footpaths and interactions with patients in a post-surgical ward. By putting into place a real-time map showing who was doing what and when, the firm says it was able to reduce costs while improving overall health and recovery times.

People analytics is gathering so much sophistication, says Waber, that “over the next decade you are going to see it subsume all of the functions HR does.” HR won’t quite disappear, but turning over most of its work portfolio to machines will leave the actual people in the field to concentrate on more human tasks like conflict resolution and helping employees to choose health plans, he notes.

But people analytics also carries Big Brother overtones, since it is so invasive. Will employees object to being so closely scrutinized? Employees generally have no legal protections against such data-gathering, says Laura Pincus Hartman, a professor of business ethics at DePaul University. “I don’t want to come off as totally insensitive, but if you don’t want to submit to giving up that level of privacy, you can [quit],”she notes. She allows that many would not be so quick to give up a paycheck in a wan job market. But “if enough people thought it was too intrusive — let’s say something like video cameras in bathrooms —then no one would be able to do it.”

Waber stresses that the idea is to gather data on organizational dynamics, not to target individuals, and says his company’s contract with employers —the client list includes Bank of America, Cubist Pharmaceuticals and insurer AXA —requires an agreement that no employee be compelled to wear a tracking badge.

So far, 95 percent of employees at client firms have participated. Still, he concedes that it’s important to control who has access to data, and favors the adoption of an industry protocol similar to the medical profession’s HIPPA. “From a legal perspective, nothing has changed since the 1980s and closed-circuit TV, so there is nothing to stop someone from using this data— and if that happens, it will have a huge impact on this field,” Waber notes. “People will get creeped out, and that’s when legislation happens, and we will be legislated out of existence.”

Rather than risking that turn of events, he is advocating a framework that would guarantee the rights of individuals to own data collected about them individually, while transferring aggregated data from the hands of private companies into the public domain, where it might serve the greater public good in areas like transportation, public health and government.

Much is at stake. Data is now being referred to as a new asset class. “Personal data is the new oil of the Internet and the new currency of the digital world,”now former European Commissioner for Consumer Protection Meglena Kuneva said in a 2009 speech. A manifesto called The New Deal on Data, promoted by MIT professor Alex “Sandy” Pentland, calls for individuals to be able to opt in or out of having their data shared and ensuring their ability to destroy it. It has been signed by 50 CEOs, says Waber.

The long arm of people analytics is by no means the only new authoritarian figure entering the workplace. One recent market arrival is Ava 500. Standing five-foot-five, the “telepresence robot”looks like a human-size chess piece topped by a video screen, providing an electronic audio and visual presence by the boss — or anyone else who might be thousands of miles away, but wants to virtually see and be seen. You don’t have to worry about getting “Ava” to show up at meetings; mobile, she has already learned the floor plan of the office, and can find her way around via a tablet interface controlled remotely by the virtual visitor.

But “Jim,”“Nao”and Ava 500 are just avatars, people analytics is only the beginning of what it takes to be a smart boss, and automatic scheduling systems are proxies behind which managers can choose to hide or stand proudly. All of these are still mere tools, made blunt or refined by their users. Says Cappelli: “So far, the things that are being automated don’t seem to affect the supervisor-employee relationship much. It still matters a great deal to have a good boss, and it is still one of the worst things in the workplace to have a bad one.” 

This article originally appeared in Knowledge at Wharton.

TOP READS FROM THE FISCAL TIMES