In a hurry to get to Logan Airport during rush hour, Monideepa Tarafdar, the Isenberg School of Management’s Charles J. Dockendorff Endowed Professor, was traveling in an Uber when she noticed somethi
Monideepa Tarafdar

In a hurry to get to Logan Airport during rush hour, Monideepa Tarafdar, the Isenberg School of Management’s Charles J. Dockendorff Endowed Professor, was traveling in an Uber when she noticed something peculiar about the relationship between the driver and the ride-hailing application. The app told the driver to take a certain route, but he knew there was a quicker way to the destination. When Tarafdar asked him to take the route he preferred, the conflicted driver said he couldn’t because he didn’t know if Uber would penalize him for straying from its directions.

“This was a new phenomenon to me,” Tarafdar said. “To see a taxi driver hesitate to take the route he was confident was the best one because an app suggested otherwise.”

This experience sparked Tarafdar to question how Uber drivers interact with the algorithms that serve as the foundation of the app. She also began thinking about how this “algorithmic control” contributes to a gig economy worker’s “technostress”—a concept she is an expert in that represents the effects technology has on the well-being of those who use it for work.

Tarafdar and her co-authors investigated these questions by talking with and surveying Uber drivers, and following their forum discussions. They published two articles: “Examining the Impact of Algorithmic Control on Uber Drivers’ Technostress” (in the Journal of Management Information Systems) and “Algorithms as co-workers: Human algorithm role interactions in algorithmic work” (in the Information Systems Journal).  

Algorithmic Co-Workers: Friendly Peers, Managers or Both?

Boil down the relationship between an Uber driver and the app’s many algorithms and it looks a lot like that of co-workers.

“Algorithms tell the drivers which rides to pick up, where to go, and how to get there. It also tells them about ‘surges,’” Tarafdar said. “Further, it decides what rates the driver should get for the rides.”

So the algorithm works not only as a helpful peer, but also as the employee’s supervisor and performance evaluator.

To be as effective as possible, the algorithm automatically captures and records the driver’s interactions with the app. This creates what Tarafdar calls the “computational learning loop,” wherein the algorithm takes this recorded information and uses machine learning to factor it into future actions.

Although the algorithm captures many data points, it cannot capture the driver’s emotional reactions—or the manifestation of the technostress they experience. Confusion, frustration, and conflict on the part of the driver fly under the algorithm’s radar, creating a “broken human loop” as the algorithm cannot adjust accordingly to the feedback.

Good and Bad Technostress

Tarafdar and her colleagues found that Uber drivers experience technostress in myriad ways and that in this gig economy context, it has a variety of complex implications.

She found that Uber drivers experience algorithm-driven role conflict when the algorithm gives instructions they disagree with. Drivers also experience algorithm-driven role ambiguity (when instructions are not clear) as well as stress caused by technology-related uncertainties (such as when app features change without notice).

But technostress for gig economy workers isn’t all bad, according to Tarafdar. “Drivers also experience ‘good’ technostress when they think the algorithm guides and pushes them to do their job well by providing useful information,” she said.

Tarafdar says there are ways to improve the algorithms of apps like Uber to mitigate the negative impacts of technostress and bolster the positive. To do so, she says the computational loop can be made more efficient through better app design, and better work design will make sure the information that needs to go back to the algorithm does so. “This will repair the broken human loop,” Tarafdar said. “The driver should be able to communicate what they are thinking and feeling back to the algorithm so it can be factored into future algorithmic decisions.”