New! Become A member
Subscribe to our newsletter
Insights

The Morality (and Meaning) of Jobs

The intersection of two recent NY Times articles got my attention. One was entitled “Wealthy, Successful and Miserable.” Written by Pulitzer Prize-winning journalist Charles Duhigg for the NY Times Magazine, it describes his shock at discovering that, 15 years out, many of his Harvard Business School classmates were no longer thrilled with their jobs. To the contrary, “they were miserable.”

He described one classmate who had to invest $5 million a day. Which was sort of a ticking time bomb — if he only got $4 million invested one day, then he had to place $6 million the next. It was insanely stressful work, done among unpleasant, backstabbing coworkers. He was paid well — about $1.2 million a year — but hated going to the office.

“I feel like I’m wasting my life,” he told me. “When I die, is anyone going to care that I earned an extra percentage point of return? My work feels totally meaningless.” He recognized the incredible privilege of his pay and status, but his anguish seemed genuine. “If you spend 12 hours a day doing work you hate, at some point it doesn’t matter what your paycheck says,” he told me. There’s no magic salary at which a bad job becomes good.”

Duhigg did some further research and learned that back in the mid-1980s roughly 61 percent of workers told pollsters they were satisfied with their jobs. But by 2010 that figure had dropped to 43 percent. A lot of the factors are what you would expect: oppressive hours, political infighting, competition, globalization, the “always-on culture” bred by the internet. But there’s also “an underlying sense that their work isn’t worth the grueling effort they’re putting into it.”

Duhigg eventually concludes that the meaning people are looking for is largely about a sense that through one’s work “we’re making the world better.” But he then observes, “what’s remarkable is how few workplaces seem to have internalized this simple lesson.”

All of which suggests an uncomplicated but essential taxonomy of jobs:

  • Jobs that make the world better.
  • Jobs that are more or less neutral in their effect.
  • Jobs that make the world worse.

Presumably, jobs in that last category should be earnestly avoided — at least by those looking for meaning and/or morality in their work.

“Is Ethical A.I. Even Possible?”

Just a few days later, the NY Times ran a rather different article, “Is Ethical A.I. Even Possible?” It chronicled the challenges that corporations, and their employees, face as they grapple with how to develop new AI capabilities that are not ethically suspect. The article made clear that the drive for profits necessarily pulled companies toward dubious applications. Against which, there seem to be only two bulwarks: government regulation and employee sentiment.

The article profiled the struggle around ethical AI at Clarifai, a small NY company focused on facial recognition and related applications. Some employees had grown increasingly concerned that their work “would end up feeding automated warfare or mass surveillance.” In late January they posted an open letter on a company message board asking CEO Matt Zeiler where their work was headed.

Zeiler then held an all-hands meeting at which he made clear that the company’s technology was likely to end up in autonomous weapons. He also reversed course on the need for an AI ethics officer, saying such a position was unnecessary at a small company. In turn, the employee who had drafted the open letter to Zeiler left the company.

Interestingly, Google worked on the same Pentagon project as Clarifai — but after a protest from company employees, the tech giant ultimately ended its involvement. And after Amazon employees protested the sale of facial recognition services to police departments, Amazon (and Microsoft) called for government regulation.

The article also noted that thousands of AI researchers from across the industry have signed an open letter saying they will oppose autonomous weapons. In turn, Meredith Whittaker, a Google employee and co-founder of the AI Now Institute, a research institute that examines the social implications of artificial intelligence, said this is an especially auspicious time for tech employees to use their power to drive change.

Why all the ferment? Two reasons. First, today’s employees have grown up with the understanding that many of their activities — from consumer purchases to stock investing — have important moral dimensions. A decision to buy a piece of apparel, if it was sourced from slave labor, or to invest in a company that does serious harm to the environment, are necessarily moral choices.

Similarly, a decision about where to work is — for those privileged to have choices — a fundamentally moral determination. One might argue, in fact, that owning a shirt or a stock is a form of passive complicity. But a decision to work for a company is a more active and important commitment — one in which the employee is expected to diligently further the aims of his or her employer. Which means such a commitment necessarily has substantial moral ramifications.

Second, AI is entirely reshaping the tech (and business) landscape. Companies like Amazon, Facebook, Google, and Microsoft are facing growing unrest from employees, and potential recruits, about the ethics challenges of their business models. Tech workers have signed open letters opposing Google’s Project Maven contract with the U.S. military, Microsoft’s contract for U.S. Immigration and Customs Enforcement (ICE), and Amazon’s sale of facial recognition technology to law enforcement. Almost always, the underlying issue is an application of AI that employees consider dangerous and undemocratic.

Artificial intelligence has upped the potential for harm from the business models of more and more companies. But it has also created a new class of workers — ones who can meaningfully ‘vote with their feet’ against profits that put society at risk. Thankfully, more and more employees are casting their votes.

 

 

 

 

 

 

Privacy Settings
We use cookies to enhance your experience while using our website. If you are using our Services via a browser you can restrict, block or remove cookies through your web browser settings. We also use content and scripts from third parties that may use tracking technologies. You can selectively provide your consent below to allow such third party embeds. For complete information about the cookies we use, data we collect and how we process them, please check our Privacy Policy
Youtube
Consent to display content from - Youtube
Vimeo
Consent to display content from - Vimeo
Google Maps
Consent to display content from - Google
Spotify
Consent to display content from - Spotify
Sound Cloud
Consent to display content from - Sound
Subscribe to our newsletter