By the Tech: The Future of Work is Starting to Look a Lot Like the Past
Process at work was the innovation of the 19th and early 20th century. During the Industrial Revolution, the biggest compliment you could pay an entrepreneur was to say their business ran like a well-oiled machine. Each part did one specific thing over and over again. The same went for employees – everyone had specific roles to play.
Eventually, the way these roles got done is by following a manual. This was a revelation at first – scale could be achieved beyond what anyone thought possible. You could have a boss in Houston managing a team in New York and know the New York team was doing precisely what you wanted, because they were doing it “by the book.”
In the 21st century, we’re looking at a new version of the same thing.
The 21st century boom
Take a look at the some of the most in demand non-technical jobs of the future. They come in all shapes and sizes. Some, like dentistry, require advanced professional degrees and bring a massive salary to boot. Others, like Market Research Analysts or Carpenters, may not even need a college or university degree and bring in respectable sums of money.
But employees each of these jobs is on the verge of a new innovation: being forced to do it “by the tech.” As technologies like AI and machine learning begin to make decisions on our behalf, the brain work we do becomes augmented and automated.
There’s a debate whether this is the robots taking over versus humans leveraging tech for their own benefit, but this is the wrong way to characterize it. Instead, the conversation should be around whether we’re making the workforce less intelligent by forcing employees to rely on artificial intelligence.
Low-skilled brain work
Creating “the book” in jobs was done to enable the scaling of routine jobs with limited trust put in the employee. This worked for a few reasons. It was part of a giant wave of 9-5’ers who, frankly, didn’t really want or need to care about their jobs outside of working hours. They worked hard when they worked, and left their job at work when they went home. It was a decent system.
The 21st century workforce is all about the brain. You all but have to care about your work outside of work because it’s during quiet downtimes (read: boredom) that you come up with your best ideas, according to a study by Harvard Business Review. That means the worker needs to think about work constantly. Thus we have the rise of vision statements and other methods of embedding trust and faith in employees as a method of encouragement for employees to care about the company as a community, not just a place of employment.
And here we come to the new problem. Technology already changes jobs. The Elevator Operator doesn’t exist anymore due to technology, but the Market Research Analyst and Carpenter simply are made more efficient by technology. What happens when technology can make many of the decisions for them? The result is that brain work becomes a low-skill task. You don’t need to be a genius researcher when you’ve got IBM Watson on your side.
Two paths forward
When technology becomes a core decision-making backbone of our jobs, from Market Research to Medical Assistants and more, one of two things can happen for people in these jobs: companies can treat these jobs as the new blue collar job, offering 9-5 and benefits plus a pension. Or companies can treat these as the new stepping stone jobs, showing career ladders (or lattices) for people to climb higher and higher.
Option one is the easy, but expensive, route. If we return to the era of blue collar labor with 9-5’ers and a defined benefit pension, companies will begin to have massive balance sheet obligations. Though there is the upside of more companies focusing on profitability to meet their obligations, so we may see fewer WeWork debacles (although who knows if we’ll see more Enron situations in balance).
Option two is the more difficult, but significantly stronger route. Forcing companies to truly look at what they need to produce economic returns (and meet other goals or obligations) is one thing. But adding transparency to build the idea of professional development not as a solo obligation but as a partnership between employee and employer will bring about new creativity – and more profit if done right.
If we allow technology to become the decision driver for non-technical roles, we lose humanity. In the old days, a creative (or stubborn) employee could simply put the book down and do what needed to be done. In a world of technology surveilling while it barks orders in a digitally pleasant voice, the ability to be human slowly erodes.