Too many managers spend too much of their time making sure that people are working. And like many other mindless tasks, this one too is being automated.
That’s the premise of a piece from Inc. Magazine, which concludes with:
Managers need to be careful about the way they monitor employees. Whether they do it themselves, or use bots, or use software that uses bots, there is a serious risk that you will hamper employee joy and productivity–and hinder employee honesty–if the employees know they are always being monitored. Life’s most enjoyable, creative moments–in and out of the workplace–come when you know you’re not being supervised or scrutinized.
At first glance, this might seem to clash directly with another finding in research on employee behavior: the Hawthorne effect. That’s the idea that people who know they are being observed tend to improve some aspect of their behavior.
So which is it? Does monitoring people’s behavior increase or decrease productivity? The answer is simply: it depends on why they believe they are being watched.
Empowerment vs. Suspicion
When we’re doing any task for another person, there’s two ways we can think about that relationship. Either we feel empowered by the work and we enjoy the challenge and opportunity, or we believe we are suspected by others as lazy, incompetent, or untrustworthy, and we’re afraid they will misinterpret anything we do in a negative light.
If you take the radical philosophy of trusting your employees, you’ll find they are likely to want to be productive. And if they don’t, they will demonstrate this by breaking your trust—and you can find someone else.
But if you don’t trust your employees and need to check on them all the time, you need to spend a significant amount of resources on supervision. That may include spot checks, security cameras, time clocks, or reports. If we are fearful that people may cheat and we put controls into place to ensure that they don’t, those restrictions become the hallmarks of the workplace.
Trust and Creativity Cannot be Automated, But…
If you want to give your employees the chance to work without fear of interruption or supervision, and to have the chance to be creative and come up with new ideas, your main course of action is letting go. You can’t define a process for trust or a checklist for innovation. It happens because you’re not trying to make it happen.
But if you distrust people, it’s relatively straightforward to build mechanisms to check up on your staff. You can watch them via closed-circuit television or have them fill out excessive documentation.
In an era of instant messaging and email, the next obvious step is write software that nags people on your behalf. And if that’s a valid idea, why not software that manages your hiring process. Via Fortune magazine:
“People analytics” is fast emerging as a tool in the perpetual war to retain and attract talent. Companies have started hiring data scientists and building or buying software that predicts who will leave and who will make the best senior vice president.
And computer systems aren’t just becoming responsible for making suggestions about who to bring into the company, but also who to show to the door. One algorithm knows you’re a bad employee:
JPMorgan Chase & Co…is rolling out a program to identify rogue employees before they go astray, according to Sally Dewar, head of regulatory affairs for Europe, who’s overseeing the effort. Dozens of inputs, including whether workers skip compliance classes, violate personal trading rules or breach market-risk limits, will be fed into the software.
“It’s very difficult for a business head to take what could be hundreds of data points and start to draw any themes about a particular desk or trader,” Dewar said last month in an interview. “The idea is to refine those data points to help predict patterns of behavior.”
Work Is Not Playing Pinball
All of this automation to try and handle key HR tasks that usually require human judgement reminds me of the game of pinball. The modern workplace may sometimes seem like a pachinko machine, where we have very little influence over a complex system. An enterprising software engineer might believe they could build a robot to play pinball, and it might even be better than some human players.
But that pinball-playing-robot would be highly tuned to one particular machine. And even if it could adapt, work is not pinball. People are far more complex than the algorithms that people create.
Sometimes, people can surprise you. It’s the unexpected that can’t be predicted, and it’s the unpredictable that we often need to move forward.