Algorithms have to factor in human values too
I write a lot about optimizing manufacturing, supply chains, and even employees using sensors and data analytics. While we are nowhere near an ultra-efficient society where every business process or even road trip is optimized, that is the ultimate goal.
As companies start to invest in technology to streamline their operations, I’m starting to question what the societal implications will be. So far, people are mostly thinking about this only as robots taking our jobs. But here’s the dirty secret of optimization: you optimize for just one specific goal.
For many businesses that goal is profit. This will eventually lead to the stark realization that optimizing for profits through automation will put lie to any other corporate mission statement about serving customers or protecting the environment or whatever else doesn’t directly affect the bottom line.
As we move into a digital world where decisions are increasingly made by machines, it becomes clear that we have to understand exactly what the purpose of every business process is. It also means that we have to find some way of factoring into our algorithms other values and elements such as work-life balance or protecting the earth.
Humans can hold two different and competing thoughts in their heads. For example, the highest goal of a business is to make money and then a secondary goal is to serve its customers. But computers, as we program them to optimize for specific outcomes, can’t handle that dichotomy. As every shred of waste is squeezed from a system from a careful analysis of data and automation of operations, the ability to say two diverging things hold true will end.
I’m not saying that making money is evil, just that when we optimize that over protecting the environment or treating workers well, it can lead to problems. Right now, the humans in charge, be it executives or managers, step in to ensure at least some externalities matter.
Yet, I’m still stunned by the lengths that some companies will go to in order to push profits over people. A recent example of this can be found in the algorithmically precise software that scheduled workers at places like Ann Taylor and Victoria’s Secret on an as-needed basis.
While this practice cuts the waste of having an employee standing around doing nothing during a lull, it wrecks havoc on the employee’s lives, making it impossible to schedule childcare or take a second job.
It doesn’t have to be scheduling software. It could be algorithmically determined quotas in a warehouse job that penalize employees for an off day. Examples are all around us, generally coming from companies trying to reduce their operational costs or those that aren’t shy about alienating workers.
In the future, absent consideration, more companies will face this dilemma. At that point, they will have to make a concerted effort to factor in human or other concerns into their algorithms, which may shave a bit off their profits. The other alternative is to have regulations in place that put a cost on things that matter to us as a society.
This isn’t a problem computers will solve.