Text

What’s the difference between having an AI or a psychopath for a manager? - Not a lot we suggest.

2022-12-16

It turns out that AIs exhibit some of the same attributes of Psychopaths. At their centre (we nearly said ‘at their heart’ but that may be misleading) they are sorting and classifying machines. They merely pursue the application of a set of rules for sorting and labelling things in particular ways – very quickly and processing huge amounts of data.

Psychopaths are people with irregularities in the areas of the brain which govern and regulate emotions. In essence they respond without emotions or emotional considerations, and this makes them cold, calculating and entirely rational. Thus, in discussing moral emotions, one academic, Haidt, suggests that only a psychopath would make entirely rational (non-emotional) decisions. However, psychopaths in organisations (referred to as corporate or organisational psychopaths) tend to do rather well in the workplace, at least in terms of personal advancement, because of their willingness to push themselves forward for promotion and their strong desire for power, money and control over others.

It turns out that AIs exhibit some of the same attributes of Psychopaths. At their centre (we nearly said ‘at their heart’ but that may be misleading) they are sorting and classifying machines. They merely pursue the application of a set of rules for sorting and labelling things in particular ways – very quickly and processing huge amounts of data. AIs do not deal however with meaning, their operations are purely, what Flordi has referred to, as syntactic; they follow rules but are incapable of reflecting on the meaning of those rules, why they are in place or what their implications might be. The financial system for UK post Offices called Horizon, for example, was happy enough to label 100s of post-office owners as criminals by sifting and sorting its financial data and finding shortfalls in 100s of post offices. Transactions made by post offices did not match money transfers back to umbrella organisation’s financial system. The only logical conclusion was that post office owners were, as an occupational group, habitually criminal; prison sentences, ostracization from communities, re-possessed homes and suicides followed. It was mundane bureaucrat violence at its worse – driven by a psychopathic, and it turns out malfunctioning, AI. A recent (2019) court ruling vindicated them and awarded some of the post masters 57.75m pounds, much of which went to paying their lawyers’ fees.

AIs, like psychopaths, are not troubled in any way by meeting out bureaucratic violence and misery in pursuit of their goals. The implications of this for the emerging digitalised workplace are not hard to envisage. As software firms aim their products at the work of managing, as they are doing, AIs will increasingly take on management decision-making. Many employees can expect a decline in the experience of work following this. If an AI determines, based on its data, that a worker should be fired for poor performance, it will be extremely difficult to mount a defence of any sort. Computer systems and their human underlings tend to resist any suggestion their ‘data’ is wrong. The originator of the Horizon systems sent their chief engineer to lie about the robustness of the system in court. Just try questioning your credit score. If your manager in the future is an AI – then you will be being managed by a psychopath whose claims to truth will be almost impossible to contest.

Chris Ivory, Mälardalen University, Sweden & Clive Boddy, Anglia Ruskin University, UK