February 07, 2018
Sir, Sarah O’Connor, discussing the use of algorithms when for instance evaluating personnel writes: “The call centre worker told me the software gives lower scores to workers with strong accents because it doesn’t always understand them.”, “Management by numbers from algorithmic overlords” February 7.
What, should we assume that the capacity of someone in a call center being understood would not be one of the most important factors considered by a human evaluator?
And when O’Connor refers to “the subtle flexibility of human judgment; decisions tempered by empathy or common sense; the simple ability to sort a problem out by sitting down across a table and talking about it.”, I must state that is absolutely not what happens all the time.
Any reasonable algorithm, with access to good historical data, would never ever have concluded, as the human Basel Committee did, that what is perceived as risky is more dangerous to our bank systems than what is ex ante perceived as safe.
PS. Could we envision a world in which more predictable algorithms managed our wives reactions… and, if so, would we then not miss their lovable unpredictability?
@PerKurowski