Communication stops without comprehension. Being understood is the most obvious requirement for sustaining communication. This is true of signals between humans, or being understood by algorithms.

When we use a self-driving car, utilize an HR bot, or get matched by a service like lunch club there is an implicit agreement for the program to fit us into buckets. If we don’t fit, it would find the closest neighbor, a common denominator, hopefully not shoehorning us too much (in technical terms, we hope it would not overfit us). It will look for predetermined archetypes in order to serve us. These might be probabilistic, statistical, neural, or deep — but they are all fixed, steady version of who you are. They are cybernetically deterministic.

Being a creative misfit is going to act against you at that moment.

It is important to linger on this exclusionary tension. These platforms use AI and machine learning to make decisions: who gets surfaced to the top of the applicant pool, who gets matched with whom, and much more. Algorithms are programmed for order, but some of us live in the in-between. That fact alone can determine the value you get from such platforms.

This is bad news for companies, because it means that algorithmic efficiency will cost them cognitive diversity.

Algorithms, being a cybernetic (action to feedback) gesture, within a siloed environment (sensing only what is fed to them) are in fact acting against diversity.

That is fine if it does not involve human beings. Your calculator can’t discriminate against you. But it becomes an issue with decision making, value delivering, human matching type systems. Do you feel like your algorithms understand you?

July 20, 2020