When we look at machine learning, which is the science of using algorithms and statistical models to teach artificial intelligence (AI), there is an element of predicting the future. There is little human input here but the key is that the machines use history and patterns to determine a consumer`s preferences – the machine guesses what you need and will place it in front of you.
This is a system that saves time and in the long run, ensures that everything a person wants is available even before he or she asks. But when you add identity politics to the mix, well the system places you and people like you in a group, segregates you and in short, takes away your individuality.
But what gets scary is the machine places us into tribes and can in fact, coach us into being what it wants us to be. Imagine identity politics moving to a level that herds us into groups, with group think and dictates to us our behavior patterns or in the very least, encourages us to behave much the same.
We see this happening on college campuses as students are trying to confirm by the way they dress and the language they use. Group or mob think eventually rules the masses. So what happens to the individual?
Machines could start to force groups to behave like they should. And in this system, you will be defined as Hispanic and must act accordingly, just as your whiteness will be the factor that determines who you are and how you should behave. The identity of people will give them stronger voice than others.
Would a 18 year old student want to drink exactly what he or she wants? Or will this group think led by machines tell Whites and Asians to drink one drink, while others should drink another?
Identity politics is here, and consumers should be deciding for themselves, not allowing small groups to form that destroys the individual. Many of us do not want to be categorized by an assigned identity. By the way, what happens when bi-racial people buy their goods and services? Will identity politics continue to divide in sub-groups? It never ends.