As I sat and watched my home country turn upon itself in the last few months, my heart and my head have been, like many of you, asking “how could this happen?”
Many pundits have pointed digital fingers. Bad actors, fake news, real news called fake news, foreign influences, the list is long and varied.
I would like to suggest the Algorithm. AI, or artificial intelligence.
As AI has progressed in computing power, conversations have begun among ethicists about what ethical and moral AI would look like. I would argue the conversation is functionally too late.
My kids are Avengers fans, and Age of Ultron is a disturbing look at when computer reach sentience (or come to life if you will), and then by choice do the math on humanity and decide to annihilate their creators. Perhaps this is in the future, but I would contend its already too late in other areas.

The one thing our nation and our world can agree on is that we are polarized. We have become binary in our opinions, and we all live in the “echo chamber” of said opinions. Search on google and the results will return the echo chamber of your own opinion. Check your social feed, and find the echo chamber of your own beliefs. How did this happen? The AI, the algorithm, the code is written this way because on a moral assumption. That assumption is that people want to hear their own opionions and those who do hear their own opinions are more likely to purchase what advertisers are selling. Seems like a perverse logic, doesn’t it?
And we become more and more binary, black and white, us vs. them, survivalists. To the winners the spoils. To the losers, reparations and payback. Wait, didn’t they try that until WWI? Weren’t the seeds of WWII sown in the payback model of WW1? But I digress.
Should we be surprised? When building these supposedly all powerful algorithms, we coded for binary. That is how the actual physics of the machines work. On, off, on, off. A billion times a second. However, these are ultimately commercial enterprises, and engineers put in lines of code that operate out of a system that seeks to maximize profits. Here is where I believe the immoral part came in. Systems that create echo chambers and view users not as consumers but rather the products to be sold, for, as Seth Godin so elaborately expands, we are the product, we are the data that is sold to the companies to whom we are the consumers. These systems are maximized for profit, not for the good of society.
But a moral AI, a moral algorithm can take into account the need we have for diverse opinion. The need we have for humanizing the other side. A search can be customized not only to show favorite results, but unexpected. Or what if every google search not only revealed adds and search results, but non-profits doing good?
I am a deeply opinionated person. I have a strong worldview and an uncompromising faith in a few things. One of those beliefs is I am deeply convinced in the dignity of each person. You are far more complex than your single opinion on any given issue. Yet our world is being built not by informed opinions of elders who have walked for generations. Our world is not being shaped by considerate voices but rather technical giants who are still developing their moral compass as young people. Their funding? That comes from people who have already decided what they believe.
My kids are awesome but they simply do not have the reasoning skills to deduce the implications of their moral decisions. Does it make sense to have them coding our lives, turning us into lines of data to be sold across platforms to maximize the profits of media/marketing giants?
What then is the solution?
I would like to suggest coding for conscientious culture creation. What does that mean? That those who write the algorithms that run our lives be given mentors, elders, and coaches. They be given lessons on the impact of their coding on the daily lives of the data creators, their moms, their dads, their families. Young adults have the malleability to learn the code and write the future. Who could have helped them deduce that their decisions 10 years ago about how to answer searches and find friends would get us here?
Because the future is here, and its’ very binary. If our future is going to be more nuanced, we need to code for it.