I am increasingly amazed to watch this new Facebook phenomena that I have come to call “The Great Divide”. The scenario goes a little bit like this: Someone posts a very controversial “Meme” or video clip. Now the fact that it is controversial inherently ensures that many comments against the statement will ensue. Then war breaks out between the commentators. It got to a point where I had to ask myself, is the intent truly to cause division amongst people? What is going on here?
The relationship between this great mystery and MLT is that we are getting more and more involved in the world of data analytics. We are discovering that Machine Learning and Artificial Intelligence are all over Social Media. The fact is that Media increases its revenue with popularity. How do we increase popularity? Give People what they want. So if Robert, who is extremely conservative, has been favoring right wing websites and posting corresponding commentaries, Googles “Donald Trump”, his search results will be all about the greatness of Trump and his exploits. On the other hand, if Celine who has been displaying left wing preferences on-line Googles “Donald Trump”, she will be returned primarily articles that deal with his blunders and faults. How can this be? Isn’t Google impartial? After all, it’s just a machine and therefore should have no bias…
This last assumption is correct. The machine has no bias, no judgement or feelings. But it has been programmed to profile its clientele. This means using your browsing track record to identify your preferences and interests. If Google and Facebook side with you on all issues, you are happy and continue using them. This ensures that you will be subjected to all advertising they transmit, and they in return guaranty their revenue. There is no bad intent behind it beside accumulation of wealth. The machine has no preference Left or Right. But nevertheless, there is an ever growing fallback of which people need to be aware as the algorithms behind it all become better at playing the game.
If for instance, you are a well-adjusted individual leaning slightly toward liberalism. Over the next year, this is identified and therefore all news clips, memes and comments filtered to your screen are increasingly liberal or bashing conservatism. It is certain to have an impact on your political views and push you more and more toward the left-wing extreme. The fact is that there are always two sides to a coin. There is positive and negative in most things. With world population nearing 8 billion, it’s impossible for any single opinion on any subject to be 100% right or wrong. Yet, if you are always only presented with the same side of the coin, you eventually forget the other side.
The goal of this article is not to bash social media. But it is merely to raise awareness to the factual presence of digital intelligence. Specifically because machines have no bias or feelings, we cannot rely on them to provide a fair and balanced view of the world when its sole purpose is to generate revenue. Social media is fast becoming the Phaedrus’s fabled Fox exploiting us with flattery. The dangers of marketing have long been limited to propelling incredulous victims into financial debt. But nowadays, there is a new byproduct which could become far more dangerous. It is widening the divide between social and political views, generating extremists out of people who didn’t even have political interests in the past. All in all, it is reducing tolerance and acceptance by presenting each individual with only one side or view on social issues, the view that they want to see.
We need to be careful when applying artificial intelligence of the potential byproducts or negative impacts that could result. As man and machine interact more and more in the future, I believe we need to be aware of the machine’s lack of conscience and learn to rise above these drawbacks. Otherwise, the Raven’s cheese could very well become our humanity.
By Claude Morin, Facilities Management and Subject Matter Expert
Missing Link Technologies Ltd.
Find Claude's original article Here.