What Comes After AI? 6 Dangerous Things That Will Happen

This is part 3 in a 3 part series on the post-AI possibility of machine-based Artificial Empathy. [Part 1][Part 2]

In my last post, I focused on what trends and changes that are happening right now are enabling computers to identify human emotions more accurately and quickly than your closest friends.

But what will that enable?

This post will answer that by highlighting the most important possible ramifications of bringing Artificial Empathy (AE) online. Some are good, some are bad, and some are truly nightmarish.

Use the page controls below to learn more.


Hypocrisy-160235363_3228x3744.jpeg

Good: The End of Depression and Suicide

This one seems to be the most obvious and closest to reality today. A machine that can detect your true inner feelings and emotions would be super adept at identifying depression and suicidal thoughts.

Scholarly work has already determined social media postings can provide better data than self-reported information patients give their doctors. And Facebook has started rolling out AI to determine if specific posts are communicating a suicidal intent and flagging them for human moderators, or notifying local authorities. But both of those examples just flag data for humans to examine.

If the technology gets to the point where the machine will know your feelings based on your microexpressions and visual cues, then you won’t even have to paste your thoughts to tip off the machines.


Unicorn

Good: Maximum Happiness Will Be A Realistic Goal for Governments

Aristotle and Plato argued during ancient times about what was the ultimate good that society should strive for. Aristotle argued that happiness was the ultimate goal in life and that everything else was a means to that end.

In the US Declaration of independence Happiness is also called out as an inalienable right: “Life, Liberty and the pursuit of Happiness”.

But none of them could measure the actual happiness of society, or how any specific policy choice impacted the happiness of everyone in society.

If we can measure with AE every individual’s specific happiness levels and track the changes to that over time, a government could produce a specific “Happiness” index and use it to drive specific laws and programs. Happiness as a goal is no longer something abstract in the realm of philosophy, but something as real and measurable as inflation or unemployment.

Massive production

Bad: Advertisers Will Pay For You To Have Specific Emotions

I mentioned in prior posts that online advertisers have moved from paying “per impression” to paying “per ad click” to paying “per specific actions” such as purchases, registrations or downloads. Each of those eras was represented by a specific advertising platform, DoubleClick, Yahoo, and Google.

Once you can measure emotions related to specific content or brands, you can start charging brands for “shifts in emotions” towards their brands. How many sponsored posts in a feed or mentions in a youtube video about a product will make you feel happy thoughts? make you smile? make you laugh?

How many beyond that will make you think that it is overplayed and no longer cool?

Advertiser will be able to pay to have you feel a specific emotion in connection with their product and brand. And it will happen thousands of times a day.

Curious corporate businessman skeptically meeting looking at small employee standing on table plate through magnifying glass

Bad: Lying Will Be impossible and so will Private Thoughts

This is a feature of the near future that others have written about. If you combine persistent augmented reality with emotion detection you get always-on lie detection, making lying obsolete. But the big leap doesn’t even require AR. That implies that data needs to be fed to a specific human to be of value.

If this can be accomplished at internet scale, or even just at the security checkpoints for airports and immigration offices, the impact will be enormous.

What private thoughts do you keep shielded today? Do you disagree with the current political party? Are you planning to look for another job? Do you dream of becoming an actor?

How different will it be to be a teenager if you cannot lie to your parents? The impacts to how we interact, and how we expect people to interact with us will be something never before seen in human history.

Again, going back to the greeks, can you imagine what Diogenes would say, if he walked around with his lamp, and could only find honest men, incapable of lying.

Abstract fractal background with intertwined threads and glowing light in the center

Ugly: Mixing Designer Drugs, Augmented Reality and Mood Seeking Stimulus will be the Most Addictive Human Experience in History.

What if you had a computer that just wanted you to be happy, like Joi from Bladerunner 2049? (she was digital assistant/girlfriend that was attentive to all the wants and needs of the main character). What if you could feel the beauty and amazement that people felt when the saw Avatar in 3D the first time? What if the Tek from Tekwars was real?

Now imagine that Joi had access not only to the Augmented reality projection from the film, but also could create a fully immersive virtual reality like Tek, and could also mix that with power individually tailored psychoactive drugs, like McAfee.

For the adventurous, you could have sensations never before imagined or expereinced. And the odds are you would never want to come back. The opioid Crisis will look tame in comparison.
Anonymous-computer-hacker-898134046_6016x4000.jpeg

Ugly: Future Totalitarian Governments Will Make 1984’s “Big Brother” Look Soft and Free

You have no private thought, no ability or reason to try to lie, and the system provides you with all the stimulus you need to feel maximum happiness. What could be wrong with that?

Well… Happiness is not the only emotion that could be cultivated. What if instead of brands wanting you to love them and be happy, you had a totalitarian government wanting you to fear your fellow citizens, and be ready to kill and die for the state.

The list of paranoid former authoritarian dictators that would have loved to know exactly who is on their side is long.

  • Stalin
  • Hitler
  • Mao
  • Pol Pot

And many more.

Now imagine that their state-run media has the kind of insight and power we talked about above. There would be no freedom. No escape. Total domination of the individual by the state.

How could you even imagine to fight back?

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.

%d bloggers like this: