A consistent theme of my recent conversations with friends is that I’m more pessimistic about AI than most. Of course, I’m more pessimistic about everything than most people; but normally, I have sufficient company in my bleak and hopeless outlook among many of my bleak and hopeless friends.
AI is curiously different. The general feelings of those like me (young dorks with too much time spent in their brains) is that AI is “nifty.” It’s going to be a revolution in human progress, leading us to greater productivity, beauty, safety, and bliss.
And don’t get me wrong…AI might indeed bring lots of benefits. But it also has potential for dystopian nightmares.
This is not to say that AI will necessarily be bad, nor that it is even most likely to be bad. It is to say that there is significant risk to continued AI development, and this warrants serious thought (and serious caution) by regulators, researchers, and AI users.
Summa Incuborum
I’ll just mention the various downsides, before focusing on one in particular that has troubled me.
There’s the risk of deep fakes. They could be used in terrorism to breakdown diplomacy and trigger global war. They can be used by criminals to run sophisticated scams anyone could fall for. There’s the ease of AI-generated revenge porn and the terrible implications for dating and young people.
There’s the economic risks that AI overtakes our jobs; basic human functions become automated, and the only careers left for people are the increasingly-complex technology roles accessible to an ever shrinking number of humans. AI steals from artists; and the fulfilling process of artistic expression is finally rendered completely unprofitable by robot competition.
Each of those worries merits its own discussion. But the most pressing issue is how AI interacts with our digital data, which threatens us even now with serious consequences.
A dangerous cocktail
For years now we’ve been handing over our lives to data harvesting. We’ve exchanged our privacy for free and convenient digital services, where we are the product being bought and sold. And, to be honest, it was going fine. Sure, there was always something uncomfortable about Google, Facebook, or whatever collecting every detail about my digital life. But hey, Google drive is really nice, and I didn’t want to pay for alternatives.
These companies harvested my data and sold to third parties for marketing revenue. The worst that would happen is I would get an awkwardly specific advertisement.
But two additional things happened, making our data landscape suddenly very hazardous.
First, everything started collecting data. Google collecting your internet searches is intrusive, but it basically stops there. But Google made a metric load of money on such practices, and so now it doesn’t stop there. Everyone wants in on the pie. My digital life merged more closely with my actual life. More things are collecting more data than ever before; we all know of our cell phones. Smart speakers and digital assistants stand always ready to record. Tracking on websites have become more sophisticated. Even your car collects data: location, speed, acceleration, when the doors open, and voice and video recordings from inside the vehicle. Suddenly we find ourselves in a situation where our entire selves are digitally harvested, classified, and available on the open market.
Second, AI happened. Our whole lives have been turned into data; but that’s frankly too much data to be used. No single human or even company can effectively sort through such massive datasets in any way meaningful to the individual. The datasets weren’t ever meant to be linked to specific people anyway; the money was in aggregating it for consumer spending trends, which could then be exploited for advertisements or campaigns.
But now AI makes that data comprehendable. And spaces for profit correspondingly open up. And what’s more, unintended (and more sinister) uses present themselves, too.
Let’s take a recent example, controversial in whether it was sinister–but one that shouldn’t be controversial as to whether it’s alarming. A Catholic group purchased data from Grindr, a gay hookup app, and analyzed it to find and out priests who used the application.
From the narrow Catholic perspective: perhaps a genius usage of technology to root out sexual sin in the Church, or perhaps a mean spirited targeting of gay priests.
But from the societal perspective, shocking proof of the privacy hellscape that we live in. Massive amounts of surveillance + a way to analyze that surveillance = a surveillance state.
That need not be a literal State (though, it can be). As in the Grindr example, it can be a surveillance state within a particular social group, private institution, or cultural movement.
Imagine a world where nothing you do or say is truly private. Employers, governments, neighbors, criminals, foreign powers, militaries, churches, etc…all can know who you voted for, what your religion is, your beliefs on a controversial social issue, where you live, when you’re not home, where you spend your time, who you love, your health habits, anything.
We already live in a world where this has been catastrophic. An ill-advised Tweet or Facebook posts makes public what should be private (at least practically private); and people are fired, run out of town, hacked, bankrupt, or disowned. Thanks to AI, we’re quickly approaching a point where no Tweet is needed.
Don’t panic (but please be scared)
This all sounds very doom and gloom (because it sorta is). But remember that this isn’t necessarily our future, even if that future heavily features AI.
The point isn’t, as some have argued, that AI is inherently evil or must be completely destroyed. It isn’t meant to be alarmist. It’s meant simply to slow down.
Only insignificant things deserve no caution. You don’t blink at an ant; but you do at a hippo. If AI really is powerful–if it really has the revolutionary potential that so many claim–then it deserves respect. It requires a steady hand to not rush in blindly.
I think AI indeed can transform our world. But if so, we must ask: transform into what, exactly?