Robot OverlordsImagine sleeping late, every day. You wake up to the smell of pancakes and bacon – it’s your robot servant, standing next to the bed. “Young master,” it says (even though you’re almost fifty), “it’s time to get up.” Sometimes you stay in bed. Sometimes you watch something on the Holo-Telly; there are new shows generated constantly, and brought to life by digital actors and actresses who’ve never passed through the uncanny valley. Sometimes you meet friends for lunch, or maybe a board game. All of you remember what life used to be like, before, and at least once during the meal, somebody raises a glass. “I don’t know about you,” they say, “but I love our robot overlords.”

It Could Happen!

The technological singularity is inevitable. I’m not saying that next week, giant robots with big, glowing eyes will be walking the streets, ordering everyone to PLEASE RETURN TO YOUR HOMES! But as artificial intelligence advances, it seems logical to assume that eventually we’ll be living with machines that are smarter than we are. I see this as much more likely than a future starring immortal transhumanists with giant, pulsing brains; Robot Overlordsthe world I live in is one where smarter, more efficient machines allow people to make more money, and money is a powerful motivator. Sure, there’s profit in genetic modification, too, but the minute you bring up altering human DNA, ethical red flags start piling up; you can work towards smarter machines a lot longer, and through multiple channels, before things get problematic.

The primary danger of the singularity springs from what drives its creation: is the machine intelligence and its progeny motivated by a need to exploit, sell to, or destroy humanity, or are they primarily interested in nurturing and protecting us? Humans want to survive, and reproduce – what should artificial intelligence want? Whether we create robots who see as us as impediments, friends, or pets is up to us. The singularity is inevitable; our new masters having our best interests at heart is not.

It’s Not Like We’re Really Free, Anyway…

I don’t see myself as a cynic, merely a gentle nihilist: a pragmatist, who tries to see the world as it is (in as much as I can). How much freedom does anyone truly have? I’m lucky to have been born into a culture that provides me with a fairly easy life, and if the world around me doesn’t always value my opinions, it at least allows me to express them. I’m grateful, but at the same time, I don’t get to say anything I want, or go anywhere I want, or do anything I want. I’m not rich, famous, or powerful. If I break the law, I’ll be punished, and like everyone else, I live with a host of doubts and fears that sometimes keep my goals at arm’s length. I’m more free than many, less free than some, but it isn’t like I don’t live within limits. Would exchanging one set for another be that difficult?

Robot OverlordsIt’s Probably Our Best Shot at Survival 🙂

I try to keep perspective, but I can’t shake the idea that we’re rapidly approaching a point where humanity’s ability to destroy itself will surpass its ability to prevent it. I’m imagining a time in the not-too-distant future when a small team of idiots can create an atomic bomb with a few things from Best Buy and whatever’s in the fridge.

Sure, we’ve had fifty years of having the ability to destroy ourselves, and we’ve made it so far (give yourself a pat on the back, humanity!), but the key issue here is accessibility: every year the possible paths to extinction multiply, and the number of people with the ability to take them increases. We’ve made fifty years – does anyone really think we’ll make two hundred? A thousand? Nuclear war, climate change, famine, plague – take your pick. Things are more likely to fall apart than to stick together, and our reach will inevitably exceed our grasp. So it goes.

I see a few ways that we might wiggle out of the noose: the stars – if we spread onto other worlds, the amount we have to chew on might prevent us from devouring ourselves. Tranhumanism: if we change who we are – become inhuman enough – we might escape evolution’s grasp.  The clock is ticking; I’m not sure we’ll make it out of the solar system, and we’ll argue (literally) till doomsday about taking the reigns of our own evolution. That leaves us with a third option: inventing our caretakers, ceding control, and hoping that our journey is a pleasant one.

If You’ve Got to Go…Robot Overlords

We’ll be lucky to make it to the singularity. I give us a hundred years, tops. I don’t know whether it’ll be war, or a virus, or whether we’ll just burn this planet out before we have a chance to escape it. I read an article a while ago that suggested that the reason we haven’t come into contact with other civilizations is that the vast majority of them get to a certain stage, and self-destruct. Maybe human intelligence and self-reflection aren’t viable, past a certain point. It’s sobering to imagine a universe dotted with ruined civilizations. Who are we, to be the exception?

But maybe we can sidestep evolution, and embrace caretakers programmed for long-term survival. Could our better, wiser children take over the galaxy? If so, I wouldn’t object to humanity being taken along for the ride. A gilded cage is better than a coffin, after all, and free from the burden of steering our destiny, we might even learn to sing.