To create Edgar Prince, the villain in eHuman Dawn, I had to dive deep into the shadows of the mind and psyche and do some serious research on phenomena called evil. As I began work on the sequel, it became clear that to understand him better, and the world he has created, I needed to go further, and face that shadow directly within my own life and the world around me. As I surface from this journey, I've discovered something very important: One evil man does not make an evil world. We may follow him and make him our leader, (see my blog on psychopaths) but he can't be effective unless his orders are carried out by many individuals, and rarely do the orders seem purposefully evil. Instead, they often make sense from a group perspective, even if they actually cause great harm to others.
I believe in humanity and the future. I believe that our science can and will unlock secrets needed for long life and a cleaner planet. But there's also this part of me that cautions, reminding me to look at who's offering the technology before "Jumping" into it, as the process for uploading the consciousness into an eHuman body is called in my novel. In the short term, it solves the problem of death, but who will make sure the mechanized body lives forever? Who guarantees the quality of life in a programmable world?
To give our lives to technology is to give them to those who own the technology.
I cannot escape this basic truth. Do I trust those who run our corporations and governments? For those who've read eHuman Dawn, the answer is clear: No I don't. I can't because at every turn I see evidence of evil enacted on this planet in the name of the corporate profits and government sovereignty: Wars funded to protect corporate mining interests, children forced to work for no money to make cheap clothing, rain forests destroyed to raise the beef that fuels our fast food industry, ecosystems obliterated for profits and stock markets manipulated for billion dollar bonuses. This is just a small list of the ways our businesses and governments harm humanity everyday. The tendency towards group evil is far greater than the percentage of individual psychopaths in the world.
Why is this? Why are evil policies easily enacted on behalf of the group? In his profound book, "The People of The Lie," author Scott Peck, MD, makes the following observation:
"For many years it has seemed to me that human groups tend to behave in much the same ways as human individuals - except at a level that is much more primitive and immature than one might expect...Of one thing I am certain, however: that there is more than one right answer…this is to say that it is the result of multiple causes. One of those causes is the problem of specialization."
Dr. Peck goes on to explain that while specialization is the reason for groups to even exist, we can get more done together than we can alone, it is also a main reason groups are capable of evil. This is because of what Dr. Peck calls, "fragmentation of consciousness."
"Whenever the roles of individuals within groups become specialized, it becomes both possible and easy for the individual to pass the moral buck to some other part of the group…we will see this fragmentation time and time again…The plain fact of the matter is that any group will remain inevitably potentially conscienceless and evil until such a time as each and every individual holds himself or herself directly responsible for the behavior of the whole group - the organism - of which he or she is a part. We have not begun to arrive at that point."
Thus, when asked why annual inspections were not done on an oil rig in the Gulf of Mexico, the person in charge of said inspections might say, "I was told to stop doing them by my boss." When the boss is asked, he may answer, "Corporate put a hold on all inspections until further notice." Keep going up the ladder and you find yourself in the CEO's office, the one who should be responsible. But what will he say? Dr. Peck imagines an answer something like, "My actions may not seem entirely ethical, but I had to cut expenses. After all, I must be responsible to the stockholders you know. On their account I must be directed to the profit motive."
Who then decides what actions any group will take? The small investor who has no clue how the operation even works? The mutual fund owners? If so, which mutual fund? Which broker?
Who is ultimately responsible for the actions of the group?
In this light, it becomes clear that groups are immature, and in great danger of being morally bankrupt. Since over 90% of people work for an organization, most of us are in danger of passing the moral buck when it comes to our work. Add to this the fact that the majority of people would rather follow than lead, thus allowing a power vacuum that psychopaths can and often do fill, and we have an environment ripe for manipulation and exploitation. We are all part of the problem, whether we like it or not.
It worries me how easy it would be for the world of eHuman Dawn to become a reality. I don't trust the Edgar Princes, nor the Guardian Enterprises of this world, to enable technological immortality for the love of humanity. Yet, I don't believe that halting progress is the solution. There's so much yet to discover about our humanity, and the connections between our bodies, minds and the planet.
Instead of fearing technological innovation, we each need to become responsible, right here and now, for the organizations in which we live and work. Each and every one of us must stop passing the moral buck and become worthy of the technology we're creating. This is the great work that the future requires from us, if we're going to live in a world of personal liberty and freedom for all.
Whether or not your job produces technology doesn't matter. Every organization runs the risk of passing the moral buck in some way, thus creating a mentality that the evil that surrounds us isn't our fault. Each of us has the opportunity to change that group dynamic, and create mature work places and organizations that honor humanity as a whole.
The alternative could be nothing less than the complete technocratic rule of the few over the many. I think we can do better than that.
On Valentine's Day, I went to see "Her" with my husband. Much has been written about the movie, so I'm not going to bother with a review. Instead, I'd like to consider just how likely it is that we humans will begin to fall in love with operating systems, or online game characters, with more regularity--to the point that we could, like the protagonist in "Her", bring our bodiless sweetheart on a double date with friends.
The sensible adult in me rejects the idea. How could a human fall in love with something that doesn't even really exist? Yet as I allow myself to fall deeper into the question, I begin to see that many of us are already doing this, just with each other.
Take online dating. Many couples now meet each other using services like eHarmony. At first, potential candidates are just profiles on a screen, data to be sifted through. It's surprising that any relationship could lead to intimacy with such a sterile means of introduction, until we look at the stats--according to Forbes magazine, one third of married couples in 2013 met online. Obviously, something catches the attention, whether it's the clever things the person posted, or the images that they've chosen to share. After checking out one another's profile pages, people can begin to converse with one another, first through texting or email, eventually progressing to phone calls and Skypeing.
Attraction even begins on social media sites such as Twitter, where I've "met" many intelligent and interesting individuals. I love the conversations I've had there and I can see how without ever meeting in person, I can develop an interest in someone's online persona. In addition, everyday trusted friendships are formed within the social media realm and people come together to create wonderful things without ever having met in person. The online context is deep enough to create lasting connections.
Samantha (voiced by Scarlett Johansson), the operating system in "Her", is really no different than an online human. She entices Theo (Joaquin Phoenix) with her clever dialogue, her soft, breathy voice, the ways she remembers what he needs and the care she takes in delivering important information. Just like the folks on eHarmony or Twitter, they get to know one another online and begin to care. They desire to check in regularly, each one wanting to know what the other is doing. An online game character could do the same, getting to know someone better each time the she goes out on a virtual game mission, battle or journey with a human. At this level, there really is no difference between human and computer. Both are beings getting to know one another and if the software is believable and likable, the human can and will fall in love with it.
Even more interesting, you and I really can't be sure that who we're meeting online is even a real person. That Twitter follower I enjoy might just be a really impressive AI. How can we be sure that all the clever things the eHarmony candidate wrote are even his thoughts? Perhaps his friend told him what to say? Deception and identity can and are easily hidden online. Recently, a child's rights group in the Netherlands used an AI called "Sweetie" to catch 1000 child predators online. I think that alone shows us that yes, humans can be aroused by artificial intelligence.
The real question is, can a relationship with an AI last?
We are two different races, one bodiless and limited by programming capacity, the other embodied and limited by the material world.
"Her" does a beautiful job at showing how vast the differences are. First of all, unlike humans, operating systems, AIs and gaming characters don't have physical bodies. There's no getting around it. As of right now, humans have an organic world that we live in, and we're wired to thrive in such a world. Studies show that touch, sex, dancing together, laughter with friends, and even bathing with others improves our health, releases beneficial hormones and increases our immunity. Nothing is worse for human health than a life untouched. If your true love doesn't have a body, how will you satisfy your urges to be connected to one another? She can't massage you, kiss you, or even hold you when you're sad.
You might have a great, exciting virtual life together, but in real life, you're alone, whether you like it or not.
We'd like to think that the body doesn't really matter, but ask anyone on eHarmony or other internet dating services--just because you "clicked" online or on the phone, doesn't mean the chemistry will be there when you meet in physical reality. I have a girlfriend who met a great guy online and their Skype sessions were fantastic. But when they met, there weren't any sparks. Even if our AI's can meet us intellectually, there can never be real sparks. At least not while we inhabit our bodies.
When it comes to falling in love with cyber entities, there's one more thing to consider. The cyber entity is networked, able to be in many places at once. Their consciousness is not bound to a single identity, the way the embodied folk are. Instead they can be in several missions, or online conversing, with several different people. There's a lovely scene in "Her" when Theo realizes that Samantha could indeed be intimate with other humans. When he asks her, she tells him that she interacts with over 8,000 others regularly, and is in love with at least 600 of them. Humans tend to be demanding and jealous creatures. To share your beloved with 600 others seems a stretch, almost impossible.
How can you be special, if your AI lover is bringing happiness to thousands of others, perhaps at the exact same moment in time?
It might just be that humans don't have what it takes to truly fall in love with an operating system. Because in the end, the jealousy would drive us away--if the fact that we slept alone each night didn't kill the whole thing first.