A Book that Changed the Future...
When Dr. Susan Calvin looks back on her career as a robopsychologist, she can’t help being saddened by the robotic evolution she has witnessed. Yet, it is not clear whether her sadness is a result of the technology or of the humans creating the technology. When asked if robots are so different from humans she replies, "Worlds different. Robots are essentially decent."
Isaac Asimov’s book I, Robot is a landmark in science fiction that not only transformed the genre but also introduced generations to the ethical dilemmas that artificial intelligence presents. This book is a collection of stories all of which were originally published in the late 1940s and early 1950s. Today, as we regularly talk to computers in our homes, ponder the impact of driverless cars, and increasingly automate our lives, I, Robot remains relevant and fresh.
Asimov’s work revolves around the Three Laws of Robotics:
- A robot may not injure a human being, or, through inaction, allow a human being to come to harm.
- A robot must obey orders given it by human beings except where such orders would conflict with the First Law.
- A robot must protect its own existence as long as such protection does not conflict with the First or Second Law.
These laws set forth an ethical operating system that are not too far removed from the ethical systems of most major religions. The central question that this book asks is whether humans can create artificial intelligence without losing control over it. The answer is not clear.
The Future of Work
Today, machine learning (artificial intelligence) is on the verge of transforming work. No industry is safe as both white collar and blue collar will feel the impact. Automation is quickly replacing call centers, insurance underwriters, sales positions, agricultural jobs, service workers, and many more. An Oxford study found that 30% of current positions may be automated by the year 2030.
Technology and Ethics
The rise of artificial intelligence creates a range of ethical challenges. If millions of people lose their jobs, what responsibility does society have to ensure that they do not end up in poverty? Should the nations of the world establish a code for how robots and artificial intelligence are used in war or used against their own citizens? Are laws such as the Three Laws of Robotics realistic? What incentives would companies or governments have to bend these laws?
Emotional Attachment to Technology
Throughout I, Robot, humans form relationships with robots, but it is not always clear that the robots reciprocate those feelings equally. In what ways do humans anthropomorphize or grow attached to technology? Does technology enhance our relationships to others by connecting us or is it an obstacle to authentic relationships?
Technology and Spirituality
Several of the stories in I, Robot have religious or spiritual themes. The Three Laws of Robotics reflect an ethical code. Over history how have scientific discoveries impacted religious perspectives? Today, there are many people anticipating a coming “singularity” where humans merge with machines. How does this reflect a religious fervor? Does the development of complex artificial intelligence present a challenge to the centrality of humans within religious practice?
Gender, Race & Technology
The main character of I Robot, Dr. Susan Calvin, is portrayed as an emotionless scientist who handles tough situations, but often does so in a way that is not much different than the robots themselves. As one of the few female characters in I, Robot, she presents a stereotypical view of the “frosty girl” (to quote the first page of the book’s introduction) who is a poor representation of women and an especially poor representation of women in science. What stereotypes about women in the sciences persist today? I, Robot has few women characters or characters who are not white. Does this book’s silence act as a commentary about who benefits from technology? How are the benefits and burdens of today’s “information society” distributed across racial and gender lines?
What Counts as Life?
As computing power increases, is it possible that computers may come to life? Would we know? Is life a simple matter of processing power or is it something else? Do our definitions of life need to change based on the emergence of this technology? How does our treatment and use of robots provide insight into our own moral and ethical strengths and weaknesses?
About the One Book Program
For thousands of years, humans have used stories to communicate knowledge about the world. Stories provide contexts for our understanding of facts, emotions, discoveries, history, relationships, and all kinds of human interaction. For this reason, the Moraine Valley Library and the Moraine Valley Bookstore invite all members of the community to come together to discuss a selected story in the One Book, One College program. Join us as we share knowledge across disciplines, exchange new ideas on useful topics, and enrich our curriculum in new ways. For more information, contact us at (708) 974-5709 or swanson[email@example.com.
Previous One Book, One College Selections
- Mary Shelley’s Frankenstein (2004)
- Mark Haddon’s The Curious Incident of the Dog in the NightTime (2005-2006)
- George Orwell’s 1984 (2006-2007)
- The Autobiography of Malcolm X as Told to Alex Haley (2007-2008)
- Elizabeth Royte's Garbage Land (2008-2009)
- Studs Terkel's Working (2009-2010)
- Rebecca Skloot's The Immortal Life of Henrietta Lacks (2010-2011)
- Roxanna Saberi's Between Two Worlds (2011-2012)
- Tony Horwitz's Confederates in the Attic: Dispatches from the Unfinished Civil War (2012-2013)
- Max Brooks’ World War Z: The Oral History of the Zombie War (2013-2014)
- Jame Baldwin's Giovanni's Room (2014-2015)
- José Angel N.'s Illegal: Reflections of an Undocumented Immigrant (2015-2016)
- LIn-Manuel Miranda's Hamilton: the Musical (2016-2017)
- Andea L. Pino & Annie E. Clark, We Believe You; Survivors of Campus Sexual Assault Speak Out (2017-2018)
- G. Willow Wilson & Adrian Alphona, Ms. Marvel: No Normal, (2018-2019)