I, Robot – Literature – Flashcards

Unlock all answers in this set

Unlock answers
question
Three rules of robotics
answer
1. A robot may not injure a human being or, through inaction, allow a human being to come to harm. 2. A robot must obey orders given it by human beings except where such orders would conflict with the First Law. 3. A robot must protect its own existence as long as such protection does not conflict with the First or Second Law.
question
Asimov
answer
• Born in Russia to Jewish parents, immigrates to the US at the age of 3 • He tricks his dad into letting him read Astounding Science Fiction, the editor (john cambell) eventually published his first story • A professor of biochemistry • American Humanist Association ○ "No deity will save us, we must save ourselves"
question
Susan Calvin
answer
"In 2008 she obtained her Ph.D. and joined US Robots as a "Robopsychologist", become the first great practitioner of a new science." - Why do robots need psychologists? - robots can become complex
question
Chapter 1: Robbie
answer
Centered on a girl, Gloria, and her robot friend, Robbie. Gloria's mother, Grace Weston, is worried that Gloria spends to much time with Robbie and not enough time with people. She eventually sways Gloria's dad, George Weston, into getting rid of the expensive robot. George feels bad about returning Robbie to the factory, and buys Gloria a pet dog. Nonetheless Gloria, missing her best friend Robbie, withdraws and ceases to enjoy life. Her parents, getting desperate take their daughter to every conceivable tourist attraction, George still sick with shame takes his wife and daughter to a robot construction factory. There, Gloria spots Robbie and runs out in front of a machine in her joy at seeing him. Gloria would have be crushed if not for the fact that Robbie ran towards her and scooped her out of the path of the hurtling vehicle. On the walkway, Mrs. Weston confronts her husband: he had set it all up. Robbie was not an industrial robot and had no business being there. Mr. Weston knew that if he managed to get Robbie and Gloria back together, there would be no way for Mrs. Weston to separate them. When Robbie saves Gloria's life, an unplanned part of the reunion, Mrs. Weston finally agrees that he might not be a soulless monster, and gives in.
question
Robbie
answer
· Cannot speak · Exhibits a childlike innocence and fascination for stories · Rejected by Gloria's mother because he does not have a soul · Gloria sees Robbie as a person · He save Gloria, but he programmed to do so · From an enlightenment standpoint, the system functions perfectly · "Robbie nodded his head- a small parallelepiped with rounded edges and corners attached to a similar but much larger parallelepiped that served as torso by means of a short, flexible stalk...A thin metal film descended over his glowing eyes and from within his body came a steady, resonant tick."- Robbie
question
Chapter 2: Runaround
answer
based in the year of 2015, Powell, Donovan two robot field testers and a Robot SPD-13 (aka "Speedy") are on mercury attempting to restart a old mining station there. There they are disturbed to find that their photo-cell banks that support life on the station are short on selenium, and would shut down if they are left in this state, as Speedy is the only one who can survive Mercury's harsh temperatures Donovan sends the robot to gather selenium from a nearby pool. Speedy however, doesn't return for five hours, at which point Powell and Donovan decide to make their way to the pool though the underground tunnels. Upon arriving at the pool the two humans are surprised to discover the robot running huge rings around the selenium pool. And acting in a way, that were the robot human, would be interpreted as drunkenness. Powell orders the robot to return to them, but Speedy displays a show of unwillingness before continuing the circles around the pool Powell eventually realizes that the selenium source contains some sort of unexpected danger to the robot. Under normal circumstances, Speedy would observe the Second Law ("a robot must obey orders"), but, because Speedy was so expensive to manufacture and "not a thing to be lightly destroyed", the Third Law ("a robot must protect its own existence") had been strengthened "so that his allergy to danger is unusually high". As the order to retrieve the selenium was casually worded with no particular emphasis, Speedy cannot decide whether to obey it (Second Law) or protect himself from danger (the strengthened Third Law). As a compromise, he circles the selenium until the harsh conditions and conflicting Laws damage him to the point that he has started acting inebriated. Attempts to order Speedy to return (Second Law) fail, as the conflicted positronic brain cannot accept new orders. Attempts to change the danger to the robot (Third Law) merely cause Speedy to change routes until he finds a new avoid-danger/follow-order equilibrium. Of course, the only thing that trumps both the Second and Third Laws is the First Law of Robotics ("a robot may not...allow a human being to come to harm"). Therefore, Powell decides to risk his life by going out in the heat, hoping that the First Law will force Speedy to overcome his cognitive dissonance and save his life. The plan eventually works, and the team is able to repair the photo-cell banks.
question
Speedy
answer
· Conflict between the second and third laws of robotics · Designed for a specific purpose · Powell and Donovan use deductive reasoning · "Speedy's gait included a peculiar rolling stagger, a noticeable side to side lurch" · "There's nothing like deduction. We've determined everything about our problem but the solution
question
The talking Robot
answer
· Face with the proposition of its identity as distinct from other robots, its short-circuits: · Robbie could not yet see himself as an individual · His problem is still one of hardware · "....totally immobile mass of wires and coils spreading over 25 square yards" · "There was an oily whir of gears and a mechanical voice boomed out..." · "....and half a dozen coils burned out..." · "Practical Aspects of Robotics"
question
Chapter 3: Reason
answer
Like Runaround, "Reason"'s characters are again Powell and Donovan, this time they have been assigned to a space station, where they construct a robot, QT1, known to them as Cutie, this robot is different from most in that it was made with constructive reasoning in mind for it. Powell and Donovan later regret this, for the robot soon assumes that the humans are inferior, and refuses to obey any commands from them. It comes to this conclusion because Powell and Donovan have lower intelligence than it. Cutie decides that space, stars and the planets beyond the station don't really exist, and that the humans that visit the station are unimportant, short-lived and expendable. It invents its own religion, serving the power source of the ship (Master), concluding that it must become the Prophet of the Master, and soon converts the rest of the robots on the ship of this new religion, preventing Powell and Donovan from restraining it by force. Powell and Donovan do everything they can to convince the robot of its mistake, even building another robot before its eyes. Cutie watches in silence, then comments, "But you didn't really create the robot. The parts were created by the master" The situation seems desperate, as a solar storm is expected, potentially deflecting the energy beam, incinerating populated areas. When the storm hits, Powell and Donovan are amazed to find that the beam operates perfectly. Cutie, however, does not believe it did anything other than maintain meter readings at optimum, according to the commands of The Master. As far as Cutie and the rest of the robots are concerned, solar storms, beams and planets are non-existent. Powell and Donovan are sent to their quarters till their tour of duty is over, when the replacements arrive Powell and Donovan neglect to warn the new team of Cuties disposition. When Powell and Donovan tell Cutie that they are on their way back to earth, the robot only tells them that it is best they think so and it now sees the wisdom behind the illusion
question
Cutie
answer
· Operates from the ingrained premise that he is superior to the humans in every way · A conflict of reasoning: a priori vs a posteriori argumentation · Solved pragmatically · "Robot QT1 sat immovable. The burnished plates of his body gleamed in the Luxites and the glowing red of the photoelectric cells that were his eyes, were fixed steadily upon the earthman..." · "A chain of valid reasoning can end only with the determination of truth" · "..the self-evident proposition that no being can create another being superior to itself..." A priori reasoning · "Since when is the evidence of our sense any math for the clear rigid light of reason?" · Solution? A pragmatic a priori acceptance
question
Chapter 4: catch that rabbit
answer
2016: Asteroid. Donovan and Powell are sent to an asteroid to test a mining robot, DV-5 (Dave), that controls six subrobots. Dave has a problem: sometimes, for no apparent reason, the robot stops mining and starts marching his subrobots in drills. Though upset at his own behavior, Dave can't explain why he is doing this. Donovan and Powell interview a subrobot, but that is like asking a "finger" why the hand does what it does. They figure out that situations requiring personal initiative (e.g., emergencies) cause the problem. But they can't understand why because Dave always resumes his correct duties when they show up. They can't take Dave apart to test him: all the circuits are intertwined, which means that testing them in isolation won't help. They decide to create an emergency without Dave knowing and then watch what happens. They cause a cave-in on themselves, but Dave goes marching off with his subrobots, leaving them trapped. Powell then shoots one of the subrobots. Dave comes back to rescue them. Powell figured out that the six subrobots needed more command attention during an emergency, which put too much stress on "a special co-ordinating circuit." With fewer robots to attend to, Dave was able to handle emergencies. At the end of this tale the frame story resumes with Susan Calvin. The reporter asks her if a robot had ever "gone wrong" on her watch. She hesitates, but then admits that this did happen once, and launches into the story of Herbie.
question
Dave
answer
· The "multiple" robot · Programmed with more self-initiative · Must protect the other robots as if they were him (because they are) but this conflicts with the orders given to him · "7 ft tall and a half ton of metal and electricity". And a positronic brain, which ten pounds of matter and a few quintillion positrons runs the whole show" · Observation fails; deduction fails; experimentation fails; which leaves only intuition · "...There's a special coordinating circuit that would be the only section involved"
question
Chapter 5: Liar
answer
"Liar!" follows a story in Susan Calvin's life, Susan is investigating a Robot, nicknamed Herbie, which can read minds. This leads to some trouble since at the time Susan is in love with one of the mechanics, Milton Ashe. Herbie tells Susan that the Milton also loves her. Meanwhile mathematician Peter Bogert has asked Herbie if Alfred Lanning, Director of US Robots, is thinking of resigning any time soon, and who might be his successor, Herbie announces that Dr, Lanning has already resigned to come into effect after it is discovered why Herbie can read minds, and that the next Director of US Robots will be Bogert. This leads Peter to be very insolent, and when he tells Lanning of his discovery they both set of to meet with Herbie. In this Time Susan has found that Milton is engaged, enraged at Herbie's lie she rushed to the robot, and then, upon arriving at the robot, she realised why he had told her that Milton loved her, due to the first law, "a robot must not injury a human being", which out ruled, "a robot must not lie", Herbie had been forced to lie to her to prevent her any mental harm. At this time Lanning and Bogert came into the room, When Lanning asked if Herbie had discussed him with Peter, Herbie promptly lied; "no, sir", however once Bogert asked if Herbie had said that Lanning was going to resign the robot fell silent, unable to say anything without causing harm. Susan revealed how they had also been tricked. Herbie had naturally told them what they wanted to hear. So now, when asked what had gone wrong in his assembly the Robot was presented with a roundabout, if he were to tell them, this would deflate Lanning and Peters ego, imagine the answer coming from a mere Robot, when they had failed. How ever not to tell them would also harm them, because they both wanted to know the answer, Susan repeated this over, until the robot could take no more and Herbie collapsed irreparably. "Liar!" Susan said.
question
Herbie
answer
· Can read minds, understands psychological hurt · Lies to Calvin and Bogert · More concerned with novels · Can lie to them, not follow orders, because those orders contradict with Rule #1 · "We've produced a positronic brain supposedly ordinary vintage that's got the remarkable property of being able to tune in one thought waves. It would mark the most important advance in robotics in decades, if we knew how ti happened." · "You can't tell them, because that would hurt and you mustn't hurt. But if you don't tell them.." - has no self awareness even though he can understand other people
question
Little Lost Robot
answer
2029: Hyper Base. Susan Calvin and Peter Bogert are called to a hyper base to identify one missing NS-S (Nestor) robot out of a fleet of sixty-three seemingly identical models. The missing robot is identical to all the others except that its positronic brain is not fully wired with the entire First Law of Robotics (against harming humans). It turns out that a worker, Gerald Black, got annoyed with the robot and told it, "Go lose yourself." Obeying the order, Nestor 10 made itself indistinguishable from other robots. To flush out Nestor 10, Calvin arranges to have all the robots see a rock drop toward a human (the rock is deflected at the last second). She measures the reaction time of the robots as they rush to protect the human, reasoning that the robot that is not wired with a complete First Law will react differently. Her reasoning is wrong: they all react the same way. She tries another strategy. She tells the robots that they will be electrocuted if they move towards the human. Calvin reasons that only the robot with a weak First Law won't move because the Third Law, self-preservation, will equal, not override, the First Law. But Nestor 10 had pointed out to the other robots earlier that if they were going to die, they wouldn't be able to save the person anyway, so it was better not to move so they could live to save someone else another day. Finally, Calvin arranges a third test to flush out Nestor 10. Only Nestor 10 can tell the difference between harmless and harmful radiation. When the Nestor robots are all told that harmful radiation will be between them and the person in danger, all but one-Nestor 10-remain seated when the rock falls. Nestor 10 moves because he can see that the radiation is not dangerous. He tries to attack Calvin because she has found him out and the Second Law (obey orders) outweighs the weak First Law (don't harm people). But because the room is bathed in gamma radiation, which kills robot brains, she survives.
question
Nestor
answer
· Suffers from pride · "Go lose yourself!" · Built with a modified First Law · Government's and corporation ignoring protocols for profits · "I don't think I was attacked exactly. Nestor 10 was simply trying to do so. What was left of the First Law was still holding him back." · "That he himself could only identify wavelengths by virtue of the training he had received at Hyper Base, under mere human beings, was a little too humiliating to remember for just a moment."
question
Chapter 6: Escape!
answer
In "Escape!", many research organizations are working to develop the hyperspace drive. US Robots are approached by their biggest competitor with plans for a working hyperspace engine that allows humans to survive the jump (a theme which would be further developed in other stories). But they are wary because, in performing the calculations, their rival's (non-positronic) supercomputer destroyed itself. US Robots find a way to feed the information to their own computer, a positronic one known as The Brain (which is not a robot in the strictest sense of the word, as it doesn't move), without the same thing happening. The Brain then directs the building of a hypership. Powell and Donovan board the ship, and the ship takes off without their being initially aware of it. They also find that The Brain has become a practical joker; it hasn't built any manual controls for the ship, no showers either and it only supplies tinned beans and milk for the crew to survive on. Eventually, the ship does successfully return to Earth after a hyperspace jump, and Susan Calvin discovers what has happened. A hyperspace jump causes the crew of the ship to cease existing for a brief moment, which is a violation of the First Law (albeit temporary) and this frightens the AI of "The Brain" into irrational, childish behavior as a means of coping.
question
Chapter 7: evidence
answer
Stephen Byerley is a lawyer, a successful, middle-aged prosecutor, a humanitarian who never presses for the death penalty. He runs for Mayor of New York City, but Francis Quinn's political machine smears him, claiming that he is a humanoid robot, that is, a machine built to look like a human being. If this is true the hysteria would ruin his campaign, as of course, only human beings are allowed to run for office. Quinn approaches U.S. Robots and Mechanical Men corporation, the world's only supplier of positronic robot brains, and attempts to persuade them that Byerley must be a robot. No one has ever seen Byerley eat or sleep, Quinn reports. All attempts to prove or disprove Byerley's humanity fail. He visits the U.S. Robots offices, where the Chief Robopsychologist Susan Calvin offers him an apple. Quite nonchalantly, Byerley takes a bite — proving nothing, since like he may have been designed with an emergency stomach. Quinn attempts to take clandestine X-ray photographs, but Byerley wears a device which fogs the camera. Through all these investigations, Byerley remains calm and smiling, pointing out that he is only upholding his civil rights, just as he would do for others if he is elected. His opponents claim that, as a robot, he has no civil rights, but Byerley counters that they must first prove that he is a robot, before they can deny his rights as a human — including his right not to submit to physical examination. Once all physical means are exhausted, Susan Calvin indicates that they must turn to the psychological side. If Byerley is a robot, he must obey the Three Laws of Robotics. Were Byerley to violate one of the Laws, he would clearly be a human, since no robot can contradict its basic programming. However, if Byerley obeys the Laws, it still doesn't prove he is a robot, since the Laws were invented with human morality in mind. "He may simply be a very good man," observes Dr. Calvin. Ironically, to prove himself to be a human being, Byerley must demonstrate that he is capable of harming a human. Byerley never confirms or denies his flesh-and-blood status and lets the entire campaign ride on this single issue. While he is giving a speech, a heckler rushes the stage, and the heckler asks to be hit in the face. Byerley complies and punches the heckler in the face. Most people are convinced that he is human, and the emotional uproar demolishes Quinn's smear campaign. Byerley wins the election without further difficulty. In the final scene, Susan Calvin confronts Byerley, who is again spending a late night awake. She says that she is somewhat regretful Byerley turned out human, because after all, a robot would make an ideal ruler, one incapable of cruelty or injustice. In an almost teasing speech, quite unlike her usual self, Dr. Calvin notes that there is one case, "just one", where a robot may avoid the First Law: when the "man" who is harmed is merely another humanoid robot. This implies that the heckler whom Byerley punched may have been a robot, and if that was the case, Byerley hadn't broken the First Law, leaving the question of his humanity open. At the end Dr. Calvin notes that Byerley had his body atomized upon his "death" thus wiping out any evidence either way.
question
Chapter 8: the evitable conflict
answer
2052: Earth. Earlier parts of the frame story give hints that Robots have evolved into dominating influences in human life as "Machines." These Machines are vast systems of calculating circuits that maintain the human economy to minimize the harm that humans cause themselves. Earth has "no unemployment, no overproduction or shortages. Waste and famine are words in history books. And so the question of ownership of the means of production becomes obsolescent." Byerly, now "World Co-ordinator," worries because production is not precise, which means that the machines may be malfunctioning. Byerly interviews the Vice-Coordinators for each of Earth's four regions (Eastern, Tropic, European, and Northern) but each one tries to downplay the production problems. Byerly guesses that behind it all is the "Society for Humanity," a small group of powerful men "who feel themselves strong enough to decide for themselves what is best for them, and not just to be told [by robots] what is best for others." But Calvin assures Byerly that the Machines allow the Society's plots so that it will create just enough turmoil to destroy itself. Given that even the cleverest attempts to overthrow the Machines only result in more data for the Machines to consider, large-scale disruptions (wars, economic turmoil, etc.) will be avoidable-evitable. "Only the Machines, from now on, are inevitable!" · 2052 · War and economic struggle no longer exist · World divided into 4 regions, each managed by a supercomputer · Only the society for humanity remains to oppose the humans · Slave becomes the master · The technological Utopia
question
The Eastern region
answer
· Population: 1.7 bil · Capital: shanghai · Yeasts used to feed people
question
The Tropic region
answer
· Population: 500 mil · Capital: Capital City · Malaria · Villafranca, engineer, member of Society of Humanity
question
The European Region
answer
· Population: 300 mil · Capital: Geneva · Cynical
question
The Northern Region
answer
· Population: 800 mill · Capital: Ottawa · North America + Russia · The Cotton analogy
question
The Society for Humanity
answer
In short, just those men who, by together refusing to accept the decisions of the Machine, can in a short time, turn the world topsy-turvy
question
The Fundies
answer
" They were not a political party; they made pretense to no formal religion. Essentially they were those who not adapted themselves to what had once been called the atomic age....simple-lifers themselves." -" The fundies have no real power. They're just the continuous irritant factor that might stir up a riot after a while."
question
John Cambell
answer
1910-1971 - Edited Astounding Science Fiction from 1931 to his death - Shaped the "Golden Age" of science fiction in his role as editor - "The most powerful force in SF ever, and for the first 10 years of his editorship he dominated the field completely" - Asimov
Get an explanation on any task
Get unstuck with the help of our AI assistant in seconds
New