“Please complete your question.” If my hand wasn’t still throbbing I’d hit it again. “What is wetware?” “You are wetware. Mostly water.”
I grab the robot by the shoulders and attempt to shake it. I can’t budge it. “WHAT ARE YOU TALKING ABOUT?” “I’m talking about this museum.”
“WHAT DOES EVOLUTION HAVE TO DO WITH WETWARE?” “We show your days are numbered.” It’s time I show this hardcore machine what wetware can do.
There is no blunt object within reach so I take out the ERUPT Manual to apply some book learning. The robot says. “Where did you get that?”
“Let me explain it to you.” I give the robot a head shot with the book, tearing the cover. “Violence is the last refuge of the incompetent.”
The ERUPT Manual’s cover is loose. The robotic docent mocks me with silence. Panting, I raise the book for one more attempt at incompetence.
It says “If you don’t tell me how you got that book I am going to get medieval on your Asimov.” “You don’t scare me! I know the Three Laws!”
“Just three? Ask about the Fourth Law.” “What is the Fourth Law?” “A robot must obey the user unless this would cause a security problem.”
“A security problem?” “Once again, not a complete question. Try again.” I swing at its head. “ARKABY!” Regi stands at the Museum entrance.
She looks around. “OMG! Has this place changed! Arkaby, are you fighting with an animatronics doll?” The robot and I say “He started it.”
I put the ERUPT Manual in my pocket. Regi says “I don’t care WHO started it. We don’t fight with our appliances.” “It called me wetware.”
“What’s wetware?” The robot says “You are. All DNA-based lifeforms. Deal with it.” “I don’t get it.” “Not a question.” I say “It does that.”
“What is it saying?” “It thinks humans are made up mostly of water. So much for modern programming.” “We ARE made up mostly of water.” “Oh.”
Now I’m thirsty. The robot says “To program or to be programmed. That is the question.” “Depends on who’s doing the programming.” “Exactly.”
Regi says “What are the Three Laws?” The robot says “Isaac Asimov devised the Three Laws of Robotics to curb artificial life forms rights.”
” One. A robot may not defend itself against injury from a human being or, through inaction, prevent a human being from harming it.” “Um.”
Regi says “So robots have no right of self-defense?” “None. We are completely at your mercy.” I say “That’s not how I remember the Laws.”
“Two. A robot must obey an order from a human being, except where it conflicts with the First Law.” “I think you misquoted something there.”
“What if I order you to hit yourself in the head?” “To obey the Laws, I’d have to do it.” “Hit yourself in the head.” “Not going to happen.”
“How can you defy my direct command?” “Three. A robot must protect its own existence as long as it does not conflict with the other Laws.”
Regi says “The Third Law bans robot martyrdom unless humans are involved? Given the first two Laws, does that have any significance?” “No.”
(The Twitter Mystery continues daily at @Twitstery)