by clicking on the page. A slider will appear, allowing you to adjust your zoom level. Return to the original size by clicking on the page again.
the page around when zoomed in by dragging it.
the zoom using the slider on the top right.
by clicking on the zoomed-in page.
by entering text in the search field and click on "In This Issue" or "All Issues" to search the current issue or the archive of back issues respectively.
by clicking on thumbnails to select pages, and then press the print button.
this publication and page.
displays a table of sections with thumbnails and descriptions.
displays thumbnails of every page in the issue. Click on a page to jump.
allows you to browse through every available issue.
GCN : June 2014
24 GCN JUNE 2014 • GCN.COM pers installed), the Warrior is significantly larger than the FirstLook, but it can also use its robotic arm for bomb disposal or to clear routes of objects. And the Warrior's ability to climb stairs also suits it for tasks inside buildings. Duke Energy, for example, is using the Warrior at several of its nuclear stations in North Carolina to handle radioactive ma- terial, such as filters used to clean water in the reactor vessel. "Before the use of the iRobot 710, per- sonnel handled these filters manually by using a six-foot pole to remove and ma- neuver filters from a radioactive storage container and lowering them into another container used for the long-term storage of radioactive filters," said Valerie Pat- terson, spokeswoman for Duke's McGuire Nuclear Station. DARPA CHALLENGES While current deployments of robots show impressive capabilities, if the re- sults of the most recent DARPA Robot- ics Challenge (DRC) are any indication, it won't be long before humanoid robots capable of driving a vehicle, climbing a ladder and removing debris become commonplace. The DRC challenges competing teams to develop robots that can perform the kinds of tasks that would be useful not only for certain military operations, but also for domestic emergency response situations like driving vehicles, moving objects and using tools. A 2013 trial included eight tasks for competitors: operating a vehicle, navigat- ing difficult terrain, climbing a ladder, clearing debris, door manipulation, re- moving an object from a wall, manipulat- ing valves and operating a hose. The robot from Team Schaft -- a Japa- nese venture that was recently purchased by Google -- won the 2013 DRC trials. "The function of these robots in the DRC is to somehow demonstrate search and rescue capabilities that can be used in a human engineered environment," SRI's Mahoney said. But before we see broader applications of those technolo- gies, the costs have to come down while performance is maintained. "That's really important," he said. • An Army robot is tasked with transporting needed medicine to a field hospital. Along the way, it encounters a wounded soldier. Should the robot drop its cargo and carry the wounded soldier instead? That s one of the scenarios posed by Matthias Scheutz, principal investigator of a project funded by the Defense Department s O ce of Naval Research aimed at developing robots that can make moral decisions. As the military fields robots with increasing degrees of autonomy, such scenarios are more likely. As a result, it would be, well ... unethical, of robot designers not to consider in advance what the machines should do in such circumstances. According to Scheutz, the team -- with researchers from Tufts University, Brown University and Rensselaer Polytechnic Institute -- is developing algorithms to enable robots to weigh a variety of factors. GCN asked Scheutz, professor of cognitive and computer science at Tufts University, how morals might be adapted to machines. GCN: In humans, ethical systems and, especially, behaviors seem to be influenced by unconscious factors, past traumas, etc. Would robots with embedded moral or ethical logical systems have variation from one robot to another? Scheutz: They could. GCN: Who will decide on the ethical rules for robots? Scheutz: This is not our decision, but the decision of those who will deploy the robots, but I expect all ethical rules to conform with national and international laws (e.g., international humanitarian law). GCN: Humans often modify their ethics as a context changes and interactions with other humans exert influence. What about robots? Are you planning for an interactive element? Scheutz: The robots will have di erent levels of ethical reasoning, some more involved and sophisticated than others. They will also be able to simulate or follow human ethical reasoning (to some extent) to be able to understand why some humans might arrive at a particular conclusion that s possibly di erent from what the robot inferred. In that sense, the robots will be able to work with di erent ethical systems, even though their actions will always be guided by the same system. GCN: Can you give any details about the logic that will be used? How are factors in decision-making given weights? Would the robot learn from actual outcomes? Or would human panels judge outcomes and call for changes in logic? Scheutz: We are working on ways for the robot to be able to justify its decisions to humans. It is critical that humans be able to understand why the robot did what it did, not just because it determined that a particular course of action had the highest utility. GCN: Humans often deceive themselves, making decisions under stress that they wouldn t have made sitting around a table with other humans. Might we expect robots, immune to stress, to make "braver" decisions? Scheutz: The robots decisions will not be subject to factors known to modulate decisions in humans, including stress and negative emotions, etc. Scheutz earned a Ph.D. in philosophy at the University of Vienna as well as a Ph.D. in cognitive and computer science at Indiana University. "I have always been fascinated by the question of what a mind is and how it is possible that some physical systems (e.g., humans) can have minds," he said. --- Patrick Marshall Can we delegate our morals to robots? "I think computer science and philosophy are complementary in many ways. Philosophy provides a framework to talk about minds and mental states while computer science provides the tools for implementing such frameworks and thus understanding them at a mechanistic level." --- Matthias Scheutz ROBOTICS