May 20, 2010

Mechanical Engineering Professor Steven Shooter

(Editor's note: "Ask the Experts" will resume weekly publication in the fall. Until then, please enjoy this edition originally posted this past spring.)

LEWISBURG, Pa. — Welcome again to "Ask the Experts," a regular web feature that highlights the expertise of various Bucknellians in a range of topics related to current news events and other timely subjects. || Ask the Experts archive

This week, we asked Mechanical Engineering Professor Steven Shooter to talk about robots, the challenges of designing them to act like humans and how they might be used in the future. Shooter and Keith Buffinton, a mechanical engineering professor and interim dean of the College of Engineering, are the lead researchers in a long-term project to develop a bipedal walking robot for military and search-and-rescue operations. || Read more: Urban Robotics Project

Q: You and fellow researcher Keith Buffinton have received more than $2 million in federal funding to develop a bipedal walking robot. Tell us more about this project.

A: It's a shared project with the Institute for Human and Machine Cognition in Florida that's been going on for a little over two years. What we're looking at is a humanoid robotic system for urban environments, places where there are a lot of people. Urban environments can be a challenge for robots because they're messy, they're dynamic, they change. These environments change a lot — there's stuff in the way — and they're designed for people.

Up until really the last 10, maybe 20 years, most robots were used in industry, like you might have seen in the auto industry and similar production facilities. For industrial robots, you design the environment for the robot. You have this standard robot and you design the environment around it to make the robot effective. There was a movement away from that — as robotics got better — to what are called service robots, or special-purpose robots designed for specific environments.

Now we're moving into robotic systems that interact more with people and our environment. They're more general purpose and need to interact in our environment, which was designed for us, not them. And that's what we're working on.

This project is sponsored by the Office of Naval Research — so there are a lot of search-and-rescue type applications. You can imagine, for example, how they might have been used in Haiti after the devastation there. You can send these robots into places where you don't want to put people in harm's way. An example would be being able to navigate through that type of an environment to look for people to rescue, look for objects to find. You can imagine also using these robots for bomb disposal. Right now we have these real basic robotic systems that have been used in the military for bomb disposal. "The Hurt Locker" example, if you saw that movie, that type of robot is a pretty basic robotic system. It's limited in how far it can go, where it can go. Being able to send a humanoid robot into that type of environment is a huge advantage.

Q: What are the challenges in developing such a robot?

A: We do things automatically that we don't even think about. Something we take for granted, such as walking around, is a big task for a robotic system. There have been these humanoid-looking robots developed by the Japanese, like ASIMO from Honda. They look incredible but they're operated in highly scripted environments. They basically control that environment and send this robot running around, but if anything is off, it falls over. For example, if you were to push it, it would fall over.

Just balancing on two feet — if you've watched as a toddler learns to walk — that's a really big deal. And being able to do that and balance is a big step. Then being able to go up stairs is another big thing. Being able to get around obstacles that are in your way is another. For example, if I told you to get up and go between those two chairs (situated closely together in Shooter's office), you could do that, even though that's a tight environment, because you would turn sideways and step through. That's a big deal for a robot. Those are big challenges, and that's what we're really looking at: how to first walk, which our robot is doing, and then being able to walk to the places we want it to go, despite obstacles.

Q: Tell us about the robotics projects that you have worked on with students, such as the snowboard-testing robots.

A: The snowboard robot project was for a company in York, Pennsylvania, that designed and built snowboards. What they wanted to be able to do was test their snowboards as they were working on designs to identify what made a good snowboard good and what made a bad snowboard bad. If you have to wait until the next ski season to test this, it hinders development time. So we developed this robot that simulated the experience of the snowboarder back in the laboratory. Basically, we had a robot that simulated a rider from the legs down. We could lock it into the bindings and put that board into any position that a human could and actually do more than a human could. By being able to do that, we could simulate the conditions on the mountain in the lab to give feedback to help drive the design process. The students designed, built and tested that robot. They learned by doing it. It's one thing to look at something on paper. It's another to get it running.

Another project with students was to develop a robotic system to test a bicycle transmission — to simulate riding a bike. Pedaling isn't a constant force — your leg has a mechanical advantage, so there are certain times when you are stronger in your stroke. We had to simulate that, recreate that. It automatically shifted. Then we added automatic resistance so we could simulate going up and down mountains. In an hour, we could put almost a lifetime of hurt on that transmission in a controlled environment. Again, that project was done with all undergraduates.

Q: What is the future of robots?

A: We're not quite at the movies stage yet. I took my kids to see "Iron Man 2" the other day. I think that's great. It's showing engineering in a positive light in a lot of ways, but we're not quite there. Doing some of our menial tasks, like Rosie in "The Jetsons," that's far off into the future.

What we'll see in the near future is more robots doing things for us in dangerous places. They're already being used to inspect bridges, for example, where a human would otherwise have to climb underneath the bridge. And back when I was in grad school, I developed a robot to go into nuclear power plants in high-radiation environments. That's another place you don't want to send people if you can avoid it.

And I think we'll start seeing more general-purpose robotics systems. So, instead of having a robot that's designed only to go into a nuclear power plant, for example, the same type of robot will be able to do multiple, different kinds of things and have more flexibility. That's really what we're trying to do with our (bipedal walking robot) project. We're taking a modular design approach. With the different peripherals, different elements that we're designing, we want them to be able to go on multiple robots.

Work on robotics has always been integrative and interdisciplinary. We are already seeing more involvement by non-engineering fields such as psychology and education. I have even collaborated with a professor in psychology on a robotics project. As the technology gets better, we'll see even more contributions from diverse fields.

Q: What's next for the bipedal walking robot?

A: This year, the students designed a head for the robot (which until recently has consisted of only a torso and legs), because we're at the stage now where we want to be able to send it places and have it communicate back to an operator. But we wanted to design the head in a way that would allow you to put it on any robot. Remember when you couldn't just take any printer and plug it into your computer? It had to be the same brand, for example. Well, now you have USB ports. It's all the same connector, and when it plugs into your computer, your computer knows, "Hey, that's a printer," or "Hey, that's an external hard drive."

We designed the head that way, with what's called distributive intelligence so that the controlling of the head is done on the head. The only time it communicates back is when it needs to communicate back.

The head has 3D vision. We wanted to be able to have an avatar-like experience, literally — we want the robot to be like an avatar. So you wear 3D goggles and wherever you turn your head, its head turns. But what it does automatically, and locally ... if you think back to the toddler example, as you walk, your head compensates for your body movement to keep everything stable. If you didn't do that, you'd get dizzy very quickly. If you just took a camera, for example — and we did this — and put it on the robot, just mounted it there and transmitted the images, it would shake all over the place, and you'd get seasick really quickly. Your head compensates, and your eyes compensate, to keep things steady. So what we have on the robot head is a sensor that senses its position and automatically compensates. The operator doesn't even have to think about it.

What we're trying to do is reduce what is called the cognitive load — the demands on the user for having to think about stuff. There are so many things we do without thinking. We balance ourselves without thinking. We want that to be done automatically. We want the operator to just figure out where they want to go and where they want to look. Everything else we want to be done by the robot automatically, and that's what we're moving towards.

This year, the students designed the head (which they will test on the robot in June). Last year they designed feet with sensors in them. Without these sensors, the robot doesn't have any sense of when it touches the ground. You know what it's like to walk with your feet asleep. Well, that's the way it's walking without these sensors. The sensors can tell it where the center of pressure is, information it can use to help balance itself.

And now I have a master's student who is designing an arm for the robot, so that it can open doors and do those types of things. We're gradually adding things but all in a modular way, so you don't have to recreate the wheel for each robotics system. You can take this platform approach and plug and play.


New editions of "Ask the Experts" will appear on the Bucknell website on most Thursdays during the fall and spring semesters and on occasion throughout the summer. This will be the final edition for the spring. Look for two new installments over the summer.

If you have ideas for future questions or are a faculty or staff member who would like to participate, please contact Sam Alcorn.

To learn more about faculty and staff experts who can speak on a variety of news topics, visit Bucknell's searchable Experts Guide.

Contact: Division of Communications

Close

Places I've Been

The following links are virtual breadcrumbs marking the 27 most recent pages you have visited in Bucknell.edu. If you want to remember a specific page forever click the pin in the top right corner and we will be sure not to replace it. Close this message.