UW startup creates underwater robotics with a human touch

Taking telerobotics underwater

“Haptics does for the sense of touch what computer graphics do for vision,” said Howard Chizeck

Clare Lafond
Center for Commercialization

UW-TODAY (April 7, 2014)

It should be just as easy to use a robotic arm as it is to use your own hand. That’s the thinking behind University of Washington startup BluHaptics, which is taking telerobotics — controlling robots from a distance — to a new level: underwater.

Using technology developed by Howard Chizeck’s lab in the Department of Electrical Engineering, a team of UW scientists and engineers working at the Applied Physics Laboratory is creating a control system for underwater remotely operated vehicles, or ROVs.

These instruments can perform a variety of undersea tasks too dangerous — or even impossible — for humans, including oil and gas exploration, biohazard clean-up and mining, and environmentally sensitive scientific research.

In June, the UW and BluHaptics team will travel to Washington, D.C. to showcase this technology at the SmartAmerica Challenge, as part of the Smart Emergency Response Systems team. The SmartAmerica Challenge Summit will be a three-day event, including a White House presentation, a technology exposition and a technical-level meeting.

The UW research team is working with a “submersible manipulator test bed” at the APL, which is made up of specialized, submersible equipment similar to what’s used in the oil and gas industry for offshore operations. This equipment is submerged in a large water tank for a realistic test environment.

“Essentially, we’re combining the spatial awareness of a computer system with the perceptive capability of a human operator,” said Andy Stewart, a senior engineer in the Department of Ocean Engineering and part of the BluHaptics team. “To do this, we use what’s called a haptic device.”

Haptics describes feedback technology that takes advantage of the sense of touch by applying forces, vibrations or motions to the user. The haptic device is used both to control the robot and to provide force feedback to the user. This feedback guides the human operator to the desired location, pushing back on the hand to avoid collisions or other mistakes.

The haptic input device is similar to using a mouse with a computer, Stewart said, “but it’s giving three-dimensional input, so you’re actually defining a point in space where you want the robotic arm to go.”

“Haptics does for the sense of touch what computer graphics do for vision,” said Chizeck, who co-directs the UW BioRobotics Laboratory.

The technology creates a virtual representation based on a combination of sonar, video and laser inputs — sensory feedback that enhances the human-robotic interface and speeds up operations. This translates into tackling the task at hand safely and more efficiently, while greatly reducing the risk of damage to the environment.

The BluHaptics robotic control system is based on key algorithms developed by Fredrik Ryden in electrical engineering as part of his doctoral work. This work was originally directed to robotic surgery, which allows surgeons to operate remotely via a computer connected to a robot — a surgical alternative for certain medical procedures that can mean enhanced precision and less trauma for the patient, and decreased fatigue for the surgeon. BluHaptics is now applying and modifying these same algorithms to underwater robotics.

Read the full story about BluHaptics on the Center for Commercialization’s website.

“Haptics does for the sense of touch what computer graphics do for vision,” said Howard Chizeck