Distributed October 23, 2001
For Immediate Release
News Service Contact: Cynthia Ferguson



National Science Foundation

Brown computer scientists win five awards for information technology

With projects that will break new ground in everything from virtual reality to robotic devices for the severely disabled, computer scientists from Brown University recently won five of the competitive National Science Foundation awards for information technology research.


PROVIDENCE, R.I. — Computer scientists at Brown University have won five of the highly competitive National Science Foundation awards for information technology research. Their projects – which, if fully realized, will allow medical students to participate in a surgical procedure that took place months ago, or quadriplegics to move a robotic arm simply by thinking about it – won the Brown researchers more than $4.5 million of the $156 million distributed.

A team headed by Andries van Dam, professor of computer science, won $1.75 million to develop what van Dam calls an “immersive electronic book,” a three-dimensional interactive video that would let medical students witness an operation that has already taken place, possibly thousands of miles away, as if they were right in the room. If peering over the surgeon’s shoulder doesn’t give them the view they need, they can walk – virtually – to the other side of the operating table for a better look.

For years, medical students have watched surgical procedures on videotape, but it’s a medium, van Dam says, that surgeons find marginally effective. Because the video has a fixed point of view and lacks three-dimensionality, “watching a video is not even close to actually seeing the procedure,” he insists.

The team’s goal is a three-dimensional time machine, one that would allow students to “walk” around a life-sized, high-fidelity reconstruction of the original event, and to move backward and forward in time. Techniques can be reviewed again and again, and the surgeon can describe them in great detail – at no risk to the patient.

Van Dam and his colleagues are focusing on trauma surgery, although the technology they plan to develop has dozens of possible applications. Teaching trauma surgery, they note, is especially difficult because some situations occur so infrequently that many surgeons encounter them only a few times in their careers. The most effective way to learn the appropriate response to a particular trauma is to observe the surgery firsthand, and then to actively participate in the procedure – but these opportunities are rare. With immersive electronic books, the experience would be widely available.

None of this will happen soon, van Dam concedes. At the expiration of the NSF grant in 2004, “we’ll be at the Kitty Hawk stage, hoping to fly for half a minute 20 feet off the ground,” he says. By 2010, however, he envisions a much longer flight.

By that time, van Dam hopes to see a talented surgeon performing an innovative procedure as a sea of cameras captures a three-dimensional model of her operating room. After surgery and back in her office, the surgeon will don a pair of 3-D stereo glasses, as will several colleagues around the country. All will then be “inside” a video of the procedure, free to discuss aspects of the operation without compromising the safety of the patient.

A media specialist then creates an annotated version of the three-dimensional electronic book, which will be available immediately – via the Internet – to students and surgeons around the world. Now they, too, can step inside the surgeon’s operating room, hear her explanation of the surgery and the comments of her colleagues, watch her methods from any vantage point, and even move back in time to review an earlier step. Eventually van Dam wants students to feel and manipulate the surgical tools, but simulating the sense of touch, he says, is particularly challenging.

Van Dam’s project is part of a multi-institutional collaboration funded by the National Science Foundation. Work on a complementary component of the project – real-time scene acquisition and reconstruction – is under way at the University of North Carolina, the University of Pennsylvania and the Pittsburgh Supercomputer Center.

Willing an arm to move

In a separate project, an interdisciplinary team at Brown may move the medical world closer to giving those who are severely disabled some independent mobility. If the goal of this team is ultimately reached, victims of spinal cord injuries will be able to operate a robotic arm or other device simply by thinking about it.

Michael Black, associate professor of computer science, Elie Bienenstock, associate professor of applied mathematics, and John Donoghue, professor of neuroscience, were awarded $446,970 from the National Science Foundation to pursue this ambitious project. It involves modeling the behavior of brain cells so that a computer chip implanted in the brain can interpret the signals and relay them to a robotic appliance.

When a person has lost all mobility, the pathway between his brain and his limbs has been cut off or impaired. “What we’re trying to do is restore that pathway,” says Black. “More important, we want to restore that person’s pathway to the world.”

Even a crude robotic arm – one that can just lift a drink, turn up the heat or turn off the television – gives a quadriplegic some control over his or her environment, Black notes. If successful, the coupling of the appliance and the person would become so tight that it would function as an extension of the body. The device might one day be attached to a wheelchair, allowing a disabled person to move around in the world and interact with objects.

Other projects funded

Pascal Van Hentenryck, professor of computer science, has received $1.48 million to work with a team from Brown, the Massachusetts Institute of Technology and the Georgia Institute of Technology on scheduling and planning models that involve unknown variables. Known as stochastic combinatorial optimization, it is one of the toughest problems in computer science.

Eugene Charniak, professor of computer science, was awarded $449,442 to refine a parsing program that could vastly improve current speech recognition systems. Charniak’s interests lie primarily in the relationship of one word to the other – or in the syntax of a sentence – rather than in sound recognition.

Eliezer Upfal, professor of computer science, has received $524,000 to develop mathematical models and algorithms that are expected to be of great use to those who design search engines, Web crawlers, Web caches and routers. Scientists from two companies that specialize in search engines and related applications – Altavista and Verity – are collaborating on Upfal’s project.

######