Visonary Computers May Put Hockey On Cell Phones
- Date:
- May 21, 2004
- Source:
- University of Calgary
- Summary:
- They may never appreciate the poetry of a sprawling glove save, or the thrill of an overtime winner, but computers are a step closer to 'seeing' the sport of ice hockey, thanks to research at the University of Calgary.
- Share:
They may never appreciate the poetry of a sprawling glove save, or the thrill of an overtime winner, but computers are a step closer to ‘seeing’ the sport of ice hockey, thanks to research at the University of Calgary.
U of C computer scientist Dr. Jeffrey Boyd and three of his students are using hockey to test and develop new applications in the fast-growing field of computer vision research. Compared to humans, computers are still extremely primitive when it comes to perceiving visual information that we take for granted: motion, texture, depth perception and colour, for example.
“Computers can do all sorts of things when it comes to recording and manipulating visual data, but to get them to interpret it in a way that approximates ‘vision’ is enormously complex,” Boyd says. The U of C team’s latest innovation is a prototype ‘smart camera’ that could one day be used to transmit live sports events to cell phones or provide unmonitored security surveillance.
Boyd and his students – master’s student Michael Zhang, and undergraduates Luke Olsen and Maxwell Sayles – are continuing to refine their system with the aid of a tabletop hockey rink (1:32 scale) set up in Boyd’s Vision and Motion Analysis Laboratory. In July, they will install a two-camera system in the Olympic Oval at the U of C, which will record action from one of the hockey rinks as well as some of the speed skating.
“I suppose we could have chosen any number of different activities to focus on, but hey – this is Canada,” Boyd says. “Hockey is easily accessible here and something that other Canadian researchers are interested in.” Boyd’s latest project builds on research he conducted with colleagues at the University of British Columbia, Dalhousie University and Waterloo University. It is supported by a $180,000 grant from the Institute for Robotics and Intelligence Systems, which is part of the federally funded Networks of Centres of Excellence.
The camera prototype developed by U of C researchers integrates sophisticated computer programming with digital video to create a system that converts on-screen movement to a form of computer language. Until now, it has taken an enormous amount of computing power and customized programming to convert visual data to computer language. The U of C innovation is to move the computer processing to the front end of the system, which then makes it possible to share the visual data with ordinary PCs in a network-accessible database.
The next step in the development of the system will be to devise a way for archiving and retrieving data. Boyd has approached the university’s commercialization arm, University Technologies International Inc., about eventually bringing the device to the marketplace.
“This could have many different applications, including most situations that require a human to monitor video taken by a remote camera, such as security cameras,” Boyd says. “But it could also be used to deliver a live sports event to your cell phone. Instead of video, which requires a lot of bandwidth, you would get a moving schematic or diagram of the action with, say, Sharks or flaming C’s representing the players.”
Story Source:
Materials provided by University of Calgary. Note: Content may be edited for style and length.
Cite This Page: