When humans view the world with two healthy eyes, we see in stereovision, which gives us depth perception and allows us to avoid running into the things around us. Scientists have long since implemented stereovision in computer systems for various applications, but until recently it wasn’t practical to implement such a system in real time because of the large amount of computational power it required.
Using today’s more robust technologies, a senior capstone project team in the School of Electrical Engineering and Computer Science accepted the challenge of designing a real-time stereovision application that can be used with programmable hardware and has potential for helping automobiles and other electronically driven gadgets to avoid collisions.
Team members Jon Dallas, Collin Keefer, and Matthew Zochert said the goal is to detect imminent danger by identifying objects and determining how far away they are and how fast they’re approaching. They are developing the algorithm so future teams can improve upon it. The corporate sponsor for their project is Aptina Imaging, based in San Jose, Calif., which provided the equipment.
–Marie Oliver
Interesting!