New software quickens time to collect data from video files for research, security purposes

May 8, 2013

The human eye may not be as fast, accurate or consistent as researchers need it to be when tracking data. But new software created by students from University of Michigan-Dearborn’s College of Engineering and Computer Science leverages computer vision technology to support the data collection process.

BioVision Team
Front row: Joshua Morrison, Molly Pohutski, Daniel Painter
Back row: Dustin Morabito, Nathaniel Dessert, Jacob Boncher

 

BioVision is a cross-platform application that automates the process for researchers, scanning raw video files for movement. Users set the motion detection parameters depending on their needs and then can output information to an Excel workbook to quantify and compare data across videos.

Anne Danielson-Francois, assistant professor of biology, commissioned the project for use in UM-Dearborn’s arachnid lab. For her and her research students, collecting data on spider movement meant hours of manually recording behavior by hand as it occurred.

“Collecting accurate data in an efficient manner is imperative to conducting experiments,” said BioVision team member Joshua Morrison. “Having a program to expedite the data collection process could allow a research team to work through more trials and conduct more experiments.”

Researchers in the Danielson-Francois laboratory often spent more time playing back videos to score the spider’s movements than doing the actual experiment. Time spent reviewing footage would take time away from further research and this is a common issue for researchers that videotape animal behavior.

“We wanted to help researchers save time and energy,” Morrison said. “No more recording behaviors by hand. No more scrubbing through an entire video file to get the information you need.”

The genesis for the project began from a random encounter over coffee in the faculty lounge in Mardigian Library between a biologist and an engineer. Danielson-Francois was reviewing spider behavior video playback on her computer when another coffee drinker, Narasimhamurthi Natarajan, professor of electrical and computer engineering, pointed out that electronic vision could do a better job of extracting data from video files than simply reviewing the videos and scoring behaviors by hand.

From this original insight, they collaborated on a short initial code 100 lines long with graduate student Raymond Llonillo. But using the code required a knowledge of computer science that most users do not have and the program was restricted in what it could do.

So with the idea to create a BioVision that everyone could use with more features and functionality, Danielson-Francois began working with the CECS Senior Design team of Morrison, Nathaniel Dessert, Daniel Painter, Jacob Boncher, Dustin Morabito and Molly Pohutski.

Danielson-Francois met with the students throughout the development process and was pleased with the end result.

The developers “produced professional grade video analysis software that can be used by any researcher to analyze patterns of movement in digitally captured video,” she said in a letter to the team’s supervising faculty. “It not only performs the functions outlined at the start of the project, but it completely exceeded my expectations.”

Although originally designed for behavioral science research, the technology also could support mechanical and security applications. Security officials can set sensitivity parameters to quickly scan video for unusual movement. In the auto industry, BioVision could help detect amounts of movements in springs (suspension) and measure how they react to differing weights.

“The application could be used with any video where you’re looking for a certain kind of movement,” said team member Daniel Painter.

The BioVision team, led by Dessert, originally looked for already-produced software that could assist in collecting data as a starting point off of which they could build. But when they couldn’t find a flexible user-friendly solution, they designed their own from scratch. The software took about 4,000 hours to complete and includes more than 10,000 lines of code.

The team presented the software at the CECS Senior Design Competition on April 19. They won the Department of Computer and Information Science division and shared top honors with the Department of Mechanical Engineering’s autonomous snowplow design project.

The program has been released as an open source project to encourage development and use in the scientific community.