A car driving in low light.

Multimodal sensing and learning project receives Air Force Office of Scientific Research grant

Salman Asif will develop a new framework to co-design sensing and learning algorithms to capture and efficiently process only the most essential data

August 20, 2021
Author: Holly Ober
August 20, 2021

Salman Asif has received a $300,000, three-year grant from the Air Force Office of Scientific Research, or AFOSR, to develop an integrated sensing and learning framework using distributed and multimodal sensors. Asif is an assistant professor of electrical and computer engineering in the Marlan and Rosemary Bourns College of Engineering.

Salman Asif

Modern devices and machines, such as robots, autonomous vehicles, or surveillance systems, have dozens of sensors to gather data. For instance, cars have cameras, radars, LIDARs, and ultrasound. All these modalities have their unique advantages and limitations; for instance, cameras provide better spatial resolution than radars, radars provide better range information than cameras, and radars perform better than cameras in bad weather.

Processing the diverse, often large, data from multiple sensors and modalities to extract useful information and present it in ways that can be quickly understood and acted upon remains a challenging task. This project will address some of these challenges by developing efficient sensing and processing algorithms to capture and efficiently process only the data that is essential for the tasks at hand.

“I am very grateful for the generous support from AFOSR,” said Asif. “This will allow my research group to work on some interesting problems at the intersection of multi-modal sensor fusion, computational imaging, and deep learning.” 

The new AFOSR grant greatly complements the research on computational sensing and learning in Asif’s group at UCR. Asif also received the prestigious National Science Foundation CAREER Award earlier this year. 

Thumbnail photo: A n v e s h on Unsplash