Automating the Surveillance of Mosquito Vectors from Trapped Specimens Using Computer Vision Techniques

Document Type

Conference Proceeding

Publication Date

2020

Keywords

Computer Human Interaction, Machine Learning, Mosquitoes, Smart-phones, Public Health, Image Processing

Digital Object Identifier (DOI)

https://doi.org/10.1145/3378393.3402260

Abstract

Among all animals, mosquitoes are responsible for the most deaths worldwide. Interestingly, not all types of mosquitoes spread diseases, but rather, a select few alone are competent enough to do so. In the case of any disease outbreak, an important first step is surveillance of vectors (i.e., those mosquitoes capable of spreading diseases). To do this today, public health workers lay several mosquito traps in the area of interest. Hundreds of mosquitoes will get trapped. Naturally, among these hundreds, taxonomists have to identify only the vectors to gauge their density. This process today is manual, requires complex expertise/ training, and is based on visual inspection of each trapped specimen under a microscope. It is long, stressful and self-limiting. This paper presents an innovative solution to this problem. Our technique assumes the presence of an embedded camera (similar to those in smart-phones) that can take pictures of trapped mosquitoes. Our techniques proposed here will then process these images to automatically classify the genus and species type. Our CNN model based on Inception-ResNet V2 and Transfer Learning yielded an overall accuracy of 80% in classifying mosquitoes when trained on 25, 867 images of 250 trapped mosquito vector specimens captured via many smart-phone cameras. In particular, the accuracy of our model in classifying Aedes aegypti and Anopheles stephensi mosquitoes (both of which are especially deadly vectors) is amongst the highest. We also present important lessons learned and practical impact of our techniques in this paper.

Was this content written or created while at USF?

Yes

Citation / Publisher Attribution

COMPASS '20: Proceedings of the 3rd ACM SIGCAS Conference on Computing and Sustainable Societies, p. 105-115

Share

COinS