today, i’m going to riff on a topic i haven’t written about before, but has interested me for years: computer vision on mobile phones. i believe advances in computer vision — combined with the compute power we now take for granted on our cell phones — could improve people’s lives in ways most of us haven’t imagined. i’d better explain how i reached this conclusion.
scientists have known for years that increased blink rates are a great predictor of tiredness or fatigue. that fact came in handy when i wanted to figure out if my 4-year-old daughter would go to bed at 8pm or at 9pm, since that usually meant a big difference in her bedtime routine. i decided to build a mobile app that would record her face and let me count how often she blinked, helping me predict when she would fall asleep — and making a happier evening for parents and child. that’s a pretty simple example of what I mean.
cardiio is a more-sophisticated app that leverages mobile phones’ cameras and compute power. hold up your iPhone to your face in a well-lit area, and cardiio uses the front-facing camera to look at the capillaries on your cheeks. the app then measures the light that’s being reflected to determine your heart rate — useful for tracking fitness levels, calorie burn, and even estimate your life expectancy.
i should mention i don’t have any investments in cardiio or any other mobile computer vision app. i just find the whole space really interesting. MIT, for example, has developed computer-vision algorithms that can tell the difference between frustrated and pleased smiles. now imagine mobile apps that interpret shoppers’ smiles and help retailers fine-tune their merchandising. retailers could also use mobile apps to analyze foot traffic for optimum cross-selling and impulse buys. and thanks to community efforts like PubFig and Labeled Faces in the Wild, computer vision software can recognize faces — with a high degree of confidence — across a wide variety of poses, expressions and conditions (recent NYT article on the advances). it won’t be long before that capability shows up in commercial-grade mobile apps.
mobile computer vision can also help us model our environment and improve crop yields. for years, scientists have been finding new ways to use near-infrared reflectance spectroscopy to detected crop mold and fungi contamination and insect infestation. it’s easy to imagine drones fitted with infrared cameras detecting early signs of infestation.
and then there’s augmented reality — potentially giving humans a sixth sense for understanding the world around us. Google Glass may be the best example so far, as developers continually add new apps that overlay information on what the wearer sees. But I wonder about the effect this sort of enhanced vision has on us. if you wear Oculus for seven hours, does it rewire your brain? for pro and con, mobile computer vision could have a dramatic impact on us and the world we live in.
here is a fun application of a convolutional neural net that i setup with caffe last weekend.