This week I stumbled across some old lemonodor entries that I apparently forgot to post. I'll be running them this week, but don't be confused: I wrote these months ago.
Lately I've been working with Anselm Hook and Paige Saez and others on a project called Imagewiki. Imagewiki is sort of a Wikipedia that's indexed by images—You can take a picture of something with your phone, send it to Imagewiki and you'll be directed to a page that connects you with other people who have taken a picture of that same thing or place.
Imagewiki uses the same object recognition algorithm as Evolution Robotics' visual pattern recognition technology (ViPR), which was used in their ER1 robot and LaneHawk grocery scanner, and in Sony's Aibo robot dogs. I've mostly been working on the object recognition part of Imagewiki, using Rob Hess's free implementation of the SIFT feature extraction and matching algorithm as a basis for Imagewiki's recognition. It's a challenge since Imagewiki depends so critically on highly accurate recognition.
This weekend Anselm and Paige and I were at Google's Mountain View campus for WhereCamp, where we showed off the current state of Imagewiki. We even ran a scavenger hunt—several brave people let us jailbreak their iPhones and install the Imagewiki app, after which they got some clues: