Wednesday, March 3, 2010

Why I created eyeBuy Visual Search for the iPhone/iPod

So "Mobile Visual Search" seems to be the new buzzword these days, right? So why would I want to create an app that competes with...well...not a small market. Precisely because the need for an accurate Mobile Visual Search is so prominent. I've tried them all. And I couldn't find a single app that worked to my liking. Go figure. Basically, they all have the same story - small databases and inaccurate results.

So why is eyeBuy Visual search so different? Well, I'll tell ya. The database structure is about as massive as they come and already has 500 million images to search against. Yup, I said 500,000,000. So don't complain when eyeBuy takes 3 seconds to find a match.

So why does eyeBuy sometimes return no results? With over half a billion images of products, one might assume an automatic match! However, while I tried to incorporate auto-rotation and auto-leveling image processing as best as I could, when images are scanned at different rotations or different angles, it often appears as a "different" image.

No worries though, because I am fixing that right now! As I type here, the eyeBuy database is being updated to include multiple pictures (ranges from 5 to 10 images) per product. So you better believe, unless you snap an image of a bridge in Tumbleweed, Arizona, we are going to identify it.

By the way, all the image recognition techniques in eyeBuy Visual Search (sometimes called image registration) were done using the Image Similarity SDK on this site: http://sites.google.com/site/imagecomparison/