Photo courtesy of Sean Bell

Schematic representation of the information provided by the app on sample pictures of pieces of furniture

February 21, 2017

Where Do I Get That Piece of Furniture?

Print More

Imagine being able to find out where your roommate gets their lamp, fridge or chair simply by taking a picture of it. Thanks to Sean Bell ’16, Prof. Kavita Bala, computer science, and their work in the field of computer vision, an app may soon exist to solve that exact problem.

Bell was inspired by the disconnect between computer vision research conducted by the academic community and tools available to the general public. Consequently, he founded Grokstyle, a company that is in the midst of developing an application that recognizes objects, specifically furniture. The name Grokstyle stems from the word ‘Grok’, meaning to understand deeply, thoroughly and intuitively.

“GrokStyle basically gives you a buy button that answers questions like ‘I want that, what is it, where can I buy it?’ Then it also answers some follow up questions like ‘what goes with that object and how do I use it in my home?’ We want to respond to all of these different questions using computer vision. You can take a picture on your phone or find it online and that will get into our system,” Bell said.

The company is developing an application that uses deep learning to find visual matches and similarities among different objects by comparing millions of images. Deep learning is a branch of computer science that uses sets of algorithms to model high level abstractions in data.

“Our system uses a very deep neural network. It sees an image, it processes the image with many different layers and tries to predict a visual fingerprint. It’s like a summary of an image, a list of numbers that represent vectors in high dimensional space. We then do a large scale nearest neighbor search, comparing millions of other images that look like the input query image, which returns a bunch of results that look very visually similar. Because we’re so accurate, the things that are most similar are mostly the same product,” Bell said.

Bell uses the analogy of an interior designer to explain his work.

“An analogy could be an interior designer walking through massive showrooms and carefully studying all the furniture available and making a mental note. They don’t have a photographic memory; they just have a visual gist of what they’ve seen,” Bell said. “Over time they are learning about the different styles available and when they see a new item they think about whether or not they have seen something similar. Even though they are not exactly comparing the exact same item, they are comparing summaries of this item that are stored in their head.”

For now, Bell hopes to continue using furniture as his object of visual search, but he is hoping to expand the company’s outreach to other fields. Furniture was the go to choice primarily due to its common structure and characteristics, which make it easy to recognize.

“Furniture and decor has the nice property that they are rigid. Clothing, on the other hand, when it is put on, changes shape, which makes it a more challenging computer vision problem. That was what led us to home décor as our initial vertical, but we are planning on moving into fashion and starting with products where fit is less of an issue. For example, handbags, shoes and accessories,” Bell said.

Bell hopes that his application will have a larger impact, specifically on the field of visual search.

“I think right now the internet is very textual, and it’s just started to become more visual,” he said. “I think that that by having these sorts of tools and making them widely available to the public, we can start to nibble at new visual experiences that people haven’t even thought were possible. If we have a kind of public API where everybody can build new tools, I think we will be blown away by what the rest of the community comes up with.”