You Can Shazam Your Food With This New App

pxhere

What if there was a way to know exactly what’s in the food you ordered without asking the server a million questions? Researchers at MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL) might have an answer. They developed a neural network called Pic2Recipe that can use a picture of food and work backward to determine the recipe for how it was made. It’s just like using Shazam to figure out that song you can’t put your finger on.

Max Pixels

According to MIT News, researchers scanned through websites like All Recipes and Food.com to develop “Recipe1M.” It’s a database of over one million recipes that have been “annotated with information about the ingredients in a wide range of dishes.” The data is used to make connections and find patterns between food pictures and their corresponding recipes and ingredients.

Yusaf Aytar, an MIT postdoc, says, “Seemingly useless photos on social media can actually provide insight into health habits and dietary preferences.”

The app sounds pretty ideal. Thanks for doing all the dirty work, MIT.

Since it’s a computer and not a human who can actually taste the food, the product is still a little dicey. It can identify baked goods, but it has a little difficulty when it comes to more complex foods like sushi. Pic2Recipe also gets confused with similar recipes for the same dish.

The system has a lot of kinks to work out, but the end goal is to create something with real-world relevance that can help to recreate restaurant food, learn what’s in that food porn and let you identify the nutritional value of your meal.