Monday 3 September 2012

Augmented reality, cook like a professional

One of the skills I pretend to have but don't posses any formal training is Cooking. Most" how to cook shows" demonstrate the skills of the chef and how equipped his/her kitchen is compared to most people, they lack a interactive aspect and they certainly don't allow the novice cook the time to pause for a alternative option for ingredients. Instead I probably gloss over the cookery show and pretend I have a virtual knowledge of how to cook an egg or whatever the show was demonstrating at the time.
Besides that, a cooking simulator being developed by a research group at the Tokyo Institute of Technology, features a force feedback fry pan and spatula to accurately recreate the sense of cooking. This simulator calculates the heat transfer from the pan to the meat or vegetables that are being cooked, and displays the visible changes caused by heating. The fry pan interface allows for three dimensional input, and as well as moving the fry pan to aid the cooking process, the simulator can feed back the weight of ingredients combined with the tactile feeling of the ingredients cooking.


When you move the frying pan, the actual movement is input, and you can feel the ingredients through the pan. Also, the upper part of the system is a screen. When you look into the pan, you can see what's in it through a half-mirror. So this simulator lets you experience looking into the frying pan while you hold it. This technology combines a rigid-body physics engine library and a heat conduction simulator. The heat conduction state changes in line with the amount of physical contact, and the simulation is achieved by combining them.
 This system also calculates how moisture evaporates or flows as the temperature rises. It shows how protein changes color from red to brown, or how vegetables turn dark, by synthesizing textures. The inventor would like to develop this system further, so it's helpful in actual cooking at home. It could help you make the meat you're cooking taste even better. If it could be linked to a system that tells you, In five minutes, your food will look like this, and in ten minutes, it will look like this. This system could help with cooking, and give you options to have rare medium or well done steak.
Another computer scientist Yu Suzuki and colleagues at Kyoto Sangyo University in Japan kitted out a kitchen with ceiling-mounted cameras and projectors that overlay cooking instructions on the ingredients. This lets you concentrate on slicing and dicing without having to look up at a book or a screen. "Cooks can easily and visually understand how to prepare an ingredient for a recipe even if they have no cooking experience," says Suzuki. Suppose you want to fillet a fish. Lay it down on a chopping board and the cameras will detect its outline and orientation so the projectors can overlay a virtual knife on the fish with a line indicating where to cut. Speech bubbles even appear to sprout from the fish's mouth, guiding you through each step.
If that is not enough, the kitchen also comes equipped with a small robot assistant named Phyno that sits on the countertop. When its cameras detect the chef has stopped touching the ingredients, Phyno asks whether that particular step in the recipe is complete. Users can answer "yes" to move on to the next step or "no" to have the robot repeat the instructions. There are some limitations, however. "Currently we have to develop a system based on a manual analysis of real cooking processes," says Suzuki, so for now the system can only help you prepare fish and slice onions. "In the future, we will automate the analysis process." He will present the work at the Asia Pacific Conference on Computer Human Interaction in Matsue, Japan, later this month.

Meanwhile, Jinna Lei at the University of Washington has also installed cameras in the kitchen to watch over novice chefs. Lei and colleagues used Kinect-like depth-sensing cameras capable of recording both the shape and appearance of kitchen objects, allowing them to track cooking actions, such as whether a particular ingredient has been added to a bowl. The system uses both object and action-recognition to keep track of what the cook is doing. Each object, such as a bowl or apple, has a number of actions related to it. For example, bowls are generally used for mixing, while apples can't be mixed but can be chopped. Tracking is about 80 per cent accurate and Lei is investigating ways to improve this, such as adding a thermal camera to better identify the user's hands by their body heat. She will present the work at the UbiComp conference in Pittsburgh, Pennsylvania.

Its hard to say if these cooking ideas will ever make it in the real world, it'll certainly cost a lot of money to put up special screens and cameras around your kitchen. Its more likely that a personnel display or your smart phone should make its way to helping with cooking directions. I personally gone as far as playing you-tube cooking clips on my phone to help me make my favorite egg tarts. But the future of cooking might not be in Augmented reality. Its likely to be a internet TV in the kitchen for entertainment or even for cooking instructions. Gadgets like these Virtual cooks are likely to be in the homes for the rich, by then each kitchen would be equipped with a human domestic staff including a cook.


No comments:

Post a Comment