We are a participant in the Amazon Services LLC Associates Program, an affiliate advertising program designed to provide a means for us to earn fees by linking to Amazon.com and affiliated sites.

[nextpage title=”Introduction”]

The 2010 edition of the Intel Developer Forum (IDF) will officially start tomorrow, but today we were invited to a media-only presentation of the latest research being conducted at Intel Labs, all aimed to new usages of computing.

It is always important to remember that Intel is a chip company and does not develop or sell consumer electronics products. However, they employ a full-time staff of researchers to better understand the trends and needs in technology, so they can develop the supporting chips that may be, in the future, used by other companies that want to go ahead and put the ideas presented by Intel on the market.

We saw demos for the following ideas:

  • A photo-recognition system for smart phones
  • The ability to run programs (“apps”) on your TV at the same time you watch live shows
  • A projected screen on your kitchen countertop that responds to the touch and is capable to recognize objects on the countertop and interact accordingly
  • A face-recognition system implemented inside a car, allowing the car to automatically load the driver’s preferred settings, and also to identify if the driver is sleepy or driving in an unsafe manner
  • Using the embedded camera of netbooks for activities with early elementary students

Let’s talk about the first three on this list.

[nextpage title=”Mobile Augmented Reality”]

What Intel calls “mobile augmented reality” is really a photo-recognition system for smart phones, where you can take a picture of a landmark and the system will search for it in several different systems (Google, Flickr, Wikipedia, etc). One use of this idea would be the traveler that wants to know more about a certain place, all he or she would need to do is to take a picture of the place and the system would come up with matches for it.

Mobile augmented realityFigure 1: Mobile augmented reality

Of course you can Google the name of the place using your smart phone. But what if you don’t know the name of the place? Also, you don’t need to type in anything; the system does all the job for you.

Another advantage of this system is that you can build a search cache in the phone with places that you want to visit during your vacations but don’t want to memorize all the info, and you know that the place you are going to doesn’t have cell phone reception or you are going overseas and you don’t plan to get cell phone coverage in that country.

In this demonstration, a picture of San Francisco’s City Hall was taken. Then the app asked which website you wanted to be your search engine for that picture. Intel showed us some options, one being using Flickr (which not only recognized the name of the place, but also showed pictures of the same place taken by other Flickr users), and the other being Wikitravel.

Mobile augmented realityFigure 2: The system in action

Mobile augmented realityFigure 3: Recognizing the place using Flickr

Mobile augmented realityFigure 4: Recognizing the place using Wikitravel

[nextpage title=”Multi App Framework”]This technology is already ready for the Intel Media Processor CE 4100, and available for developers that want to build set-top boxes with it, which allows users to run applications while watching TV. With this technology you can open the website of the show you are watching to learn more about it, or open your Twitter account to let your friends know what you are watching.

Multi App FrameworkFigure 5: Multi App Framework

Multi App FrameworkFigure 6: The technology in action

[nextpage title=”OASIS”]

OASIS is the codename of a technology that transforms an everyday surface into a screen, recognizing commands through motions of your hand, recognizing objects, and creating a full interactive experience.

OASISFigure 7: OASIS

Intel presented as an example a kitchen countertop.

OASISFigure 8: A kitchen countertop transformed into a display

OASISFigure 9: The projector and motion sensor that are installed above the countertop

The system can recognize objects on the countertop, as you see it recognizing the banana and the bell pepper in Figure 10. By putting your finger on top of the icon projected besides the object, you can open an interactive menu that has several options available, as showing recipes using that ingredient, adding it to your grocery list, telling you how many of those you still have in your fridge or pantry, etc. Once again, all commands are done by simply moving your fingers.

OASISFigure 10: Object recognition

OASISFigure 11: Interacting with the object