Wednesday, September 15, 2010

Context awareness to radically change how we interact with technology

IDF 2010, SAN FRANCISCO, USA: The future of computing lies in rich, context-driven user experiences, Justin Rattner, Intel VP, Director of Intel Labs and Intel CTO and Senior Fellow told his keynote audience at the Intel Developer Forum.

Rattner described how context awareness is poised to fundamentally change the nature of how we interact with and relate to information devices and the services they provide. With computing devices having increased processing power, improved connectivity and innovative sensing capabilities, Intel researchers are focused on delivering new “context-aware” user experiences.

“Context-aware” devices will anticipate your needs, advise you, and guide you through your day in a manner more akin to a personal assistant than a traditional computer. “Context aware” computing, via a combination of hard and soft sensors, will open up new opportunities for developers to create the next generation of products on Intel platforms.

Context: Foundation for future experience-driven design
Rattner said “context aware” computing is fundamentally different than the simple kinds of sensor-based applications we see today.

“My GPS co-ordinates and compass heading don’t tell my smartphone all that much about me,” said Rattner. “Imagine a device that uses a variety of sensory modalities to determine what you are doing at an instant, from being asleep in your bed to being out for a run with a friend. By combing hard sensor information such as where you are and the conditions around you combined with soft sensors such as your calendar, your social network and past preferences, future devices will constantly learn about who you are, how you live, work and play. As your devices learn about your life, they can begin to anticipate your needs.

“Imagine your PC advising you leave the house 10 minutes early for your next appointment due to a traffic tie-up on your way to work. Consider a ‘context aware’ remote control that instantly determines who is holding it and automatically selects the Smart TV preferences for that person. All this may sound like science fiction, but this is the promise of ‘context-aware’ computing and we can already demonstrate much of it in the lab.”

To provide an example, Rattner was joined onstage by Tim Jarrell, vice president and publisher of Fodor’s Travel. Jarrell showed Fodor’s experimental Personal Vacation Assistant running on a mobile Internet device and designed in conjunction with Intel.

The PVA uses a variety of context sources such as personal travel preferences, previous activities, current location and calendar information to provide real-time travel recommendations to vacationers. The PVA can even generate, at the user’s request, a travel blog with annotated photos and videos visited during the trip.

Rattner also showed the Socially ENabled Services (SENS) research project that provides the ability to sense and understand your real-time activities and, if you choose to do so, share that knowledge “live and direct” to networked friends and family through animated avatars on whatever screen, be it PC, smartphone, or TV, is handy.

“While we’re developing all of these new ways of sensing, gathering and sharing contextual data, we are even more focused on ensuring privacy and security as billions of devices get connected and become much smarter,” Rattner said.

“Our vision is to enable devices to generate and use contextual information for a greatly enhanced user experience while ensuring the safety and privacy of an individual’s personal information. Underlying this new level of security are several forthcoming Intel hardware-enabled techniques that dramatically improve the ability of all computing devices to defend against possible attacks.”

Making context work
Designing compelling user experiences requires deep knowledge and understanding of consumer behavior and preferences. Genevieve Bell, Intel Fellow and head of Interaction & Experience Research at Intel Labs, joined Rattner onstage to talk about the fundamentals of experience design.

“Our goal is to develop experiences that people love,” Bell said. “Randomly applying context can easily result in a negative experience. The key to making context work is people-centered design, and for us, that begins with working out what people love.”

At the end of his keynote, Rattner presented the ultimate example of sensing – a human brain-computer interface. Through the Human Brain project, Intel’s aim is to enable people to one day use their thoughts to directly interact with computers and mobile devices.

In a joint project with Carnegie Mellon University and the University of Pittsburgh, Intel Labs is investigating what can be inferred about a person's cognitive state from their pattern of neural activity.

No comments:

Post a Comment

Note: Only a member of this blog may post a comment.