Intel regularly kicks off its annual eponymous developer conference, IDF, with an opportunity for the press to gain an insight into current research topics being investigated at Intel Labs. Speaking at a pre-showcase event, Intel futurist Brian David Johnson, charged with developing the framework for upcoming research strategy based on perceived computing needs at 10 least years from now, explained that the inexorable production of ever-smaller chips will fundamentally change how we interact and view our computers.
Predicting the future
This usage and device metamorphosis is already underway; smartphones patently pack in more computing horsepower than full-tower PC boxes of yesteryear. Johnson believes that future ultra-small Intel chips will lead to designers turning 'anything into a computer,' be it the recognisable form factor of a phone, a sensor-driven surface in the kitchen, or even the shirt on your back. Reiterating a cliché bandied about at Intel, the computing possibilities of tomorrow are limited only by our imagination, he enthused.
Trouble is prognostications are inherently laden with accursed uncertainty. There are more questions than answers. What kind of technology will people actually want (or need) in 10 or 20 years' time? How can future technology uses be monetised, and what role does a silicon company such as Intel play in an age when one needs to provide a complete ecosystem - a la Apple and Google - and not just engineering excellence? This is why Intel, and other large tech companies, employ teams of social anthropologists and out-of-the-silicon-box thinking, with a view for providing the kinds of products and services that cater for yet-unrealised usage scenarios.
The here and now
Segueing nicely, identifying new technology revenue streams and designing products that cater for them is one part of the Intel Labs' remit. Following from previous years' future technology showcases, Intel is treading the same path, that is, focussing on variations of augmented reality and dallying with context-aware computing.
As an example of a technology that may see the retail light of day within two years, and one that took my fancy, Intel demonstrated an early prototype of a disarmingly simple piece of technology that may, in time, become useful to bricks-and-mortar stores. What you're looking at is a number of LEDs on the right-hand side of the above picture - eight in this case - positioned in front of a regular camera.
Designed to form part of a much larger sign displayed in shop windows, each LED, in additional to the basic requirement of light, also provides 15 bytes-per-second of data that can be captured by a smartphone camera and translated by an installed app into augmented-reality-like graphics that provide more information to the consumer. It works by the LEDs modulating in a certain pattern, undetectable to the naked eye, and the camera records this pattern for a second or so. Intel uses downsampling to convert the 100Hz-plus LEDs into data that, say, a 30fps camera can capture.
Assuming that hundreds of LEDs are used in a basic display, which is reasonable, enough data can be passed along to provide augmented information such as, for example, in-depth menus and reservations at a restaurant - all from an LED display that draws a picture of a burger. Intel's research group says it beats superficially similar quick response (QR) codes in many ways, most notably insofar as it works from 'metres' away from an LED sign and, just as importantly, can be accessed by more than one person concurrently.
The only consumer-side requirement is that the camera needs to be able to identify and capture the LED modulation; Intel demonstrated a sub-1MP, 30fps camera receiving simple additional information from the eight-LED display. It's an interesting project that, as far as I can tell, has commercial legs.
Mining new areas
Intel is clearly, sensibly investing in safeguarding its business interests for the future. I feel as if it's now thinking way, way past the x86 server, PC and notebook space, where it continues to hold a dominant position, and is now actively focussing on nascent segments that it can exploit with its chip-making expertise. In this regard, however, Intel is playing catch-up; ARM and its cohort of partners have been resolutely mining this multi-billion-device Promised Land for a while now.