Ikea, in collaboration with IDEO and Lund University, released this Concept Kitchen 2025 project.
Some personal thoughts
It's not so much of a 'future kitchen', so much it is a 'future table'. The focus on this particular item of furniture as a central focus of all the kitchen activities, is quite interesting, in that it assumes it's role with explaining why...
It might be fun to brainstorm the reasons why a table SHOULD be the focal point.
- Lack of space in the kitchen due to average home sizes becoming smaller in the future, so table is multi-purpose - like a sofa bed.
- Society acknowledges need for improved social interaction, so instead of having dining room, breakfast bar, dinner on lap on the couch - everyone in 2025 gets together for family meals at the smart table.
- What do you think?
It's also, not particularly futuristic.
The technological suggestion is to use a 'smart light' technology, whereby graphic projections on the table surface provide a graphical user interface, that is dynamic and interactive. This in and of itself is nothing extraordinary, we've seen projection based UI's many times, take a look back at this video from the CHI '99 Conference (yes 1999!).
The IKEA video also demonstrates the use of ambient objects which act as UI controls. For instance, the round wooden dial assuming the role of a knob for the timer.
As far back as 1997, we have the idea of Tangible User Interfaces, or "Tangible Bits". It's a wonderfully nostalgic read.
"Tangible Bits" is an attempt to bridge the gap between cyberspace and the physical environment by making digital information (bits) tangible.
1) Interactive Surfaces: Transformation of each surface within architectural space (e.g., walls, desktops, ceilings, doors, windows) into an active interface between the physical and virtual worlds;
2) Coupling of Bits and Atoms: Seamless coupling of everyday graspable objects (e.g., cards, books, models) with the digital information that pertains to them; and
3) Ambient Media: Use of ambient media such as sound, light, airflow, and water movement for background interfaces with cyberspace at the periphery of human perception.
Figure 3: Center and Periphery of User's Attention within Physical Space
Most of the interactions rely heavily on optical recognition capabilities. The ability to 'recognise' something. This is a very interesting area and one which will be core to many of the Smart Home technologies and future concepts that we'll see.
But, to 'recognise' is a wonderfully complex thing. Visual recognition alone would struggle, but instead, we now think of all the 'senses' that we have available. Sensors collecting meta-data from many different vantage points, in this example, from the table surface, from the appliances, from the air, from the Person (that's 2025 speak for User in today's typical parlance), to name but a few.
All of this data being streamed in real time to somewhere else - a mobile device, a cloud server, a home hub - where it will be stored, processed and analysed, run through learning algorithms, prediction engines, outcome optimisation models resulting in near immediate responses to the Person, to provide coaching, suggestions, guidance, warnings, alarms, interventions, support - whatever the Person needs.
What do you think of this concept? Is it really smart or futuristic? Is it practical? What problems does it really solve? What could be different?