black-hex-1440_header

When Objects Talk Back

It is pretty clear that the role of objects is changing, and the way we interact with them is, too. We are still in control, but in a more cooperative and fluid way. By Azure Yang & Simone Rebaudengo
Article

The Control Dilemma

We expect a lot from something that is labeled as smart. But as smart as a product could be, data analysis and sensing is not enough to design a fully trustworthy experience. By relying on a product’s smartness, we tend to hide complexity. And by focusing on connectedness we outsource all controls to remote applications.

Soon we might find ourselves with objects around our homes that prevent us from making choices, that might awkwardly deny any manual control and behave in a way that is not really understandable to us. At this point, who is actually in control?

It is widely accepted that being in control is a classic principle of designing interfaces. In this spirit, objects have been designed to be obedient, responsive, and predictable. We design objects with clear purposes and missions, and we design interfaces to simplify and grant people the best experience and control over those products.

However, in our daily lives, the constant reframing of expectations around convenience and efficiency pushes products to become more automated, to be somewhat smart, and ultimately to make choices and actions on our behalf.

They are partly in control.

This is today and not the future; it may already be yesterday.

It’s not a time of anthropomorphic walking robo-maid, but one of ubiquitous and mundane “smart stuff” that roams around networks and is implemented with learning, sensing, and some sort of algorithm — objects that have, as Dunne and Raby would say, a life of their own.

In a recent study about the use of Nest (besides a clear delight brought on by the slick UI design and remote control), what was surprisingly most undervalued was its “smartness.” People couldn’t fully rely on the self-setting of certain functionality as its sensing was not perfectly accurate (Nest relies on sensing presence to set specific routines as Away mode). The interviewees didn’t fully understand what learning meant, as the Nest seemed to be repeating what it was set to do. Ultimately, what was particularly interesting is that people didn’t trust it because “The Nest is doing its own [thing] and doesn’t tell you what it is doing.”

Interfaces For Engagable Objects Illustrations

From Buttons to Conversations

It is pretty clear that the role of objects is changing, and the way we interact with them is, too. We are still in control, but in a more cooperative and fluid way.

For example, a washing machine might not be on or off, but just sleeping and waiting to start a program at the right time, based on whatever rule was set.

The lights in my house might not be used only by me, but by any other person that can have access to it. How many people have experienced that weird feeling when their geek partner at work is showing his colleagues his last home automation experiment, and flickering all the lights in the house?

Objects are given the possibility to make choices and base them on inputs they are given or can understand. Their reality might be driven by specific goals or ethical principles and influenced by the partiality or erroneous understanding of their context. Will a coffee machine give me a coffee if it knows that my blood pressure is too high? Will it give me the same work to boost up my productivity if it knew from my fitbit that i run a lot?

From simple control systems we will design products that need to have a point of view of their own. From silent automation they will have to have feedbacks for an agreed discussion. From pushing buttons we might have to build the tools to have an actual conversation.

What kind of interfaces will we have to design? What are the buttons needed? Should I avoid listening to my fitbit? What kind of messages will we receive?

User-centred design is useful in dealing with silent and inert objects, but new intelligent systems and objects surface a new set of issues and constraints. Things have complicated lives and far more complex messages that a simple and slick UI can handle.


Design for Products

In 1996 Weiser and Seely Brown were talking about User-centering or the shift of focus to the periphery of a person. They were pointing at the importance of defining how all these things around us stay out of focus or come on stage when needed, avoiding a cluttering and unlivable home in favour of what they defined as calm technology. The object, in this view, becomes the lens to look at an interaction, looking at the center, the user, but from the perspective of the object.

So what if we would actually design from a product perspective?

The ecosystem to consider would not revolve around one person, but one made of other products, near and far. What would the ecosystem of a coffee machine look like and how would this ecosystem influence its behavior?

The journey to define would actually be the one of the object, with its touchpoint, its moments to get in and out of the focus.

All of this will be mediated by language that cannot be always articulated as a voice and as pressing as a tweet. It will be communicated through levers, feedback, and scent — new languages to be found, designed, tested, and interpreted.

Actions will have to be built based on strategies and logics not only predetermined, but evolving with information acquired, something close to a mental model to regulate behaviors and set expectations with people that an object is interacting with.

As it happens with pets, we need ways to encourage good behaviors and correct bad ones within the objects we own. Through the process, pets understand the limit of what that can do and learn how to communicate with us. Objects will need to be designed just right to learn and adapt for various scenarios, to understand the stirring, to be sensible if they need be, even be loud or smelly or annoying if they have to. They will expose the pretty and the not so pretty if it’s in our best interest.

We might find ourselves needing to define more Object-friendly environments for them to easily communicate and exchange information between each other’s and people ever regardless counteracting goals and misunderstanding cases.

Usability will no longer be the only goal of objects: they will absorb, adapt and become “engagable.”

Functions and features make our lives easier; experiences and relationships make us happier. Historically, we emphasize and focus on the utility of object design. It is time to endow them with abilities to engage with us and evolve forward, together.

We used to design interfaces for people to communicate better with objects to mediate our goals with their process and mechanism. However, in this near future we might start to think about interfaces for objects to better communicate with humans to mediate instead their goals with our life and routines.

Interfaces For Engagable Objects Illustrations

From Buttons to Conversations

It is pretty clear that the role of objects is changing, and the way we interact with them is, too. We are still in control, but in a more cooperative and fluid way.

For example, a washing machine might not be on or off, but just sleeping and waiting to start a program at the right time, based on whatever rule was set.

The lights in my house might not be used only by me, but by any other person that can have access to it. How many people have experienced that weird feeling when their geek partner at work is showing his colleagues his last home automation experiment, and flickering all the lights in the house?

Objects are given the possibility to make choices and base them on inputs they are given or can understand. Their reality might be driven by specific goals or ethical principles and influenced by the partiality or erroneous understanding of their context. Will a coffee machine give me a coffee if it knows that my blood pressure is too high? Will it give me the same work to boost up my productivity if it knew from my fitbit that i run a lot?

From simple control systems we will design products that need to have a point of view of their own. From silent automation they will have to have feedbacks for an agreed discussion. From pushing buttons we might have to build the tools to have an actual conversation.

What kind of interfaces will we have to design? What are the buttons needed? Should I avoid listening to my fitbit? What kind of messages will we receive?

User-centred design is useful in dealing with silent and inert objects, but new intelligent systems and objects surface a new set of issues and constraints. Things have complicated lives and far more complex messages that a simple and slick UI can handle.

Design for Products

In 1996 Weiser and Seely Brown were talking about User-centering or the shift of focus to the periphery of a person. They were pointing at the importance of defining how all these things around us stay out of focus or come on stage when needed, avoiding a cluttering and unlivable home in favour of what they defined as calm technology. The object, in this view, becomes the lens to look at an interaction, looking at the center, the user, but from the perspective of the object.

So what if we would actually design from a product perspective?

The ecosystem to consider would not revolve around one person, but one made of other products, near and far. What would the ecosystem of a coffee machine look like and how would this ecosystem influence its behavior?

The journey to define would actually be the one of the object, with its touchpoint, its moments to get in and out of the focus.

All of this will be mediated by language that cannot be always articulated as a voice and as pressing as a tweet. It will be communicated through levers, feedback, and scent — new languages to be found, designed, tested, and interpreted.

Actions will have to be built based on strategies and logics not only predetermined, but evolving with information acquired, something close to a mental model to regulate behaviors and set expectations with people that an object is interacting with.

As it happens with pets, we need ways to encourage good behaviors and correct bad ones within the objects we own. Through the process, pets understand the limit of what that can do and learn how to communicate with us. Objects will need to be designed just right to learn and adapt for various scenarios, to understand the stirring, to be sensible if they need be, even be loud or smelly or annoying if they have to. They will expose the pretty and the not so pretty if it’s in our best interest.

We might find ourselves needing to define more Object-friendly environments for them to easily communicate and exchange information between each other’s and people ever regardless counteracting goals and misunderstanding cases.

Usability will no longer be the only goal of objects: they will absorb, adapt and become “engagable.”

Functions and features make our lives easier; experiences and relationships make us happier. Historically, we emphasize and focus on the utility of object design. It is time to endow them with abilities to engage with us and evolve forward, together.

We used to design interfaces for people to communicate better with objects to mediate our goals with their process and mechanism. However, in this near future we might start to think about interfaces for objects to better communicate with humans to mediate instead their goals with our life and routines.

From Buttons to Conversations

It is pretty clear that the role of objects is changing, and the way we interact with them is, too. We are still in control, but in a more cooperative and fluid way.

For example, a washing machine might not be on or off, but just sleeping and waiting to start a program at the right time, based on whatever rule was set.

The lights in my house might not be used only by me, but by any other person that can have access to it. How many people have experienced that weird feeling when their geek partner at work is showing his colleagues his last home automation experiment, and flickering all the lights in the house?

Objects are given the possibility to make choices and base them on inputs they are given or can understand. Their reality might be driven by specific goals or ethical principles and influenced by the partiality or erroneous understanding of their context. Will a coffee machine give me a coffee if it knows that my blood pressure is too high? Will it give me the same work to boost up my productivity if it knew from my fitbit that i run a lot?

From simple control systems we will design products that need to have a point of view of their own. From silent automation they will have to have feedbacks for an agreed discussion. From pushing buttons we might have to build the tools to have an actual conversation.

What kind of interfaces will we have to design? What are the buttons needed? Should I avoid listening to my fitbit? What kind of messages will we receive?

User-centred design is useful in dealing with silent and inert objects, but new intelligent systems and objects surface a new set of issues and constraints. Things have complicated lives and far more complex messages that a simple and slick UI can handle.

Design for Products

In 1996 Weiser and Seely Brown were talking about User-centering or the shift of focus to the periphery of a person. They were pointing at the importance of defining how all these things around us stay out of focus or come on stage when needed, avoiding a cluttering and unlivable home in favour of what they defined as calm technology. The object, in this view, becomes the lens to look at an interaction, looking at the center, the user, but from the perspective of the object.

So what if we would actually design from a product perspective?

The ecosystem to consider would not revolve around one person, but one made of other products, near and far. What would the ecosystem of a coffee machine look like and how would this ecosystem influence its behavior?

The journey to define would actually be the one of the object, with its touchpoint, its moments to get in and out of the focus.

All of this will be mediated by language that cannot be always articulated as a voice and as pressing as a tweet. It will be communicated through levers, feedback, and scent — new languages to be found, designed, tested, and interpreted.

Actions will have to be built based on strategies and logics not only predetermined, but evolving with information acquired, something close to a mental model to regulate behaviors and set expectations with people that an object is interacting with.

As it happens with pets, we need ways to encourage good behaviors and correct bad ones within the objects we own. Through the process, pets understand the limit of what that can do and learn how to communicate with us. Objects will need to be designed just right to learn and adapt for various scenarios, to understand the stirring, to be sensible if they need be, even be loud or smelly or annoying if they have to. They will expose the pretty and the not so pretty if it’s in our best interest.

We might find ourselves needing to define more Object-friendly environments for them to easily communicate and exchange information between each other’s and people ever regardless counteracting goals and misunderstanding cases.

Usability will no longer be the only goal of objects: they will absorb, adapt and become “engagable.”

Functions and features make our lives easier; experiences and relationships make us happier. Historically, we emphasize and focus on the utility of object design. It is time to endow them with abilities to engage with us and evolve forward, together.

We used to design interfaces for people to communicate better with objects to mediate our goals with their process and mechanism. However, in this near future we might start to think about interfaces for objects to better communicate with humans to mediate instead their goals with our life and routines.

Author
Simone Rebaudengo
Interaction designer, frog Shanghai
Simone Rebaudengo
Simone Rebaudengo
Interaction designer, frog Shanghai

Simone is an Interaction designer in Shanghai where he designs digital, tangible and behavioral interfaces.

Cookies settings were saved successfully!