A caring relationship with voice companions to mediate privacy concerns.
As humans, we have an innate ability to care for the things around us. This care often mediates the relationship we build with them. These relationships took an interesting turn as we started interacting with voice assistants, raising the question of how care defines our relationship with these digital ‘beings.’ Do we really care for them? or does the relationship evolve to a more cautious one that leads to obsessions and concerns about their presence? This made me wonder if there is a way we could care for the digital things around us, and if this care also reflects in the relationship we build with them over time. Can we build more intimate and caring relationship with the voice assistants we live with? Will the way we care also instill a behavior that is more transparent and honest?
In my exploration, I looked into the most integral element of a voice interaction — ‘The Language.’ The language plays a significant role in our making and breaking of connections. We interact with voice assistants in the language we speak, and when we give these objects space in our intimate surroundings, it can often create a sense of intrusion into our private moments. I wonder if there could be a new type of voice companion with its own language and is alien to the language we speak. Will this unfamiliarity make it inherently virtuous to our private conversations?
Literature research, Ideation, Experience prototyping, Filming and communication, Research publishing.
"Learning a language (of the voice companion) feels like an act of impacting a behavior without being paternalist."
- Katherina, Student.
The 'aha' moment for me was the question what if voice assistants are alien to the language we speak? We learn a language to integrate with people around us and learning a language is perceived as a mindset to care for a new culture and people alien to you. On the hind side, people also use alien languages to communicate secretly and to mask information from eavesdroppers. Can we use language as a means to care and mask information from agents that can listen?
WHAT IS CUBE?
Cube is a voice companion that understands only Cubish language and thus creates an engagement opportunity for the user to learn a language to interact with it. This act of coming down to learn the language of the thing paves way to a more intimate and caring human-thing relationship. Cube thus also has an ingrained respect for privacy due to its ignorance of human languages.
The Cubish app helps you mingle with Cube and build a strong bonding with it. It is an extension of the Cube and enables you to see and discover the world through its eyes (the language it speaks). It is a camera-based application and you can use it to learn the Cubish words of things in your surroundings. You can also use the app to learn Cubish words for actions.
While gathering feedbacks, I learned that expressing intimacy and care is equally important as experiencing it. This was explored by animating the LEDs and the result was a transition from ON state (single dot) to a LISTENING state (a pulsating trail) to a CHEERFUL state showing appreciation (a flickering pattern).
Methodology. Research through making
In order to challenge the status quo of interacting with voice assistants, I followed a research through design methodology. It also gave me a first hand experience of learning a language to engage with an object. I also immersed myself into the topic by reading books and academic articles related to privacy issues and expressing care as a form of interaction.
EXPERIENCE PROTOTYPING 2
I integrated a camera with the app to scan and learn Cubish words of objects in the surroundings, making it an activity with more Cube involvement. Participants felt like camera integration took away reading the manual and now had the feeling of the system supporting the user to learn and communicate.
EXPERIENCE PROTOTYPING 1
I used video as an experience prototyping tool to engage with participants. First video has a voice assistant named cube that understands a fictional language Cubish, which was inspired from Japanese and Maori. An application was built as a user manual that has frequently used Cubish nouns and verbs.
From the feedback I got, I tried to explore how people communicate when they dont have a language in common. This made me realize that an alien language can create moments to engage with someone actively. People used various means - actions, gestures, scripts, drawings, show and tell to communicate.
WIZARD OF OZ TESTING
Finally, I took Cube to 3 users to understand how they interacted with the prototype and how they perceived the experience. I made some changes by adding features like understanding actions and showing appreciation through light animation.
I made the final application prototype using Protopie to integrate camera and voice interactions. I used an Arduino Nano to animate the LED matrix and used it in tandem with a Bluetooth speaker to create a close enough experience of interacting with a voice companion.
Following a research through design process was a new experience and I learned that we can start prototyping from the day one and use that as a probe to drive the research.
During the process, I had to find ways to engage with users remotely. Video as a prototyping tool was easier to gather initial reactions, and also gave participants the possibility of revisiting the experience at any later stage.