While discovering the Silicon Valley of Hardware at the Shenzhen Open Innovation Lab in China, I organized a user experience design workshop for interactions with Artificial Intelligence. Inspired by a then viral hack of a Big Mouth Billy Bas, we explored ways to rapidly prototype playful interactions with the technology, especially in the context of emerging smart hardware that works with either cloud-based or local machine learning software.
Prototype for and with AI
During that time, the hype around Artificial Intelligence was widely heralded as the end of the human specifies. Considering that China is known for a generally bigger acceptance of new technologies across the society, it was a great place to encounter the predominant narrative from an optimistic and curious angle.
Hence, the workshop experiments with AI as a tool that can be used by anyone to design and prototype delightful and enriching user experiences. To get started, I provided a brief introduction on machine learning and existing machine-learning libraries such as Tensorflow, or web-based natural language processing platforms that do not require coding skills to get started. After a discussion of the participants’ examples, we jumped straight into new use cases of voice interfaces.

Using a prepared template, the participants interviewed each other and captured activities, motivations and challenges of a typical working.
In bigger groups, we discussed the collected insights and identified patterns and the most valid pain points. Based on those, we extracted addressable problem statements.
The participants then designed a conversation with an AI-enabled voice assistant who may solve the user’s pain points. Depending on the use case and scenario, also a playful voice character was designed.


Later on, the participants prototyped their voice assistants using the conversational interface tool “Sayspring.com” and the Chinese counterpart “DUI.ai“. Others used the “Wizard of Oz“ technique, which is a low-tech approach to test “intelligent” interactions – a hidden human operator simply pretends to be the AI by manually controlling a text-to-speech software.


After the exiting afternoon session, we were inspired to rapidly design, prototype and user test more interactions with AI-enabled interfaces and hardware – without getting lost in any black box.