I finally worked up the nerve to test something that’s obnoxious and disruptive to public life, and the results were fasciating.
While thinking about the stuff I’d have to do on a busy day with back to back meetings, I remembered something from yesterday I still had to do, and I raised my wrist to my face and told my watch to remind me.
Now, this isn’t natural with all devices I’m testing. For many of the use cases, I have to make a note to remember when and where to do or try a thing, but I’ve been testing the watch in regular use and daily life for months now, so it didn’t surprise me that at this point I have developed a habit to simply ask the assistant on my watch whenever my phone is inaccessible.
So that’s a thing. Whether it’s because I watched Star Trek when I was a kid or it’s just that natural a gesture – I mean, we do raise our wrists to look at our watches – I have developed a habit to do this new thing that I barely do on my phone, except when I’m driving. But I have more or less completely adapted my natural behavior and language for science. It of course helps that I developed it intentionally, and in order to further and deepen my understanding for my design practice. (Note to self to write later about why I think attempts at behavior modification doesn’t change people’s behavior most of the time)
Accessing Siri by wake word on the Apple Watch works surprisingly well, but it’s highly contextualized to certain gestures. Raising your wrist to your mouth is basically a requirement, but if you’re walking or in a noisy environment, it’s much less reliable.
Today, though, one thing was different. I was on the bus.
Definitely not okay.
I will do it again, but I’ll have to work up to it.
Everybody looked. Everybody.
Every single person within eyeshot of where I was sitting was looking at me, having recognized the pattern of a person speaking an invocation of a virtual agent.
It was weird.
Siri worked fine though.