Talking to chat bots is so 2017. I’m working on something a little different:
The text on the left-hand side of the screen is a live transcription of the conversation. This happens automagically, powered by AI fine-tuned to recognize and interpret human voices. The stuff that appears on the right-hand side of the screen is not called up by a human. It comes from AI listening to the humans chat and making guesses at what they’re discussing. It’s designed to just listen along and make brief, clickable suggestions.
The key difference here is that my conversation is with a human. (Er…insofar as taking turns with yourself can be called a “conversation”.) Contrast the above with this:
The first prototype blows the second prototype out of the water usability-wise.
To use voice chatbots these days, you still have to recall some set of magic incantations. You still have to know (or spend time discovering) what you can ask for. This can be absolutely great in loose, unspecific scenarios like word games and quick trivia-knowledge questions, or setting a timer. But when you need to get down to the business of running your business, you have two problems: the bot doesn’t know enough about your particular business, and you don’t always remember the right magic words to invoke the specific functionality you need. It’s still easier to shout over the cubicle wall or blast everyone in your slack channel.
When the system just listens and unobtrusively gives you suggestions, it’s easy to swipe away and ignore them. If you’re talking about invoices but don’t actually need to see them – just don’t click the little box! On the flip side, if your call-in customer is getting angrier by the minute, how great would it be to have the system running sentiment analysis and suggesting to you to bring your manager into the call?
Skynet isn’t quite ready to be a conversational companion. But it’s ready to be a really good listener.
View our LinkedIn, here.