Sramana Mitra: We work with PaaS companies to do their developer ecosystem. We do virtual accelerator partnerships with developer ecosystems. This may be something we can talk about.
You said something about empathy. In that interaction, your technology is able to establish empathy. Talk to me more about that. Talk to me a bit more from a technical standpoint on how you achieve that.
Danny Tomsett: It’s one of the machine learning areas of our business. It’s thinking about how we take the multiple inputs and behave in a way that you and I would. We’re listening to tone. We’re looking at a person’s facial expressions.
We think very carefully about how we say something too. If I’m going to disapprove a loan, I know that when I deliver bad news, I’m going to do it without a smile or without any emotion at all. The technology behind that comes in multiple forms.
With NLP, it’s feeding up responses of what to say. We are able to look at the sentiment of that sentence and the content of the conversation. We have dynamic animation. Nothing is motion captured.
In our world, the script is dynamic and the animation is dynamic. When the text is sent to our API, a digital human is making a judgement based on AI on how to present that. This is because we don’t know everything and because machine learning improves with more and more data.
We also allow our customers to override any AI decisions. In the text, you can use our behavior language. You can say how you want the digital human to behave. That then gives you full control over the experience through the NLP.
Second of all, it gives us more ability to learn around context and how different use cases and conversation styles require a different approach. We can continue to improve around them. Most of our customers are 80/20. 80% of what they’re doing is all done by AI. 20% might use a lot more expressive behavior because it tends to be more brand aligned as well. It’s so easy to do.
We’re in the early days of inbound, vision, and sound. We definitely got some clever tech going on. Digital humans are available in kiosks, websites, and on a mobile app. For kiosks, we need to have a level of vision to determine if someone is coming towards you or if they’re talking to you or their friends.
We use those technologies on our own stack to help move around that physical environment and make that a natural and engaging experience.
Sramana Mitra: Very interesting. What’s the genesis of the company? What were the circumstances under which this was founded?
Danny Tomsett: I was the founder of the company. It didn’t get excitement in the early days. I love the fact that my shareholders, the Board, and a lot of the key engineers are still with us today. From an engineering perspective, we have some of the best engineers across the world. They come out of Pixar, Disney, and game studios.
We’ve got some pretty phenomenal people including others that have worked in other complex engineering areas around computer vision. We’ve got people from Cisco. We have a small team in the UK as well.
Sramana Mitra: Most of these people are of New Zealand origin?
Danny Tomsett: Most of them would be but not so much the US workers.
Sramana Mitra: How big a team do you have and how are they located?
Danny Tomsett: Majority is in New Zealand. In New Zealand, we have 50 staff, which is mostly engineering and operations. Then there are 10 of us in the US.
Sramana Mitra: Wonderful story. I love AI. I understand a lot of the complexity of what you’re doing and what you’re trying to do. It’s remarkable that you’ve gotten so much of it figured out. Thank you for your time.