We’ve been busy this quarter.
Those following our adventures at the Roger Wilco Agency will know we are working with our friends at Robokind as well as with IBM’s Watson on an exciting, cutting-edge integration. Through our professional relationships and our breadth of experience, we are positioned to lead industry efforts leveraging this cognitive technology. The project is under ongoing progress, but this new integration will bridge the capabilities of each partner to “teach the system how to understand inputs without explicit instructions.”
Robokind created Milo the robot as the center of a curriculum that teaches social and emotional skills to children with Autism Spectrum Disorder (ASD). Milo’s creator Richard Margolin describes Milo as a “facially expressive and socially interactive robot” who is also patient, speaks clearly, and has a screen on his chest to display flashcards and other information. Milo is highly effective in helping students learn to understand and interpret social and emotional communication, and he is fun and engaging, too. We saw Milo as the perfect foundation for this cognitive computing project.
Some time back, we created our first demo with Milo as an avatar for the cognitive services of IBM’s Watson. Using Raspberry Pi to host Node-Red, the application captured a user’s spoken input and used Speech-to-Text software to send the data to the Watson Conversation Service. The Conversation Stack would utilize IBM’s natural language processing order to progress the conversation, which was then returned by Milo. Despite minimal promotional efforts for this demo, we managed to create a buzz in the marketplace and have been inundated with requests for more demos, including one for AT&T.
In our demo for AT&T, we illustrated the progress of this cognitive computing project. Early demos had revealed some limitations in the process, such as a delay in the speech-to-text translation between Milo and the conversation module. In the updated demo resolving those issues, we also discussed some of the attributes that will be part of a production application, including integration with tablets to display certain outputs from the conversations.
We are excited about the development of this project and the opportunity to demonstrate the advances that are possible with the cognitive technology and Watson. We are showing how a Watson-powered conversation produces identical interaction modes in the software avatar and Milo the robot for a consistent experience. As these services are developed and implemented, the Roger Wilco Agency will continue to be a leader in the field, attracting greater attention to the possibilities of this exciting technology.
By staff writer: Keith Long