Using Mycroft to interact with my Cozmo robot

I used Mycroft to chat with my distilgpt2 trained model and output the speach to my Cozmo robot.
It’s still pretty buggy, the straight python version without Cozmo works good at back and forth chat but I trigger animations with the Cozmo and a few tend to hang up. It’s all a work in progress but he’s a sarcastic (obviously), funny, wierd little AI model I trained.
Also there is a lot going on so he is a little slow. Need to trim some things down a little more to make him more responsive. Oh the model’s name is Cinder.

A little clip of his shenanigans

Special thanks to Gez for all his hard work and helping us noobs.

2 Likes

got a git link for this work? Asking for my kid’s cozmo…

This is really cool @Joseph_Flowers. @derick-mycroft has one at the office that I’ve always thought was pretty cool.

If you’ve got source for this checked in to GitHub, we’d love to see a link here!

A little longer clip. He’s a little slow so be patient.

2 Likes

I’ll post the code after work tonight hopefully. I’m still working on something to post to GitHub. The Cozmo skill I created is usable without the distilgpt2 model and works well with the Aiml fallback (which you can customize) and was how I was running it for a while. My model outputs too much for Cozmo’s speech string sometimes and could use a little cleaning up . I plan on using only the shorter strings and retraining the model for smoother use with cozmo. Still not quite sure where to host the model folder as it’s around 500mb I think. So won’t upload to GitHub.

1 Like

Ok this is what I have so far.
The readme-install.txt has all the commands I used to install the cozmo sdk, transformers, adb, mycroft, and custom wake word.

There is definitely a lot more that could be done here and I would like to test it with my Vector robot also but I’m working on using a google aiy vision hat (movidious chip) with him right now. Any suggestions are welcome.

1 Like

Love this so much!

It’s also a lot like my 2 year old lol

2 Likes