Why not integrate blenderbot or similar

As you may or may not know there was a Chat Turing Tournament in Nov of last year and one of them used Blenderbot which is opensource. The other was Kukai which was done by pandorabots (remember pandorabots started around the time they took A.l.i.C.E. and made the aiml code into a 2.0 code or at least that’s kinda what I remember). Here is the release about the challenge as well: [https://home.pandorabots.com/bot_battle.html]

Anyway, some people have an Alexa , or Google Voice at home to not just use for weather but also as a type of social item where they can just chat with it which is a cool idea but not so private.

So why not something like this in Mycroft?

Btw,
If one never has talked to Blenderbot and want to try you can do it in Google Colab (using a free GPU). (here’s a link for setting up: https://colab.research.google.com/drive/1bRMvN0lGXaTF5fuTidgvlAl-Lb41F7AD?usp=sharing)

You may ask why Blenderbot? It can be very real in the responses it provides and allow people to have a more “social bot” in their house and/or project if they so desire. Blenderbot is so much further than A.L.I.C.E.

So imagine. You have an older family member that lives by themselves and they need a friend (per say). The could buy a mycroft and enable the skill or something like that and not only find out about the latest weather, but also have someone that they could chat with.

Thoughts?

1 Like

I did this for my robot that I’m using mycroft on. I used Blenderbot as a fallback skill and it’s marginally “okay” (I don’t have a GPU so I am only running the 400M version). It’s a hobby robot of mine and I don’t mind nonsensical answers and such, but holding a real conversation for over 30 seconds is hard.

I think Blenderbot2 could potentially could work well enough because it has some long-term memory functionality (remembers your previous conversations) and internet searching ability, but the computational power required for Blenderbot2 to be responsive is ridiculous. The low-end 400M version without a GPU takes over a minute to respond to a query. I can’t imagine building a machine with multi-thousand dollar GPU cards to just create a chatbot.

But I agree with the concept, but I’m not sure Blenderbot(x) is the solution at this time… maybe many years off when we can get a raspberry pi with 2B node neural network built-in. :slight_smile:

1 Like