Setting up the Chromecast backend?

I have Mycroft running on an RPi3, which it shares with Home Assistant. HA currently provides spoken output through my Chromecast which, for various reasons, is the best way forward for my situation. What I’d like to do, and what it looks like I can based on work done these past few months, is use Mycroft as a Home Assistant notification platform. So, instead of speaking directly, HA pipes notifications to Mycroft which sends them to the Chromecast. Likewise, I can eventually hook a mic array to the RPi and talk directly to Mycroft, eventually using it to drive my various automations.

In looking through the release notes, it seems like there is an experimental Chromecast backend. Am I correct in noting that this receives TTS output as well? If so, and if Mycroft can speak through my Chromecast, are there any setup docs? I’m OK with it being experimental and perhaps not super polished. My needs aren’t too great, with my HA installation only generating a few speech notifications per day.

Thanks!

I made a bit of progress. I added this to my config:

  "Audio": {
    "default-backend": "nolan's chromecast"
  }
}

And from the logs it does appear to find the Chromecast and use it as a backend, but I hear no speech. I’m looking at mycroft-audio.log and it seems I’m in the loop where I’m given a code to enter at home.mycroft.ai. I verified that Home Assistant can speak through the chromecast, and my sound system’s volumes are good. I’m just not hearing anything from Mycroft.

That is the pairing code probably. Did you register your mycroft with that pairing code at home.mycroft.ai?
After that it starts the other skills and services. And then you could test your output.

I am actually working on outputting just the music and non-voice audio to the chromecast,

Yeah, it’s the pairing code for sure, but if I’m not getting speech output, then there isn’t a whole lot of value to pairing the device. Am I reading this correctly in that TTS output isn’t currently streamed to Chromecasts? If that’s the case then that is my answer, as I assumed that was currently possible. If it isn’t possible, is it planned, and if so, is there an issue I can watch to know when it works? I’ll happily create one if there is value in that.

Thanks.

I’m not a mycroft-core developer, so I wouldn’t know for sure if I’m right. But I can see that all my skills get loaded AFTER pairing. So If the chromecast-skill is a skill, it will only start after pairing.

Now pairing is something that has to be done just once…So technically it should work after you do that.

“Audio”: {
“default-backend”: “nolan’s chromecast”
}
}

I will try and test your suggestion tonight, I don’t have a chromecast in the office to test.
I have a speaker connected to the raspberry, and it’s too close to the mic. So tts output is ok, but playing music never picks up the stop command. That’s why I am looking into chromecast streaming music only.

Hi, the audio services are not intended for TTS. They’re intended to handle playback of audio media (news, podcasts, music) queueing and such.

I think the best you can do right now is to use a pulseaudio plugins to stream to the chromecast. (http://www.webupd8.org/2016/03/how-to-stream-audio-to-chromecast-or.html)

you can also override the playback command in the config and use something like catt instead of the default paplay to cast the audio to the chromecast. But likely you will need to create some wrapper script.

I’ll think some more about this and see if I can come up with a clean solution for this kind of integration…

1 Like

Thanks. Here’s the blog post that led me to believe that the Chromecast backend would support TTS. Specifically, this text, emphasis mine:

This include a Major unification of audio handling in Mycroft. The AudioService now manages all sound output, including text to speech, music files and audio streams. This enables:

Thanks for the clarification. I’ll try the pulse approach.

I see the confusion, sorry about that.

The TTS was moved over the audio service but is run separate from the playback backends to be able to speak over the playing audio.

Yep, technically this does work. But it also adds a couple of seconds lag extra.
I tested with stream2chromecast and changed the play_wav command in /etc/mycroft/mycroft.conf to "/path/to/stream2chromecast %1"
Probably better to change it in ~/.mycroft/mycroft.conf

I also installed avconv
sudo apt-get install libav-tools
To stream I used:

1 Like

Awesome, this worked. Thanks! Hadn’t found stream2chromecast at all in my googling, so thanks for that.

1 Like

ndarilek,
I have been browsing some posts in the forum and stumbled upon your post with the following quote “I have Mycroft running on an RPi3, which it shares with Home Assistant.” I am curious as to how you implemented both platforms on one RPi3? What is your base platform (HASS or Mycroft)? If you could provide some links on how you implemented the two I would be grateful as this is something I am very interested in.
Thanks,

For now it’s just a Raspbian installation with everything installed globally (I.e. I install the Mycroft Debian packages, homeassistant globally via pip, etc.) I’m not super worried about compartmentalization because I consider the system disposable, and build everything via Ansible scripts. Beyond that, there’s nothing special about it.

I did give up on making Mycroft work with the Chromecast for output because latency was just too high. It would hear the wake word, send the tone to the chromecast, but by the time that played, it was finished listening. I need to experiment with integrating directly to HDMI, but haven’t done that yet.

Eventually I’ll probably switch to something like Redhat’s Project Atomic, then run everything in Docker containers. But for the moment, everything’s just globally installed and the setup scripts are versioned.

Great, been searching for this for a while. Do you have some pointers to set this up or is that it to get it to work?

Yes, just installing stream2chromecast and updating the mycroft.conf with the correct path will work.

But as @ndarilek also stated, it will create latency. For instance, you will hear the beep after mycroft has stopped listening.

I also used a button with led lights, which lights up when listening. Looking at the light, you could still use it.
Then the lag is somewhat less irritating