Setting up the Chromecast backend?


#1

I have Mycroft running on an RPi3, which it shares with Home Assistant. HA currently provides spoken output through my Chromecast which, for various reasons, is the best way forward for my situation. What I’d like to do, and what it looks like I can based on work done these past few months, is use Mycroft as a Home Assistant notification platform. So, instead of speaking directly, HA pipes notifications to Mycroft which sends them to the Chromecast. Likewise, I can eventually hook a mic array to the RPi and talk directly to Mycroft, eventually using it to drive my various automations.

In looking through the release notes, it seems like there is an experimental Chromecast backend. Am I correct in noting that this receives TTS output as well? If so, and if Mycroft can speak through my Chromecast, are there any setup docs? I’m OK with it being experimental and perhaps not super polished. My needs aren’t too great, with my HA installation only generating a few speech notifications per day.

Thanks!


#2

I made a bit of progress. I added this to my config:

  "Audio": {
    "default-backend": "nolan's chromecast"
  }
}

And from the logs it does appear to find the Chromecast and use it as a backend, but I hear no speech. I’m looking at mycroft-audio.log and it seems I’m in the loop where I’m given a code to enter at home.mycroft.ai. I verified that Home Assistant can speak through the chromecast, and my sound system’s volumes are good. I’m just not hearing anything from Mycroft.


#3

That is the pairing code probably. Did you register your mycroft with that pairing code at home.mycroft.ai?
After that it starts the other skills and services. And then you could test your output.

I am actually working on outputting just the music and non-voice audio to the chromecast,


#4

Yeah, it’s the pairing code for sure, but if I’m not getting speech output, then there isn’t a whole lot of value to pairing the device. Am I reading this correctly in that TTS output isn’t currently streamed to Chromecasts? If that’s the case then that is my answer, as I assumed that was currently possible. If it isn’t possible, is it planned, and if so, is there an issue I can watch to know when it works? I’ll happily create one if there is value in that.

Thanks.


#5

I’m not a mycroft-core developer, so I wouldn’t know for sure if I’m right. But I can see that all my skills get loaded AFTER pairing. So If the chromecast-skill is a skill, it will only start after pairing.

Now pairing is something that has to be done just once…So technically it should work after you do that.

“Audio”: {
“default-backend”: “nolan’s chromecast”
}
}

I will try and test your suggestion tonight, I don’t have a chromecast in the office to test.
I have a speaker connected to the raspberry, and it’s too close to the mic. So tts output is ok, but playing music never picks up the stop command. That’s why I am looking into chromecast streaming music only.


#6

Hi, the audio services are not intended for TTS. They’re intended to handle playback of audio media (news, podcasts, music) queueing and such.

I think the best you can do right now is to use a pulseaudio plugins to stream to the chromecast. (http://www.webupd8.org/2016/03/how-to-stream-audio-to-chromecast-or.html)

you can also override the playback command in the config and use something like catt instead of the default paplay to cast the audio to the chromecast. But likely you will need to create some wrapper script.

I’ll think some more about this and see if I can come up with a clean solution for this kind of integration…


#7

Thanks. Here’s the blog post that led me to believe that the Chromecast backend would support TTS. Specifically, this text, emphasis mine:

This include a Major unification of audio handling in Mycroft. The AudioService now manages all sound output, including text to speech, music files and audio streams. This enables:

Thanks for the clarification. I’ll try the pulse approach.


#8

I see the confusion, sorry about that.

The TTS was moved over the audio service but is run separate from the playback backends to be able to speak over the playing audio.


#9

Yep, technically this does work. But it also adds a couple of seconds lag extra.
I tested with stream2chromecast and changed the play_wav command in /etc/mycroft/mycroft.conf to "/path/to/stream2chromecast %1"
Probably better to change it in ~/.mycroft/mycroft.conf

I also installed avconv
sudo apt-get install libav-tools
To stream I used:


#10

Awesome, this worked. Thanks! Hadn’t found stream2chromecast at all in my googling, so thanks for that.