Easiest way to use Mycroft completely offline

I have backed on the new Mycroft and want to use it ideally completely offline.
I have a central home automation server, that fetches all important data from sensors or apis that should be controled by mycroft.

But I want to avoid that my speech is sent to a server outside my controlled home-subnet.
So the question is, what is the easiest way to make mycroft work offline (without an extra server or with a self-hosted one.)

3 Likes

Hi there @skeltob, this is one of our most requested features - completely offline use. Unfortunately we donā€™t have good documentation on how to do this at the moment. I do know that some of our Community members have done this in the past, including @Jarbas_Ai and may have some guidance.

Iā€™ve been waiting for a year on this and it still has not happened which is a complete shame considering the post about Google/Alexa and Siri recently:
Alexa_Siri_Google Hidden command attacks

Are there plans to implement such functionality in the future (even if it was an optional setting), @KathyReid?

Itā€™s definitely something we want to do @gregory.opera, however it requires a bit of work on our side. Our existing home.mycroft.ai platform is scaled to support tens of thousands of users, and runs across several virtual hosts - probably not all that usable as a local / personal backend. So we need to work on scaling that down.

The other layers to this problem are;

  • Speech to text - really this is the biggest blocker at the moment. Until we can get DeepSpeech to a point where it can run (or at least a vocabulary subset can run) on an embedded device, then weā€™re going to be stuck with cloud-based STT, irrespective of which cloud that runs on. There have been some substantive efforts by the DeepSpeech community toward this objective.

  • Skill support - most Skills need some form of internet connectivity as theyā€™re connecting to third party APIs.

  • Configuration settings - at the moment, configuration of Devices is done via Skill Settings at home.mycroft.ai so we would need to find a way to do configuration locally.

1 Like

Iā€™ve been waiting for a year on this and it still has not happened which is a complete shame considering the post about Google/Alexa and Siri recently:
Alexa_Siri_Google Hidden command attacks 18

If I understand it correctly, that attack could still work on a completely offline solution.

it is possible to run it offline, but there is no out of the box solutionā€¦

if you want to get your hands dirty

  • remove all metrics
  • disable pairing
  • disable remote config
  • find an offline STT (pocketsphinx is not goodā€¦)
  • many skills need internet and wonā€™t work

compare changes from my fork (slightly outdated, no py3)

1 Like

I have been running deepspeech locally with the ā€˜pretrainedā€™ model on a separate computer in my house recently.

It was fairly easy to set up and to point Mycroft at it. The server does not have a GPU, so itā€™s not as fast as it could be, but I think the gain in local network speed makes it not that different from the cloud service, which is kind of slow too, in my opinion. I will probably get a GPU based server at some point, but donā€™t expect a huge improvement in speed, because non-GPU is already usable for the short commands I use.

The big hit Iā€™m taking is with accuracy. I have to speak slowly, right in front of the mycroft, and leave gaps of silence between words.

Iā€™m currently starting to research ways to better train the local service. I have not gotten very far

My pipe dream would be for the mycroft community to be able to share and asimmilate incremental training gains without sharing any audio. Thatā€™s way over my head at this point, though

1 Like

You know about the DeepSpeech trainer, to improve accuracy:
https://home.mycroft.ai/#/deepspeech

I do now haha - thanks

Will that training somehow be accessible for local deepspeech services to get? Or would I be able to install the trainer software itself local?

Iā€™ll definitely take some time and listen/train whether or not itā€™s doable locally - great project

1 Like

If you manage to release something on Windows any time soon, Iā€™m fairly certain that itā€™d be really easy to do something with C# or VB.NET that uses the System.Speech.Recognition namespace in the Common Language Runtime. Its accuracy isnā€™t the best, but it is definitely a functional baseline. And it doesnā€™t seem to need pauses between each word like DeepSpeech apparently does.

1 Like

I think well trained deepspeech doesnā€™t require spaces between words. Just the combination of my voice and the ā€˜pretrainedā€™ model that Mozilla distributes seems to result in that

You can also contribute to project common voice. This is the data that Deepspeech is using in the end. In this way it will get at least used to your accent and tone of voice.

yeah, an offline version would be great. My thought is, even if its a ā€œserverā€ program that runs on a GPU server in my house, then I link my imbeded devices to that server, and skills that need to reach out to the internet would, but most of the stuff would happen within my network would be great!

it running on the device itself would be cool, but at least for me, running a central server that those devices connect to within my house instead of the cloud would be awesome!

An offline version is on our Roadmap :wink:

Just curious, https://home.mycroft.ai/#/deepspeech is just english, isnā€™t it? Because Iā€™ve found some sentences in spanish and in germanā€¦ and they were recognized not only correctly, but written in a perfect spanish (e.g: cĆ³mo estĆ”s, even with the proper accents!)

Buenos dias @malevolent! Great question. DeepSpeech is starting to provide translations for both German and Spanish. If youā€™re confident that the transcription is correct then youā€™re welcome to tag it :slight_smile:

Wow, thatā€™s an awesome new! :raising_hand_man:

It would be interesting to tag or filter somehow those new languages, so people who can understand them can flag them out as correct and donā€™t create false positivesā€¦ I mean, english people who doesnā€™t speak any other language, most probably will flag any other language as ā€œNoā€ because they will think is not english. Even me, who speaks spanish, doubted if mark it out or not, because I didnā€™t see any language filter on the site. It would be a shame to have so many few sentences on those new languages marked as non-valid when the are really good, donā€™t you think?

Any updates on offline/private/firewalled use?

As written in the other topic - you might want to have a look into this: https://github.com/MycroftAI/personal-backend

1 Like