What do you want to see from Mycroft?

If you use the Mycroft software, either with our hardware or your own, you might have noticed several recent updates. 18.02b was a major milestone for the project, and we’re working diligently toward our next major release, 18.08b. User input is what fuels progress here at Mycroft–whether that’s through testing, updating skills, or documentation feedback. In fact, a lot has been happening lately. Check out our pieces about Mozilla Deepspeech, the Pi Day release, and how you can help train a neural network.

We NEED your input, and would like to invite you to help us decide on the direction of our next major release. What do you want to see with the 18.08b release?
We are inviting our community to join CEO Joshua Montgomery as he works with Mycroft members and open up a discussion. What do you want Mycroft to be able to do?

Post your thoughts here or check out the dates listed on the blog.


I’ll start.

  1. Along with a couple users in another thread, my wife has a really hard time triggering the wake word while Mycroft will respond to me from 2 rooms away speaking softly if it’s quiet in the house. Would be great if it was a priority for Mycroft to respond to everyone easily whether it’s male, female, child etc…

  2. I’m impressed with the quality of the skills from the mycroft team. I would like to see more official mycroft skills of equal quality.

  3. Some skills that I think should be there out of the box as basic functionality for a voice assistant:

    • Basic Grocery/Shopping List (add/remove things on a list or multiple lists, read the list back to me, maybe send it to me somehow like email or various messaging platforms if requested)

    • Basic Music playing skill (Mycroft play some bluegrass. Shouldn’t require any subscriptions or configuring. There are places like libre.fm and soma.fm that have streams/channels of different genres)

  4. I envision Mycroft being the assistant that is useful to me no matter where I am thanks to the power of the internet. I would find Mycroft much more indispensable if I could ask it questions, set reminders, etc. when I wasn’t at home. I don’t mean I just want Mycroft on other devices. I want to be able to interact with my Mycroft unit that’s at my house from out and about more easily.


Here are some random thoughts

  1. Mycroft should track the first interaction of the day and the time since the last interaction. I can see that info being of use in certain situations.
User: “Hey Mycroft, what is the weather today”
Mycroft: “The weather today is blah blah blah”
(if first interaction of the day)
    check for scheduled tasks
    report on traffic conditions
    deliver quote of the day
    deliver news headlines
    initiate a conversation with the user Mycroft: “Have you taken you medicine today?”
                                                                      User: “No, remind me again in half an hour”
                                                                      {schedule a reminder for half an hour}
(if >4 hours since last interaction)
    Mycroft: “How has your day been?”
  1. Support for displays. Ideally would allow configurable size to allow support of various enclosures.
  2. Make Mycroft’s volume sensitive to background noise. The louder the background noise the louder Mycroft should speak, and vice-versa. May require a calibration step due to various hardware configurations and room acoustics.
  3. Support for wake on a defined hand signal (using the camera in the Mark II or webcam). Using this wakeup method would signal to Mycroft to interact via the screen rather than speaking (other people in the house may be sleeping).

I wonder if it is possible to have mycroft with the camera recognize sign language, and speak it via the display. Would a nice thing for some of my deaf friends


In the office, we reference the rock, paper, scissors video that Greg did, and talk about how he really opened up the possibilities with that demo of image recognition. Would love to see someone jump on this.

1 Like

As a developer, it’s really important to me to both be able to reach a wide audience as well as to use the product I’m making myself.

As Mycroft is limited to Linux and a rare home device, I can’t justify learning to develop for Mycroft, because it won’t reach anyone and I can’t use it in practice (I use a Mac and an Android phone, don’t have a Mark device).

I’d really appreciate it if you prioritized the Android version.

What I would love to be on the roadmap is a distributed Mycroft - or another way to put it, a way for th user to access the same Mycroft “installation” from multiple devices (including mobile).

In essence, I want to be able to pick where I want to store the knowledge, skills etc, that my Mycroft knows about me.

It could be on a Mycroft Mark II with other Mark IIs placed throughout the house.
It might be on a Mycroft HomeBase device sold by Mycroft (that includes a nice GPU for STT processing) in combination with an app on my phone.
It could be hosted on a virtual host somewhere with good internet access, connected to my home Mark II running the local skills for shutting the blinds and turning on the lights.

What this means, as far as I can understand, is for the Mycroft solution to be structured with distribution in mind. If redundancy was part of it, that would be even better.


Great suggestion, @Kallisti!

Continuing with quality core apps, a first-party calendar app would make Mycroft usable as a daily driver for my mother.
She would like to ask Mycroft what’s on the calendar today, and have it respond with entries from her iCal or CalDAV calendars elsewhere (Google, Cozi, published iCals, Nextcloud, etc). If Mycroft can also create and edit events where applicable, that would smooth the seams of the interaction.


Good integration with caldav and cardav and even more specifically with Nextcloud could be killer!

With Nextcloud Talk doing voice and video calls…Mycroft, call Jane Doe.


Have you considered doing a reddit AMA?


faster review/merging of Pull Requests

easier skill submission process, faster review

better docs, forever out of sync with new features :smiley:

better translation coordinating mechanism between teams

better language support in general, in long term i would like the language to be identified once speech is recorded, then an appropriate STT engine selected and the lang parameter passed along to the intent parser, mycroft would understand several languages at once

1 Like

On a similar path to @Kallisti, I’m thinking that all Mycroft units in a given installation should effectively be one unit. If I give an order in the upstairs bedroom, I should be able to modify or cancel it in the basement. Work distribution should similarly be overlapped. If I schedule a task on the basement unit and it fails for some reason, one of the other units should be able to pick up the job and complete it. Unless I specifically constrain it at setting point, alarms and reminders should play on all units simultaneously to insure that I don’t miss anything. This should apply to any informational statement Mycroft may make that is not a direct immediate response to a question, where the unit hearing the question would provide the answer. By way of example, if I ask Mycroft what the weather is like, the nearest unit can answer me, if I ask for severe weather alerts, all units should announce those. The option for data to be mirrored on all units in a given group should be available. If used, to tell one is to tell all, which can allow for failover if needed.

The above would apply to a Mycroft smart phone app also. I should be able to tell Mycroft on the phone something, and the home units get the instruction also. A phone app would be an exception to announcements coordination, using standard notification logic for the device.

I can see the logic in wanting linking to on-line calendars and reminder systems, but I would like to see a Mycroft-only calendar and reminder system that is strictly among devices and the apps in a group, and does not use cloud storage.

If you’re going to dream, dream big.


I love this idea. I also think that the groupings should be end user configurable. For example, in a family situation you might want Mycroft devices that are grouped as “parent” devices primarily used by parents, and devices that are grouped as “children” devices primarily used by children - because you would want different configurations applied to each group, even though they’re in the same household on the same network.

Thinking enterprise scale, you probably also want the ability to have multiple subgroups on the same local network.


Concept of households built into Mycroft Core.
I think this is something almost all software fails to do (and for good reason, it’s hard). I have a wife and 3 kids. I’ve also lived as a bachelor with 4 other guys before that. Sharing parts of your life is a thing that happens inside the home. There is also a difference in authority (my kids do not have the same level of privileges that my wife and I do).

I think it’s ridiculous that HomePod or the Echo will let you kid walk into the room and take over whatever music is playing (has happened a friends house on multiple occasions). Yes, teaching kids etiquette is a must, but at the same time, I want a digital assistant that is hooked into so many parts of my life to understand that when my 4 year old says something, that they are just a 4 year old and should be treated accordingly. :smile: You can imagine a skill that lets you fire up a Netflix/Prime Video show on the TV. If I’m watching something and my kids walk in they should not be allowed to just say “Hey Mycroft, play Magic Schoolbus” and interrupt what I’m watching. Conversely, I should be allowed to interrupt what they put on. Finally, I should be able to cancel commands issued by others. If we have guest over and they intentionally or unintentionally execute a skill, I (as owner/admin) should be able to cancel the command.

In the same way, understanding that my life is tightly coupled to those of my immediate family. For example, calendar events may be significant to me even if they are on my wife’s calendar. Or “Hey Mycroft, where are my kids?” or “When will my wife be home?” or “Are we free Thursday night?”

Location of people awareness
@mwgardiner brought up the idea that alarms and reminders might go off on all devices to make sure you don’t miss it, but why not instead track my location inside the house using BT, wifi, sound, etc, so that the nearest Mycroft unit can be used to get my attention without disturbing the entire household. An App (I know there is one for Android but I use iPhones/Macs) could also have notices pushed to it as well as keeping Mycroft apprised of my location.

Complete privacy
Another item I’d be really excited about is Mycroft being able to be completely private (disconnected from home.mycroft.ai). No audio leaving your personal network. I’d be ok if that meant installing a server inside my network to act as the hub for the system. External devices (like the one in the truck) could VPN in.

Vehicle enclosure
I am thinking of putting my PiCroft in my truck and letting it tether to my phone for network access. Having MyCroft understand that it’s in my truck would be cool.


just some ideas I have are
make it so I can teach it to do things through voice
have so it can “learn” on it’s own with out having to program every skill
I want to integrate it into my diy smart home setup and have child nodes running on all my systems (phones,computers,xbox,car)

1 Like

I think a formal, central repository of improvements and suggestions might be helpful.

Right now people have to search github, the forums and chat rooms to see if a feature has
already been suggested. In some cases, the improvement may be on the Mycroft roadmap or may
be under development by the community.

I found a software development platform/project management platform called Tagia https://taiga.io/
and it ties into github so it would be useful for end users and developers. Tagia is just one option,
there are a ton of software/project management tools out there.

Having one location to search and view the Mycroft project as a whole might be interesting.


1 Like

First and foremost a proper support for languages.
One point should be the easy switch including the wake word from english to any other language.

Another point is that It is not uncommon to have a lot of english words in other languages which makes the TextToSpeach struggle most of the times. Talking to device should also be easy when mixing nativ words with propererly pronounced foreign words.

1 Like

I agree! roadmap is my favourite word - we do need a public roadmap that isn’t just text based, but more visual.

Personally I really like the Thoughtworks Tech Radar, but I’ve used taiga.io and that’s good as well. There’s also aha.io which is a bit more commercial. Internally we use Flow, and that has public roadmap type functionality but it means our subscription is more expensive.

But yes, I support us having a public roadmap :wink:


Banking…“mycroft…pay my electricity bill”

1 Like