What "Skills" Will You Make for Mycroft?


Check this link out. The documentation isn’t great, but it’ll do for now. When the MyCroft starts shipping and I can begin writing new skills, I think I’ll probably help out on the documentation part (as numerous people have been asking about it now :smile:).


Some skills that I definitely will make as soon as possible:

  • Email
  • Intercom (front door intercom, eg. when someone knocks on your door)
  • Touchscreen Display

I’m guessing Email is probably already under development, I’ll leave that for others to do :wink:.

Are there any plans to make a touchscreen display? For example, I’d really like if inside the dining room or living room there was a MyCroft with a touchscreen (maybe the official 7" touchscreen?). Then you could do things like viewing all your security cameras, checking to make sure all your MyCrofts are functional, configuration etc.


Is there already a skill for philips hue?


@Simple-codinger I don’t think so. It may be under development, but I can’t find any phillips hue skills in the github repository, so I’m assuming no.

EDIT: Never mind. Found a pull request: https://github.com/MycroftAI/mycroft-core/pull/240


I love the idea of an “intercom bot”! Having an AI answer the door and be able to determine the difference between interactions I want (ie Girl Scout Cookies or FedEx deliveries) vs interactions I don’t want (ie religious parties or political pollsters or people trying to sell me magazine subscriptions) would be amazing!

I really like the idea of touch screen support too! I recently got my Smarti Pi Touch case (https://smarticase.com), and a simple Mycroft GUI that could run on the display would be pretty cool. That type of display could bring “facial” features to Mycroft like we see with the Cozmo bot (https://anki.com/en-us/cozmo). On top of that just displaying some basic information could also make Mycroft accessible to individuals with special needs. It could even be pushed to external display types like TVs with the normal model.


We looked at using https://matrix.org for the intercom stuff.


Thanks! I’ll look into matrix as soon as I can.

I have a basic prototype of the intercom working by manually piping audio data to/from the Raspberry Pi and the browser. However, this leads to latency times of around 1.5 seconds (!!!) and also requires an extremely good internet connection.

Furthermore, it doesn’t work on Android and iOS, so still got some work there :smiley:


It does work, I think. Check out https://vector.im/


It seems so obvious for Openhab & Mycroft to work together I don’t understand why they already don’t. So much duplication of effort here that could allow the Mycroft team to work on the product and refinements instead of doing things that Openhab already does. Openhab is amazing and already works with Amazon Echo. It works with just about everything you can imagine and could simply work with Mycroft via http or mqtt


It would be great to see support for Pebble Watch (built in microphone), and OpenHAB open source home automation. Jawbone jambox (built in microphone and “Siri” button) seems like a good device to support as well. Cool project!


Hey, my first commment here.

First, i would like to thank you for the amazing software you brought to us.
To answer your question about making a skill for mycroft would be interacting mycroft to give commands to a PLC domotica home automation system. which would be very awesome to do. However i am not capable yet to understand what i have to do (program) to let mycroft communicate with hardware. I know that there are documentations. about Adapt. mimic and the mycroftcore.

The thing is, that when i read those documentations i have the feeling. that i am diving blind into a deep sea of awesome knowledge about how to program mycroft. my question to you and all other guys and girls who are active in this community is. Would you please give me and others who are struggling with the same problem a guiding light and a direction to the right path to understand the mycroft programming concept?. at this moment that i am writing i have a doubt about posting this here. so excuse me.
Thank you for understanding guys and gals. and keep doing this awesome job making this project beautiful for the world.



Hi Goldsmith,

You can look here : https://docs.mycroft.ai/ for more information. :wink:


Thank you very much!, i appreciate it.


While on quiet coach on train with one rude person loudly talking on his mobile:
Mycroft, enable signal jammer.
{Mycroft enables a short range signal jammer that cuts off the rude guy’s mobile connection}

While in my car and not able to use my mobile:
Mycroft, what’s on my calendar for the rest of the week?
{Mycroft reads my events day by day}

While on my way home in car
Mycroft, is my wife home?
Could you let her know I’ll be 10 minutes late because of traffic?

First thing in the morning, I walk into the kitchen and Mycroft sees me via a PIR sensor
Morning sir, I’ve checked all the servers and server 5 seems to be running low on disk space. I estimate it will take 2 days until this becomes a problem.
Thanks Mycroft!

Sat at my desk at work, Mycroft says: Sir, your wife wants you to pick up some milk on your way home
Thanks Mycroft
Should I remind you when you leave work? : (GPS tracking)
Yes please Mycroft.

[Security] Parental Controls

I’ve thought about voice authentication (as a user not connected with the company);

I’m thinking possibly using LDAP(and once functioning kerberos support) to create multiple user accounts (on a private LDAP server in my home lab for development, but for distribution could be cloud authenticated or from a private identity server process running on a local machine that has enough power to handle the requests for Mycroft across the home/business simultaneously), then teaching mycroft how to learn a user’s intonations, cadence, rhythm, and frequency range (Skill names: mu_add_user , multi_user_retrain_user) to allow it to determine the likelihood that the user matches a stored profile, then validates privileged requests based on a spoken passphrase in addition to requiring that the match probability be within the “very high” threshold limit which I suppose i’ll determine once i can get it to give me probability percentages. (Skill names: auth_phrase_verify , auth_phrase_change) Side note: pass phrase is separate from the set LDAP password, which is for key-entry only and is for verifying user during manual input and for mycroft to insert for authentication purposes once verified.

once you have the way to authenticate users and have reasonable probability predictions, then it’s safe to build skills that utilize the authenticate_user and authenticate_privileged_user skills, for instance, take a family with two adults and two children. [] denotes mycroft logic (highly simplified) “” indicates speech either from a user or output speech from mycroft, and () indicate an action from a user that is not spoken per se.

Child: "Hey mycroft, lower the temperature"
Mycroft:[recognizes child 1, probability “very high”] "I’m sorry, you don’t have permission to change the temperature"
Child 2 (attempting to mimic a parent’s voice): "Hey mycroft, lower the temperature"
Mycroft: [threshold for request within “questionable range” proceed with step 2] "Parental controls are enabled. Please authenticate using your passphrase"
Child 2 (still attempting to mimic said parent): "[gives the correct phrase]"
Mycroft: [recognizes probability threshold below requirement, intonations not within threshold for passphrase] "I’m sorry, I didnt’ recognize your response, please try again"
Child 2 (forgets to mimic parent): "[gives the correct phrase]"
Mycroft: [recognizes child 2, very high probability; intonations not within threshold for passphrase] "It’s not polite to pretend to be someone else, [child 2]"
Mycroft: [generates log of interaction, notifies parents using preferred notification method for security notifications]

and because I like to round it out with an authorized use:
Parent 1: "hey mycroft, lower the temperature by one degree"
Mycroft: [Parent 1 probability match “Very High”, short prompt step 2] "pass phrase required to complete request"
Parent 1: "[gives correct pass phrase]"
Mycroft: [Parent 1 match “Very High”, Passphrase threshold met, complete request] “consider it done”

other skills that would be highly useful once user authentication is available:

Password Manager
Advanced Diagnostics (beyond basic troubleshooting, scripted procedures run on command)
Disable/Enable authentication for users
Add/Change/Remove users
Parental Controls (can be single-skill or command calls baked into other skills)
Configure Access Classes (LDAP)
…many others


I got job and now I suddenly don’t have time but I have more money.
What we currently need as a community is a efficient way to convert money / donations into Mycroft skills.
Something like BountySource or Patreon, where I throw money at people to encourage them into developing skills that I want to see exist.


I have ported most of Alexa s skills over if anyone is interested in helping just look for my other post. It tells you what I am doing.


Im now in talks with a webdesigner friend, we will be making a blog, youtube channel and patreon, explaining stuff / tutorials / showing skills for mycroft, we will also be taking requests for skills / mycoft-code

so, stay tunned for news, and start getting ideas from alexa’s skill list


I’m surprised no one has done SNMP traps as triggers to Mycroft yet. As an Infrastructure / Network Engineer, I definitely see value in using Mycroft to interact with my Network via Voice.

Mycroft: "Andy, the SharePoint site isn’t accepting logins again."
Me: "Thank You, Mycroft. I’ll check it out. Can you ping the server for me?"
Mycroft: Working… the server is responding as expected: less than 1 millisecond, 4 replies
Me: Thank You. Can you check…

Some of this is quite advanced, (“ping the server” to correct command & IP; “responding as expected”) but that can be added later.

Can Mycroft do triggers from external sources?


My idea is to create a set of skills with different intents that allow Mycroft to be a penetration tester assistant in order to help scanning hosts, networks also helping discovering vulnerabilities on web applications

Using Open Source tools and API whit differents tools like nmap, OpenVas, ZAP PROXY, w3af and metasploit

I’m a beginner so I’m starting learning how to create my first skill but so far this is my first idea