What about senses?

heja there,

any plans for implementing … let’s call it … senses for mycroft?
It would be great, to fire up a little Zero W with some sensors on and send those values over to mycroft which then can do something with it.

Currently we “only” can implement a new skill and fetch those value or am i wrong?

best,
Markus

Markus,

I have not yet worked with the Pi Zero, however I have used the Adafruit Feather ESP 8266 and implemented a simple server on it to handle http commands from a Picroft set-up. I use the python requests library to communicate from the Picroft to the ESP board.:

I assume the same could be done for fetching sensor values from the ESP board.

In terms of senses, you can give you Picroft or Mark 1 sight directly with the PiCamera:

again, not exactly what you are aiming at, but it is one of the 5!

Greg

I had a look on core code. From what I see I would say that sensens could be implemented with more or less copying over the skills code. It would be a new web socket with control code to get json data through http push. Also a new sense API object which could be as easily enhanced like skills.

I know it’s not that easy :wink:

Haven’t had time to play around with it. I’m struggling to get VoiceHat running and also have not that much time currently. But it’s one enhancement I want to play around in the next week’s.

I don’t know if this is helpful here, but thought I would post it just in case:

Kind regards,
Kathy

Have you looking into the PubNub or Cayenne apis?


i have been working on a server kind of mycroft, i can connect any (hacked) mycroft instance, and i am now working on some standalone clients (cli and voice), like endpoints to have mycroft all over the place :slight_smile: