Anyone have iOS dev experience that can help port Mimic to run on the platform? Our friends over at VocalID are really interested in making this happen to help disabled folks be able to use the voices on that platform, and this will help us be able to port Mycroft to iOS in the future.
I’m not even sure where to start with this, so perhaps someone can point us in the right direction.
Since most of your code is in C, I believe you can use objectiveC to directly call the C functions. Am I right?
C and objective C aren’t my primary languages, but it is my understanding taht you can call C functions from objectiveC.
Are you interested in working with us to port Mimic to iOS? As I understand it, iOS is the only major OS that Mimic doesn’t run on at this time.
Thank you for the invite but I am new to iOS myself, working on core audio for 3 months. I was just looking through your repo to see if I could create objectiveC wrappers. I want to test it in my app.
Oh, awesome. Would love to see Mycroft get implemented into an app.
I got a crude implementation of flite version 2.0 working on iOS my app. So, mimic should not be that far away. But there are some issues:
I implemented a wrapper as I don’t know yet how to compile the library and then use it in XCode
There are few issues with the library files such as cst_val.c, cst_tokenstream.c. They do not compile in Xcode for iOS. I had to revert cst_tokenstream.c to an older version to get the app working.
Would it be easier to consider a framework like Ionic that lets you write iOS / Android / Windows mobile apps with the same (mostly Angular) codebase?
I’m actually looking at writing a mycroft app for iOS, as I know a bit of swift. It would probably be smart to use pybeeware’s suite of tools, and write a somethings that communicates with the messagebus.