Differential Privacy

Apple is attempting to differentiate themselves respecting privacy while processing user data to make their services smarter. (See https://www.wired.com/2016/06/apples-differential-privacy-collecting-data/) The technique is called “Differential Privacy”, which, in my understanding, involves the client devices adding “noise” to the user’s data, which makes it hard to link the real data back to the user. This is apparently more reliable then anonymizing data.

I am absolutely not an expert in either machine learning or Differential Privacy, but it sounds like this technique, among others could be very important for enhancing user privacy. Is this a technique that the team has looked at?

2 Likes

I hadn’t looked at this, but I like the idea - we will do more looking into this. Thanks @PaulDStevenson!

Great! I know that privacy is very important for many people in this community (open source and privacy often seem to go hand-in-hand) and Mycroft is well placed to be transparent about how it uses user data and the extent that privacy is retained. Apple can’t exactly be transparent while their source code is locked away.

2 Likes

I agree 100%, looking at this I’m not sure we have the resources to do the same thing. But I think we can take some steps to make it clear that users’ privacy is at the top of our priority list.

1 Like

This look potentially useful for this particually the line “The library comes with tutorials and analysis tools for computing the privacy guarantees provided.”.

What do you guys think?