Apple is attempting to differentiate themselves respecting privacy while processing user data to make their services smarter. (See https://www.wired.com/2016/06/apples-differential-privacy-collecting-data/) The technique is called “Differential Privacy”, which, in my understanding, involves the client devices adding “noise” to the user’s data, which makes it hard to link the real data back to the user. This is apparently more reliable then anonymizing data.
I am absolutely not an expert in either machine learning or Differential Privacy, but it sounds like this technique, among others could be very important for enhancing user privacy. Is this a technique that the team has looked at?