Thursday, March 28, 2024

The coming wave of Apple-ficial intelligence

 

The depth of Apple’s commitment to AI has started to manifest itself since the summer of 2016. In a recent essay Stephen Levy manifests those efforts in greater detail than before.

Back in June, I wrote: “Apple is going to continue its investments in improving UX through technologies in the AI stack… Expect more AI in Apple’s products. But I would be surprised to see large-scale open source efforts, of the kind we have seen from Google or Facebook. Open source has rarely been Apple’s bag.”

New in Levy’s piece gets more granular including the following nuggets.

  • Apple runs a neural net locally on the iPhone.
  • This neural net weighs 200Mb and trains itself in real-time, but especially overnight, using the GPUs in Apple’s iPhone device.
  • Apple cites owning the silicon design (from the far-sighted acquisition of PA Semiconductor, I guess) as a driver of improved learning performance.
  • They replaced oldskool voice recognition  (hidden Markov models) by a deep learning approach back in 2014.
  • Apple uses third-party sourced data to generalise training of things like photo recognition. This happens on-device.

Given the level of Apple’s acquisitions in this area (see 💡 my tweetstorm from May 2016 which covers this, and was prior to the Turi deal), their recent senior hirings and more than 280 job openings for hardware & software engineers with machine learning experience it is reasonable to say Apple is going for it.  (Fascinated to learn more about “Proactive Intelligence”, their new AI-enriched interface paradigm which sounds a bit Weave.Ai/Google Now like.)

One open question raised by the Levy piece is whether Apple’s mental model around privacy is a bug or feature when it comes to artificial intelligence.

Apple doesn’t share user data. And Apple’s global models are built not on this shared user data but on externally & expensively sourced data. And Apple doesn’t seem to send much user data back to the cloud to be learned from on super deep networks running on GPU clusters.

The traditional argument would be that it is a bug. Leveraging data network effects allows you to build better, more defensible products faster. Tesla’s network learning (EV#31) is a great example of this. As is Facebook’s capability in face & object detection. And keeping things on a local GPU denies your neural nets of the value of lots of GPUs (particularly for training).

The counter argument would be that user-privacy may increasingly be a differentiating feature which allows you to sell more stuff. Apple is wealthy and paying for tons of training data doesn’t make a dent in its cash pool. And, in any case, model performance often tends to a limit beyond which additional training data doesn’t help you.

Here’s my fast take on this. Apple’s approach to user privacy is may start to look more like a bug than a feature but it may not make a difference right now.

Apple’s approach to user privacy is may start to look more like a bug than a feature but it may not make a difference right now.

  • Externally sourced training data can’t keep up with novel use cases generated by real users. So your external training takes a long time to improve your overall performance. An Apple car training locally will generally have worse training data than a network Tesla whose models draw on edge cases from across the world. Worse performance means a worse product means worse market share means…
  • Their introduction of differential privacy which Levy discusses suggests they see the value of data network effects and are finding a way to grab that data while staying true to the user privacy promise. What I don’t know is whether differential privacy provides sufficiently good data. I’d recommend reading this essay at High Scalability which looks in more depth at deep learning in Apple Photos and differential privacy.
  • Andrew Ng, Baidu’s deep learning czar, has pointed out that deep learning performance doesn’t seem to flatten out as you add more data. You can just make the network deeper and the model continues to get more performant.
  • Consumers won’t care. For better or worse they won’t care enough especially when given the choice of products that feel more ‘magical’.

Right now (and perhaps for the next few years) this probably won’t hinder Apple. But their approach to user privacy might start to hurt the user experience they strive to deliver.

Now that would be an interesting tension.

Red
Author: Red

Related Articles

spot_img

Latest Articles

s2Member®