Apple’s WWDC 2024 started today. In this post, I quickly recap my thoughts on what this all means for Apple’s AI strategy in simple terms, and why

Enterprise AI Trends

submited by
Style Pass
2024-06-11 03:30:03

Apple’s WWDC 2024 started today. In this post, I quickly recap my thoughts on what this all means for Apple’s AI strategy in simple terms, and why you should care.

Enterprise AI Trends is a reader-supported publication. To receive new posts and support my work, consider becoming a free or paid subscriber.

Apple clearly knows its stuff when comes to AI - contrary to what many doubters proclaimed. They just announced the best on-device model in the market, as well as its own data center - that Apple built from ground up - that can run Apple scale inference with its own chips. Apple is already a vertically integrated AI company, and deserve a higher valuation.

Speaking of inference, Apple says “F you” to Nvidia. As they should. Apple’s success story may be the slight bearish case for Nvidia, especially if Apple also figures out how to use Apple silicon for training.

But let’s quickly recap how Apple classifies its AI workloads, which are grouped into three buckets: on-device, private cloud compute, and 3rd party model inference (explained below). Note, it’s Apple’s OS that automatically decides whether to run AI locally or remotely, unless it’s for ChatGPT which requires the user to explicitly opt-in. The three buckets:

Leave a Comment