Apple just opened the floodgates for third-party developers to tap directly into its Foundation Models framework, the AI infrastructure powering Apple Intelligence. The move puts Apple's on-device language models in the hands of app creators for the first time, enabling everything from personalized fitness coaching to intelligent task management across the App Store ecosystem.
Apple just made its biggest developer move since launching the App Store. The company's Foundation Models framework is now available to third-party developers, giving them direct access to the same AI infrastructure that powers Apple Intelligence across iOS, macOS, and iPadOS.
The timing couldn't be more strategic. As OpenAI faces increasing competition and Google pushes its own AI developer tools, Apple's taking a distinctly different approach - keeping everything on-device while opening up its models to app creators.
Early adopters are already shipping impressive integrations. SmartGym transforms into a personalized AI trainer, generating monthly progress summaries and adaptive coaching messages that adjust to each user's preferred style. The app even greets users with dynamic, real-time messages based on their current fitness data every time they open it.
But it's the video analysis capabilities that really showcase the framework's power. SwingVision, which helps tennis and pickleball players improve their game, now uses Apple's models to analyze game footage from Core ML outputs, delivering "highly actionable and specific feedback" according to the company's announcement.
The health and fitness category seems particularly well-suited for this AI boost. 7 Minute Workout lets users create custom routines using natural language - saying things like "I want to avoid exercises that would hurt my knee" or "I'm training for a marathon." Meanwhile, the Gratitude journaling app transforms personal entries into context-aware affirmations and generates detailed weekly summaries of challenges and wins.
Productivity apps are seeing similar transformations. Stuff, a task management app, now understands dates, tags, and lists as users type naturally. Write "Call Sophia Friday" and the app automatically populates the details in the right places. Its Listen Mode converts spoken thoughts into organized tasks, while Scan Mode can read handwritten to-dos from photos.
The education sector is also getting attention, though Apple's announcement was lighter on specific examples here. The company hints at "unlocking opportunities for education apps" but doesn't detail which apps are already integrating the framework.