(From 9to5 Mac)
At this year’s NeurIPS event, Apple will have a booth (#1103) where attendees will be able to interact with live demos of the multiple machine learning initiatives from the company, including:
At this year’s NeurIPS event, Apple will have a booth (#1103) where attendees will be able to interact with live demos of the multiple machine learning initiatives from the company, including:
- MLX – an open source array framework designed for Apple silicon that enables fast and flexible ML and scientific computing on Apple hardware. The framework is optimized for Apple silicon’s unified memory architecture and leverages both the CPU and GPU. Visitors will be able to experience two MLX demos:
- Image generation with a large diffusion model on an iPad Pro with M5 chip
- Distributed compute with MLX and Apple silicon: Visitors will be able to explore text and code generation with a 1 trillion-parameter model running in Xcode on a cluster of four Mac Studios equipped with M3 Ultra chips, each operating with 512 GBs of unified memory.
- FastVLM – a family of mobile-friendly vision language models, built using MLX. These models use a mix of CNN and Transformer architectures for vision encoding designed specifically for processing high-resolution images. Together, they demonstrate a strong approach that achieves an optimal balance between accuracy and speed. Visitors will get to experience a real-time visual question-and-answer demo on iPhone 17 Pro Max.