Skip to content(if available)orjump to list(if available)

Speeding up PyTorch inference on Apple devices with AI-generated Metal kernels

simlevesque

Are these kernel available ? I'd love to try that !

pbronez

This is pretty cool.

I initially thought they were writing custom kernels for proprietary models like GPT-5. They aren't - they're using proprietary models to write kernels for a set of ~250 open Pytorch modules.