✨ About The Role
- Lead critical work on the shared internal inference stack and grow the team
- Focus on getting state-of-the-art throughput for important research models
- Reduce the time to achieve efficient inference for new model architectures
- Collaborate closely with the Applied AI engineering team to maximize the benefits of the shared internal inference stack
- Create a diverse, equitable, and inclusive culture while enabling radical candor and challenging group think
âš¡ Requirements
- Experienced engineering manager with a track record of leading high-scale distributed systems and ML systems
- Familiarity with ML systems, particularly in distributed training or inference for modern LLMs
- Deep care for diversity, equity, and inclusion, with a history of building inclusive teams
- Skilled in hiring top AI systems engineers in a competitive market
- Ability to coordinate and meet the inference needs of research teams at OpenAI