The Motion & Interaction team has created intuitive experiences for our customers through motion sensing. When you simply raise your wrist, shake your head, or move your device to interact, it's the work of engineers and scientists on this team. Our fingerprints can be found across core capabilities and experiences on iPhone, Watch, AirPods, Vision Pro, and other Apple products. We are a multidisciplinary team that operates at the intersection of algorithms, software, hardware, and design. We come from diverse backgrounds in signal processing, machine learning, software engineering, statistics, controls, firmware development, and more. As a member of our dynamic group, you will have a unique opportunity to work cross-functionally to develop products and features that impact the lives of millions of users worldwide on a daily basis.
We are seeking an experienced, talented, self-motivated machine learning engineer to build Apple's next-generation features and experiences using multi-modal sensing. In this role, you will ideate, design, and implement models & algorithms, while optimizing for power, memory, and performance. You will be working on motion sensing-related features, including sensor fusion and interactive technologies that impact over a billion customers.
Minimum qualifications:
Preferred qualifications:
Apple is an equal opportunity employer that is committed to inclusion and diversity. We seek to promote equal opportunity for all applicants without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability, veteran status, or other legally protected characteristics.