At Apple, we believe that novel ideas can quickly become extraordinary products, services and customer experiences. When you bring passion and dedication to your job there's no limit to what we can accomplish. Are you ready to join an incredible research and engineering team that's responsible for building the next-generation of algorithms for sensing technologies on the iPhone, Apple Watch, iPad and more? Our team has diverse backgrounds in machine learning, statistics, estimation theory, control, physics and human factors. We are looking for inquisitive, creative engineers with expertise in signal processing and/or machine learning algorithms. An understanding of probabilistic modeling, statistics and embedded programming will improve your adaptability and efficiency in this role.
This is a key position at a focal point of HW/SW/UI integration with collaboration opportunities among different fields. You will design algorithms that transform raw image data into interpretable information that feeds into applications that delight, connect and enable Apple users all around the world. Starting early in the product lifecycle, you will analyze sensor data, extract features and prototype algorithms. You will collaborate closely with partner teams throughout the product life cycle, developing figures of merit to guarantee algorithm performance at different stages, crafting and implementing algorithmic mitigations as needed and providing support during production. Experience with on-device implementation, prototyping skills to show proof of concept and user interface design will further help broaden your role and efficiency in this position.
Apple is an equal opportunity employer that is committed to inclusion and diversity. We seek to promote equal opportunity for all applicants without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability, Veteran status, or other legally protected characteristics.