View All Jobs 155705

Staff Software Engineer - Data

Develop and optimize real-time data pipelines for healthcare robotics and clinical data
San Francisco Bay Area
Senior
18 hours agoBe an early applicant
Intuitive

Intuitive

A global technology leader in minimally invasive care and the pioneer of robotic-assisted surgery.

Staff Software Engineer - Data

At Intuitive, we are united behind our mission: we believe that minimally invasive care is life-enhancing care. Through ingenuity and intelligent technology, we expand the potential of physicians to heal without constraints. As a pioneer and market leader in robotic-assisted surgery, we strive to foster an inclusive and diverse team, committed to making a difference. For more than 25 years, we have worked with hospitals and care teams around the world to help solve some of healthcare's hardest challenges and advance what is possible. Intuitive has been built by the efforts of great people from diverse backgrounds. We believe great ideas can come from anywhere. We strive to foster an inclusive culture built around diversity of thought and mutual respect. We lead with inclusion and empower our team members to do their best work as their most authentic selves. Passionate people who want to make a difference drive our culture. Our team members are grounded in integrity, have a strong capacity to learn, the energy to get things done, and bring diverse, real world experiences to help us think in new ways. We actively invest in our team members to support their long-term growth so they can continue to advance our mission and achieve their highest potential. Join a team committed to taking big leaps forward for a global community of healthcare professionals and their patients. Together, let's advance the world of minimally invasive care.

Job Description

We're looking for a Staff Software Engineer who is passionate about building a modern, scalable Data as a Service (DaaS) platform that powers Intuitive's Digital products and supports over 2,000 engineers across the organization. In this role, you will own and evolve critical components of our real-time and micro-batch data pipelines that power product development, internal tools, and analytics.

Your work will focus on enabling high-throughput, low-latency data delivery through streaming pipelines, dynamic transformations, and APIs, driving application development, AI/ML, and actionable insights. You will help define the architecture and engineering practices that support self-service analytics and operational decision-making on scale.

As a catalyst for change, you will be at the forefront of reimagining how engineering teams consume and interact with data. Long-term success in this role means building robust, efficient systems and replacing legacy processes with modern solutions that allow teams to move faster, with greater confidence and autonomy.

Responsibilities:

  • Build highly scalable distributed systems that leverage event-based and streaming data pipelines to handle ingestion and processing of robot, manufacturing, and clinical data
  • Enable users by providing self-service APIs and applications to access and interact with data
  • Work closely with core engineering teams to consistently evolve data models based on growing business needs
  • Apply and evangelize software development best practices such as CI/CD, automated testing, infrastructure-as-code, and microservice architectures
  • Effectively participate in the team's planning, code reviews, KPI reviews, and design discussions leading to continuous improvement in these areas.
  • Act as a technical supervisor within the data domain driving best practices, mentoring teammates, and continuously improving how data is produced, shared, and consumed across the organization

Qualifications

Skills, Characteristics, and Technology:

  • Exceptional quantitative background (Computer Science, Math, Physics, and/or Engineering), or at least 8-10+ years of industry experience in a quantitative role
  • Fluent coding with at least two of the following: Python, Go, Scala, Java, C++
  • Proven experience building data pipelines and working with distributed systems using technologies like Apache Spark, Kafka, Elasticsearch, Snowflake, and Airflow
  • Deep technical knowledge & experience with Data Platform components/design patterns:
    • Event-Driven Pipelines (e.g. AWS Lambda)
    • Message Queues (e.g. Kafka)
    • Container Orchestration (e.g. Kubernetes)
    • Stream Processing (e.g. Flink, Spark)
    • Relational Databases (e.g. Postgres)
    • Data Warehouses (e.g. Snowflake)
    • Analytics Engineering (e.g. DBT)
    • Workflow Orchestration (e.g. Airflow)
    • Search Engines (e.g. Elasticsearch)
  • Proven understanding of CI/CD workflows, unit testing and integration testing, and deployment patterns.
  • Experience with AWS and/or GCP
  • Experience with SQL and relational databases
  • Ability and enthusiasm to work collaboratively and cross-functionally, and take end-to-end ownership to deliver results for customers

Bonus points:

  • Experience on a Platform team
  • Experience with Gitlab CI/CD or other CI tooling
  • Experience with Terraform, Ansible, Packer + general IaC best practices
  • You enjoyed the book Designing Data-Intensive Applications by Martin Kleppman
  • You're familiar with CNCF projects and have successfully used them in the past

About the Data Services Team:

We are responsible for building, maintaining, and enabling access to high-quality, reliable, and scalable data infrastructure and services. We are looking for exceptional technical leaders that want an opportunity to develop data products and foundational architecture that will shape the future of Intuitive's application development, analytics, and AI/ML capabilities.

+ Show Original Job Post
























Staff Software Engineer - Data
San Francisco Bay Area
Engineering
About Intuitive
A global technology leader in minimally invasive care and the pioneer of robotic-assisted surgery.