View All Jobs 112179

Senior Data Engineer - Remote Eligible

Own migrating our enterprise data platform into a Google Cloud Lakehouse and establish best practices.
Remote
Senior
$125,000 – 140,000 USD / year
3 days ago
Akumin

Akumin

Provides outpatient diagnostic imaging and radiation oncology services through a network of centers across the United States.

Senior Data Engineer

This position offers a competitive base salary range of $125,000 - 140,000 annually, without room for negotiation above the maximum. Akumin is not able to provide immigration sponsorship for any employment authorization category. This includes, but is not limited to, H-1B, H-1B1, E-3, TN, O-1, OPT, CPT, STEM OPT, or any other temporary or permanent work authorization sponsorship. All candidates must have valid, independent authorization to work in the United States for any employer.

The Senior Data Engineer will serve as the initial technical point of contact for the migration of our enterprise data platform into a Google Cloud Lakehouse. This role is primarily focused on the technical succession and handoff from our external vendor partners to our internal team. While you will take point on the initial cloud implementation and Medallion architecture, you will work as part of a collaborative squad. A core responsibility of this role is to share knowledge and establish best practices for HL7/Google Cloud implementations, bringing the rest of the data engineering team—including both Senior and non-senior engineers—up to speed to ensure long-term platform stability and team-wide proficiency.

Specific duties include, but are not limited to:

  • Development of complex SQL queries for data extraction, manipulation, and reporting and transformation technologies
  • Design and implement robust ETL/ELT pipelines using custom-tooling (Python/Google) and off-the shelf tooling with focus on monitoring, supportability, and resource stewardship.
  • Contribute to and leverage coding standards and best practices to ensure efficient and re-usable services and components.
  • Architect, implement and deploy new data models and data processes in production
  • Builds data pipelines which acquire, cleanse, transform and publish data from a wide variety of sources.
  • Assembles large, complex data sets which meets functional and non-functional business requirements.
  • Partners with data asset managers, architects, and development leads to ensure technical solution provides data which is fit for use and in line with architecture blueprints.
  • Identify, document, design, and implement internal process improvements.
  • Other duties as assigned by management.

Position requirements:

  • GCP & BigQuery Mastery: 5+ years of Data Engineering with expert-level production experience in Google Cloud Platform. Mastery of BigQuery architecture, performance tuning, and cost management.
  • Clinical Ingestion Mastery: Hands-on experience implementing and managing GCP HL7/FHIR Stores. Expertise in parsing, flattening, and modeling hierarchical clinical data for downstream analytics.
  • Medallion Architecture: Proven track record designing and implementing Medallion (Bronze/Silver/Gold) frameworks in a cloud-native environment.
  • Modern Transformation (dbt): Advanced proficiency with dbt (data build tool). Ability to establish and document best practices (macros, testing, snapshots) for the broader engineering team.
  • Dimensional Modeling: Expert mastery of Kimball/Dimensional modeling. Ability to translate complex business SME knowledge into clean, scalable Star Schemas.
  • Technical Succession: Demonstrated experience performing technical audits and handoffs from external contractors to internal operations.
  • Peer Knowledge Transfer: Proven ability to upskill and assist a team of engineers in adopting new technologies (HL7/GCP) through documentation, patterns, and collaborative development.
  • SQL & Python: Elite BigQuery SQL (JSON parsing/Analytical functions) and Python for orchestration and automation.

Preferred requirements:

  • GCP Professional Data Engineer or dbt Certification.
  • Experience with Infrastructure as Code (Terraform) for GCP resource management.
  • Knowledge of HIPAA compliance and data security protocols within a cloud-native healthcare setting.
  • Prior experience in "Train-the-Trainer" scenarios where you've helped on-prem engineers transition to cloud-native stacks.

Physical requirements:

Standard office environment. More than 50% of the time:

  • Sit, stand, and walk.
  • Repetitive movement of hands, arms and legs.
  • See, speak and hear to be able to communicate with patients.

Less than 50% of the time:

  • Stoop, kneel or crawl.
  • Climb and balance.
  • Carry and lift 10-20 pounds.

Akumin Operating Corp. and its divisions are an equal opportunity employer and we believe in strength through diversity. All qualified applicants will receive consideration for employment without regard to, among other things, age, race, religion, color, national origin, sex, sexual orientation, gender identity & expression, status as a protected veteran, or disability.

+ Show Original Job Post
























Senior Data Engineer - Remote Eligible
Remote
$125,000 – 140,000 USD / year
Engineering
About Akumin
Provides outpatient diagnostic imaging and radiation oncology services through a network of centers across the United States.