View All Jobs 113801

Intermediate Data Engineer

Build and optimize scalable data pipelines for institutional investment analysis
Edmonton, Alberta, Canada
Mid-Level
yesterday
AltaML

AltaML

A Canadian artificial intelligence and machine learning firm specializing in applied AI solutions for businesses across various industries.

Intermediate Data Engineer

At AlphaLayer, we help institutional investors uncover investment edge at scale with a repeatable research process that leverages core technology, data, and AI to develop differentiated investment strategies and signals. We're looking for an Intermediate Data Engineer with strong foundations in data warehousing, Python, and data workflows—and an interest in growing their skills in software engineering, DevOps, and cloud infrastructure. This role is ideal for someone who enjoys working on real-world problems, has solid technical fundamentals, and is eager to learn. You'll join a cross-functional team that combines AI, quantitative finance, and engineering to build production-grade systems that deliver real business impact.

What You'll Do

  • Design, build, and maintain robust, scalable data pipelines
  • Work with tools like Snowflake, Prefect, and Python to enable reliable and efficient workflows
  • Collaborate with software engineers, infrastructure specialists, and quantitative experts
  • Troubleshoot and optimize data flows in both development and production environments
  • Contribute to internal tooling and automation as the team grows its DevOps/cloud maturity
  • Participate in code reviews, team planning, and knowledge-sharing

Must-Have Skills

  • 2–4 years of experience in a data engineering or related role
  • Hands-on experience with data warehousing platforms (e.g., Snowflake, Databricks, BigQuery, etc.)
  • Strong Python skills, including libraries such as Pandas, SQLAlchemy, or equivalent
  • Experience working with workflow orchestration tools like Prefect, Azure Data Factor, or Airflow
  • Solid SQL skills and understanding of performance optimization
  • Familiarity with version control (Git/GitHub) in a collaborative setting
  • Comfort using Unix/Linux systems for development and debugging
  • Experience working with structured and semi-structured data (e.g., JSON, Parquet, CSV)

Nice-to-Have Skills

  • Understanding of data modeling concepts (e.g., star schema, dimensional modeling)
  • Basic understanding of Streamlit and/or React for data presentation or internal tools
  • Exposure to Docker and Kubernetes in local or cloud environments
  • Familiarity with cloud platforms (Azure, AWS, or GCP) and how applications are deployed in them
  • Experience working with CI/CD pipelines (e.g., GitHub Actions, Azure Pipelines)
  • Exposure to Infrastructure as Code tools like ARM templates or Terraform
  • Experience interacting with APIs (REST/GraphQL) for data ingestion or automation

What We're Looking For

  • Edmonton-based candidates willing to work in-office at least once a week
  • Strong communication and collaboration skills
  • A self-starter with a growth mindset, eager to expand into software and cloud engineering
  • Interest in working across disciplines in a production-focused team

Why Join Us:

  • Be part of a high-impact team productionalizing cutting-edge AI and quantitative models
  • Work with experienced engineers, researchers, and infrastructure experts
  • Grow your skills in cloud, DevOps, and end-to-end systems
  • Competitive compensation and benefits
  • Flexible hybrid work model with an emphasis on collaboration and learning
  • Opportunity to start working with cutting edge Agentic technologies
+ Show Original Job Post
























Intermediate Data Engineer
Edmonton, Alberta, Canada
Engineering
About AltaML
A Canadian artificial intelligence and machine learning firm specializing in applied AI solutions for businesses across various industries.