View All Jobs 136320

Data Engineer

Build and maintain scalable Snowflake-based data pipelines and dbt models for analytics
Tel Aviv
Mid-Level
yesterday
Hewlett Packard Enterprise

Hewlett Packard Enterprise

Provides enterprise IT solutions including servers, storage, networking, cloud services, and edge computing for businesses and organizations worldwide.

69 Similar Jobs at Hewlett Packard Enterprise

Data Engineer (Mid Level)

This role has been designed as 'Hybrid' with an expectation that you will work on average 2 days per week from an HPE office.

Hewlett Packard Enterprise is the global edge-to-cloud company advancing the way people live and work. We help companies connect, protect, analyze, and act on their data and applications wherever they live, from edge to cloud, so they can turn insights into outcomes at the speed required to thrive in today's complex world. Our culture thrives on finding new and better ways to accelerate what's next. We know varied backgrounds are valued and succeed here. We have the flexibility to manage our work and personal needs. We make bold moves, together, and are a force for good. If you are looking to stretch and grow your career our culture will embrace you. Open up opportunities with HPE.

About the Role

We are looking for a talented Data Engineer to help build and enhance the data platform that supports analytics, operations, and data-driven decision-making across the organization. You will work hands-on to develop scalable data pipelines, improve data models, ensure data quality, and contribute to the continuous evolution of our modern data ecosystem.

You'll collaborate closely with Senior Engineers, Analysts, Data Scientists, and stakeholders across the business to deliver reliable, well-structured, and well-governed data solutions.

What You'll Do

Engineering & Delivery

  • Build, maintain, and optimize data pipelines for batch and streaming workloads.
  • Develop reliable data models and transformations to support analytics, reporting, and operational use cases.
  • Integrate new data sources, APIs, and event streams into the platform.
  • Implement data quality checks, testing, documentation, and monitoring.
  • Write clean, performant SQL and Python code.
  • Contribute to improving performance, scalability, and cost-efficiency across the data platform.

Collaboration & Teamwork

  • Work closely with senior engineers to implement architectural patterns and best practices.
  • Collaborate with analysts and data scientists to translate requirements into technical solutions.
  • Participate in code reviews, design discussions, and continuous improvement initiatives.
  • Help maintain clear documentation of data flows, models, and processes.

Platform & Process

  • Support the adoption and roll-out of new data tools, standards, and workflows.
  • Contribute to DataOps processes such as CI/CD, testing, and automation.
  • Assist in monitoring pipeline health and resolving data-related issues.

What We're Looking For

  • 2–5+ years of experience as a Data Engineer or similar role.
  • Hands-on experience with Snowflake (mandatory)—including SQL, modeling, and basic optimization.
  • Experience with dbt (or similar)—model development, tests, documentation, and version control workflows.
  • Strong SQL skills for data modeling and analysis.
  • Proficiency with Python for pipeline development and automation.
  • Experience working with orchestration tools (Airflow, Dagster, Prefect, or equivalent).
  • Understanding of ETL/ELT design patterns, data lifecycle, and data modeling best practices.
  • Familiarity with cloud environments (AWS, GCP, or Azure).
  • Knowledge of data quality, observability, or monitoring concepts.
  • Good communication skills and the ability to collaborate with cross-functional teams.

Nice to Have

  • Exposure to streaming/event technologies (Kafka, Kinesis, Pub/Sub).
  • Experience with data governance or cataloging tools.
  • Basic understanding of ML workflows or MLOps concepts.
  • Experience with infrastructure-as-code tools (Terraform, CloudFormation).
  • Familiarity with testing frameworks or data validation tools.

Additional Skills: Cloud Architectures, Cross Domain Knowledge, Design Thinking, Development Fundamentals, DevOps, Distributed Computing, Microservices Fluency, Full Stack Development, Security-First Mindset, User Experience (UX)

What We Can Offer You:

Health & Wellbeing: We strive to provide our team members and their loved ones with a comprehensive suite of benefits that supports their physical, financial and emotional wellbeing.

Personal & Professional Development: We also invest in your career because the better you are, the better we all are. We have specific programs catered to helping you reach any career goals you have — whether you want to become a knowledge expert in your field or apply your skills to another division.

Unconditional Inclusion: We are unconditionally inclusive in the way we work and celebrate individual uniqueness. We know varied backgrounds are valued and succeed here. We have the flexibility to manage our work and personal needs. We make bold moves, together, and are a force for good.

+ Show Original Job Post
























Data Engineer
Tel Aviv
Engineering
About Hewlett Packard Enterprise
Provides enterprise IT solutions including servers, storage, networking, cloud services, and edge computing for businesses and organizations worldwide.