View All Jobs 121825

Data Engineer

Build scalable IoT data pipelines powering AI-enabled analytics and real-time inference
Atlanta
Mid-Level
9 hours agoBe an early applicant
Honeywell

Honeywell

Diversified technology and manufacturing conglomerate providing aerospace systems, building technologies, performance materials, and safety and productivity solutions.

Data Engineer

As a Data Engineer, you will be part of a high-performing global team delivering AI- and data-driven solutions for Honeywell's industrial customers, with a focus on IoT and real-time data processing. In this role, you will architect and implement scalable data pipelines and platforms that enable advanced analytics and AI capabilities, including large-scale machine learning models, intelligent automation, and real-time inference. You will work closely with cross-functional engineering and product teams at the intersection of IoT telemetry and modern data technologies to develop reliable, high-impact industrial solutions.

You will report directly to our Data Engineering Manager and you'll work out of our Atlanta, GA location on a hybrid work schedule. Note: for the first 90 days, new hires must be prepared to work 100% onsite M-F.

Key Responsibilities

Data Engineering & AI Pipeline Development:

  • Design and implement scalable data architectures to process high-volume IoT sensor data and telemetry streams, ensuring reliable data capture and processing for AI/ML workloads
  • Build and maintain data pipelines for AI product lifecycle, including training data preparation, feature engineering, and inference data flows
  • Develop and optimize RAG (Retrieval Augmented Generation) systems, including vector databases, embedding pipelines, and efficient retrieval mechanisms
  • Create robust data integration solutions that combine industrial IoT data streams with enterprise data sources for AI model training and inference

DataOps:

  • Implement DataOps practices to ensure continuous integration and delivery of data pipelines powering AI solutions
  • Design and maintain automated testing frameworks for data quality, data drift detection, and AI model performance monitoring
  • Create self-service data assets enabling data scientists and ML engineers to access and utilize data efficiently
  • Design and maintain automated documentation systems for data lineage and AI model provenance

Collaboration & Innovation:

  • Partner with ML engineers and data scientists to implement efficient data workflows for model training, fine-tuning, and deployment
  • Drive continuous improvement in data engineering practices and tooling
  • Establish best practices for data pipeline development and maintenance in AI contexts
  • Drive projects to completion while working in an agile environment with evolving requirements in the rapidly changing AI landscape

Qualifications

You Must Have:

  • Minimum 3 years of experience in data engineering with a strong grasp of Change Data Capture (CDC), ELT/ETL workflows, streaming replication, and data quality frameworks
  • Deep expertise in building scalable data pipelines using Databricks, including Unity Catalog and Delta Live Tables
  • Strong hands-on proficiency with PySpark for distributed data processing and transformation
  • Solid experience working with cloud platforms such as Azure, GCP, and Databricks, especially in designing and implementing AI/ML-driven data workflows
  • Proficient in CI/CD practices using GitHub Actions, Bitbucket, Bamboo, and Octopus Deploy to automate and manage data pipeline deployments.

We Value:

  • Experience building solutions on RAG and Agentic architectures and working with LLM-powered applications
  • Expertise in real-time data processing frameworks (Apache Spark Streaming, Structured Streaming)
  • Knowledge of MLOps practices and experience building data pipelines for AI model deployment
  • Experience with time-series databases and IoT data modeling patterns
  • Familiarity with containerization (Docker) and orchestration (Kubernetes) for AI workloads
  • Strong background in data quality implementation for AI training data
  • Experience working with distributed teams and cross-functional collaboration
  • Knowledge of data security and governance practices for AI systems
  • Experience working on analytics projects with Agile and Scrum Methodologies

Benefits Of Working For Honeywell

In addition to a competitive salary, leading-edge work, and developing solutions side-by-side with dedicated experts in their fields, Honeywell employees are eligible for a comprehensive benefits package. This package includes employer-subsidized Medical, Dental, Vision, and Life Insurance; Short-Term and Long-Term Disability; 401(k) match, Flexible Spending Accounts, Health Savings Accounts, EAP, and Educational Assistance; Parental Leave, Paid Time Off (for vacation, personal business, sick time, and parental leave), and 12 Paid Holidays. For more information visit: click here

About Honeywell

Honeywell International Inc. (NYSE: HON) invents and commercializes technologies that address some of the world's most critical challenges around energy, safety, security, air travel, productivity, and global urbanization. We are a leading software-industrial company committed to introducing state-of-the-art technology solutions to improve efficiency, productivity, sustainability, and safety in high growth businesses in broad-based, attractive industrial end markets. Our products and solutions enable a safer, more comfortable, and more productive world, enhancing the quality of life of people around the globe. Learn more here.

Job Info

Job Identification 145130

Job Category Engineering

Posting Date 04/03/2026, 03:18 PM

Job Schedule Full time

Locations 715 Peachtree Street, N.E., Atlanta, GA, 30308, US (Hybrid)

Hire Eligibility Internal and External

Relocation Package None

+ Show Original Job Post
























Data Engineer
Atlanta
Engineering
About Honeywell
Diversified technology and manufacturing conglomerate providing aerospace systems, building technologies, performance materials, and safety and productivity solutions.