Lockheed Martin is partnering with PG&E, Salesforce, and Wells Fargo to deliver EMBERPOINT™, an initiative designed to transform wildfire prevention, detection, and response across the United States. EMBERPOINT integrates advanced sensing technologies, artificial intelligence, and real-time command-and-control capabilities to help detect wildfires earlier, prevent escalation, and improve coordination between first responders, utilities, and emergency management organizations. To support this mission, we are building a scalable cloud-based data platform that enables rapid ingestion, storage, and analysis of large volumes of operational and sensor data. We are seeking an experienced Data Engineer to drive the orchestration of data ingestion, storage, and movement.
As a Data Engineering, you will design, build, and operate the data pipelines that ingest, transform, catalog, and serve EMBERPOINT data platform and its AI/ML, Unified HMI, and command-and-control (C2) applications. This role works closely with the AWS Infrastructure Architect, AI/ML Engineers, Software Factory, MBSE team (Cameo → DOORS NEXT), and the Advisory Board to ensure data quality, security, and performance across the end-to-end mission workflow (Detection → Prediction → Response → Recovery). These databases will integrate with streaming ingestion pipelines and data lake services to enable real-time analytics, operational applications, and predictive wildfire intelligence. You will work closely with data engineering, platform engineering, and application development teams to ensure the platform delivers high availability, strong performance, and secure data management.
Our team is building the cloud and data infrastructure that powers next-generation wildfire detection and response technologies. We work at the intersection of cloud engineering, large-scale data platforms, and mission-driven technology designed to protect communities and critical infrastructure. We value engineers who combine strong operational expertise with a systems perspective and who enjoy building reliable, scalable data services.
Who You Are:
Deep knowledge of S3, Glue, Lake Formation, Kinesis, MSK, Redshift, Athena, QuickSight, Lambda, Step Functions, IAM, KMS Hands-on with AWS Glue, Spark, DBT, Airflow (or Managed Workflows for Apache Airflow). Proficient in Python (PySpark, Boto3), Scala, SQL, and Shell/Bash. Experience integrating MBSE data from Cameo → DOORS NEXT into data-lake pipelines. Knowledge of AI/ML data pipelines (SageMaker Feature Store, model-artifact versioning). Experience with data ingestion, Object Storage and data paths, ingest orchestration, ETL, relational and non-relational database with autonomousDB
By applying to this job, you are expressing interest in this position and could be considered for other career opportunities where similar skills and requirements have been identified as a match. Should this match be identified you may be contacted for this and future openings.
Full-time Remote Telework: The employee selected for this position will work remotely full time at a location other than a Lockheed Martin designated office/job site. Employees may travel to a Lockheed Martin office for periodic meetings.
This position requires a government security clearance, you must be a US Citizen for consideration.