Senior Software Engineer
Employer: Optum Services, Inc. Location: 1 Optum Circle, Eden Prairie, MN 55344 (Telecommuting available from anywhere in the U.S.)
Duties:
- Research, design and develop computer and network software or specialized utility programs.
- Design and develop ETL/ELT solutions on Azure Databricks, ADF, Snowflake and Spark.
- Develop, implement and deploy large-scale data pipelines empowering machine learning algorithms, insights generation, business intelligence dashboards, reporting and new data products.
- Design and build data service APIs.
- Partner with Optum Technology to create and maintain the technical architecture of the Enterprise Delta Lake to consolidate data from many systems into a single source for machine learning and reporting analytics.
- Design, build, optimize, and manage modern large-scale data pipelines.
- ETL/ELT processing to support data integration for analytics, machine learning features and predictive modelling.
- Participate in architectural evolution of data engineering patterns, frameworks, systems, and platforms including defining best practices and standards for managing data collections and integration.
- Contribute to building an automated E2E Integration testing suite.
- Consume data from a variety of sources (RDBMS, APIs, FTPs and other cloud storage) and formats (Excel, CSV, XML, JSON, Parquet, Unstructured).
- Write advanced/complex SQL with performance tuning and optimization.
- Identify ways to improve data reliability, data integrity, system efficiency and quality.
- Assist in mentoring other data engineers and provide significant technical direction by teaching other data engineers how to leverage cloud data platforms.
Requirements:
Employer will accept a Master's degree in Computer Engineering, Information Technology, Data Engineering or related field and three (3) years of experience in the job offered or in a computer-related occupation. Position requires three (3) years of experience in the following: Data engineering, data integration, data modeling, data architecture, and ETL/ELT processes to provide quality data and analytics solutions; Python coding; Apache Spark (PySpark/Spark SQL); Building and deploying Cloud based solutions using - Azure Databricks, ADF, Snowflake, Functions, and Service Bus; SQL with designing complex data schemas and query performance optimization; DevOps automation with Terraform; and CI/CD process and tools such as Jenkins, GitHub Actions, GIT, Artifactory, Fortify Scan, or Sonar.
Rate Of Pay: $143,260.00 - $191,502.00 per year.
Please email your resume to GlobalRecruitment@uhg.com and reference job #2328449.