Design, develop, and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform, and load) processes to migrate and deploy data across systems.
Must have skills: PySpark
Minimum 5 year(s) of experience is required. 15 years full time education required.
Roles and Responsibilities:
Skills and Experience: Extensive Experience with PYSpark, Strong Knowledge of Data Warehousing Principles, SQL Proficiency, Experience with Cloud Platforms, Leadership and Communication Skills, Problem-Solving and Analytical Skills, Proven Ability to Deliver Data Solutions
Minimum Experience: At least 8 years of experience in data engineering or a related field.
Professional & Technical Skills:
Additional Information: