Sr. Data Engineer need to create technical design, source to target mapping, and test case documents to reflect ELT process. Extract data from various source systems like Oracle, SQL server, and flat files. Write scripts for data cleansing, data validation, and data transformation from different source systems. Process data and perform testing using Spark SQL and real-time processing by Spark streaming and Kafka using Python. Script using Python and PowerShell for setting up baselines, branching, merging, and automation process using GIT. Improve memory and time performances for several existing pipelines. Develop data ingestion modules into various layers in S3, Redshift, and Snowflake using AWS Kinesis, AWS Glue, AWS Lambda, and AWS Step functions. Use Bash Shell Scripting, Sqoop, AVRO, Hive, Impala, HDP, Pig, Python, Map/Reduce daily to develop ETL, batch processing, and data storage functionality. Requires a master's degree in Computer Science, Information Technology, or a related field or foreign equivalent. Mail resumes to ITech-Go LLC, 6751 Dixie HWY, Ste 112, Clarkston, MI, 48346.