Business Consultant - Niche (Analytics Data Engineer)
Pay rate range - $100/hr. to $110/hr. Hybrid • [candidate needs to be able to go into the office for tech issues or special occasions.]
Role Responsibilities Include (but are not limited to):
Data Collection and Preparation: Gather data from multiple sources and preprocess it to eliminate errors, inconsistencies, and ensure quality.
Statistical Analysis: Apply statistical modeling techniques to identify patterns and relationships, leveraging methods such as hypothesis testing, regression, and clustering.
Leverage SQL, Python, or R to uncover patterns, trends, and relationships.
Interpret these findings to provide insights that address specific business challenges, such as identifying customer behavior or optimizing operations.
Create visualizations using tools like Tableau, Power BI, or matplotlib. These visualizations, along with detailed reports, help stakeholders understand complex data and make data-driven decisions.
Bridge the gap between technical data and business needs, enabling organizations to improve efficiency, reduce costs, and enhance customer satisfaction.
Incorporate advanced techniques like predictive modeling and machine learning to address complex challenges.
Must-Have Skills:
Programming & Tools: Strong programming skills in Python, SAS, and SQL.
Experience with Power BI, DAX, and M Code for dashboarding.
Proficiency with MS 365 Suite: Office, Power Automate, SharePoint, OneDrive.
Database & Data Engineering: Advanced configuration of SQL Server for high-throughput analytical workloads, including memory allocation and parallel query execution.
Design and implementation of partitioned tables, indexed views, and columnstore indexes to support large-scale data operations.
Deep understanding of SQL Server recovery models (Simple, Full, Bulk-Logged), including backup/restore strategies, log management, and disaster recovery planning.
Development of robust ETL pipelines to extract, transform, and load data from IBM Netezza/Hadoop, ensuring efficient handling of large datasets.
Experience with data staging, incremental loads, and change data capture techniques.
Data Architecture: Experience with star/snowflake schemas, fact/dimension modeling, and slowly changing dimensions.
Design and implementation of batch processing pipelines using Python and SQL.
Solid understanding of RDBMS, NoSQL, and data file formats (CSV, Parquet, JSON).
Proficient in translating business requirements into scalable data models.
Cloud & Big Data: Strong experience with AWS (Redshift, Glue, MLOps).