DataStream, ETL Fundamentals, SQL, SQL (Basic + Advanced), Python, Data Warehousing, Time Travel and Fail Safe, Snowpipe, SnowSQL, Modern Data Platform Fundamentals, PLSQL, T-SQL, Stored Procedures
Job Requirements
Design, develop, and maintain robust ELT/ETL data pipelines to load structured and semi-structured data into Snowflake. Implement data ingestion workflows using tools like Azure Data Factory, Informatica, DBT, or custom Python/SQL scripts.
Write and optimize complex SQL queries, stored procedures, views, and UDFs within Snowflake.
Use Snowpipe for continuous data ingestion and manage tasks, streams, and file formats for near real-time processing Optimize query performance using techniques like clustering keys, result caching, materialized views, and pruning strategies. Monitor and tune warehouse sizing and usage to balance cost and performance.
Design and implement data models (star, snowflake, normalized, or denormalized) suitable for analytical workloads.
Create logical and physical data models for reporting and analytics use cases.