Experience in AWS, Python, SQL, data modelling. Nice to have: Apache Iceburg, data quality framework, databricks/snowflake. Formal training or certification on software engineering concepts and applied experience. Hands-on practical experience in system design, application development, testing, and operational stability. Proficient in coding in one or more languages. Experience in developing, debugging, and maintaining code in a large corporate environment with one or more modern programming languages and database querying languages. Exposure to cloud technologies (preferred). Experience across the data lifecycle. Advanced at SQL (e.g., joins and aggregations). Working understanding of NoSQL databases. Significant experience with statistical data analysis and able to determine appropriate tools and data patterns to perform analysis. Experience customizing changes in a tool to generate product. Builds hybrid on-prem and public cloud data platform solutions, Builds end-to-end data pipelines (ETL/ELT) for ingestion, transformation, and distribution, supporting both batch and streaming workloads. Design logical and physical data models. Develops data products that are reusable, well-documented, and optimized for analytics, BI, and AI/ML consumers. Implements modern data lake and lakehouse architectures, including Apache Iceberg table formats. AWS, Python, SQL, data modelling, databricks/snowflake, Apache Iceburg. NoSQL databases, data quality framework.