✨ About The Role
- The Data Ops Engineer will oversee the end-to-end execution of the Deep 6 AI data pipeline, ensuring issues are identified and resolved quickly.
- Responsibilities include building proactive and reactive measures for supporting pipeline health, such as monitoring and alerting.
- The role involves enhancing the data pipeline with established data engineering core principles like observability and resilience.
- Ensuring data metrics for sufficiency and quality are captured and visible to all stakeholders is a key task.
- The engineer will contribute to discussions on data sufficiency across all clients, leveraging various data types.
⚡ Requirements
- Proven success operating data pipelines at a similarly sized or larger company is essential for this role.
- Proficiency in JVM languages, especially Kotlin, is required to effectively contribute to the data pipeline.
- Experience with cloud-based platforms, particularly AWS, is necessary for managing the data infrastructure.
- A positive, collaborative, can-do attitude and a strong sense of ownership are crucial for team dynamics.
- Familiarity with both supervised and unsupervised machine learning methods will be beneficial in enhancing data processes.