We are looking for an experienced DevOps/Platform Engineer with strong expertise in OpenShift, Kafka, CI/CD, Cloud, and distributed systems to join our technology team. The ideal candidate will have hands-on experience building scalable, secure, and cloud-native platforms in high-throughput environments.
Degree, Postgraduate in Computer Science or related field (or equivalent industry experience)
Minimum 5+ years of development and design experience in OpenShift/Jenkins, ArgoCD and Kafka.
• Containerization and orchestration (Docker and OpenShift/Kubernetes)
• Enterprise and Open Source Redis setup, configuration and support.
• Kafka setup and configuration experience with high throughput environment.
• Source control (GIT), automated build/deployment pipelines (Jenkins, ArgoCD, Kaniko, Shipwright etc.)
• Public Cloud preferably Azure and OCI
• Linux OS configuration and using shell scripting
• Working with streaming data sets at scale
• Public Cloud automation toolset
• Cloud Native applications
• Understanding of GitOps
• Extensive coding experience and knowledge in event driven and streaming architecture
• Good hands-on experience with design patterns and their implementation
• Experience doing automated unit and integration testing
• Well versed with CI/CD principles (GitHub, Jenkins etc.), and actively involved in solving, troubleshooting issues in distributed services ecosystem
• Familiar with Distributed services resiliency and monitoring in a production environment.
• Responsible for adhering to established policies, following best practices, developing, and possessing an in-depth understanding of exploits and vulnerabilities, resolving issues by taking the appropriate corrective action.
• High level knowledge of compliance and regulatory requirements of data including but not limited to encryption, anonymization, data integrity, policy control features in large scale infrastructures
• Understand data sensitivity in terms of logging, events and in memory data storage– such as no card numbers or personally identifiable data in logs.
• Distributed systems
• Network fundamentals and host-level routing
• Tuning distributed systems
• Automated testing, Security engineering practices and tools.
• Event driven architecture
• Good to have Flink, Beam / Kafka streams, Apache Spark or similar streaming technologies
• Experience in Agile methodology.
• Ensure quality of technical and application architecture and design of systems across the organization.
• Analytical thinking
• Effectively research and benchmark technology against other best in class technologies.
• Able to influence multiple teams on technical considerations, increasing their productivity and effectiveness, by sharing deep knowledge and experience.
• Self-motivator and self-starter, ability to own and drive things without supervision and works collaboratively with the teams across the organization.
• Have excellent soft skills and interpersonal skills to interact team members.
• Strategic Thinking with Research and Development mindset
• Excellent communication skills (verbal, written, and presentation)