As Singapore's longest established bank, we have been dedicated to enabling individuals and businesses to achieve their aspirations since 1932. How? By taking the time to truly understand people. From there, we provide support, services, solutions, and career paths that meet their individual needs and desires.
Today, we're on a journey of transformation. Leveraging technology and creativity to become a future-ready learning organisation. But for all that change, our strategic ambition is consistently clear and bold, which is to be Asia's leading financial services partner for a sustainable future.
We invite you to build the bank of the future. Innovate the way we deliver financial services. Work in friendly, supportive teams. Build lasting value in your community. Help people grow their assets, business, and investments. Take your learning as far as you can. Or simply enjoy a vibrant, future-ready career.
Your Opportunity Starts Here.
Imagine being part of a team that builds innovative digital solutions for one of Asia's leading banks. Here, you'll have the opportunity to work on cutting-edge projects that transform the way we deliver financial services. You'll be part of a collaborative team that values creativity, innovation, and continuous learning.
To succeed in this role, you'll need to be passionate about Big Data Engineering and committed to delivering high-quality solutions. You'll work closely with cross-functional teams to understand business requirements, design and develop software applications, and ensure seamless integration with existing systems. Your ability to collaborate, innovate, and adapt to changing requirements will be key to your success.
-Collaborate with business stakeholders to translate requirements to data solutions
-Strong Delivery Experience in a Multi Stakeholder able to see initiatives end to end
-Design and implement data pipelines for batch and real-time/stream processing.
-Troubleshoot and resolve issues across a complex multi-technology landscape
-Performance tuning and optimization on data pipelines within the Big Data Ecosystem
-Drive Initiatives for Optimization and Resiliency Employing automation
-Excellent communication and Stakeholder Management in an Agile Setting.
A degree in Computer Science, Information Technology, or a related field
3–5 years of hands-on experience in Big Data Engineering within the banking domain.
Expert knowledge Operationalizing data pipeline on Hadoop Hive, Iceberg, Spark
Strong knowledge of Translating Business Requirements to Spark and SparkSQL
Working Experience with pySpark, Kafka, Spark Streaming, Flink
Proficiency in Unix Shell scripting Python scripting and Python Frameworks for automation
Proficient in Dev Ops Process and Orchestration Tools
Experience in deploying applications on containers
Experience operationalizing data pipelines on the Cloud platforms (Azure, Google, AWS) is a plus
Knowledge of Data APIs with GraphQL is a plus.
Competitive base salary. A suite of holistic, flexible benefits to suit every lifestyle. Community initiatives. Industry-leading learning and professional development opportunities. Your wellbeing, growth and aspirations are every bit as cared for as the needs of our customers.