Bigdata - Senior Engineer
Location: Noida, UP, India
Company: Iris Software
Why Join Iris?
Are you ready to do the best work of your career at one of India's Top 25 Best Workplaces in IT industry? Do you want to grow in an award-winning culture that truly values your talent and ambitions? Join Iris Software — one of the fastest-growing IT services companies — where you own and shape your success story.
About Us
At Iris Software, our vision is to be our client's most trusted technology partner, and the first choice for the industry's top professionals to realize their full potential. With over 4,300 associates across India, U.S.A, and Canada, we help our enterprise clients thrive with technology-enabled transformation across financial services, healthcare, transportation & logistics, and professional services. Our work covers complex, mission-critical applications with the latest technologies, such as high-value complex Application & Product Engineering, Data & Analytics, Cloud, DevOps, Data & MLOps, Quality Engineering, and Business Automation. At Iris, every role is more than a job — it's a launchpad for growth. Our Employee Value Proposition, "Build Your Future. Own Your Journey." reflects our belief that people thrive when they have ownership of their career and the right opportunities to shape it. We foster a culture where your potential is valued, your voice matters, and your work creates real impact. With cutting-edge projects, personalized career development, continuous learning and mentorship, we support you to grow and become your best — both personally and professionally.
Job Description
Skills Required:
- Solid understanding of object-oriented programming and design patterns.
- 7 to 9 Years of strong experience with bigdata.
- Comfortable working with large data volumes and able to demonstrate a firm understanding of logical data structures and analysis techniques.
- Experience in Big data technologies like HDFS, Hive, HBase, Apache Spark, Pyspark & Kafka.
- Proficient in code versioning tools, such as Git, BitBucket, and Jira.
- Strong systems analysis, design and architecture fundamentals, Unit Testing, and other SDLC activities.
- Experience in working on Linux shell scripting.
- Demonstrated analytical and problem-solving skills.
- Excellent troubleshooting and debugging skills.
- Strong communication and aptitude.
- Ability to write reliable, manageable, and high-performance code.
- Good knowledge of database principles, practices, and structures, including SQL development experience, preferably with Oracle.
- Understanding fundamental design principles behind a scalable application.
- Basic Unix OS and scripting knowledge.
Good to have:
- Financial markets background is preferable but is not a must.
- Experience in Jenkins, Scala, Autosys.
- Familiarity with build tools such as Maven and continuous integration.
- Candidates with working knowledge of Docker / Kubernetes / OpenShift / Mesos is a plus.
- Have basic experience in Data Preparation Tools Experience with CI/CD build pipelines.
Mandatory Competencies:
- Big Data - HDFS
- Big Data - HIVE
- Big Data - Hadoop
- Big Data - Pyspark
- Big Data - Hbase
- Data Science and Machine Learning - Apache Spark
- Middleware - Message Oriented Middleware - Messaging (JMS, ActiveMQ, RabitMQ, Kafka, SQS, ASB etc)
- DevOps/Configuration Mgmt - Git
- DevOps/Configuration Mgmt - GitLab, Github, Bitbucket
- Development Tools and Management - JIRA
- Tech - Unit Testing
- Operating System - Linux
- Development Tools and Management - SharePoint - PowerShell Scripting
- Operating System - Unix
- Database - Oracle - Database Design
- Beh - Communication and collaboration