Data Engineers
As a Junior Data Engineer, you will be joining the newly formed AI, Data & Analytics team, whose mission is to drive increased value from the data InvestCloud captures to enable a smarter financial future for our clients, focused on "enhanced intelligence". Ensuring we have fit-for-purpose modern capabilities is a key goal for the team. This is a unique opportunity to participate in building the data platform of a green field ecosystem to create a next-generation advisor and client experience. We're building for scale. As such, much of what we design and implement today will be the technology/infrastructure which will serve thousands of clients and petabyte-level volumes of data. The core stack we use and are building is:
• AWS as our cloud provider • Oracle as our legacy data warehouse • Snowflake as our next-gen data warehouse • Mage AI for data ingestion and processing • Kafka as our message bus • Terraform for building infrastructure
Key Responsibilities:
- Build reliable and scalable data pipelines and capabilities across the platform, with full ownership under mentorship of talented engineers
- Execute the technical strategy of the team through prioritization, and delivery management.
- Support complex architectures tying multiple services, SaaS tooling and third party data together, leveraging a strong understanding of a cloud-based stack.
- Write well-rounded, reusable and documented code that captures the essential nature of the solution.
- Apply high standards across documentation, testing, resiliency, monitoring, and code quality.
- Understand data quality, governance and security across the platform, complying with relevant regulations
Required Skills:
- At least 2 years of relevant professional experience in Data Engineering or in a related field
- Experience with a mature cloud data platform (AWS, GCP, Azure)
- Hands-on experience in building resilient batch (Airflow, Fivetran, Mage AI, Airbyte) and streaming (Kafka, Kinesis, Flink, Spark) data pipelines
- Developed data models in a cloud data warehouse (dbt, BigQuery, Snowflake) and/or have working experience with legacy ecosystems (Oracle, Postgres)
- Build platform components through IaC (Terraform, OpenTofu, Ansible), containerization (Docker) and CI/CD (Jenkins, Github Actions)
- Built projects in SQL and Python, and are eager to deepen your expertise
- Growth mindset and always on the lookout for stretch challenges
- Curious and collaborative, keen to learn from others, tackle open-ended problems, and grow through feedback
- Love learning new things. Up to date on new trends in data. AI proficient or AI curious, hungry to embrace what the latest technology has to offer to enhance your productivity
Apply without meeting all requirements statement: If you don’t meet every requirement but believe you’d thrive in this role, we’d still love to hear from you. We’re always keen to speak to people who connect with our mission and values.
Location and Travel: The ideal candidate will be expected to work from the office on a regular basis (3 days minimum per week). Occasional travel may be required.
Compensation: The salary range will be determined based on experience, skills, and geographic location. Equal Opportunity Employer InvestCloud is committed to fostering an inclusive workplace and welcomes applicants from all backgrounds.