You will join a data team of six people, small and close-knit:
1 Head of Data, who manages the team directly and with whom you will work daily
2 Data Engineers (including this position)
1 Analytics Engineer
2 Data Analysts
We have chosen a modern data platform that we control, with a strong culture of self-hosting and a deep commitment to the sovereignty of our data. Each component of our stack has been chosen and integrated with intention.
You arrive at a stimulating time, with several key projects underway on a modern stack that you will help evolve: overhaul of our data ingestion system, optimization of dbt models, improvement of our fraud detection pipeline.
You take on a structuring technical scope with real responsibilities from day one. As a Data Engineer, you are at the heart of our data platform: you run it, evolve it, and are its guardian on a daily basis.
Concretely, you arrive on concrete and impactful projects:
The overhaul of our data ingestion system (POC in progress)
The improvement of our fraud detection pipeline
The optimization of our dbt models and Snowflake datamarts
The simplification of self-service BI for our internal teams
You collaborate closely with the rest of the Data team, but also with SREs, Platform developers, SIC teams, and RevOps. This gives you a cross-functional view of the company and its data challenges.
With us, you won't be a simple executor: we will give you the autonomy and space needed to understand the stakes, propose your own solutions, and make technical decisions.
You join a passionate and committed Data team where you will have a direct impact on the quality and reliability of our platform. Concretely, you:
Build and evolve our Data Platform You are responsible for its robustness on a daily basis: you implement observability metrics, monitor performance, investigate alerts, and optimize our ingestion pipelines. You have the freedom to identify areas for improvement and propose them to the team, whether on architecture, tools, or processes.
Accelerate the work of Analytics Engineers and Data Analysts You are their technical reference partner: you support them in data modeling and transformation, challenge their SQL queries, and help evolve their practices on the platform. Your impact is directly measured in their efficiency and the quality of their deliverables.
Develop concrete solutions You code, automate workflows, connect external APIs—with a constant focus on quality, testability, and readability. You use generative AI (Claude Code, Copilot) as a daily lever to accelerate development, improve code quality, and automate repetitive tasks. And you share these practices with your team.
Contribute to engineering in its entirety From technical framing to production deployment, you are involved in the entire cycle: defining adapted architectures, respecting good security and compliance practices (GDPR), managing critical incidents, and writing actionable post-mortems.
You will evolve in a modern environment, built around the reference tools:
Data Warehouse: Snowflake as the central storage and analysis core
Transformation: DBT Core to model and make our data reliable
Orchestration: Airflow to pilot our pipelines and workflows
Ingestion: DLT Hub to connect and load our data sources
Infrastructure as Code: Terraform/Terragrunt and Permifrost to manage Snowflake resources, Kubernetes with Helm charts (official and custom) for service deployment
Hosting: Self-hosted with managed services in OVH (databases)
Development with continuous deployment: Python, GitLab CI/CD, and GitOps practices (ArgoCD) for maintainable and collaborative code
Visualization: Metabase to make data accessible to all
Generative AI: Claude Code integrated into our daily workflow to accelerate and improve our engineering
What is essential to succeed in this role:
Minimum 3 years of experience in data engineering or a similar role
Bac+4/5 degree in Computer Science, Engineering, or equivalent
Very good mastery of Python, with a strong appetite for data manipulation
Advanced mastery of SQL for data transformation and querying
Experience with an orchestrator (like Airflow) and a transformer (like DBT)
A good level of technical English
The pluses (we will support you if you don't have them all!):
Experience or knowledge of Kubernetes (k8s) for deployment and infrastructure management
Experience with Terraform or Infrastructure as Code
Knowledge of GitOps practices (ArgoCD, Helm, versioning)
Comfortable with using generative AI tools (Claude Code, Copilot, etc.) as a daily lever
Knowledge of data visualization tools (Metabase or equivalent)
Curious, proactive, and action-oriented: you like to learn, explore, and go to the end of your ideas
A mission that matters in a world challenged by AI-driven fraud
A vision built on integrity
A European & sovereign platform
A certified B Corp
The golden age of Yousign
Salary range: 52k-55k
Meal vouchers (Swile): €10.50/day, 50% covered by Yousign.
Health insurance (Alan): Basic plan at €62.50/month, 50% covered by Yousign.
Life & disability insurance: 100% employer-covered.
Wellbeing: Axomove (4 physio sessions) and Moka.care (4 therapy/coaching sessions).
Transportation: 50% reimbursement for public transport for hybrid workers.
Leeto: Access to numerous employee discounts (cinema, travel, leisure, sports, etc.).
Time off: 10 RTT days/year, plus menstrual leave, parenthood benefits, seniority days, and "journée de solidarité."
Additional benefits: 1 volunteering day/year, learning & development budget, and more.
Lunch vouchers: 6,50€/day, with 80% covered by Youtrust
Transportation: 50% off public transport passes
Moka.care: 6 free therapy or coaching sessions, plus access to mental health content
Paid Leave:</