Data Engineer GCP
Act digital is a technology consulting and expertise company created in 2011. Our mission is to support our clients on their digital transformation challenges. Our services revolve around the following expertise areas:
- Software Delivery
- Infrastructure & Cloud Computing
- Agile IT Performance
- Business Performance
We are an international group established in more than a dozen countries and comprising 5700 employees.
Our success is driven by the development and flourishing of each employee, which is why we place great importance on offering the best possible working conditions:
- Telework available on a large part of our missions
- A Flex Office work environment available for everyone at all times to promote communication and collaboration
- Expert communities to share and disseminate skills within the group
- Project management and close HR follow-up
- Training and certifications offered annually
- Recognition of the expertise paths of our consultants
- A strong openness to international mobility, either short-term or long-term
- Intrapreneurship opportunities
Job Description
You will join our Data & AI Division , a team dedicated to the performance, governance, and optimization of data usage in an advanced cloud environment.
As a data engineer, you will participate in the analysis, development, optimization, and maintenance of comprehensive data solutions, ranging from data ingestion to data presentation, leveraging modern technologies such as GCP, BigQuery, Terraform, or Kafka.
Your responsibilities will include the following:
- Participate in the agile rituals of the team.
- Exploit the native and advanced features of BigQuery: QUOTAS, SLOTS, BI Engine, etc.
- Define and implement a BigQuery governance adapted to the needs of the projects.
- Produce optimized, readable, and maintainable queries, even on large volumes of data.
- Ensure the maintenance of the existing data solutions (RUN).
- Disseminate the best practices of usage to maximize the value extracted from the data while ensuring their accessibility and understanding.
- Contribute to the increase in skills of the teams on BigQuery and its ecosystem.
- Graduated from a Bac +5 in computer science, data engineering, or equivalent.
- Expertise in SQL and ETL pipeline development.
- Good level on CI/CD tools, GitHub, Terraform, Kafka.
- Basic knowledge of data visualization tools such as Power BI or Looker.
- Mastery of Google Cloud Platform (BigQuery, GCS, etc.) highly desirable.
- Good ability to evolve in an agile context and to collaborate with multidisciplinary teams.
- English language required (written and read)