View All Jobs 118726

Senior Software Engineer, Datacraft - Remote Eligible

Deliver the first production-grade DWH data pipeline and data model for Bloomreach
Bratislava, Region of Bratislava, Slovakia
Senior
€4,250 EUR / month
5 days ago
Bloomreach

Bloomreach

Delivers an AI-powered digital experience platform for personalized e-commerce search, merchandising, and content optimization across channels.

Senior Data Engineer, Datacraft

Join our newly formed Datacraft team — the team building the next-generation data platform that powers internal DWH, analytics dashboards, and the Loomi Analytics Agent for Bloomreach Engagement. Your engineering work will directly impact how hundreds of enterprise customers access, understand, and activate their Bloomreach data internally and with data share in Snowflake, BigQuery, Databricks, and beyond. Your starting salary will be from 4,250 € per month, along with stock options and other benefits. Working in one of our Central European offices (Bratislava, Praha, Brno) or from home on a full-time basis, you'll become a core part of the Engineering team.

What challenge awaits you?

You will join the Datacraft team as one of its founding engineers. Datacraft is a new team in the Engagement pillar, established to tackle three interconnected domains:

  • Data Warehouses (~60% of team domain) — making Bloomreach data first-class in customer DWHs (Snowflake, BigQuery, Databricks). The strategic goal for 2026–27 is to use DWHs to exponentially accelerate data adoption — both for customers who want their data outside Bloomreach and for Bloomreach itself to build analytics faster.
  • Loomi Analytics Agent (~20%) — evolving Loomi Analytics from a constrained report builder into an agentic analytics assistant that can explore data across systems, explain insights, and eventually act on them. You will help build the data backbone the agent operates on.
  • Dashboards & Analytics Stack (~20%) — moving reporting from the proprietary stack onto DWH-backed, modern analytics stacks (semantic layers, headless BI tools), dramatically speeding up how fast we can ship and iterate on dashboards.

Datacraft is an AI-first team. We believe code is a commodity and expect every engineer to fluently use coding agents (e.g., Cursor, Claude Code, Copilot, Gemini CLI) as a core part of their daily workflow. The ability to leverage AI tooling to accelerate development, prototyping, and problem-solving is not optional — it's foundational. As a P3 (Senior) engineer at Bloomreach you are an independent professional — expert in at least one component, able to decompose objectives into tasks, and lead small projects end-to-end with minimal day-to-day guidance.

Your job will be to:

a. Design and build the DWH data platform

  • Design and build robust data pipelines that move and transform Engagement data (events, profiles, campaigns, aggregates) into DWHs (BigQuery, Snowflake, Databricks).
  • Implement and tune batch and streaming ingestion patterns (e.g., Kafka → GCS → Iceberg -> DWH) with attention to scalability, cost, and reliability, following a medialion architecture principles
  • Contribute to data mutation in ETL pipelines ensuring DWH data correctly reflects Bloomreach's ID resolution and consent semantics.
  • Own and evolve data models that make Bloomreach data easy to use.
  • Build and maintain orchestration and scheduling (Airflow / Cloud Composer) so complex workflows run predictably and are observable.
  • Read and interpret data from our monitoring, alerting, and reliability improvements so we catch issues (missing loads, quota limits, data drifts) before customers do.

b. Shape the data layer for Loomi Analytics Agent (~20%)

  • Help implement evaluation harnesses and analytics skills (trend analysis, driver analysis, experiment analysis) as skills in the Loomi analytic assistant
  • Fine tuning of Loomi system prompts, implementing traces and enabling AI agents federation
  • Work with Product and engineering teams to provide clean, well-modeled data interfaces and MCPs for the Loomi Analytics Agentic platform
  • Contribute to agentic workflows where Loomi collaborates with DWH analytics/AI (e.g., asking DWH agents for segment performance via MCP, combining results across systems).
  • Ensure Loomi's data access patterns are reliable, explainable, and debuggable, minimizing sources of hallucination or data inconsistencies.
  • Work with Product and applied AI engineers to provide clean, well-modeled data interfaces and MCPs for the Loomi Analytics Agent platform

c. Co-build the dashboards and analytics stack (~20%)

  • Co-design and maintain canonical metrics and models for Engagement reporting when dashboards move from current infrastructure onto DWH-backed stacks.
  • Support semantic layers and BI tools (Looker, Power BI, etc.) so that multiple personas (analysts, marketers, data scientists) can reliably self-serve analytics.
  • Make sure dashboards and BI assets are tightly integrated with Loomi, so the agent can understand and explain what users see, not just raw tables.

What technologies and tools does the Datacraft team work with?

  • Programming languages — Python (primary), Go, SQL
  • Messaging & streaming — Apache Kafka
  • Databases & storage — BigQuery, Apache Iceberg, Google Cloud Storage (GCS), Mongo, Redis
  • Data processing — Apache Spark, DataProc, Airflow / Cloud Composer
  • DWH platforms — BigQuery (primary), Snowflake, Databricks (customer-facing)
  • Infrastructure — Google Cloud Platform (GCP), Kubernetes, Terraform
  • AI / Agentic — LLM APIs, agent orchestration frameworks, MCP, evaluation harnesses
  • BI & semantic layers — Cube, Looker Studio
  • Observability & operations — Grafana, Prometheus, PagerDuty, Sentry, OpenTelemetry
  • Software & tools — GitLab (CI/CD), Jira, Confluence, Kubernetes
  • AI coding agents — Cursor, Claude Code

The owned area encompasses domains such as DWH data exports and ingestion, data modeling for analytics, Loomi Analytics Agent data layer, Engagement dashboards on DWH, and semantic layers. Experience with data lakehouse architectures, open table formats (Iceberg), and agentic/LLM systems is highly valued.

Your success story will be:

  • In 30 Days: Gain understanding of company processes, team dynamics, the product, and the Datacraft domain — DWH strategy, Loomi Analytics, dashboards, and key data services.
  • Set up your local and GCP development environment and complete the Engagement engineering onboarding.
  • Understand the current state of EBQ/data export pipelines, ongoing DWH architecture research, and the Loomi Analytics Agent roadmap.
  • In 90 Days: Deliver your first meaningful contribution to the DWH data platform — a pipeline, data model, or orchestration improvement that ships to production.
  • Become comfortable with the end-to-end data flow from Engagement (Kafka, IMF) through to DWH destinations and BI layers.
  • Participate in architecture discussions and contribute to key design decisions (KDDs) for DWH architecture, Loomi data access, or dashboard data models.
  • Take part in teams onCall/onDuty
  • In 180 Days: Own at least one component or domain within the Datacraft scope — able to independently design, build, and maintain it.
  • Be a trusted contributor in your domain — understanding the data platform deeply enough to make informed trade-off decisions and challenge technical proposals.
  • Contribute to measurable progress on key team goals: first DWH export customers live, first dashboards on DWH, or first Loomi Agent improvements backed by new data interfaces.

You have the following experience and qualities:

Professional experience

  • Solid data engineering background with strong SQL and data modeling skills (star/snowflake schemas, slowly changing dimensions, partitioning/clustering, etc.).
  • Hands-on experience building production-grade data pipelines on GCP, ideally involving BigQuery, Apache Iceberg, Apache Spark on DataProc, and Airflow (Cloud Composer).
  • Experience with orchestration and workflow tools — specifically Airflow / Cloud Composer — and comfort working with DAG-based systems for scheduled and event-driven jobs.
  • Familiarity with open table formats (Iceberg preferred, Delta Lake / Hudi acceptable) and how they interact with query engines and DWH platforms.
  • Strong programming skills in Python
+ Show Original Job Post
























Senior Software Engineer, Datacraft - Remote Eligible
Bratislava, Region of Bratislava, Slovakia
€4,250 EUR / month
Engineering
About Bloomreach
Delivers an AI-powered digital experience platform for personalized e-commerce search, merchandising, and content optimization across channels.