We’re recruiting for a forward-thinking tech team that’s building one of the most modern cloud analytics platforms in Europe. As a Data Engineer focused on Cloud Data Warehousing, you’ll help shape a scalable data mesh architecture that powers high-impact AI and business intelligence solutions.
You’ll work with terabyte-scale relational data, enabling advanced data services—from deep learning models that predict customer demand to real-time analytics that drive strategic decisions.
Hybrid Program – Bucharest | GCP Ecosystem | AI & Analytics-Driven
Ready to turn data into decisions? Apply now and let’s build the future together.
What You’ll Do:
- Build and optimize data pipelines in Google Cloud Platform using BigQuery, Dataflow, Python, and Kubernetes.
- Design and evolve a custom data management framework with layered architecture (data vault + business layer).
- Collaborate with Data Scientists, Analysts, and fellow Engineers to deliver clean, scalable, and business-ready data.
- Contribute to a self-built architecture using Google APIs/SDKs, GitLab CI/CD, and modern DevOps practices.
- Support streaming and messaging integrations (Pub/Sub, Kafka) for real-time data flows.
- Ensure high data quality and service reliability across the mesh and stakeholder-facing layers.
Your Profile:
- 5+ years of experience in data engineering within a cloud environment (GCP preferred).
- Strong Python skills and deep understanding of database architectures (MPP experience is a plus).
- Solid background in relational data management and data warehousing at scale.
- Familiarity with streaming/messaging tools (Pub/Sub, Kafka).
- Analytical mindset with a structured, solution-oriented approach.
- Agile, proactive, and accountable in your work style.
- Fluent in English; conversational German (A2+) is a bonus.