We are seeking a Senior Data Engineer for our business partner, a global company undergoing a major digital transformation. They are building a modern, cloud-based data platform designed to power smarter decisions, operational efficiency, and scalable growth across all business areas.
In this role, you will contribute across Architecture & Strategy, Data Modeling & Engineering, Governance & Quality, and Collaboration & Enablement, shaping the data platform, building trusted datasets, and delivering meaningful results alongside the Head of Data & BI.
Location: Bucharest, Victoriei | B2B or Employment Contract | Model: Remote
If you want to shape a modern data platform and make a direct impact on strategic decisions, we’d love to hear from you!
Key Responsibilities
- Drive impact across Architecture & Strategy by shaping how our Microsoft Fabric/Azure data platform evolves, setting practical standards for ingestion, storage, modeling, and security, and bringing ideas that influence the BI direction.
- Contribute to Data Modeling & Engineering by designing and refining pipelines for all types of data, building intuitive star schemas and semantic layers, using SQL and Python (pandas, PySpark, APIs) for efficient transformations, and supporting automation through CI/CD and modern DataOps practices.
- Strengthen Governance & Quality by developing monitoring and observability frameworks that ensure data is fresh, accurate, and traceable, while maintaining consistency across systems so teams can rely on a single source of truth.
- Accelerate delivery through Collaboration & Enablement by partnering with BI Developers on semantic models and Power BI datasets, supporting the transition from SSRS to Power BI, and simplifying integrations to reduce technical debt and create a cleaner, more scalable data landscape.
Must‑Haves
- At least 5 years of experience in data engineering or related technical roles.
- Expertise with cloud data platforms (Azure preferred; AWS/GCP acceptable).
- Strong data modeling capabilities, including star schemas and semantic layer design.
- Advanced SQL and Python skills for large‑scale data processing.
- Hands‑on experience with ETL/ELT tools such as ADF, Fabric Pipelines, Synapse, or Databricks.
- Solid understanding of data governance, data quality, and master data management principles.
- Experience with Git and CI/CD workflows.
- Excellent communication and cross‑functional collaboration skills.