We are seeking a highly skilled and detail-oriented Senior Data Engineer to join our growing team in Birkirkara.
In this role, you will be a key contributor to build and optimize our data infrastructure, pipelines and analytics systems. You will be responsible for designing, building and maintaining highly scalable and secure ETL/ELT data pipelines to support the needs of analytics, data science and business teams. The ideal candidate has strong technical expertise, problem-solving skills and leadership capabilities to support the development of a scalable and robust data engineering ecosystem.
This role offers a hybrid work setup, providing flexibility to work both remotely and in-office, helping you achieve a balanced professional and personal life.
Responsibilities
- Architect and maintain modern data platforms, warehouses and lakes (e.g., Snowflake, BigQuery, Redshift, Databricks)
- Optimize the storage and retrieval of data and ensure performance and cost efficiency
- Establish processes and systems for monitoring data quality, completeness and reliability
- Automate manual processes and optimize data delivery workflows to reduce latency and improve job reliability
- Implement and maintain Kafka-based streaming data pipelines for real-time data processing and integration with various systems
- Integration to third party databases and APIs
- Continuously refine and improve existing data systems and pipelines for scalability
- Implement monitoring and alerting systems for data pipelines
- Ensure data infrastructure uptime and availability
Requirements
- Minimum 5–8+ years of experience in data engineering or related roles, including experience with large-scale data processing
- Proficiency in programming languages like Python, SQL
- Expertise in building and maintaining ETL/ELT workflows using tools like Apache Airflow
- Hands-on experience with Big Data technologies like Spark, Hadoop and Kafka
- Working experience with version control systems (Git) and CI/CD pipelines
- Fluency in English