Technology is our how. And people are our why. For over two decades, we have been harnessing technology to drive meaningful change.
By combining world-class engineering, industry expertise and a people-centric mindset, we consult and partner with leading brands from various industries to create dynamic platforms and intelligent digital experiences that drive innovation and transform businesses.
From prototype to real-world impact - be part of a global shift by doing work that matters.
- Proficient in Python and SQL
- Knowledge of good data design principles (concepts such as Kimball and Star Schema), data warehousing concepts and data modelling.
- Strong experience with Relational and ideally NoSQL databases.
- Strong understanding of ETL/ELT processes.
- Exposure to data monitoring and observability tools.
- Relevant experience with Event-Driven Architecture
- Good understanding of Agile ways of working
- Work together with data scientists and analysts to understand the needs for data and create effective data pipelines
- Solution design, implement, and maintain data pipelines for data ingestion, processing, and transformation using could services and other technologies
- Create and maintain data solutions IN Azure, using any of the following: Data Factory, Synapse, Databricks/Spark, Polars/Pandas, Fabric
- Implementing data validation and cleansing procedures will ensure the quality, integrity, and dependability of the data.
- Improve the scalability, efficiency, and cost-effectiveness of data pipelines.
- Monitoring and resolving data pipeline problems, ensuring consistency and availability of the data.