We are looking for an exceptional Big Data Engineer to work with our client's cross-functional team, and join a world-class community of talented experts.
- Develop and provide Big Data end-to-end solutions in the Cloud for major customers in US and Western Europe;
- Use Data Lake and Data Warehouse models according to business needs;
- Create metadata-driven pipelines using various programming languages;
- Participate in code and design reviews to ensure consistency in architecture and design/code practice;
- Code with performance, scalability and usability in mind;
- Work on new tools in leading industry trends, with new and emerging technologies, prototypes and engineering process improvements;
- Work closely with next generation architecture development teams using cutting edge approaches and technologies.
- Cloud expertise with AWS (S3, Redshift, Glue) and/or Azure (ADLS, Synapse, Data Factory, Databricks);
- Strong SQL Skills;
- Knowledge of at least one of these programming languages: Python, Spark, PySpark, Kafka, Scala;
- ETL background;
- Experience with Airflow or Snowflake represents a plus;
- Very good English communication skills (both verbal and written);
- Client communication experience.