EPAM is a leading global provider of digital platform engineering and development services. We are committed to having a positive impact on our customers, our employees, and our communities. We embrace a dynamic and inclusive culture. Here you will collaborate with multi-national teams, contribute to a myriad of innovative projects that deliver the most creative and cutting-edge solutions, and have an opportunity to continuously learn and grow. No matter where you are located, you will join a dedicated, creative, and diverse community that will help you discover your fullest potential.
Join our software and system engineering group as a Senior Data Software Engineer working primarily with Google Cloud Platform technologies.
You will develop Python projects and manage data workflows in a big data environment, focusing on data extraction, transformation, and loading with BigQuery. If you have a passion for cloud services and infrastructure as code, we encourage you to apply.
Responsibilities
- Develop and maintain Python projects related to data engineering
- Design and implement ETL pipelines for data extraction, transformation, and loading
- Create and optimize data models in BigQuery for efficient querying
- Deploy and manage applications on Google Cloud Run
- Implement infrastructure as code using Terraform
- Collaborate with cross-functional teams to adopt new GCP services such as Dataproc, Dataflow, and Composer
- Monitor and troubleshoot data workflows and cloud infrastructure
- Ensure data quality and integrity throughout processing stages
- Document solutions and share knowledge with team members
- Stay current with emerging GCP technologies relevant to data engineering
- Support continuous improvement of data processes and tools
- Participate in code reviews and maintain coding standards
- Assist in planning and scoping of new data projects
- Provide technical guidance to junior team members
Requirements
- Proven experience as a Data Engineer with at least 3 years in software development
- Strong knowledge of Google Cloud Platform, including BigQuery and Cloud Run
- Competency in Python programming for data processing tasks
- Experience with infrastructure as code, preferably Terraform
- Background in designing and implementing ETL pipelines
- Understanding of data modeling concepts in big data environments
- Ability to quickly learn and apply new technologies such as Dataproc, Dataflow, and Composer
- Familiarity with cloud-based data processing and orchestration tools
- Good problem-solving skills and attention to detail
- Strong written and verbal English communication skills (B2+)
Nice to have
- Experience with additional GCP services like Dataproc, Dataflow, Composer
- Hands-on practice with container orchestration and serverless deployments
- Knowledge of other infrastructure as code tools
- Understanding of scalable data architectures
- Certifications in Google Cloud technologies
We offer
- We believe that the greatest strength of the company is its people. EPAM is fully committed to help its employees to reach their full potential and achieve their professional goals through continues learning. With this in mind, we would like to introduce to you few of the many opportunities and services which we believe will help you expand your current knowledge:
- Full access to cutting-edge tools and technologies
- Competitive compensation depending on experience and skills
- All-around Social package: professional & soft skills training, medical & family care programs, sports
- Relocation opportunities
- Free English classes
- Unlimited access to LinkedIn learning solutions
- Continuous experience exchange with experts and professionals worldwide
- Friendly team and comfortable working environment
- Engineering, corporate, and social events within and outside the Company
- Flexible working schedule
- Opportunities for self-realization