Senior Data Engineer - MarAdTech Advertising
Warszawa, PL, 00-841
The salary range for this position is (contract of employment): 18 400 - 25 410 PLN in gross terms
A hybrid work model requires 1 day a week in the office
We are looking for a Senior Data Engineer with a focus on the data processing and preparation, deployment and maintenance of our data projects. Join our team to enhance your skills related to deploying data-based processes, data ops approaches and share the skills within the team.
Your main responsibilities:
- You will be actively responsible for developing and maintaining processes for handling large volumes of data.
- You will be streamlining and developing the data architecture that powers analytical products and work along a team of experienced analysts.
- You will be monitoring and enhancing quality and integrity of the data.
- You will manage and optimize costs related to our data infrastructure and data processing on GCP.
We are looking for people who:
- You have at least 5 years of experience as Data Engineer and working with large datasets.
- You have experience with cloud providers (GCP preferred).
- You are highly proficient in SQL.
- You have strong understanding of data modeling and cloud DWH architecture.
- You have experience in designing and maintaining ETL/ELT processes.
- You are capable of optimizing cost and efficiency of data processing.
- You are proficient in Python for working with large data sets (using PySpark or Airflow).
- You use good practices (clean code, code review, CI/CD).
- You have a high degree of autonomy and take responsibility for developed solutions.
- You have English proficiency on at least B2 level.
- You like to share knowledge with other team members.
We offer:
- Big Data is not an empty slogan for us, but a reality - you will be working on really big datasets (petabytes of data).
- You will have a real impact on the direction of product development and technology choices. We utilize the latest and best available technologies, as we select them according to our own needs.
- Our tech stack includes: GCP, BigQuery, (Py)Spark, Airflow.
- We are a close -knit team where we work well together.
- You will have the opportunity to work within a team of experienced engineers and big data specialists who are eager to share their knowledge, including publicly through allegro.tech