Data Engineer (maternity cover)
Warszawa, PL, 00-841
Important things for you:
-
Flexible working hours in the hybrid model (4/1) - working hours start between 7:00 a.m. and 10:00 a.m. We also have 30 days of occasional remote work.
-
The salary range for this position depending on the skill set is as follows (contract of employment, tax-deductible cost):
-
PLN 14 200 - 20 200
-
-
Our team is based in Warsaw.
About the job:
Delivery Experience is one of the fastest-growing areas of Allegro where we undertake new, complex projects to enhance logistics and warehousing processes. We are building technology that makes Allegro's deliveries easy, cost-effective, fast and predictable. Our team takes care of critical services along the Allegro shopping journey, responsible for predicting delivery times using statistical algorithms and machine learning, selecting the best delivery methods tailored to customers, and integrating with carrier companies.
As a contributor in this position, you will help us grow our cutting-edge cloud analytical platforms, establishing the groundwork for the future of logistics intelligence. If you are up for the challenge and would like to build a future of logistics, make sure to apply!
We are looking for people who:
-
Develop and maintain data ingestion and processing pipelines for large volumes of data using BigQuery, Airflow, and dbt.
-
Design and streamline data architecture that powers analytical products.
-
Help drive the adoption of AI in analytics and unlock new possibilities on top of our data mesh.
-
Develop tooling for monitoring and enhancing data quality and integrity.
-
Manage and optimize costs related to data processing and infrastructure.
This is the right job for you if:
-
At least 3 years of experience as a Data Engineer working with large datasets.
-
Expert proficiency in SQL.
-
Strong understanding of data modeling and cloud Data Warehouses (DWH).
-
Hands-on experience with cloud warehouses like BigQuery or Snowflake (BigQuery preferred).
-
Experience in designing and maintaining ETL/ELT processes.
-
Experience working with large datasets in Python, utilizing data processing tools like PySpark.
-
Familiarity with data orchestration tools like Airflow.
-
High degree of autonomy and passion for translating complex business requirements into impactful data solutions.
-
Familiarity with and adherence to good software engineering practices (clean code, code review, CI/CD).
-
Proven capability to optimize cost and efficiency of data processing.
-
Demonstrated ability to leverage AI/ML tooling to enhance data pipelines and analytical products.
-
English proficiency at least B2 level.
What’s in it for you:
-
Well-located offices (with e.g. fully equipped kitchens, bicycle parking, terraces full of greenery) and excellent work tools (e.g., raised desks, ergonomic chairs, interactive conference rooms).
-
A 16" or 14" MacBook Pro and all the necessary accessories.
-
A wide selection of fringe benefits in a cafeteria plan - you choose what you like (e.g., medical, sports or lunch packages, insurance, purchase vouchers).
-
English classes that we pay for related to the specific nature of your job.
-
A training budget, inter-team tourism (see more here), hackathons, and an internal learning platform where you will find multiple trainings.
-
An additional day off for volunteering, which you can use alone, with a team, or with a larger group of people connected by a common goal.
-
Social events for Allegro people - Spin Kilometers, Family Day, Fat Thursday, Advent of Code, and many other occasions we enjoy.And that's just the beginning! You can read more about the benefits here.
#goodtobehere means that:
-
You will join a team you can count on - we work with top-class specialists who have knowledge- and experience-sharing in their DNA.
-
You will love our level of autonomy in team organization, the space for continuous development, and the opportunity to try new things. You get to choose which technology solves the problem and you are responsible for what you create.
-
You will value our Developer Experience and the full platform of tools and technologies that make creating software easier. We rely on an internal ecosystem based on self-service and widely used tools such as Kubernetes, Docker, Consul, GitHub, and GitHub Actions. Thanks to this, you can contribute to Allegro from your very first days on the job.
-
You will be equipped with modern AI tools to automate repetitive tasks, allowing you to focus on developing new services and refining existing ones (also leveraging AI support).
-
You will create solutions that will be used (and loved!) by your friends, family and millions of our customers.
-
You will meet the Allegro Scale, which starts with over 1000 microservices, an open-source data bus (Hermes) with 300K+ rps, a Service Mesh with 1M+ rps, tens of petabytes of data, and production-used machine learning.
-
You will become part of Allegro Tech - We speak at industry conferences, cooperate with tech communities, run our own blog (it's been over 10 years!), record podcasts, lead guilds, and we organize our own internal conference - the Allegro Tech Meeting. We create solutions we love (and can) to talk about!
Send us your CV and… see you at Allegro!