Junior Data Engineer (maternity cover)
Warszawa, PL, 00-841
Important things for you:
-
Flexible working hours in the hybrid model (4/1) - working hours start between 7:00 a.m. and 10:00 a.m. We also have 30 days of occasional remote work.
-
The salary range for this position depending on the skill set is as follows (contract of employment, tax-deductible cost):
-
PLN 10 700 - 14 900
-
-
Our team is based in Warsaw.
About the job:
Allegro Pay is the largest fintech in Central Europe – we are growing fast and need engineers who want to learn and develop, while at the same time solving problems related to serving thousands of RPSs. If, like us, you like flexing your mental muscles to solve complex problems and you would be happy to co-create the infrastructure which underpins our solutions, make sure you apply!
In this role, you will be a contributor, helping us expand our modern cloud-based analytical solutions. We embrace challenging and interesting projects and take quality very seriously. Depending on your preference, your position may be more business-oriented or platform-oriented.
What you will do:
-
You will be actively responsible for developing and maintaining processes for handling large volumes of data using state of the art tooling like Snowflake, Airflow and duckdb.
-
You will help drive adoption of AI in analytics, unlocking new possibilities on top of our data mesh. We build and deploy agents for our daily work and to support our products.
-
You will be streamlining and developing the data architecture that powers analytical products and work along a team of experienced analysts.
-
You will be developing tooling for monitoring and enhancing the quality and integrity of the data.
-
You will manage and optimize costs related to our infrastructure and data processing.
This is the right job for you if:
-
You have at least 1 year of experience as a Data Engineer working with large datasets.
-
You are proficient in SQL and have a strong understanding of data modeling and cloud DWH architecture.
-
You have experience with cloud warehouses like Snowflake or BigQuery.
-
You worked with large data sets in Python, using tools like PySpark or Airflow.
-
You are familiar with containerization and CI/CD, ideally with Github Actions.
-
You have a product mindset and you’re not afraid to translate business requirements into software and data solutions.
-
You can confidently use AI tools in your daily work.
-
You know English at a B2 level or higher.
What’s in it for you:
-
Well-located offices (with e.g. fully equipped kitchens, bicycle parking, terraces full of greenery) and excellent work tools (e.g., raised desks, ergonomic chairs, interactive conference rooms).
-
A 16" or 14" MacBook Pro or corresponding Dell with Windows (if you don't like Macs) and all the necessary accessories.
-
A wide selection of fringe benefits in a cafeteria plan - you choose what you like (e.g., medical, sports or lunch packages, insurance, purchase vouchers).
-
English classes that we pay for related to the specific nature of your job.
-
A training budget, inter-team tourism (see more here), hackathons, and an internal learning platform where you will find multiple trainings.
-
An additional day off for volunteering, which you can use alone, with a team, or with a larger group of people connected by a common goal.
-
Social events for Allegro people - Spin Kilometers, Family Day, Fat Thursday, Advent of Code, and many other occasions we enjoy.
And that's just the beginning! You can read more about the benefits here.
#goodtobehere means that:
-
You will join a team you can count on - we work with top-class specialists who have knowledge- and experience-sharing in their DNA.
-
You will love our level of autonomy in team organization, the space for continuous development, and the opportunity to try new things. You get to choose which technology solves the problem and you are responsible for what you create.
-
You will value our Developer Experience and the full platform of tools and technologies that make creating software easier. We rely on an internal ecosystem based on self-service and widely used tools such as Kubernetes, Docker, Consul, GitHub, and GitHub Actions. Thanks to this, you can contribute to Allegro from your very first days on the job.
-
You will be equipped with modern AI tools to automate repetitive tasks, allowing you to focus on developing new services and refining existing ones (also leveraging AI support).
-
You will create solutions that will be used (and loved!) by your friends, family and millions of our customers.
-
You will meet the Allegro Scale, which starts with over 1000 microservices, an open-source data bus (Hermes) with 300K+ rps, a Service Mesh with 1M+ rps, tens of petabytes of data, and production-used machine learning.
-
You will become part of Allegro Tech - We speak at industry conferences, cooperate with tech communities, run our own blog (it's been over 10 years!), record podcasts, lead guilds, and we organize our own internal conference - the Allegro Tech Meeting. We create solutions we love (and can) to talk about!
Send us your CV and… see you at Allegro!