Wroclaw, PL
Description
Ryanair Labs are currently recruiting for a Data Engineer to join Europe’s Largest Airline Group!
Ryanair Labs is the technology brand of Ryanair. Labs is a state of-the-art digital & IT innovation hub creating Europe's Leading Travel Experience for our customers. The Ryanair platform has over 1 billion visits per year. By joining Ryanair, you will develop cutting edge tech solutions inside Ryanair, transforming aviation for Pilots, Cabin Crew & Ground Ops, as well as driving the tech experience for our customers on Europe’s largest travel website!
The Role
We are looking for a Data Engineer to join a multi-disciplined team working alongside Developers, Designers and Product Owners. This is a hands-on technical role focused on building and maintaining our data platform, working closely with other engineers to deliver reliable data products.
We are continually gathering information on travel-related events and are looking for someone to help us make the most of our cloud-based data systems. The role offers opportunities to work across various business areas including commercial, marketing, engineering, logistics and more, depending on business needs and your own initiative.
Key Responsibilities
- Design and develop automated ELT processes ensuring a high level of data quality and reliability, using tools such as Spark, Apache Airflow, DBT, Python, Databricks, etc.
- Build and maintain batch and real-time data processing using Apache Kafka and AWS services like SNS, SQS, and Kinesis, supporting microservice architectures.
- Maintain the data lake in AWS and Databricks, building well-governed, query-oriented data models for efficient consumption.
- Implement monitoring and observability for the pipelines you work on.
- Collaborate cross-functionally, assisting other teams in identifying, collecting, and processing valuable information for diverse business use cases.
- Support the Data Science team by helping ensure stable, scalable environments for model deployment and experimentation.
- Build and maintain CICD pipelines, following best practices for automatic deployment and consistency.
- Help administer and maintain deployed infrastructure.
- Follow and contribute to engineering standards and best practices across the Data Engineering team.
- Deliver work on time, managing your own priorities in line with team commitments.
Our Tech Stack
- Batch: DBT on Databricks, Python on AWS Batch, Python on AWS Lambdas
- Virtualization: Docker, AWS ECR
- Orchestration: Apache Airflow
- Event-driven: SNS, SQS, DynamoDB, Lambdas, Kinesis, Kafka Streams
- Data Warehouse: Databricks
- Monitoring: Grafana, New Relic
- IaaC: CloudFormation, Terraform
- CICD: Bitbucket, CodeStar, CodeBuild, CodePipeline
- Programming: Python, Scala
Requirements
Skills and Experience
- 3+ years in data engineering or related data processing roles.
- Strong experience working with varied data formats and sources (JSON, CSV, Parquet, APIs, multiple DB engines).
- Hands-on experience with data modelling in Big Data platforms (e.g. Databricks, Glue, Snowflake).
- Familiarity with event-driven architectures and AWS.
- At least 2 years of hands-on experience with AWS, especially:
– Core: IAM, S3, EC2, VPC
– Data Services: EMR, Lambda, Batch, SNS, SQS, DynamoDB, Glue, Athena - Ability to set up and maintain monitoring, alerting, and debugging solutions for your pipelines.
- Experience collaborating with Data Science teams to support production-grade pipelines.
Knowledge
- Solid understanding of computing systems (OS, memory, networks, etc.).
- Proficient in data analysis and interpretation, with a good grasp of statistics and data quality methods.
- Strong SQL and performance tuning skills.
- Good knowledge of Big Data technologies, Spark, and object storage systems.
- Proficiency in Python (and optionally Scala), with strong coding and testing practices.
- Familiarity with Docker; Kubernetes experience is a plus.
- Airflow (or similar) for orchestration.
- Good understanding of IaaC and CICD principles.
- Dashboard and storytelling experience for data visualization.
Other
- Strong communication skills.
- Proactive mindset and collaborative team spirit.
- Comfortable working within an agile team setup.
Benefits
Forms of employment
Contract of employment (permanent contract after trial period)
- Possible hybrid model (2 days from the office weekly)
- Option to participate in trainings and conferences
- Staff travel benefits from day one
- Creative work tax deduction
- Multisport card
- Private health care
- Group insurance scheme
- - - or - - -
B2B
- - -
Other benefits:
- Possibility of taking part in trainings and certifications
- Great chance to meet your colleagues in other offices
- Annual events (i.e. St. Patrick’s Day 🍀)
- Regular social meetings 🍻
- Paid referral system
- New office building surrounded by great dinettes right in the city centre 🌆
Apply today to discuss the role in more detail!
Competencies