At a Glance
- Tasks: Lead the Data Engineering team to build scalable data pipelines and collaborate with data scientists.
- Company: Join easyJet, the UK's largest airline, connecting millions across Europe with a friendly service.
- Benefits: Enjoy competitive salary, bonus, 25 days holiday, flexible benefits, and excellent staff travel perks.
- Why this job: Be part of an exciting data transformation in a fast-paced environment with a supportive culture.
- Qualifications: Significant experience in data engineering, Python, SQL, and cloud-based systems required.
- Other info: Embrace individuality and inclusivity while working in a dynamic team atmosphere.
The predicted salary is between 43200 - 72000 £ per year.
When it comes to innovation and achievement there are few organisations with a better track record. Join us and you’ll be able to play a big part in the success of our highly successful, fast-paced business that opens up Europe so that people can exercise their get-up-and-go. With almost 300 aircraft flying over 1,000 routes to more than 32 countries, we’re the UK’s largest airline, the fourth largest in Europe and the tenth largest in the world. Set to fly more than 90 million passengers this year, we employ over 10,000 people. It’s big-scale stuff and we’re still growing.
With a big investment into Databricks, and with a large amount of interesting data, this is the chance for you to come and be part of an exciting transformation in the way we store, analyse and use data in a fast-paced organisation. You will join as a Senior Platform Data Engineer providing technical leadership to the Data Engineering team. You will work closely with our Data Scientists and business stakeholders to ensure value is delivered through our solutions.
Job Accountabilities
- Develop robust, scalable data pipelines to serve the easyJet analyst and data science community.
- Highly competent hands-on experience with relevant Data Engineering technologies, such as Databricks, Spark, Spark API, Python, SQL Server, Scala.
- Work with data scientists, machine learning engineers and DevOps engineers to develop, develop and deploy machine learning models and algorithms aimed at addressing specific business challenges and opportunities.
- Coach and mentor the team (including contractors) to improve development standards.
- Work with Business Analysts to deliver against requirements and realise business benefits.
- Build a documentation library and data catalogue for developed code/products.
- Oversight of project deliverables and code quality going into each release.
Key Skills Required
- Technical Ability: has a high level of current, technical competence in relevant technologies, and be able to independently learn new technologies and techniques as our stack changes.
- Clear communication: can communicate effectively in both written and verbal forms with technical and non-technical audiences alike.
- Complex problem-solving ability: structured, organised, process-driven and outcome-oriented. Able to use historical experiences to help with future innovations.
- Passionate about data: enjoy being hands-on and learning about new technologies, particularly in the data field.
- Self-directed and independent: able to take general guidance and the overarching data strategy and identify practical steps to take.
Technical Skills Required
- Significant experience designing and building data solutions on a cloud-based, big data distributed system.
- Hands-on software development experience with Python and experience with modern software development and release engineering practices (e.g. TDD, CI/CD), and software deployment automation with GitHub actions or Azure DevOps.
- Experience in testing automation of data transformation pipelines, using frameworks like Pytest or dbt Unit Test.
- Comfortable writing efficient SQL and debugging.
- Data warehouse operations and tuning experience in schema evolution, indexing, partitioning.
- Hands-on IaC development experience with Terraform or CloudFormation.
- Understanding of ML development workflow and knowledge of when and how to use dedicated hardware.
- Significant experience with Apache Spark or any other distributed data programming frameworks (e.g. Flink, Hadoop, Beam).
- Familiarity with Databricks as a data and AI platform or the Lakehouse Architecture.
- Experience with data quality and/or data lineage frameworks like Great Expectations, dbt data quality, OpenLineage or Marquez, and data drift detection and alerting.
- Understanding of Data Management principles (security and data privacy) and how they can be applied to Data Engineering processes/solutions (e.g. access management, data privacy, handling of sensitive data (e.g. GDPR).
Desirable Skills
- Experience in event-driven architecture, ingesting data in real time in a commercial production environment with Spark Streaming, Kafka, DLT or Beam.
- Understanding of the challenges faced in the design and development of a streaming data pipeline and the different options for processing unbounded data (pubsub, message queues, event streaming etc).
- Understanding of the most commonly used Data Science and Machine Learning models, libraries and frameworks.
- Knowledge of the development lifecycle of analytical solutions using visualisation tools (e.g. Tableau, PowerBI, ThoughtSpot).
- Hands-on development experience in an airline, e-commerce or retail industry.
- Worked within the AWS cloud ecosystem.
- Experience of building a data transformation framework with dbt.
What you’ll get in return
- Competitive base salary
- Up to 20% bonus
- 25 days holiday
- BAYE, SAYE & Performance share schemes
- 7% pension
- Life Insurance
- Work Away Scheme
- Flexible benefits package
- Excellent staff travel benefits
At easyJet our aim is to make low-cost travel easy – connecting people to what they value using Europe’s best airline network, great value fares, and friendly service. It takes a real team effort to carry over 90 million passengers a year across 35 countries. Whether you’re working as part of our front-line operations or in our corporate functions, you’ll find people that are positive, inclusive, ready to take on a challenge, and that have your back. We call that our ‘Orange Spirit’, and we hope you’ll share that too.
We encourage individuality, empower our people to seize the initiative, and never stop learning. We see people first and foremost for their performance and potential and we are committed to building a diverse and inclusive organisation that supports the needs of all. As such we will make reasonable adjustments at interview through to employment for our candidates.
Senior Data Platform Engineer employer: Job Traffic
Contact Detail:
Job Traffic Recruiting Team
StudySmarter Expert Advice 🤫
We think this is how you could land Senior Data Platform Engineer
✨Tip Number 1
Familiarise yourself with the specific technologies mentioned in the job description, such as Databricks, Spark, and Python. Having hands-on experience or projects that showcase your skills in these areas will make you stand out during discussions.
✨Tip Number 2
Prepare to discuss your experience with data pipelines and cloud-based systems. Be ready to share examples of how you've designed and built scalable solutions, as this is a key responsibility for the role.
✨Tip Number 3
Highlight your ability to communicate complex technical concepts to non-technical stakeholders. This role requires clear communication, so think of instances where you've successfully bridged the gap between technical teams and business users.
✨Tip Number 4
Show your passion for data by discussing any personal projects or continuous learning efforts related to data engineering. This could include online courses, certifications, or contributions to open-source projects that demonstrate your commitment to staying current in the field.
We think you need these skills to ace Senior Data Platform Engineer
Some tips for your application 🫡
Tailor Your CV: Make sure your CV highlights relevant experience and skills that align with the job description. Focus on your hands-on experience with technologies like Databricks, Spark, and Python, as well as your ability to develop scalable data pipelines.
Craft a Compelling Cover Letter: In your cover letter, express your passion for data and innovation. Mention specific projects or experiences that demonstrate your problem-solving abilities and technical competence. Show how you can contribute to the team at easyJet.
Highlight Team Collaboration: Emphasise your experience working with cross-functional teams, such as data scientists and business analysts. Provide examples of how you've successfully collaborated to deliver data solutions that meet business needs.
Showcase Continuous Learning: Mention any recent courses, certifications, or self-directed learning related to data engineering and cloud technologies. This demonstrates your commitment to staying current in a fast-evolving field and aligns with easyJet's values of empowerment and continuous improvement.
How to prepare for a job interview at Job Traffic
✨Showcase Your Technical Skills
Be prepared to discuss your hands-on experience with relevant technologies like Databricks, Spark, and Python. Highlight specific projects where you've designed and built data solutions, as this will demonstrate your technical competence and ability to contribute to the team.
✨Communicate Clearly
Since the role involves working with both technical and non-technical stakeholders, practice explaining complex concepts in simple terms. This will show that you can bridge the gap between different teams and ensure everyone is on the same page.
✨Demonstrate Problem-Solving Skills
Prepare examples of complex problems you've solved in previous roles. Use the STAR method (Situation, Task, Action, Result) to structure your answers, showcasing your structured and outcome-oriented approach to challenges.
✨Express Your Passion for Data
Share your enthusiasm for data and any recent technologies you've explored. Discuss how you stay updated with industry trends and your eagerness to learn, as this aligns with the company's culture of continuous improvement and innovation.