At a Glance
- Tasks: Design and implement data pipelines in AWS for airline customer and revenue products.
- Company: Join a leading company transforming the airline industry with data-driven solutions.
- Benefits: Attractive salary, flexible working options, and opportunities for professional growth.
- Why this job: Make a significant impact by building scalable data solutions in a dynamic environment.
- Qualifications: 7+ years in data engineering, expert in Python and SQL, AWS experience required.
- Other info: Exciting projects with potential for career advancement in a fast-paced industry.
The predicted salary is between 54000 - 84000 Β£ per year.
Our client seeks Senior Data Engineers to design and implement production-grade data pipelines in AWS lakehouse supporting Customer, Revenue, and Operations initiatives.
Responsibilities:
- Build scalable data ingestion pipelines from external sources
- Implement CDC patterns for operational system integration
- Develop PySpark transformations for Customer and Revenue products
- Optimize existing Redshift/Matillion workloads
- Create automated data quality monitoring and alerting
- Support Revenue production deployment
Must Haves:
- 7+ years building production data pipelines in cloud environments
- Expert-level Python and SQL for data transformation
- Hands-on experience with AWS Glue, EMR, Lambda, and Step Functions
- Strong background in Spark/PySpark for large-scale data processing
- Experience building CDC pipelines and real-time data ingestion
- Track record delivering data products from ingestion to consumption
Nice to Haves:
- Airline industry experience with reservation, revenue, or operational data
- Infrastructure as Code (Terraform/CloudFormation)
- Experience with streaming platforms (Kafka, Kinesis)
Senior Data Engineer (AWS/Python) to build lakehouse pipelines and revenue data products in London employer: S.i. Systems
Contact Detail:
S.i. Systems Recruiting Team
StudySmarter Expert Advice π€«
We think this is how you could land Senior Data Engineer (AWS/Python) to build lakehouse pipelines and revenue data products in London
β¨Tip Number 1
Network like a pro! Reach out to your connections in the data engineering field, especially those who work with AWS and Python. A friendly chat can lead to insider info about job openings that aren't even advertised yet.
β¨Tip Number 2
Show off your skills! Create a portfolio showcasing your best projects, especially those involving lakehouse pipelines or revenue data products. This gives potential employers a taste of what you can do and sets you apart from the crowd.
β¨Tip Number 3
Prepare for technical interviews by brushing up on your Python and SQL skills. Practice coding challenges related to data transformation and pipeline building. We all know that confidence in your abilities can make a huge difference!
β¨Tip Number 4
Donβt forget to apply through our website! Itβs the best way to ensure your application gets seen by the right people. Plus, we love seeing candidates who are proactive and engaged in their job search.
We think you need these skills to ace Senior Data Engineer (AWS/Python) to build lakehouse pipelines and revenue data products in London
Some tips for your application π«‘
Tailor Your CV: Make sure your CV highlights your experience with AWS, Python, and data pipelines. We want to see how your skills match the job description, so donβt be shy about showcasing your relevant projects!
Craft a Compelling Cover Letter: Your cover letter is your chance to shine! Use it to explain why youβre passionate about building lakehouse pipelines and how your background makes you the perfect fit for our team. Keep it engaging and personal!
Showcase Your Projects: If you've worked on any cool data projects, especially in the airline industry or using tools like PySpark, make sure to mention them. We love seeing real-world applications of your skills!
Apply Through Our Website: We encourage you to apply directly through our website. Itβs the best way for us to receive your application and ensures you donβt miss out on any important updates from our team!
How to prepare for a job interview at S.i. Systems
β¨Know Your Tech Inside Out
Make sure youβre well-versed in AWS services like Glue, EMR, and Lambda. Brush up on your Python and SQL skills, especially around data transformation. Be ready to discuss specific projects where you've built production-grade data pipelines.
β¨Showcase Your Problem-Solving Skills
Prepare to talk about challenges you've faced while building scalable data ingestion pipelines. Use the STAR method (Situation, Task, Action, Result) to structure your answers and highlight your experience with CDC patterns and real-time data ingestion.
β¨Demonstrate Your Industry Knowledge
If you have experience in the airline industry, be sure to mention it! Discuss how your background with reservation, revenue, or operational data can add value to their team. If not, research the industry trends and challenges to show your interest.
β¨Ask Insightful Questions
Prepare thoughtful questions about the companyβs data architecture and future projects. Inquire about their use of tools like Matillion and Redshift, and how they approach data quality monitoring. This shows your enthusiasm and helps you gauge if the role is a good fit for you.