At a Glance
- Tasks: Build and maintain scalable data pipelines using Python and SQL in a modern cloud environment.
- Company: Join Future, a leading digital media and ecommerce business with a collaborative culture.
- Benefits: Enjoy unlimited annual leave, a bonus scheme, and hybrid working options.
- Other info: Opportunity for career growth and influence in a dynamic tech environment.
- Why this job: Make a real impact on engineering standards and platform direction in a large-scale data environment.
- Qualifications: Strong hands-on Python and SQL experience, with exposure to orchestration tools like Airflow.
The predicted salary is between 50000 - 60000 € per year.
We’re partnered with Future, a leading digital media and ecommerce business behind a portfolio of well known consumer and B2B brands, including Go.Compare, as they continue investing heavily into their modern data platform. They are looking for a hands on Senior Data Engineer to help drive the next phase of their cloud data journey, supporting the migration of legacy workloads into a modern GCP environment while building scalable, reliable data pipelines across the business. This is a strong opportunity for someone who enjoys ownership, solving engineering problems and working in an environment where you can genuinely influence how things are built.
What you’ll be doing:
- Building and maintaining scalable data pipelines using Python and SQL
- Supporting the migration of legacy SQL workloads into GCP
- Working across cloud native tooling and modern orchestration environments
- Improving existing pipelines, automation and platform reliability
- Collaborating directly with stakeholders, analysts and engineers across the business
- Taking ownership of delivery from concept through to production
Tech environment:
- Python
- GCP
- BigQuery
- Cloud Functions / Cloud Run / Dataflow
- Airflow / Composer
- SQL
- dbt / Dataform
- CI/CD
What they’re looking for:
- Strong hands on Python experience within data engineering environments
- Experience building cloud based data pipelines
- Strong SQL skills
- Exposure to orchestration tooling such as Airflow or Composer
- Someone comfortable working autonomously and owning delivery
- Strong communication and stakeholder skills
Experience with AWS is also welcomed if you have worked in similar modern cloud environments.
Why join?
- Large scale data environment with ongoing investment
- Opportunity to influence engineering standards and platform direction
- Modern cloud stack
- Strong collaborative culture
- Unlimited annual leave
- Bonus scheme
- Hybrid working
Please apply for immediate consideration or reach out directly for a confidential discussion.
Senior Data Engineer employer: Sanderson
Future is an exceptional employer, offering a dynamic work environment in Bristol where innovation thrives. With a strong emphasis on employee growth, you will have the opportunity to influence engineering standards and contribute to a modern cloud data platform while enjoying benefits like unlimited annual leave and a collaborative culture that values your input.
StudySmarter Expert Advice🤫
We think this is how you could land Senior Data Engineer
✨Tip Number 1
Network like a pro! Reach out to people in the industry, especially those at Future or similar companies. A friendly chat can sometimes lead to job opportunities that aren’t even advertised.
✨Tip Number 2
Show off your skills! Create a portfolio showcasing your data engineering projects, especially those involving Python and GCP. This gives you a chance to demonstrate your hands-on experience and problem-solving abilities.
✨Tip Number 3
Prepare for interviews by brushing up on your SQL and cloud pipeline knowledge. Be ready to discuss how you've tackled challenges in past projects, as this will show your ownership and engineering mindset.
✨Tip Number 4
Don’t forget to apply through our website! It’s the best way to ensure your application gets seen. Plus, we love seeing candidates who are proactive about their job search.
We think you need these skills to ace Senior Data Engineer
Some tips for your application 🫡
Tailor Your CV:Make sure your CV reflects the skills and experiences that match the Senior Data Engineer role. Highlight your hands-on Python experience and any cloud-based data pipeline projects you've worked on. We want to see how you can contribute to our modern data platform!
Craft a Compelling Cover Letter:Your cover letter is your chance to shine! Use it to explain why you're passionate about data engineering and how your background aligns with our needs. Don’t forget to mention your experience with GCP and orchestration tools like Airflow or Composer.
Showcase Your Problem-Solving Skills:In your application, share specific examples of how you've tackled engineering challenges in the past. We love candidates who take ownership and can demonstrate their ability to improve existing pipelines and platform reliability.
Apply Through Our Website:We encourage you to apply directly through our website for the best chance of getting noticed. It’s the easiest way for us to keep track of your application and ensure it reaches the right people!
How to prepare for a job interview at Sanderson
✨Know Your Tech Stack
Make sure you’re well-versed in the technologies mentioned in the job description, especially Python, SQL, and GCP. Brush up on your experience with data pipelines and orchestration tools like Airflow or Composer, as these will likely come up during the interview.
✨Showcase Your Problem-Solving Skills
Prepare to discuss specific engineering problems you've solved in the past. Think of examples where you took ownership of a project from concept to production, as this aligns perfectly with what they’re looking for in a Senior Data Engineer.
✨Communicate Effectively
Since strong communication skills are key, practice explaining complex technical concepts in simple terms. Be ready to discuss how you’ve collaborated with stakeholders and other teams, as this will demonstrate your ability to work in a collaborative culture.
✨Ask Insightful Questions
Prepare thoughtful questions about their data platform and future projects. This shows your genuine interest in the role and helps you understand how you can contribute to their cloud data journey.