At a Glance
- Tasks: Build and maintain data pipelines that power Bauer’s audio business across Europe.
- Company: Join Bauer Media Audio's innovative Data & Decision Sciences team.
- Benefits: Hybrid work model, competitive salary, and opportunities for professional growth.
- Why this job: Make a real impact on millions of listeners with your data engineering skills.
- Qualifications: Strong Python and SQL skills, plus 2+ years in data/analytics engineering roles.
- Other info: Collaborative environment with mentorship opportunities and a focus on learning.
The predicted salary is between 36000 - 60000 £ per year.
Our Team: How we enrich everyday life
You’ll be joining Bauer Media Audio’s Data & Decision Sciences (DDS) team—a collaborative, cross-functional unit at the heart of our business. Our mission is to leverage data as a strategic enabler across nine European markets, delivering trusted insights and robust data solutions that drive growth, enhance audience engagement, and improve operational efficiency. We work closely with stakeholders across all domains, combining centralized capabilities with localized expertise to ensure data delivers real business value.
The Difference you will make
As a Mid-Level Data Engineer, you’ll play a key role in building and maintaining data pipelines that power Bauer’s audio business across multiple markets. You’ll work within our Joint Capability Team (JCT) to implement the CUBE architecture, develop ETL/ELT processes, and support critical business initiatives. Your work will enable meaningful dashboards, analytics, and data-driven decisions that impact millions of listeners.
Your role
You will design, build, and maintain reliable data pipelines and collaborate with analytics engineers and business stakeholders to deliver impactful solutions. Responsibilities include but are not limited to:
- Build and maintain data pipelines moving data from source systems into S3 and curated layers using Python and Airflow.
- Deliver new ingestion and transformation pipelines for Snowflake while supporting Redshift and BigQuery workloads.
- Develop and maintain dbt models and tests to support BI and analytics use cases.
- Implement and enhance data quality checks and monitoring in Airflow.
- Collaborate with senior engineers on ingestion patterns and migration approaches.
- Participate in Scrum ceremonies, manage work through Jira, and maintain documentation in Confluence.
- Use GitHub for branching, pull requests, and code reviews.
- Engage with business stakeholders to ensure pipelines enable meaningful analytics.
- Support and mentor junior engineers, sharing best practices.
The Skills you will bring
Technical must-haves:
- Strong Python and SQL skills.
- Experience with Airflow for workflow orchestration.
- Proven track record delivering end-to-end data pipelines in a cloud environment.
- Hands-on experience with dbt and AWS.
- Familiarity with Git-based workflows and CI/CD practices.
- Experience implementing data quality checks and monitoring.
- 2+ years in data/analytics engineering roles.
Nice-to-haves:
- Experience with Snowflake, Great Expectations, or dbt-expectations.
- Exposure to GCP or Azure.
- Familiarity with Terraform and semantic layers.
Behavioural:
- Clear communicator who engages with technical and business stakeholders.
- Ownership mindset—focused on outcomes, not just tasks.
- Collaborative team player with a documentation-first approach.
- Adaptable and curious, eager to learn new tools and approaches.
Working Pattern/Location
Hybrid role based in London.
Data Engineer in London employer: Bauer Media Group
Contact Detail:
Bauer Media Group Recruiting Team
StudySmarter Expert Advice 🤫
We think this is how you could land Data Engineer in London
✨Network Like a Pro
Get out there and connect with people in the industry! Attend meetups, webinars, or even just grab a coffee with someone who works at Bauer Media Audio. Building relationships can open doors that a CV just can't.
✨Show Off Your Skills
When you get the chance to chat with potential employers, don’t hold back! Share your experiences with Python, SQL, and any cool projects you've worked on. Let them see how you can bring value to their data pipelines.
✨Ask Smart Questions
During interviews, come prepared with questions that show you understand their business and the role. Ask about their data challenges or how they use analytics to drive decisions. This shows you're genuinely interested and engaged!
✨Apply Through Our Website
Don’t forget to apply through our website! It’s the best way to ensure your application gets seen by the right people. Plus, it shows you’re serious about joining the team at Bauer Media Audio.
We think you need these skills to ace Data Engineer in London
Some tips for your application 🫡
Tailor Your CV: Make sure your CV reflects the skills and experiences that align with the Data Engineer role. Highlight your Python, SQL, and Airflow expertise, and don’t forget to mention any cloud experience you have!
Craft a Compelling Cover Letter: Use your cover letter to tell us why you're passionate about data engineering and how you can contribute to our mission at Bauer Media Audio. Be specific about your past projects and how they relate to the responsibilities listed in the job description.
Showcase Your Projects: If you've worked on relevant projects, whether in a professional or personal capacity, make sure to include them. We love seeing real-world applications of your skills, especially those involving data pipelines and analytics.
Apply Through Our Website: We encourage you to apply directly through our website. It’s the best way for us to receive your application and ensures you’re considered for the role. Plus, it shows you’re keen on joining our team!
How to prepare for a job interview at Bauer Media Group
✨Know Your Tech Inside Out
Make sure you brush up on your Python and SQL skills before the interview. Be ready to discuss your experience with Airflow, dbt, and cloud environments like AWS. They’ll likely ask you to explain how you've built and maintained data pipelines in the past, so have some specific examples at the ready.
✨Showcase Your Collaboration Skills
Since this role involves working closely with analytics engineers and business stakeholders, be prepared to talk about your teamwork experiences. Share examples of how you've engaged with others to deliver impactful solutions and how you’ve mentored junior engineers. This will highlight your collaborative spirit and ownership mindset.
✨Demonstrate Your Problem-Solving Abilities
Expect questions that assess your ability to tackle challenges, especially around data quality checks and monitoring. Think of scenarios where you identified issues in data pipelines and how you resolved them. This will show that you’re adaptable and curious, which is exactly what they’re looking for.
✨Familiarise Yourself with Their Tools
Before the interview, take some time to learn about the tools mentioned in the job description, like Snowflake, Redshift, and BigQuery. If you have any experience with Terraform or semantic layers, be sure to mention it. Showing that you’ve done your homework will impress them and demonstrate your genuine interest in the role.