At a Glance
- Tasks: Build and support telemetry pipelines using modern tools like Kafka and OpenShift.
- Company: Forward-thinking tech company in Leeds with a focus on innovation.
- Benefits: Competitive daily rate, contract flexibility, and opportunities for impactful work.
- Why this job: Make a real impact on platform reliability while tackling complex engineering challenges.
- Qualifications: Hands-on experience with Kafka, OpenShift, and strong Python skills.
- Other info: Collaborative environment with opportunities for professional growth.
The predicted salary is between 30000 - 45000 £ per year.
Are you ready to take your Data Engineering skills to the next level? Elevation Recruitment Group are working with a forward-thinking tech company in Leeds, who is looking for a Data Engineer to help develop and scale large telemetry pipelines across OpenShift and Kafka. This is your chance to work with modern tools, solve meaningful engineering problems, and make a real impact on platform reliability and insight.
What you'll be doing:
- Building and supporting telemetry pipelines for metrics, logs, and traces
- Developing Kafka producers and consumers, managing schemas and topics
- Contributing to resilient streaming and enrichment services
- Integrating telemetry into Splunk for dashboards and alerting
- Implementing OpenTelemetry for tracing, metrics, and structured logging
- Helping shape schema governance with Avro/Protobuf
- Collaborating with Platform & SRE teams to improve observability and SLOs
Key Skills & Experience:
- Hands-on Kafka experience (producers/consumers, schema registry)
- Knowledge of OpenShift/Kubernetes telemetry (Otel, Prometheus)
- Experience sending data into Splunk
- Strong Python skills for data processing and validation
- Curious, collaborative, and eager to tackle complex observability challenges
If you're a Data Engineer ready to make an impact, apply today and let's get you on the path to challenging, rewarding work.
Data Engineer in Leeds employer: Parkinson Lee
Contact Detail:
Parkinson Lee Recruiting Team
StudySmarter Expert Advice 🤫
We think this is how you could land Data Engineer in Leeds
✨Tip Number 1
Network like a pro! Reach out to your connections in the tech industry, especially those who work with data engineering. A friendly chat can lead to insider info about job openings that aren't even advertised yet.
✨Tip Number 2
Show off your skills! Create a portfolio showcasing your projects, especially those involving Kafka and OpenShift. This is your chance to demonstrate your hands-on experience and problem-solving abilities in a way that a CV just can't.
✨Tip Number 3
Prepare for interviews by brushing up on common data engineering challenges. Be ready to discuss how you've tackled similar issues in the past, especially around telemetry pipelines and observability. Confidence is key!
✨Tip Number 4
Don't forget to apply through our website! We make it easy for you to find roles that match your skills and interests. Plus, it shows you're serious about joining our team and making an impact in the data engineering space.
We think you need these skills to ace Data Engineer in Leeds
Some tips for your application 🫡
Tailor Your CV: Make sure your CV highlights your hands-on experience with Kafka and OpenShift. We want to see how your skills align with the role, so don’t be shy about showcasing relevant projects or achievements!
Craft a Compelling Cover Letter: Your cover letter is your chance to shine! Use it to explain why you’re excited about this Data Engineer position and how you can contribute to our team. Keep it concise but impactful!
Showcase Your Problem-Solving Skills: In your application, mention specific challenges you've tackled in data engineering. We love hearing about how you’ve solved complex problems, especially those related to observability and telemetry.
Apply Through Our Website: We encourage you to apply directly through our website. It’s the best way for us to receive your application and ensures you don’t miss out on any important updates from our team!
How to prepare for a job interview at Parkinson Lee
✨Know Your Tech Stack
Make sure you brush up on your Kafka and OpenShift knowledge before the interview. Be ready to discuss your hands-on experience with producers, consumers, and schema management. This will show that you're not just familiar with the tools but can also apply them effectively.
✨Showcase Your Problem-Solving Skills
Prepare to share specific examples of how you've tackled complex engineering problems in the past. Think about challenges related to telemetry pipelines or observability issues you've faced and how you resolved them. This will demonstrate your ability to think critically and work collaboratively.
✨Understand the Company’s Goals
Research the tech company and understand their mission and values. Knowing how they approach platform reliability and insight will help you align your answers with their goals. It shows genuine interest and helps you articulate how you can contribute to their success.
✨Ask Insightful Questions
Prepare a few thoughtful questions to ask at the end of the interview. Inquire about their current projects involving telemetry or how they measure SLOs. This not only shows your enthusiasm for the role but also gives you a better understanding of what to expect if you join their team.