At a Glance
- Tasks: Design and maintain scalable data pipelines for telemetry and observability.
- Company: Dynamic tech company based in Sheffield with a focus on innovation.
- Benefits: Competitive daily rate, flexible contract, and opportunity for remote work.
- Why this job: Join a cutting-edge team and enhance platform reliability with your data engineering skills.
- Qualifications: Experience with Kafka, OpenShift, and strong Python skills required.
- Other info: 9+ month contract with excellent career growth potential.
Are you the right candidate for this opportunity? Make sure to read the full description below.
9+ Month Contract based in Sheffield
£395 - £442 per day Inside IR35
BPSS clearance required - candidates must be eligible
My client is seeking a Data Engineer to design and operate large-scale telemetry and observability data pipelines within a modern OpenShift and Kafka ecosystem. This role is central to enabling proactive, Level 4 observability, delivering high-quality metrics, logs, and traces to support platform reliability, operational insight, and automation.
Responsibilities:
- Design, implement and maintain scalable data pipelines to ingest and process OpenShift telemetry (metrics, logs, traces)
- Stream telemetry through Kafka (producers, topics, schemas) and build resilient consumer services for enrichment and transformation
- Engineer multi-tenant observability data models, ensuring data lineage, quality controls and SLAs across streaming layers
- Integrate processed telemetry into Splunk for dashboards, analytics, alerting and operational insights
- Implement schema management and governance using Avro/Protobuf, including versioning and compatibility strategies
- Build automated validation, replay and backfill mechanisms to ensure data reliability and recovery
- Instrument services using OpenTelemetry, standardising tracing, metrics and structured logging
- Apply LLMs to enhance observability, such as query assistance, anomaly summarisation and runbook generation
- Collaborate with Platform, SRE and Application teams to align telemetry, alerts and SLOs
- Ensure pipelines meet security, compliance and best-practice standards
- Produce clear documentation covering data flows, schemas, dashboards and operational runbooks
Skills & Experience:
- Strong hands-on experience building streaming data pipelines with Kafka (producers/consumers, schema registry, Kafka Connect, KSQL/KStreams)
- Experience with OpenShift / Kubernetes telemetry, including OpenTelemetry and Prometheus
- Proven capability integrating telemetry into Splunk (HEC, Universal Forwarders, sourcetypes, CIM, dashboards, alerting)
- Solid data engineering skills in Python (or similar) for ETL/ELT, enrichment and validation
Please apply for immediate interview!
Data Engineer - Contract - 9+ Months in Sheffield employer: CBSbutler Holdings Limited trading as CBSbutler
Contact Detail:
CBSbutler Holdings Limited trading as CBSbutler Recruiting Team
StudySmarter Expert Advice 🤫
We think this is how you could land Data Engineer - Contract - 9+ Months in Sheffield
✨Tip Number 1
Network like a pro! Reach out to your connections in the data engineering field and let them know you're on the lookout for opportunities. You never know who might have a lead or can refer you to a hiring manager.
✨Tip Number 2
Get your hands dirty with some practical projects. Build a portfolio showcasing your skills in designing data pipelines, especially with Kafka and OpenShift. This will not only boost your confidence but also impress potential employers.
✨Tip Number 3
Prepare for interviews by brushing up on common data engineering scenarios. Be ready to discuss how you've tackled challenges in past projects, particularly around telemetry and observability. Practice makes perfect!
✨Tip Number 4
Don't forget to apply through our website! We’ve got loads of opportunities that might just be the perfect fit for you. Plus, it’s a great way to ensure your application gets seen by the right people.
We think you need these skills to ace Data Engineer - Contract - 9+ Months in Sheffield
Some tips for your application 🫡
Read the Job Description Carefully: Before you start your application, make sure to read through the job description thoroughly. We want to see that you understand the role and how your skills align with what we're looking for.
Tailor Your CV: Don’t just send a generic CV! Highlight your experience with data pipelines, Kafka, and OpenShift. We love seeing how your background fits the specific requirements of the Data Engineer role.
Craft a Compelling Cover Letter: Your cover letter is your chance to shine! Use it to explain why you're passionate about this role and how your skills can help us achieve our goals. Keep it concise but impactful!
Apply Through Our Website: We encourage you to apply directly through our website. It’s the best way for us to receive your application and ensures you’re considered for the role. Plus, it’s super easy!
How to prepare for a job interview at CBSbutler Holdings Limited trading as CBSbutler
✨Know Your Tech Stack
Make sure you’re well-versed in the technologies mentioned in the job description, especially Kafka, OpenShift, and Python. Brush up on your experience with streaming data pipelines and be ready to discuss specific projects where you've implemented these technologies.
✨Showcase Your Problem-Solving Skills
Prepare to share examples of how you've tackled challenges in data engineering. Think about times when you had to ensure data reliability or implement schema management. Use the STAR method (Situation, Task, Action, Result) to structure your responses.
✨Understand the Business Impact
Be ready to explain how your work as a Data Engineer contributes to operational insights and platform reliability. This shows that you understand the bigger picture and can align your technical skills with business goals.
✨Ask Insightful Questions
Prepare thoughtful questions about the team’s current projects, challenges they face, and how they measure success. This not only demonstrates your interest in the role but also helps you gauge if the company is the right fit for you.