At a Glance
- Tasks: Design and maintain data pipelines for processing OpenShift telemetry at scale.
- Company: Tech-forward company focused on enhancing observability and delivering high-quality products.
- Benefits: Competitive benefits, supportive culture, and opportunities for professional development.
- Why this job: Join a dynamic team and make an impact in data engineering and observability.
- Qualifications: Experience with Kafka, Python, and strong problem-solving skills required.
- Other info: Collaborative environment with a focus on security and best practices.
The predicted salary is between 48000 - 72000 £ per year.
Requirements
- Must have:
- Hands-on experience building streaming data pipelines with Kafka (producers/consumers, schema registry, Kafka Connect/KSQL/KStream)
- Proficiency with OpenShift/Kubernetes telemetry (OpenTelemetry, Prometheus) and CLI tooling
- Experience integrating telemetry into Splunk (HEC, UF, sourcetypes, CIM), building dashboards and alerting
- Strong data engineering skills in Python (or similar) for ETL/ELT, enrichment, and validation
- Knowledge of event schemas (Avro/Protobuf/JSON), contracts, and backward/forward compatibility
- Familiarity with observability standards and practices; ability to drive toward Level 4 maturity (proactive monitoring, automated insights)
- Understanding of hybrid cloud and multi-cluster telemetry patterns
- Knowledge of security and compliance for data pipelines: secret management, RBAC, encryption in transit/at rest
- Good problem-solving skills and ability to work in a collaborative team environment
- Strong communication and documentation skills
Responsibilities
- Design, implement, and maintain data pipelines to ingest and process OpenShift telemetry (metrics, logs, traces) at scale
- Stream OpenShift telemetry via Kafka (producers, topics, schemas) and build resilient consumer services for transformation and enrichment
- Engineer data models and routing for multi-tenant observability; ensure lineage, quality, and SLAs across the stream layer
- Integrate processed telemetry into Splunk for visualization, dashboards, alerting, and analytics to achieve Observability Level 4 (proactive insights)
- Implement schema management (Avro/Protobuf), governance, and versioning for telemetry events
- Build automated validation, replay, and backfill mechanisms for data reliability and recovery
- Instrument services with OpenTelemetry; standardize tracing, metrics, and structured logging across platforms
- Use LLMs to enhance observability capabilities (e.g., query assistance, anomaly summarization, runbook generation)
- Collaborate with platform, SRE, and application teams to integrate telemetry, alerts, and SLOs
- Ensure security, compliance, and best practices for data pipelines and observability platforms
- Document data flows, schemas, dashboards, and operational runbooks
Company
We are a technology-forward company committed to enhancing our observability capabilities and enabling our teams to deliver high-quality products. Our focus is on building and maintaining scalable data pipelines that ensure reliability and compliance in processing telemetry data. As part of our dynamic team, you will have the opportunity to work alongside skilled platform, SRE, and application teams in a collaborative environment, with a strong emphasis on security and best practices. We offer competitive benefits and a supportive work culture, allowing for professional development and career growth.
Data Engineer Lead in Sheffield employer: Infoplus Technologies UK Ltd
Contact Detail:
Infoplus Technologies UK Ltd Recruiting Team
StudySmarter Expert Advice 🤫
We think this is how you could land Data Engineer Lead in Sheffield
✨Tip Number 1
Network like a pro! Attend industry meetups, webinars, or conferences related to data engineering. It's a great way to connect with potential employers and learn about job openings that might not be advertised.
✨Tip Number 2
Show off your skills! Create a portfolio showcasing your projects, especially those involving Kafka, OpenShift, and Python. This gives you a chance to demonstrate your hands-on experience and problem-solving abilities.
✨Tip Number 3
Don’t just apply blindly! Tailor your approach by researching the companies you're interested in. Understand their tech stack and culture, and mention how your experience aligns with their needs when you reach out.
✨Tip Number 4
Use our website to apply! We’re always on the lookout for talented individuals like you. Applying directly through our site can give you an edge, as it shows your genuine interest in joining our team.
We think you need these skills to ace Data Engineer Lead in Sheffield
Some tips for your application 🫡
Tailor Your CV: Make sure your CV highlights your hands-on experience with streaming data pipelines and tools like Kafka. We want to see how your skills align with our needs, so don’t be shy about showcasing relevant projects!
Showcase Your Projects: Include specific examples of your work with OpenShift, Kubernetes, and telemetry integration. We love seeing real-world applications of your skills, especially if you’ve built dashboards or alerting systems in Splunk.
Communicate Clearly: Strong communication is key! When writing your application, be clear and concise about your experiences and how they relate to the role. We appreciate well-documented applications that reflect your attention to detail.
Apply Through Our Website: Don’t forget to apply through our website! It’s the best way for us to receive your application and ensures you’re considered for the role. We can’t wait to see what you bring to the table!
How to prepare for a job interview at Infoplus Technologies UK Ltd
✨Know Your Tech Stack
Make sure you’re well-versed in the technologies mentioned in the job description, especially Kafka, OpenShift, and Python. Brush up on your hands-on experience with streaming data pipelines and be ready to discuss specific projects where you've implemented these tools.
✨Showcase Your Problem-Solving Skills
Prepare to share examples of how you've tackled complex data engineering challenges. Think about situations where you had to ensure data quality or implement schema management, and be ready to explain your thought process and the outcomes.
✨Communicate Clearly
Strong communication skills are key for this role. Practice explaining technical concepts in a way that’s easy to understand. You might be asked to describe how you would integrate telemetry into Splunk, so clarity is crucial!
✨Emphasise Collaboration
This position involves working closely with various teams. Be prepared to discuss your experience in collaborative environments and how you’ve successfully integrated feedback from different stakeholders into your projects.