At a Glance
- Tasks: Lead the design and management of our enterprise event streaming platform using Apache Kafka.
- Company: Join a forward-thinking company focused on innovative data solutions and technology.
- Benefits: Enjoy flexible working options, competitive salary, and opportunities for professional growth.
- Why this job: Be at the forefront of event-driven architecture and make a real impact in tech.
- Qualifications: 5+ years in software engineering with 3+ years of hands-on Kafka experience required.
- Other info: Work with cutting-edge technologies like Docker, Kubernetes, and cloud platforms.
The predicted salary is between 48000 - 84000 £ per year.
My client is looking for a Senior Apache Kafka Engineer to lead the design, development, and management of our enterprise event streaming platform. This role requires deep Kafka expertise, strong system design skills, and hands-on experience managing large-scale, production-grade deployments.
Key Responsibilities:
- Own and evolve a critical Kafka infrastructure: assess, stabilize, and optimize architecture.
- Design and implement scalable, event-driven systems across environments (dev, staging, prod).
- Develop and maintain Kafka clusters, topics, partitions, schemas (Avro), and connectors.
- Integrate Kafka with external systems and ensure reliability, security, and observability.
- Troubleshoot delivery issues, latency, consumer lag, and performance bottlenecks.
- Drive documentation, training, incident resolution, and continuous improvement.
Qualifications:
- 5+ years in software/data engineering, with 3+ years of hands-on Kafka experience.
- Proven track record managing complex Kafka environments.
- Strong understanding of Kafka internals (brokers, replication, KRaft/ZooKeeper, ISR).
- Experience with Kafka Streams, ksqlDB, and building real-time data pipelines.
- Proficient in Java, Scala, or Python.
- Familiarity with Kafka Connect, Schema Registry, and common integrations (DBs, Elasticsearch, cloud).
- Experience with Docker, Kubernetes, Terraform, and Helm.
- Cloud Kafka experience (AWS MSK, Confluent, Azure Event Hubs).
- Skilled with monitoring tools (Prometheus, Grafana, Datadog, etc.).
Apache Kafka Engineer employer: Stott and May
Contact Detail:
Stott and May Recruiting Team
StudySmarter Expert Advice 🤫
We think this is how you could land Apache Kafka Engineer
✨Tip Number 1
Make sure to showcase your hands-on experience with Kafka in any discussions or interviews. Be prepared to discuss specific projects where you've managed Kafka clusters and the challenges you faced, as this will demonstrate your expertise.
✨Tip Number 2
Familiarise yourself with the latest trends and updates in Kafka and related technologies. Being able to discuss recent developments or features can set you apart from other candidates and show your passion for the field.
✨Tip Number 3
Network with professionals in the Kafka community. Attend meetups, webinars, or online forums to connect with others in the industry. This can lead to valuable insights and potentially even referrals for the position.
✨Tip Number 4
Prepare to demonstrate your problem-solving skills during the interview. Be ready to tackle hypothetical scenarios related to Kafka performance issues or system design challenges, as this will highlight your critical thinking abilities.
We think you need these skills to ace Apache Kafka Engineer
Some tips for your application 🫡
Tailor Your CV: Make sure your CV highlights your experience with Apache Kafka and related technologies. Focus on specific projects where you've managed Kafka environments, detailing your role and the impact of your work.
Craft a Strong Cover Letter: In your cover letter, express your passion for event streaming and your expertise in Kafka. Mention how your skills align with the responsibilities listed in the job description, such as designing scalable systems and troubleshooting performance issues.
Showcase Relevant Projects: Include examples of relevant projects in your application that demonstrate your hands-on experience with Kafka, such as building real-time data pipelines or integrating Kafka with external systems. Use metrics to quantify your achievements.
Highlight Continuous Learning: Mention any recent courses, certifications, or workshops related to Kafka or cloud technologies. This shows your commitment to staying updated in the field and your readiness to tackle new challenges.
How to prepare for a job interview at Stott and May
✨Showcase Your Kafka Expertise
Be prepared to discuss your hands-on experience with Apache Kafka in detail. Highlight specific projects where you've managed Kafka clusters, integrated external systems, or optimised performance. This will demonstrate your deep understanding of Kafka internals and your ability to handle complex environments.
✨Demonstrate System Design Skills
Expect questions about designing scalable, event-driven systems. Prepare to explain your approach to architecture, including how you assess and stabilise Kafka infrastructure. Use examples from your past work to illustrate your design process and decision-making.
✨Prepare for Troubleshooting Scenarios
You may be asked to troubleshoot common Kafka issues such as delivery failures or performance bottlenecks. Brush up on your problem-solving techniques and be ready to walk through your thought process when diagnosing and resolving these types of challenges.
✨Familiarise Yourself with Tools and Technologies
Since the role involves working with various tools like Docker, Kubernetes, and monitoring solutions, make sure you're comfortable discussing your experience with these technologies. Be ready to explain how you've used them in conjunction with Kafka to enhance deployment and observability.