Kafka DevOps Engineer in London

Kafka DevOps Engineer in London

London Full-Time 60000 - 80000 ÂŁ / year (est.) Home office (partial)
Verifone

At a Glance

  • Tasks: Join our team as a Kafka DevOps Engineer and manage cutting-edge payment solutions.
  • Company: Verifone, a leader in electronic payment technology with a dynamic work culture.
  • Benefits: Competitive pay, career growth, mentorship, and funded certifications.
  • Other info: Enjoy hybrid working flexibility and a collaborative team environment.
  • Why this job: Make a real impact on global payment systems and work with innovative technologies.
  • Qualifications: 4+ years of engineering experience with hands-on Kafka support.

The predicted salary is between 60000 - 80000 ÂŁ per year.

For more than 30 years, Verifone has established a remarkable record of leadership in the electronic payment technology industry. Verifone is one of the leading electronic payment solutions brands and among the largest providers of electronic payment systems worldwide. Verifone has a diverse, dynamic, and fast-paced work environment in which employees are focused on results and have opportunities to excel. We take pride in working with leading retailers, merchants, banks, and third-party partners to invent and deliver innovative payment solutions around the world. We strive for excellence in our products and services and are obsessed with customer happiness.

Across the globe, Verifone employees are leading the payments industry through experience, innovation, and an ambitious spirit. Whether it’s developing the next generation of secure payment systems or finding new ways to bring electronic payments to emerging markets, the Verifone team is dedicated to the success of our customers, partners, and investors. It is this passion for innovation that drives every Verifone employee toward personal and professional success. Verifone is proudly an in-office work culture as we see immense benefits to career development and business results from our colleagues being physically co-located.

What’s Exciting About the Role

Verifone is seeking a Kafka DevOps Engineer to join our Platform Engineering team. This is an operations-first role with a strong emphasis on scripting, automation, and pipeline development. You’ll be hands-on with day-to-day Kafka operations, reliability, tuning, and high availability for payment gateway solutions that process billions of transactions annually on-prem and in AWS Cloud. Beyond keeping the lights on, you’ll play a key role in building the data pipelines—of which Kafka is a core component—that power Verifone’s new AI, machine learning, and analytics initiatives. You’ll also be part of an active effort to migrate Kafka and related services to Kubernetes, giving you hands-on experience with a meaningful infrastructure modernization project. The technology footprint is broad: Redis, MongoDB, PostgreSQL, MySQL, Snowflake, and more—so you’ll grow well beyond a single-technology niche.

Key Responsibilities

  • Kafka Operations & Reliability: Manage and support Apache Kafka clusters (including MSK), Kafka Connect ecosystem, and KSQL for high-throughput, fault-tolerant messaging and event streaming pipelines. Monitor system health, set up alerts, and drive incident response and root cause analysis (RCA). Handle day-to-day operational tasks including offset management, lag monitoring, consumer group management, and cluster rebalancing. Perform routine cluster maintenance: upgrades, configuration changes, scaling, and health checks.
  • Scripting, Automation & DevOps: Develop and manage automation scripts using Python and Shell scripting to reduce manual toil and improve operational efficiency. Design, implement, and maintain robust CI/CD pipelines for patching automation. Build and maintain Infrastructure as Code (IaC) using tools such as Terraform, Ansible, or similar. Containerize and orchestrate workloads using Docker and Kubernetes, with the opportunity to help migrate Kafka and related services to Kubernetes.
  • Data Pipeline Development: Build and optimize end-to-end data pipelines, with Kafka as a core streaming component alongside batch and ETL processes, enabling new AI, ML, and analytics use cases. Collaborate with development teams to implement best practices for data flow, security, scalability, and performance. Continuously identify and implement improvements to the software development lifecycle (SDLC).
  • Cloud & Infrastructure: Manage cloud infrastructure on AWS (VMs, networking, storage, IAM, etc.). Support cloud migration tasks and assist in moving workloads between on-prem and cloud environments.

Required Qualifications / Skills

  • 4+ years of overall engineering experience with 2+ years of hands-on experience supporting Kafka infrastructure in a large-scale production environment.
  • Kafka Producer/Consumer microservices concepts and Kafka distributed architecture.
  • Solid Linux fundamentals: networking basics, logs, system troubleshooting, process/memory, disk.
  • Strong scripting and automation skills (Python, Bash) with a track record of reducing operational toil.
  • CI/CD pipeline development and Infrastructure-as-Code experience (Terraform preferred).
  • Cloud engineering skills, preferably AWS (EC2, VPC, IAM, MSK/ElastiCache, CloudWatch).
  • Familiarity with observability tools (metrics/logs/tracing concepts) and incident response practices.
  • Basic understanding of distributed systems tradeoffs (availability, consistency, partitions, backpressure).
  • Strong communication and presentation skills with emphasis on executive communication.
  • Flexible with regards to working shifts; on-call and weekends.

Preferred Skills (Highly Desired)

  • Data Engineering & Pipelines: Data engineering skills including data analytics, data processing, ETL, and data lake technologies (batch and streaming, file formats like Parquet, table formats like Iceberg/Delta/Hudi, basic orchestration). Experience with AWS data tools (Athena, Glue, Iceberg, Redshift, etc.). Exposure to Kafka Streams, Apache Flink, or similar stream processing frameworks.
  • Database & Caching Technologies: Operate Redis deployments for caching, ephemeral state, queues/streams, and rate limiting use cases. Relational DB experience: PostgreSQL and/or MySQL (indexing basics, vacuum/analyze, query plans, replication fundamentals). MongoDB operational familiarity (replica sets, elections, oplog basics, backup/restore).
  • Infrastructure & Security: Container/Kubernetes familiarity (deployments, stateful workloads, storage classes). On-prem experience (VMware/KVM, storage, networking). Security fundamentals: least privilege, secrets management, encryption-in-transit/at-rest concepts. Experience working with PCI (Payment Card Industry Data Security) standards.

What We Offer

  • Direct impact on Verifone’s global payment infrastructure—your work keeps billions of transactions flowing reliably every year.
  • Ground-floor involvement in building the data pipelines that will drive Verifone’s AI, machine learning, and analytics strategy.
  • Hands-on participation in a Kafka-to-Kubernetes migration—a resume-building modernization project from day one.
  • Multi-technology exposure—Kafka, Redis, MongoDB, PostgreSQL, MySQL, Snowflake—so you grow as a well-rounded platform engineer, not a single-tool specialist.
  • Mentorship from senior engineers and Kafka architects with deep production experience.
  • Clear career progression path from mid-level to senior and lead engineer.
  • Funded certification paths—Confluent (CCDAK, Certified Administrator), AWS (Solutions Architect, MSK Specialty), CKA—plus budget for courses and technical conferences.
  • Hybrid/remote working flexibility with a collaborative team culture.
  • Competitive compensation with performance-based incentives.

Our Commitment

Verifone is committed to creating a diverse environment and is proud to be an equal opportunity employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, gender, gender identity or expression, sexual orientation, national origin, genetics, disability, age, or veteran status. Verifone is also committed to compliance with all fair employment practices regarding citizenship and immigration status.

Kafka DevOps Engineer in London employer: Verifone

Verifone is an exceptional employer, offering a dynamic and fast-paced work environment where innovation thrives. As a Kafka DevOps Engineer, you'll have the opportunity to directly impact global payment infrastructure while working alongside experienced professionals in a collaborative culture that prioritises employee growth through mentorship and funded certification paths. With a commitment to diversity and a clear career progression, Verifone stands out as a rewarding place for those seeking meaningful employment in the electronic payment technology industry.
Verifone

Contact Detail:

Verifone Recruiting Team

StudySmarter Expert Advice 🤫

We think this is how you could land Kafka DevOps Engineer in London

✨Tip Number 1

Network like a pro! Reach out to current or former Verifone employees on LinkedIn. Ask them about their experiences and any tips they might have for landing the Kafka DevOps Engineer role. Personal connections can give you insights that job descriptions just can't.

✨Tip Number 2

Prepare for technical interviews by brushing up on your Kafka knowledge and scripting skills. Practice common DevOps scenarios and be ready to discuss how you've tackled challenges in previous roles. We want to see your problem-solving skills in action!

✨Tip Number 3

Showcase your passion for innovation! During interviews, share examples of how you've contributed to projects that improved efficiency or introduced new technologies. Verifone loves candidates who are eager to drive change and enhance customer happiness.

✨Tip Number 4

Don’t forget to apply through our website! It’s the best way to ensure your application gets seen by the right people. Plus, it shows you're genuinely interested in joining the Verifone team. Let’s get you that interview!

We think you need these skills to ace Kafka DevOps Engineer in London

Apache Kafka
Kafka Connect
KSQL
Python
Shell Scripting
CI/CD Pipeline Development
Infrastructure as Code (IaC)
Terraform
Ansible
Docker
Kubernetes
AWS
Linux Fundamentals
Data Engineering
Observability Tools

Some tips for your application 🫡

Tailor Your CV: Make sure your CV is tailored to the Kafka DevOps Engineer role. Highlight your experience with Kafka, scripting, and cloud technologies. We want to see how your skills align with what we're looking for!

Craft a Compelling Cover Letter: Your cover letter is your chance to shine! Share your passion for payment technology and how you can contribute to Verifone's mission. Keep it concise but impactful—let us know why you're the perfect fit!

Showcase Your Projects: If you've worked on relevant projects, don’t hold back! Include links or descriptions of your work with Kafka, CI/CD pipelines, or any automation scripts. We love seeing real-world applications of your skills.

Apply Through Our Website: We encourage you to apply directly through our website. It’s the best way to ensure your application gets into the right hands. Plus, it shows us you're serious about joining our team!

How to prepare for a job interview at Verifone

✨Know Your Kafka Inside Out

Make sure you brush up on your Kafka knowledge before the interview. Understand the core concepts like producers, consumers, and distributed architecture. Be ready to discuss your hands-on experience with Kafka clusters and how you've managed them in a production environment.

✨Show Off Your Scripting Skills

Since this role emphasises scripting and automation, prepare to showcase your Python and Shell scripting skills. Bring examples of scripts you've developed to improve operational efficiency or reduce manual tasks. This will demonstrate your practical experience and problem-solving abilities.

✨Get Familiar with CI/CD and IaC

Verifone is looking for someone who can design and maintain CI/CD pipelines and Infrastructure as Code. Be prepared to discuss your experience with tools like Terraform and Ansible. If you have specific examples of how you've implemented these practices, share them!

✨Understand Cloud Infrastructure

Since the role involves managing AWS cloud infrastructure, make sure you know the basics of EC2, VPC, and IAM. Be ready to talk about any cloud migration tasks you've been involved in and how you’ve handled workloads between on-prem and cloud environments.

Kafka DevOps Engineer in London
Verifone
Location: London

Land your dream job quicker with Premium

You’re marked as a top applicant with our partner companies
Individual CV and cover letter feedback including tailoring to specific job roles
Be among the first applications for new jobs with our AI application
1:1 support and career advice from our career coaches
Go Premium

Money-back if you don't land a job in 6-months

>