At a Glance
- Tasks: Design and manage scalable Azure cloud infrastructure while improving data flow and platform performance.
- Company: Fast-growing SaaS organisation focused on modern, data-driven solutions.
- Benefits: Competitive salary of £75k plus a 10% bonus, with remote work flexibility.
- Other info: Exciting opportunity for career growth in a collaborative and innovative environment.
- Why this job: Join a dynamic team to shape the future of data platforms and enhance user insights.
- Qualifications: Strong Azure experience, DevOps background, and skills in GitHub Actions and Terraform.
The predicted salary is between 75000 - 75000 £ per year.
We’re partnering with a fast-growing SaaS organisation building a modern, data-driven platform that powers better insight into behaviour, performance, and collaboration at scale. They’re scaling their platform capability and are now looking for a Platform Engineer to help design, build, and evolve their cloud and data infrastructure.
This is a hands-on role spanning cloud infrastructure, DevOps, and data platform engineering.
You’ll work across Azure cloud infrastructure and data platforms, helping to build scalable, secure systems and improve how data flows across the organisation. You’ll also play a key role in improving deployment processes, system reliability, and overall platform performance, working closely with software and data teams.
Key Responsibilities:- Design and manage scalable Azure cloud infrastructure
- Own Infrastructure as Code using Terraform
- Build and maintain CI/CD pipelines using GitHub Actions (essential)
- Support GitHub-based release and deployment workflows
- Work with Kafka for event-driven streaming and real-time data movement
- Support and evolve data platforms (Databricks ideal)
- Build and maintain data pipelines (batch + streaming / ETL / ELT)
- Improve platform reliability, observability, and performance
- Collaborate with engineering teams to improve developer experience
- Strong Azure cloud experience
- Background in Platform Engineering, DevOps, or SRE
- Strong experience with GitHub Actions for CI/CD (essential)
- Kubernetes, Docker, and Terraform experience
- Strong scripting skills (Python, Bash, or PowerShell)
- Experience with Kafka or similar streaming platforms
- Experience with Databricks or modern data platforms (e.g. Snowflake, Synapse, BigQuery)
- Strong understanding of data pipelines and distributed systems
- Focus on automation, scalability, and reliability
- Lakehouse or large-scale data platform experience
- Observability tooling (Datadog, Grafana, Prometheus)
- SaaS / high-growth product experience
- Strong developer experience mindset
Azure DevOps Engineer (Kafka) in Cambridge employer: Digital Waffle
Contact Detail:
Digital Waffle Recruiting Team
StudySmarter Expert Advice 🤫
We think this is how you could land Azure DevOps Engineer (Kafka) in Cambridge
✨Tip Number 1
Network like a pro! Reach out to folks in your industry on LinkedIn or at meetups. A personal connection can often get you a foot in the door faster than any application.
✨Tip Number 2
Show off your skills! Create a portfolio or GitHub repository showcasing your projects, especially those involving Azure, Terraform, and Kafka. This gives potential employers a taste of what you can do.
✨Tip Number 3
Prepare for interviews by practising common DevOps scenarios. Brush up on your knowledge of CI/CD pipelines and data platforms, as these are hot topics for roles like the Azure DevOps Engineer.
✨Tip Number 4
Don’t forget to apply through our website! We’ve got loads of opportunities that might just be the perfect fit for you. Plus, it’s a great way to get noticed by our hiring team.
We think you need these skills to ace Azure DevOps Engineer (Kafka) in Cambridge
Some tips for your application 🫡
Tailor Your CV: Make sure your CV highlights your experience with Azure, Terraform, and CI/CD pipelines. We want to see how your skills align with the role, so don’t be shy about showcasing your relevant projects!
Craft a Compelling Cover Letter: Your cover letter is your chance to shine! Use it to explain why you’re excited about the role and how your background in DevOps and data platforms makes you a perfect fit for us. Keep it engaging and personal!
Showcase Your Projects: If you've worked on any cool projects involving Kafka or cloud infrastructure, make sure to mention them! We love seeing real-world applications of your skills, so include links or descriptions of your work.
Apply Through Our Website: We encourage you to apply directly through our website. It’s the best way for us to receive your application and ensures you don’t miss out on any important updates from our team!
How to prepare for a job interview at Digital Waffle
✨Know Your Tech Stack Inside Out
Make sure you’re well-versed in Azure, Terraform, Kubernetes, and Kafka. Brush up on your knowledge of CI/CD pipelines using GitHub Actions, as this is essential for the role. Being able to discuss specific projects where you've used these technologies will really impress.
✨Showcase Your Problem-Solving Skills
Prepare to discuss how you've tackled challenges in previous roles, especially around improving deployment processes or system reliability. Use the STAR method (Situation, Task, Action, Result) to structure your answers and highlight your impact.
✨Demonstrate Collaboration
Since this role involves working closely with software and data teams, be ready to share examples of how you've collaborated in the past. Talk about how you’ve improved developer experience or worked on cross-functional projects to show you’re a team player.
✨Ask Insightful Questions
Prepare thoughtful questions about the company’s data platforms and their approach to cloud infrastructure. This shows your genuine interest in the role and helps you assess if the company aligns with your career goals.