At a Glance
- Tasks: Design and manage scalable Azure cloud infrastructure while improving data flow and platform performance.
- Company: Join a fast-growing SaaS organisation focused on modern, data-driven solutions.
- Benefits: Competitive salary of £75k plus a 10% bonus, with remote work flexibility.
- Other info: Dynamic role with opportunities for growth in a collaborative environment.
- Why this job: Make a real impact by building innovative cloud and data infrastructures.
- Qualifications: Strong Azure experience and background in DevOps or Platform Engineering required.
The predicted salary is between 75000 - 75000 £ per year.
We’re partnering with a fast-growing SaaS organisation building a modern, data-driven platform that powers better insight into behaviour, performance, and collaboration at scale. They’re scaling their platform capability and are now looking for a Platform Engineer to help design, build, and evolve their cloud and data infrastructure.
This is a hands-on role spanning cloud infrastructure, DevOps, and data platform engineering.
You’ll work across Azure cloud infrastructure and data platforms, helping to build scalable, secure systems and improve how data flows across the organisation. You’ll also play a key role in improving deployment processes, system reliability, and overall platform performance, working closely with software and data teams.
Key Responsibilities:- Design and manage scalable Azure cloud infrastructure
- Own Infrastructure as Code using Terraform
- Build and maintain CI/CD pipelines using GitHub Actions (essential)
- Support GitHub-based release and deployment workflows
- Work with Kafka for event-driven streaming and real-time data movement
- Support and evolve data platforms (Databricks ideal)
- Build and maintain data pipelines (batch + streaming / ETL / ELT)
- Improve platform reliability, observability, and performance
- Collaborate with engineering teams to improve developer experience
- Strong Azure cloud experience
- Background in Platform Engineering, DevOps, or SRE
- Strong experience with GitHub Actions for CI/CD (essential)
- Kubernetes, Docker, and Terraform experience
- Strong scripting skills (Python, Bash, or PowerShell)
- Experience with Kafka or similar streaming platforms
- Experience with Databricks or modern data platforms (e.g. Snowflake, Synapse, BigQuery)
- Strong understanding of data pipelines and distributed systems
- Focus on automation, scalability, and reliability
- Lakehouse or large-scale data platform experience
- Observability tooling (Datadog, Grafana, Prometheus)
- SaaS / high-growth product experience
- Strong developer experience mindset
Azure DevOps Engineer (Kafka) in Doncaster employer: Digital Waffle
Contact Detail:
Digital Waffle Recruiting Team
StudySmarter Expert Advice 🤫
We think this is how you could land Azure DevOps Engineer (Kafka) in Doncaster
✨Tip Number 1
Network like a pro! Reach out to folks in your industry on LinkedIn or at meetups. You never know who might have the inside scoop on job openings or can refer you directly.
✨Tip Number 2
Show off your skills! Create a portfolio or GitHub repository showcasing your projects, especially those involving Azure, Terraform, and Kafka. This gives potential employers a taste of what you can do.
✨Tip Number 3
Prepare for interviews by brushing up on common DevOps scenarios and challenges. Practice explaining how you've tackled similar issues in the past, focusing on your hands-on experience with CI/CD and cloud infrastructure.
✨Tip Number 4
Don’t forget to apply through our website! We’re always on the lookout for talented individuals like you, and applying directly can sometimes give you an edge over other candidates.
We think you need these skills to ace Azure DevOps Engineer (Kafka) in Doncaster
Some tips for your application 🫡
Tailor Your CV: Make sure your CV highlights your experience with Azure, Terraform, and CI/CD pipelines. We want to see how your skills align with the role, so don’t be shy about showcasing your relevant projects!
Craft a Compelling Cover Letter: Your cover letter is your chance to shine! Use it to explain why you’re passionate about the role and how your background in DevOps and data platforms makes you a perfect fit for our team.
Showcase Your Projects: If you've worked on any cool projects involving Kafka or cloud infrastructure, make sure to mention them! We love seeing real-world applications of your skills, so include links or descriptions of your work.
Apply Through Our Website: We encourage you to apply directly through our website. It’s the best way for us to receive your application and ensures you’re considered for the role. Plus, it’s super easy!
How to prepare for a job interview at Digital Waffle
✨Know Your Tech Stack
Make sure you’re well-versed in Azure, Terraform, Kubernetes, and Kafka. Brush up on your knowledge of CI/CD pipelines using GitHub Actions, as this is essential for the role. Being able to discuss your hands-on experience with these technologies will show that you’re ready to hit the ground running.
✨Showcase Your Problem-Solving Skills
Prepare to discuss specific challenges you've faced in previous roles, especially related to cloud infrastructure and data platforms. Think about how you improved deployment processes or enhanced system reliability. Real-world examples will demonstrate your ability to tackle issues effectively.
✨Understand the Company’s Vision
Research the SaaS organisation and understand their mission and the data-driven platform they’re building. Be ready to explain how your skills can contribute to their goals, particularly in improving data flow and collaboration across teams. This shows genuine interest and alignment with their objectives.
✨Ask Insightful Questions
Prepare thoughtful questions about the team dynamics, the tech stack they use, and their approach to scalability and reliability. This not only shows your enthusiasm for the role but also helps you gauge if the company culture and work environment are a good fit for you.