At a Glance
- Tasks: Design and manage scalable Azure cloud infrastructure while improving data flow and platform performance.
- Company: Fast-growing SaaS organisation focused on modern, data-driven solutions.
- Benefits: Competitive salary, 10% bonus, remote work, and opportunities for professional growth.
- Other info: Exciting opportunity for career advancement in a collaborative environment.
- Why this job: Join a dynamic team and make a real impact on innovative cloud and data projects.
- Qualifications: Strong Azure experience, DevOps background, and skills in CI/CD with GitHub Actions.
The predicted salary is between 75000 - 75000 £ per year.
We’re partnering with a fast-growing SaaS organisation building a modern, data-driven platform that powers better insight into behaviour, performance, and collaboration at scale. They’re scaling their platform capability and are now looking for a Platform Engineer to help design, build, and evolve their cloud and data infrastructure.
This is a hands-on role spanning cloud infrastructure, DevOps, and data platform engineering.
You’ll work across Azure cloud infrastructure and data platforms, helping to build scalable, secure systems and improve how data flows across the organisation. You’ll also play a key role in improving deployment processes, system reliability, and overall platform performance, working closely with software and data teams.
Key Responsibilities:- Design and manage scalable Azure cloud infrastructure
- Own Infrastructure as Code using Terraform
- Build and maintain CI/CD pipelines using GitHub Actions (essential)
- Support GitHub-based release and deployment workflows
- Work with Kafka for event-driven streaming and real-time data movement
- Support and evolve data platforms (Databricks ideal)
- Build and maintain data pipelines (batch + streaming / ETL / ELT)
- Improve platform reliability, observability, and performance
- Collaborate with engineering teams to improve developer experience
- Strong Azure cloud experience
- Background in Platform Engineering, DevOps, or SRE
- Strong experience with GitHub Actions for CI/CD (essential)
- Kubernetes, Docker, and Terraform experience
- Strong scripting skills (Python, Bash, or PowerShell)
- Experience with Kafka or similar streaming platforms
- Experience with Databricks or modern data platforms (e.g. Snowflake, Synapse, BigQuery)
- Strong understanding of data pipelines and distributed systems
- Focus on automation, scalability, and reliability
- Lakehouse or large-scale data platform experience
- Observability tooling (Datadog, Grafana, Prometheus)
- SaaS / high-growth product experience
- Strong developer experience mindset
Azure DevOps Engineer (Kafka) in Edinburgh employer: Digital Waffle
Contact Detail:
Digital Waffle Recruiting Team
StudySmarter Expert Advice 🤫
We think this is how you could land Azure DevOps Engineer (Kafka) in Edinburgh
✨Tip Number 1
Network like a pro! Reach out to folks in your industry on LinkedIn or at meetups. You never know who might have the inside scoop on job openings or can refer you directly.
✨Tip Number 2
Show off your skills! Create a portfolio or GitHub repository showcasing your projects, especially those involving Azure, Terraform, and Kafka. This gives potential employers a taste of what you can do.
✨Tip Number 3
Prepare for interviews by practising common DevOps scenarios. Brush up on your knowledge of CI/CD pipelines and data platforms, and be ready to discuss how you've tackled challenges in past roles.
✨Tip Number 4
Don’t forget to apply through our website! We’re always on the lookout for talented individuals like you, and applying directly can sometimes give you an edge over other candidates.
We think you need these skills to ace Azure DevOps Engineer (Kafka) in Edinburgh
Some tips for your application 🫡
Tailor Your CV: Make sure your CV is tailored to the Azure DevOps Engineer role. Highlight your experience with Azure, Terraform, and CI/CD pipelines. We want to see how your skills match what we're looking for!
Craft a Compelling Cover Letter: Your cover letter is your chance to shine! Use it to explain why you're passionate about this role and how your background in platform engineering and DevOps makes you a perfect fit. Let us know what excites you about working with Kafka and data platforms.
Showcase Your Projects: If you've worked on relevant projects, don’t hold back! Share examples of how you've built scalable systems or improved deployment processes. We love seeing real-world applications of your skills, especially with tools like GitHub Actions and Kubernetes.
Apply Through Our Website: We encourage you to apply through our website for a smoother application process. It helps us keep everything organised and ensures your application gets the attention it deserves. We can't wait to hear from you!
How to prepare for a job interview at Digital Waffle
✨Know Your Tech Stack Inside Out
Make sure you’re well-versed in Azure, Terraform, Kubernetes, and Kafka. Brush up on your knowledge of CI/CD pipelines using GitHub Actions, as this is essential for the role. Being able to discuss specific projects where you've used these technologies will really impress the interviewers.
✨Showcase Your Problem-Solving Skills
Prepare to discuss how you've tackled challenges in previous roles, especially around improving deployment processes or enhancing system reliability. Use the STAR method (Situation, Task, Action, Result) to structure your answers and make them impactful.
✨Demonstrate Your Collaboration Skills
Since this role involves working closely with software and data teams, be ready to share examples of how you've successfully collaborated in the past. Highlight any experiences where you improved developer experience or worked on cross-functional projects.
✨Ask Insightful Questions
Prepare thoughtful questions about the company’s data platforms and their approach to cloud infrastructure. This shows your genuine interest in the role and helps you assess if the company aligns with your career goals. Plus, it gives you a chance to engage with the interviewers on a deeper level.