At a Glance
- Tasks: Build and maintain automated data solutions in Azure, focusing on CI/CD pipelines and MLOps.
- Company: Join a forward-thinking tech company that values innovation and collaboration.
- Benefits: Competitive daily rate, remote work flexibility, and opportunities for career growth.
- Why this job: Make an impact by optimising data workflows and collaborating with diverse teams.
- Qualifications: Experience with Azure DevOps, IaC tools, and scripting languages like PowerShell and Python.
- Other info: Remote role with potential for extension after initial 3-month contract.
The DevOps Engineer plays a critical role in enabling scalable, reliable, and automated data solutions across the Azure ecosystem. This position focuses on supporting end-to-end data and ML pipelines, primarily built on Azure DevOps, Azure Databricks, and modern Infrastructure-as-Code (IaC) practices using Bicep or Terraform. The candidate collaborates closely with our client's Data Engineers, Security Team and Platform Teams to ensure smooth development workflows, automated deployments, and securely governed cloud environments all according to our client's Quality standards.
Key Responsibilities
- CI/CD Pipeline Engineering (Azure DevOps)
- Design, develop, and maintain Azure DevOps pipelines for data processing workflows, ML model training, and Databricks deployments.
- Implement pipeline quality gates, automated testing, environment promotion strategies, and artifact management.
- Ensure pipelines are resilient, observable, and aligned with organizational standards.
- Ensure the Software Development Lifecycle is built up according to the client's Policies and meet their requirements.
- Infrastructure as Code (IaC)
- Build, maintain, and standardize cloud infrastructure using Bicep or Terraform for Azure resources such as Databricks Workspaces, Storage Accounts, Key Vaults, Networks and containerized where applicable.
- Ensure infrastructure is modular, reusable, and compliant with enterprise security and governance requirements.
- Automation & Scripting (PowerShell, Python)
- Develop automation for recurring operational tasks (orchestration, monitoring, environment provisioning) using PowerShell and Python.
- Create scripts supporting Databricks job deployments, cluster lifecycle management, ML model registration, and data workflow automation.
- Databricks & Machine Learning Operations (MLOps)
- Build and maintain deployment mechanisms for Databricks notebooks/Jobs, workflows, ML models, and Delta pipelines.
- Support ML lifecycle automation, including data validation, model packaging, model registry updates, and automated retraining pipelines.
- Collaborate with Data Science teams to operationalize machine learning workflows in production environments.
- Cloud Environment Management & Reliability
- Ensure high availability, scalability, and reliability of data and ML workloads in Azure.
- Implement monitoring & alerting for pipelines, clusters, and data workflows.
- Contribute to operational improvements and proactive issue prevention.
- Collaboration & Governance
- Work closely with cross-functional teams and participate in review processes for IaC, pipeline changes, and data platform enhancements.
- Ensure changes follow standardized processes as outlined in the client's Quality Handbook.
- Document designs, processes, and architecture diagrams to support transparency and long-term maintainability.
Required Skills & Experience
- Technical Skills
- Azure DevOps pipelines (YAML), environments, approvals, artifacts.
- Infrastructure as Code: Bicep and/or Terraform and/or Pulumi.
- Scripting Languages: PowerShell and Python.
- Strong understanding of Azure Databricks, Spark fundamentals, and Databricks deployment patterns.
- Experience with Azure Core Services: Key Vault, Storage, VNet, AAD, Monitor, AKS (optional).
- Familiarity with containerization, Git branching strategies, and DevOps best practices.
- Experience with MLOps frameworks (MLflow, Databricks Model Registry).
- Experience deploying large-scale data or ML workloads.
- Knowledge of cloud security best practices and networking in Azure.
- Soft Skills
- Strong communication and ability to collaborate across data, engineering, and infrastructure teams.
- Analytical mindset with a passion for automation and continuous improvement.
- Ability to troubleshoot complex distributed systems and data pipelines.
Additional Information
- Rate offered: £450-475 per day
- IR35 Status: Outside
- Location: Remote
- Start date: March '26
- Duration: 3 months initial sign up with significant opportunity for extension.
DevOps Engineer employer: Shareforce
Contact Detail:
Shareforce Recruiting Team
StudySmarter Expert Advice 🤫
We think this is how you could land DevOps Engineer
✨Tip Number 1
Network like a pro! Reach out to folks in the industry on LinkedIn or at meetups. A friendly chat can lead to opportunities that aren’t even advertised yet.
✨Tip Number 2
Show off your skills! Create a portfolio showcasing your projects, especially those involving Azure DevOps and IaC. This gives potential employers a taste of what you can do.
✨Tip Number 3
Prepare for interviews by practising common DevOps scenarios. Think about how you’d handle CI/CD pipelines or automate tasks with PowerShell and Python. Confidence is key!
✨Tip Number 4
Don’t forget to apply through our website! We’ve got loads of opportunities waiting for talented DevOps Engineers like you. Let’s get you that dream job!
We think you need these skills to ace DevOps Engineer
Some tips for your application 🫡
Tailor Your CV: Make sure your CV highlights your experience with Azure DevOps, Bicep, and Terraform. We want to see how your skills align with the role, so don’t be shy about showcasing relevant projects or achievements!
Craft a Compelling Cover Letter: Your cover letter is your chance to shine! Use it to explain why you’re passionate about DevOps and how you can contribute to our team. Keep it concise but impactful – we love a good story!
Showcase Your Technical Skills: When filling out your application, make sure to mention your proficiency in scripting languages like PowerShell and Python. We’re keen on seeing how you’ve used these skills in real-world scenarios, especially in automation and MLOps.
Apply Through Our Website: We encourage you to apply directly through our website for a smoother process. It helps us keep track of your application and ensures you don’t miss any important updates from us!
How to prepare for a job interview at Shareforce
✨Know Your Tech Inside Out
Make sure you’re well-versed in Azure DevOps, Bicep, and Terraform. Brush up on your scripting skills with PowerShell and Python, as these will be crucial for the role. Be ready to discuss specific projects where you've implemented these technologies.
✨Showcase Your Collaboration Skills
Since this role involves working closely with Data Engineers and Security Teams, prepare examples of how you've successfully collaborated in the past. Highlight any cross-functional projects and how you ensured smooth workflows and communication.
✨Demonstrate Problem-Solving Abilities
Be prepared to tackle hypothetical scenarios or past challenges you've faced in DevOps. Think about how you approached troubleshooting complex systems and what strategies you used to ensure reliability and scalability.
✨Understand the Client's Quality Standards
Familiarise yourself with the client's policies and quality standards mentioned in the job description. Be ready to discuss how you would ensure compliance in your work, especially regarding CI/CD pipelines and Infrastructure as Code.