At a Glance
- Tasks: Lead cloud migration and build modern data platforms using Databricks and Azure.
- Company: Join a forward-thinking tech company focused on transformative data engineering projects.
- Benefits: Competitive day rates, remote work options, and potential for contract extensions.
- Why this job: Make a real impact by shaping engineering standards and delivering innovative solutions.
- Qualifications: Experience with Databricks, Azure, and strong SQL/PySpark skills required.
- Other info: Fast-paced environment with opportunities for growth and collaboration with senior engineers.
We're building a bench of top-tier Databricks and Cloud Data Engineering contractors for a series of high-impact transformation programmes across multiple UK clients. If you specialise in cloud migration, Databricks engineering, and modern data platform builds, there are several projects kicking off now and over the coming weeks. These are hands-on, delivery-focused roles where you'll be shaping modern data platforms, migrating legacy estates, and building scalable engineering foundations that actually get used; not just documented.
What You'll Be Working On
- Migrating enterprise data workloads from Teradata and other legacy systems into Databricks on Azure
- Designing and building high-performance PySpark pipelines, Delta Lake tables, and medallion architectures
- Developing scalable ETL/ELT workflows and optimising existing pipelines
- Implementing Unity Catalog, governance frameworks, and platform standards
- Leading or contributing to cloud-native data platform builds
- Performance tuning, cost optimisation, and FinOps-aligned engineering
- Working closely with analytics, data science, and product teams to deliver real business value
These projects move fast - you'll be expected to deliver clean, scalable, production-ready engineering from day one.
What We're Looking For
- Strong hands-on experience with Databricks, Spark, Delta Lake, and Databricks SQL
- Proven delivery of cloud migration projects (Teradata experience is a big plus)
- Solid Azure knowledge: Data Lake, Data Factory, Unity Catalog
- Advanced SQL and PySpark engineering skills
- Experience building and optimising medallion/lakehouse architectures
- Ability to work autonomously in fast-paced, product-led environments
- A contractor mindset: pragmatic, delivery-driven, and commercially aware
- Leadership experience is welcome, but not essential - we have both individual contributor and lead-level opportunities.
Contract Details
- Outside IR35
- Competitive day rates
- Remote or hybrid options depending on the client
- Initial 3-6 month contracts with extensions highly likely
- Start dates available immediately and rolling over the next few weeks
Why These Projects Stand Out
- The tech stack is modern and evolving
- The work is genuinely transformative, not BAU
- You have autonomy to shape engineering standards
- Your impact is visible - faster insights, better governance, lower costs
- You're surrounded by senior engineers who know their craft
If you're tired of firefighting legacy systems and want to build platforms that matter, this is the kind of work you'll enjoy.
Data Platform & Engineering Lead in Stoke-on-Trent employer: iO Associates
Contact Detail:
iO Associates Recruiting Team
StudySmarter Expert Advice 🤫
We think this is how you could land Data Platform & Engineering Lead in Stoke-on-Trent
✨Tip Number 1
Network like a pro! Reach out to your connections in the data engineering space, especially those who have worked with Databricks or Azure. A friendly chat can lead to insider info on job openings that aren't even advertised yet.
✨Tip Number 2
Show off your skills! Create a portfolio showcasing your past projects, especially those involving cloud migration and PySpark. This will give potential employers a taste of what you can bring to the table.
✨Tip Number 3
Be proactive! Don’t just wait for job postings; reach out directly to companies you’re interested in. Express your enthusiasm for their projects and how your experience aligns with their needs.
✨Tip Number 4
Apply through our website! We’ve got some exciting roles lined up, and applying directly can give you a better chance of standing out. Plus, we love seeing candidates who are keen to join us!
We think you need these skills to ace Data Platform & Engineering Lead in Stoke-on-Trent
Some tips for your application 🫡
Tailor Your CV: Make sure your CV speaks directly to the skills and experiences mentioned in the job description. Highlight your hands-on experience with Databricks, Azure, and cloud migration projects to catch our eye!
Craft a Compelling Cover Letter: Use your cover letter to tell us why you're passionate about data engineering and how your background aligns with our needs. Share specific examples of your work with PySpark and modern data platforms to show us what you can bring to the table.
Showcase Your Projects: If you've worked on relevant projects, don’t hold back! Include links or descriptions of your past work that demonstrate your ability to deliver clean, scalable engineering solutions. We love seeing real-world applications of your skills.
Apply Through Our Website: We encourage you to apply through our website for a smoother process. It helps us keep track of your application and ensures you’re considered for the right roles. Plus, it’s super easy!
How to prepare for a job interview at iO Associates
✨Know Your Tech Stack
Make sure you’re well-versed in Databricks, Azure, and PySpark. Brush up on your knowledge of Delta Lake and medallion architectures, as these are crucial for the role. Be ready to discuss specific projects where you've used these technologies.
✨Showcase Your Migration Experience
Prepare examples of cloud migration projects you've worked on, especially involving Teradata. Highlight the challenges you faced and how you overcame them. This will demonstrate your hands-on experience and problem-solving skills.
✨Demonstrate Delivery Focus
Since these roles are delivery-focused, be prepared to talk about how you ensure clean, scalable, production-ready engineering from day one. Share your strategies for performance tuning and cost optimisation, as these are key aspects of the job.
✨Emphasise Autonomy and Leadership
Even if you’re applying for an individual contributor role, show that you can work autonomously in fast-paced environments. If you have leadership experience, mention it, but also express your willingness to collaborate with others to achieve project goals.