At a Glance
- Tasks: Design and implement cloud-native data pipelines using Azure and Databricks.
- Company: Join a stealth-mode AI-first consultancy ready to disrupt the Big Four.
- Benefits: Earn up to £600 per day with potential for permanent leadership roles.
- Why this job: Shape the data backbone of a next-gen consultancy while mentoring junior engineers.
- Qualifications: 5+ years in data engineering, strong Python and SQL skills required.
- Other info: This is a contract role outside IR35, perfect for those seeking high ownership.
The predicted salary is between 48000 - 72000 £ per year.
Location: London, Hybrid
Rate: Up to £600 per day
Contract: 6 months (Outside IR35, potential to go perm)
Tech Stack: Azure Data Factory, Synapse, Databricks, Delta Lake, PySpark, Python, SQL, Event Hub, Azure ML, MLflow
We’ve partnered with a new AI-first professional services consultancy that’s taking on the Big Four. Currently in stealth mode and gearing up for launch, they’re assembling a founding team of contractors with the view to scale into permanent leadership as the business grows.
They’re now hiring a Senior Data Engineer to build robust, scalable data solutions powering AI, analytics, and future client delivery. If you love creating high-performance data infrastructure from the ground up — this is your chance to help shape the data backbone of a next-gen consultancy.
What You’ll Be Doing:
- Designing and implementing cloud-native data pipelines using Azure and Databricks
- Building clean, governed data layers for ML, analytics, and application use
- Supporting LLM/AI data workflows through structured and unstructured pipeline design
- Contributing to reusable data engineering components across client and internal use cases
- Collaborating with data scientists, architects, and product teams to enable experimentation and delivery
- Mentoring junior engineers and supporting team capability development
What You Need:
- 5+ years in data engineering or backend cloud development
- Strong Python, SQL, and Databricks skills (especially PySpark & Delta Lake)
- Deep experience with Azure: Data Factory, Synapse, Event Hub, Azure Functions
- Understanding of MLOps tooling like MLflow and integration with AI pipelines
- Exposure to data for LLM workflows, unstructured sources, and real-time ingestion
- Comfortable operating in an early-stage environment, with minimal structure and high ownership
This is a contract role outside IR35, UK-based, and ideal for someone excited to help shape a company from the ground up. Apply today!
Contract Senior Data Engineer employer: trg.recruitment
Contact Detail:
trg.recruitment Recruiting Team
StudySmarter Expert Advice 🤫
We think this is how you could land Contract Senior Data Engineer
✨Tip Number 1
Familiarise yourself with the specific technologies mentioned in the job description, such as Azure Data Factory, Databricks, and PySpark. Consider building a small project or contributing to open-source projects that utilise these tools to demonstrate your hands-on experience.
✨Tip Number 2
Network with professionals in the data engineering field, especially those who work with AI and cloud technologies. Attend meetups or webinars related to Azure and data engineering to connect with potential colleagues and learn about industry trends.
✨Tip Number 3
Prepare to discuss your experience in building scalable data solutions and how you've collaborated with cross-functional teams. Be ready to share specific examples of how you've mentored junior engineers or contributed to team development.
✨Tip Number 4
Research the company’s mission and values, especially since they are an AI-first consultancy. Tailor your conversations during interviews to reflect how your skills and experiences align with their goals and how you can contribute to shaping their data infrastructure.
We think you need these skills to ace Contract Senior Data Engineer
Some tips for your application 🫡
Tailor Your CV: Make sure your CV highlights relevant experience in data engineering, particularly with Azure, Databricks, and Python. Use specific examples that demonstrate your skills in building data pipelines and working with cloud technologies.
Craft a Compelling Cover Letter: Write a cover letter that showcases your passion for data engineering and your excitement about the opportunity to work in an AI-first consultancy. Mention how your background aligns with the company's goals and the specific requirements of the role.
Highlight Relevant Projects: In your application, include details about specific projects where you designed and implemented data solutions. Focus on your contributions to cloud-native data pipelines and any experience with MLOps tooling like MLflow.
Showcase Soft Skills: Since this role involves collaboration and mentoring, emphasise your ability to work in teams and support junior engineers. Provide examples of how you've contributed to team development and knowledge sharing in previous roles.
How to prepare for a job interview at trg.recruitment
✨Showcase Your Technical Skills
Make sure to highlight your experience with the specific technologies mentioned in the job description, such as Azure Data Factory, Databricks, and PySpark. Prepare examples of projects where you've successfully implemented these tools to demonstrate your expertise.
✨Understand the Company’s Vision
Since the company is in stealth mode and gearing up for launch, do some research on their mission and values. Be ready to discuss how your skills and experiences align with their goals, especially in building scalable data solutions for AI and analytics.
✨Prepare for Scenario-Based Questions
Expect questions that assess your problem-solving abilities in real-world scenarios. Think about challenges you've faced in previous roles, particularly in data engineering, and how you overcame them. This will show your capability to operate in an early-stage environment.
✨Emphasise Collaboration and Mentorship
The role involves collaborating with various teams and mentoring junior engineers. Be prepared to share examples of how you've worked effectively in teams and supported the development of others, showcasing your leadership potential.