At a Glance
- Tasks: Lead the design and delivery of scalable data platforms and pipelines.
- Company: Join a forward-thinking company in the heart of London.
- Benefits: Competitive day rate, hybrid work model, and a dynamic team environment.
- Other info: Opportunity for mentorship and leadership in a collaborative setting.
- Why this job: Make an impact by shaping enterprise-scale data solutions with cutting-edge technology.
- Qualifications: 6-10+ years in data engineering with strong SQL and Databricks skills.
The predicted salary is between 84000 - 84000 £ per year.
Experienced Data Engineer required to join our client on an initial six-month contract to support the design, build, and delivery of enterprise-scale data platform solutions. This is a hands-on technical leadership role responsible for building core platform components while setting and maintaining engineering standards across the wider data engineering team.
The successful candidate will lead the technical design and delivery of data pipelines, transformations, and data models across the platform, ensuring outputs are scalable, reliable, and fit for consumption as governed data products. You will work closely with data consultants, architects, governance teams, and integration engineers to deliver high-quality, production-ready solutions within a modern cloud-based data environment.
This role requires strong technical expertise across Databricks and modern lakehouse architectures, alongside the ability to provide leadership, mentorship, and best practice guidance to Data Engineers within the squad.
Key Requirements
- 6–10+ years’ experience in data engineering, including technical leadership responsibilities
- Strong SQL skills and extensive experience building ETL/ELT pipelines at scale
- Hands-on expertise with Databricks, including Autoloader, Delta Live Tables, Delta Lake, and Unity Catalog
- Strong experience designing and implementing lakehouse and medallion architecture patterns
- Strong data modelling experience, including dimensional modelling, entity-based models, and SCD Type 2 handling
- Experience designing and building scalable data ingestion, transformation, and serving pipelines
- Ability to implement data quality rules, validation frameworks, and testing patterns across data pipelines
- Experience performance tuning pipelines, including partitioning, latency optimisation, and compute efficiency
- Strong understanding of data governance, access control, and regulated data environments
- Experience contributing to or building API-backed data products and analytics solutions
- Proven ability to define engineering standards, conduct code reviews, and mentor engineering teams
- Strong stakeholder management skills with the ability to work across engineering, architecture, governance, and product teams
Location: London (Hybrid, travel to office when required)
Day rate: £350 p/d (Outside IR35)
Duration: 6-months
Data Engineer employer: Ventula Consulting
Contact Detail:
Ventula Consulting Recruiting Team
StudySmarter Expert Advice 🤫
We think this is how you could land Data Engineer
✨Network Like a Pro
Get out there and connect with people in the industry! Attend meetups, webinars, or even just grab a coffee with someone who’s already in the data engineering field. You never know when a casual chat might lead to your next opportunity.
✨Show Off Your Skills
Don’t just talk about your experience; demonstrate it! Create a portfolio showcasing your projects, especially those involving Databricks and ETL pipelines. This will give potential employers a clear view of what you can bring to the table.
✨Ace the Interview
Prepare for technical interviews by brushing up on your SQL skills and understanding of lakehouse architectures. Be ready to discuss your past projects in detail and how you’ve tackled challenges in data engineering. Confidence is key!
✨Apply Through Our Website
Make sure to apply through our website for the best chance at landing that Data Engineer role. We’re always on the lookout for talented individuals like you, and applying directly helps us see your application faster!
We think you need these skills to ace Data Engineer
Some tips for your application 🫡
Tailor Your CV: Make sure your CV is tailored to the Data Engineer role. Highlight your experience with Databricks, SQL skills, and any leadership roles you've held. We want to see how your background aligns with what we're looking for!
Showcase Your Projects: Include specific projects where you've built ETL/ELT pipelines or worked with lakehouse architectures. We love seeing real examples of your work, so don’t hold back on the details!
Be Clear and Concise: When writing your application, keep it clear and to the point. Use bullet points for key achievements and avoid jargon unless it's relevant. We appreciate straightforward communication!
Apply Through Our Website: We encourage you to apply through our website for a smoother process. It helps us keep track of applications and ensures you get all the updates directly from us!
How to prepare for a job interview at Ventula Consulting
✨Know Your Tech Inside Out
Make sure you brush up on your SQL skills and get familiar with Databricks, especially Autoloader and Delta Live Tables. Be ready to discuss your experience with building ETL/ELT pipelines and how you've tackled challenges in data modelling and performance tuning.
✨Showcase Your Leadership Skills
Since this role involves technical leadership, prepare examples of how you've mentored other engineers or led projects. Think about specific instances where you set engineering standards or conducted code reviews, and be ready to share those stories.
✨Understand the Bigger Picture
Familiarise yourself with lakehouse architectures and data governance principles. Be prepared to discuss how your work fits into the wider data ecosystem and how you’ve collaborated with different teams like architects and governance teams to deliver high-quality solutions.
✨Ask Insightful Questions
Prepare thoughtful questions that show your interest in the company’s data strategy and culture. Inquire about their current data challenges, the tools they use, and how they measure success in their data engineering team. This will demonstrate your enthusiasm and strategic thinking.