At a Glance
- Tasks: Lead data engineering projects using Azure Databricks and create scalable data solutions.
- Company: Join a B Corpâ„¢ accredited consultancy with a global team of 8,000 specialists.
- Benefits: Competitive salary of €70,000, remote work, and inclusive recruitment processes.
- Why this job: Make an impact by combining human expertise with AI in innovative tech solutions.
- Qualifications: Strong experience with Azure Databricks, Python, and data engineering principles.
- Other info: Dynamic remote environment with opportunities for personal and professional growth.
The predicted salary is between 42000 - 84000 £ per year.
Futureheads have partnered with a consultancy that combines human expertise with AI to deliver scalable tech solutions, working with 1,000 clients over 30 years and a global team of 8,000 specialists.
As Data Tech Lead (Azure & Databricks), you’ll have end-to-end ownership of data engineering initiatives – from infrastructure and pipelines through to analytics and BI enablement.
What you’ll do:
- Design, build, and maintain scalable data pipelines using Azure Databricks (PySpark, Spark SQL, Delta Lake).
- Implement code-first data engineering solutions, following strong software engineering and DataOps principles.
- Orchestrate data ingestion and transformation using Azure Data Factory.
- Integrate data from APIs, relational databases, event-driven systems, and files.
- Apply ETL/ELT patterns for analytical and operational use cases.
- Ensure data quality, reliability, security, and governance using Azure-native services.
- Collaborate with analytics, BI, and business teams to turn requirements into robust data solutions.
- Support analytics use cases with dbt where relevant and enable downstream consumption (e.g. Power BI).
- Take technical ownership of projects, influencing architecture decisions and best practices.
- Implement and maintain unit, integration, and end-to-end tests, plus CI/CD for data workloads.
What you’ll bring:
- Strong experience with Azure Databricks and Spark (PySpark, Spark SQL).
- Advanced Python skills, with clean code practices and dependency management (Poetry, venv, etc.).
- Advanced SQL and solid understanding of distributed data processing and large-scale data architectures.
- Hands-on experience with Azure Data Factory.
- Familiarity with dbt and basic knowledge of Power BI data models.
- Basic understanding of ML concepts as they relate to data preparation and feature engineering.
- Experience with Infrastructure as Code (ideally Terraform on Azure) and Azure DevOps (CI/CD, version control).
- Strong ownership, problem-solving, and communication skills, plus advanced spoken English.
Salary working options:
In the region of €70,000 (candidates after more would still be considered). Remote based in Portugal.
We encourage applicants from all backgrounds, so if there is anything we can do to make our recruitment processes better for you and to allow you to show your best self, let us know. We also understand that some people require extra time to complete assessments, require alternative application methods and can also benefit from having interview questions or a guide to the type of questions pre-interview. We are open to any suggestions or requests that you may have and are always looking for creative ways to assess talent. Our commitment to you is that you should always feel safe and secure when you’re working with us.
Futureheads is a B Corpâ„¢ accredited digital recruitment agency based in London. We specialise in recruiting permanent, contract and freelance digital and tech professionals in creative, data, design, digital marketing, engineering, product, project and programme management, UX and service design jobs.
Data Tech Lead in London employer: Futureheads
Contact Detail:
Futureheads Recruiting Team
StudySmarter Expert Advice 🤫
We think this is how you could land Data Tech Lead in London
✨Tip Number 1
Network like a pro! Reach out to your connections in the data field, especially those who work with Azure and Databricks. A friendly chat can lead to insider info about job openings that aren’t even advertised yet.
✨Tip Number 2
Show off your skills! Create a portfolio showcasing your data engineering projects, especially those involving Azure Data Factory and Databricks. This gives potential employers a taste of what you can do and sets you apart from the crowd.
✨Tip Number 3
Prepare for interviews by brushing up on common data engineering scenarios. Think about how you’d tackle real-world problems using ETL/ELT patterns and Azure-native services. Practising these will help you feel more confident when it’s your turn to shine.
✨Tip Number 4
Don’t forget to apply through our website! It’s the best way to ensure your application gets seen by the right people. Plus, we’re always looking for talented individuals like you to join our team!
We think you need these skills to ace Data Tech Lead in London
Some tips for your application 🫡
Tailor Your CV: Make sure your CV is tailored to the Data Tech Lead role. Highlight your experience with Azure Databricks, Spark, and any relevant projects you've worked on. We want to see how your skills match what we're looking for!
Showcase Your Projects: Include specific examples of data engineering initiatives you've led or contributed to. Whether it's building scalable data pipelines or implementing CI/CD processes, we love seeing real-world applications of your skills.
Be Clear and Concise: When writing your application, keep it clear and to the point. Use bullet points where possible to make it easy for us to read through your experience and skills. We appreciate a well-structured application!
Apply Through Our Website: We encourage you to apply directly through our website. It’s the best way for us to receive your application and ensures you’re considered for the role. Plus, it’s super easy to do!
How to prepare for a job interview at Futureheads
✨Know Your Tech Inside Out
Make sure you’re well-versed in Azure Databricks, PySpark, and Spark SQL. Brush up on your advanced Python skills and be ready to discuss clean code practices. The interviewers will likely want to see how you can apply these technologies in real-world scenarios.
✨Showcase Your Problem-Solving Skills
Prepare to discuss specific challenges you've faced in data engineering and how you tackled them. Use the STAR method (Situation, Task, Action, Result) to structure your answers. This will demonstrate your ownership and problem-solving abilities effectively.
✨Understand DataOps Principles
Familiarise yourself with DataOps principles and be ready to explain how you’ve implemented them in past projects. Highlight your experience with CI/CD processes and Infrastructure as Code, especially if you’ve used Terraform on Azure.
✨Communicate Clearly and Confidently
Since strong communication skills are essential for this role, practice articulating your thoughts clearly. Be prepared to collaborate with analytics and business teams, so showing that you can convey complex technical concepts in simple terms will be a big plus.