Senior Databricks Data Engineer
Senior Databricks Data Engineer

Senior Databricks Data Engineer

Full-Time 43200 - 72000 ÂŁ / year (est.) No home office possible
Go Premium
Jobgether

At a Glance

  • Tasks: Design and optimise data pipelines on Azure Databricks for enterprise-scale projects.
  • Company: Join a forward-thinking tech company with a fully remote, collaborative culture.
  • Benefits: Competitive pay, comprehensive benefits, and the flexibility of remote work.
  • Why this job: Make a real impact by solving complex data challenges and influencing platform strategy.
  • Qualifications: 5+ years in Azure Databricks, strong Python and SQL skills, and a knack for problem-solving.
  • Other info: Work with expert teams on high-impact projects across diverse industries.

The predicted salary is between 43200 - 72000 ÂŁ per year.

We are currently looking for a Senior Databricks Data Engineer in the United Kingdom. In this role, you will design, build, and optimize enterprise‑scale data pipelines on Azure Databricks, supporting structured, semi‑structured, and unstructured data. You will work closely with data architects, security teams, and business stakeholders to implement best practices for data governance, security, and high‑performance data processing. This position involves hands‑on development of Delta Lake architectures, CI/CD pipelines, and orchestrated workflows to deliver reliable, scalable data products. Operating in a fully remote, collaborative environment, you will have the opportunity to influence the overall data platform strategy while solving complex technical challenges. Your work will enable analytics, reporting, and AI initiatives across the enterprise. The role combines technical depth, architectural insight, and operational excellence, offering strong growth potential in cloud data engineering.

Accountabilities

  • Design, develop, and optimize ETL/ELT data pipelines using Azure Databricks (Python, PySpark, SQL, Delta Lake).
  • Configure and maintain Databricks workspaces, clusters, jobs, repositories, and workflow schedules for multi‑team data product delivery.
  • Implement and enforce data governance and security best practices, including access controls, lineage, and auditing frameworks.
  • Build and maintain Delta Lake architectures with medallion (bronze/silver/gold) layer structures.
  • Integrate Databricks pipelines with Azure Data Platform services, ensuring reliable orchestration, observability, and CI/CD automation.
  • Collaborate with data architects, data owners, and cross‑functional teams to align platform solutions with enterprise standards.
  • Optimize pipeline performance, compute cost, and system efficiency through code‑level and cluster‑level tuning strategies.

Requirements

  • Minimum 5 years of professional experience delivering Azure Databricks solutions in enterprise environments.
  • Strong expertise in Databricks components: Workspaces, Notebooks, Jobs, Workflows, Repos, Unity Catalog, Delta Lake, Delta Live Tables, and MLflow.
  • Solid knowledge of Azure Data Platform services: ADLS Gen2, Azure Key Vault, Azure Monitor, Azure Log Analytics, Azure Entra ID/RBAC, Terraform provider a plus.
  • Experience implementing data security and governance frameworks including access controls, masking, row‑level security, ABAC, governed tags, credential management, lineage, and auditability.
  • Proficiency in Python, SQL, PySpark, Git, Spark performance tuning, and distributed computing concepts.
  • Familiarity with AI/ML lifecycle and MLflow model management.
  • Experience working in Agile or DevOps‑oriented teams, with strong analytical, problem‑solving, and communication skills.
  • Fluency in Portuguese and English.

Benefits

  • Competitive compensation aligned with experience.
  • Fully remote work environment.
  • Delivery of work equipment suited to the role and responsibilities.
  • Comprehensive benefits plan.
  • Opportunity to work with expert teams on high‑impact, large‑scale projects.
  • Exposure to long‑term, strategic client initiatives in diverse industries.

Senior Databricks Data Engineer employer: Jobgether

As a Senior Databricks Data Engineer, you will thrive in a fully remote and collaborative environment that champions innovation and technical excellence. The company offers competitive compensation, comprehensive benefits, and the chance to work on high-impact projects with expert teams, fostering both personal and professional growth. With a focus on data governance and security, you will play a pivotal role in shaping the data platform strategy while enjoying the flexibility and support necessary for a rewarding career.
Jobgether

Contact Detail:

Jobgether Recruiting Team

StudySmarter Expert Advice 🤫

We think this is how you could land Senior Databricks Data Engineer

✨Tip Number 1

Network like a pro! Reach out to your connections in the data engineering field, especially those who work with Azure Databricks. A friendly chat can lead to insider info about job openings or even referrals.

✨Tip Number 2

Show off your skills! Create a portfolio showcasing your best projects, especially those involving ETL/ELT pipelines and Delta Lake architectures. This will give potential employers a taste of what you can do.

✨Tip Number 3

Prepare for interviews by brushing up on common technical questions related to Databricks and Azure Data Platform services. Practising your problem-solving approach will help you stand out during technical assessments.

✨Tip Number 4

Don’t forget to apply through our website! We use AI to match your skills with the right roles, making it easier for you to land that Senior Databricks Data Engineer position. Plus, it speeds up the review process!

We think you need these skills to ace Senior Databricks Data Engineer

Azure Databricks
ETL/ELT Data Pipelines
Python
PySpark
SQL
Delta Lake
CI/CD Pipelines
Data Governance
Data Security
Azure Data Platform Services
Spark Performance Tuning
Agile Methodologies
Problem-Solving Skills
Communication Skills
Collaboration

Some tips for your application 🫡

Tailor Your CV: Make sure your CV is tailored to the Senior Databricks Data Engineer role. Highlight your experience with Azure Databricks, ETL/ELT pipelines, and any relevant projects that showcase your skills in Python, SQL, and PySpark.

Craft a Compelling Cover Letter: Your cover letter is your chance to shine! Use it to explain why you're passionate about data engineering and how your background aligns with the responsibilities outlined in the job description. Don’t forget to mention your experience with data governance and security best practices!

Showcase Your Technical Skills: In your application, be sure to highlight your technical expertise with Databricks components and Azure Data Platform services. Mention specific tools and frameworks you've worked with, like Delta Lake and CI/CD pipelines, to demonstrate your hands-on experience.

Apply Through Our Website: We encourage you to apply through our website for a smoother application process. This way, your application will be reviewed quickly and fairly, ensuring you get the best chance at landing an interview. Good luck!

How to prepare for a job interview at Jobgether

✨Know Your Databricks Inside Out

Make sure you brush up on all the Databricks components mentioned in the job description. Be ready to discuss your experience with Workspaces, Notebooks, and Delta Lake architectures. Having specific examples of how you've used these tools in past projects will really impress the interviewers.

✨Showcase Your Problem-Solving Skills

Prepare to talk about complex technical challenges you've faced and how you solved them. Use the STAR method (Situation, Task, Action, Result) to structure your answers. This will help demonstrate your analytical skills and ability to think critically under pressure.

✨Understand Data Governance and Security

Since this role involves implementing data governance and security best practices, be ready to discuss your experience with access controls, lineage, and auditing frameworks. Highlight any specific frameworks you've worked with and how they improved data integrity in your previous roles.

✨Familiarise Yourself with Azure Services

Get comfortable with Azure Data Platform services like ADLS Gen2 and Azure Key Vault. Being able to explain how you've integrated these services into your data pipelines will show that you're not just a Databricks expert but also well-versed in the broader Azure ecosystem.

Senior Databricks Data Engineer
Jobgether
Go Premium

Land your dream job quicker with Premium

You’re marked as a top applicant with our partner companies
Individual CV and cover letter feedback including tailoring to specific job roles
Be among the first applications for new jobs with our AI application
1:1 support and career advice from our career coaches
Go Premium

Money-back if you don't land a job in 6-months

>