Databricks Data Engineer

Databricks Data Engineer

Full-Time 36000 - 60000 £ / year (est.) Home office (partial)
N

At a Glance

  • Tasks: Build scalable data pipelines and optimise Lakehouse architectures on the Databricks platform.
  • Company: Join a global tech leader transforming industries with innovative data solutions.
  • Benefits: Flexible work options, tailored benefits, and continuous learning opportunities.
  • Why this job: Make a real impact by solving complex data challenges in a collaborative environment.
  • Qualifications: 5-8 years in data engineering with strong Databricks expertise required.
  • Other info: Inclusive culture with various networks supporting diversity and growth.

The predicted salary is between 36000 - 60000 £ per year.

We are seeking a highly skilled Databricks Data Engineer to join our Data & AI practice. The successful candidate will have deep expertise in building scalable data pipelines, optimizing Lakehouse architectures and enabling advanced analytics and AI use cases on the Databricks platform. This role is critical in building and optimising modern data ecosystems that enable data-driven decision making, advanced analytics, and AI capabilities for our clients. As a trusted practitioner, you will design and implement robust ETL/ELT workflows, integrate real-time and batch data sources, and enable secure, well-governed data products and pipelines. You will thrive in a collaborative, client-facing environment, with a passion for solving complex data challenges, driving innovation and ensuring the seamless delivery of data solutions.

What you’ll be doing:

  • Client Engagement & Delivery
  • Data Pipeline Development (Batch and Streaming)
  • Databricks & Lakehouse Architectures
  • Data Modelling & Optimisation (Delta Lake, Medallion architecture)
  • Collaboration & Best Practices
  • Quality, Governance & Security

Business Relationships:

  • Solution Architects
  • Data Engineers, Developers, ML Engineers, and Analysts
  • Client stakeholders up to Head of Data Engineering, Chief Data Architect, and Analytics leadership

What experience you’ll bring:

Competencies / Critical Skills:

  • Proven experience in data engineering and pipeline development on Databricks and cloud-native platforms.
  • Strong consulting values with ability to collaborate effectively in client-facing environments.
  • Hands‑on expertise across the data lifecycle: ingestion, transformation, modelling, governance, and consumption.
  • Strong problem‑solving, analytical, and communication skills.
  • Experience leading or mentoring teams of engineers to deliver high‑quality scalable data solutions.

Technical Expertise:

  • Deep expertise with the Databricks platform (Spark/PySpark/Scala, Delta Lake, Unity Catalog, MLflow).
  • Proficiency in ETL/ELT tools such as DBT, Matillion, Talend, or equivalent.
  • Strong SQL and Python (or equivalent language) skills for data manipulation and automation.
  • Hands‑on experience with cloud platforms (AWS, Azure, GCP).
  • Familiarity with Databricks Workflows and other orchestration tools.
  • Knowledge of data modelling methodologies (star schemas, Data Vault, Kimball, Inmon).
  • Familiarity with medallion architectures, data lakehouse principles and distributed data processing.
  • Experience with version control tools (GitHub, Bitbucket) and CI/CD pipelines.
  • Understanding of data governance, security, and compliance frameworks.
  • Exposure to AI/ML workloads desirable.

Qualifications and Education:

  • Experience: Minimum 5–8 years in data engineering, data warehousing, or data architecture roles, with at least 3+ years working with Databricks.
  • Education: University degree required. Preferred: BSc/MSc in Computer Science, Data Engineering, or related field.
  • Databricks certifications (Data Engineer Professional) highly desirable.

Measures of Success:

  • Delivery of high-performing, scalable, and secure data pipelines aligned to client requirements.
  • High client satisfaction and successful adoption of Databricks-based solutions.
  • Demonstrated ability to innovate and improve data engineering practices.
  • Contribution to the growth of the practice through reusable assets, accelerators, and technical leadership.

We’re a business with a global reach that empowers local teams, and we undertake hugely exciting work that is genuinely changing the world. Our advanced portfolio of consulting, applications, business process, cloud, and infrastructure services will allow you to achieve great things by working with brilliant colleagues, and clients, on exciting projects. Our inclusive work environment prioritises mutual respect, accountability, and continuous learning for all our people. This approach fosters collaboration, well‑being, growth, and agility, leading to a more diverse, innovative, and competitive organisation.

We are also proud to share that we have a range of Inclusion Networks such as: the Women’s Business Network, Cultural and Ethnicity Network, LGBTQ+ & Allies Network, Neurodiversity Network and the Parent Network.

What we’ll offer you: We offer a range of tailored benefits that support your physical, emotional, and financial wellbeing. Our Learning and Development team ensure that there are continuous growth and development opportunities for our people. We also offer the opportunity to have flexible work options.

We are an equal opportunities employer. We believe in the fair treatment of all our employees and commit to promoting equity and diversity in our employment practices. We are also a proud Disability Confident Committed Employer - we are committed to creating a diverse and inclusive workforce. We actively collaborate with individuals who have disabilities and long-term health conditions which have an effect on their ability to do normal daily activities, ensuring that barriers are eliminated when it comes to employment opportunities. In line with our commitment, we guarantee an interview to applicants who declare to us, during the application process, that they have a disability and meet the minimum requirements for the role. If you require any reasonable adjustments during the recruitment process, please let us know. Join us in building a truly diverse and empowered team.

Databricks Data Engineer employer: NTT America, Inc.

At NTT DATA, we pride ourselves on being an exceptional employer, offering a dynamic work culture that fosters collaboration and innovation. As a Databricks Data Engineer, you will benefit from tailored development opportunities, flexible working options, and a commitment to diversity and inclusion, all while contributing to impactful projects that drive data-driven decision-making for our clients. Join us in a supportive environment where your expertise will be valued and your career can flourish.
N

Contact Detail:

NTT America, Inc. Recruiting Team

StudySmarter Expert Advice 🤫

We think this is how you could land Databricks Data Engineer

✨Tip Number 1

Network like a pro! Reach out to your connections in the data engineering field, especially those who work with Databricks. A friendly chat can lead to insider info about job openings or even referrals.

✨Tip Number 2

Show off your skills! Create a portfolio showcasing your best data pipelines and projects on Databricks. This gives potential employers a taste of what you can do and sets you apart from the crowd.

✨Tip Number 3

Prepare for interviews by brushing up on common data engineering challenges and solutions. Be ready to discuss how you've tackled complex problems in past roles, especially in client-facing situations.

✨Tip Number 4

Don’t forget to apply through our website! It’s the best way to ensure your application gets noticed. Plus, we love seeing candidates who are genuinely interested in joining our team.

We think you need these skills to ace Databricks Data Engineer

Databricks
Data Pipeline Development
ETL/ELT Workflows
Delta Lake
SQL
Python
Cloud Platforms (AWS, Azure, GCP)
Data Modelling
Data Governance
Communication Skills
Problem-Solving Skills
Collaboration
Version Control (GitHub, Bitbucket)
CI/CD Pipelines
AI/ML Workloads

Some tips for your application 🫡

Tailor Your CV: Make sure your CV is tailored to the Databricks Data Engineer role. Highlight your experience with data pipelines, ETL/ELT workflows, and any relevant projects that showcase your skills in Databricks and cloud platforms.

Craft a Compelling Cover Letter: Your cover letter is your chance to shine! Use it to explain why you're passionate about data engineering and how your background aligns with our needs. Don’t forget to mention your collaborative spirit and problem-solving skills!

Showcase Your Technical Skills: Be specific about your technical expertise in your application. Mention your hands-on experience with tools like Spark, Delta Lake, and any relevant programming languages. This will help us see how you can contribute to our team right away.

Apply Through Our Website: We encourage you to apply through our website for the best chance of getting noticed. It’s the easiest way for us to keep track of your application and ensure it reaches the right people!

How to prepare for a job interview at NTT America, Inc.

✨Know Your Databricks Inside Out

Make sure you brush up on your Databricks knowledge before the interview. Be ready to discuss your hands-on experience with Spark, Delta Lake, and any ETL/ELT tools you've used. Prepare examples of how you've built scalable data pipelines and optimised Lakehouse architectures.

✨Showcase Your Problem-Solving Skills

During the interview, be prepared to tackle some real-world data challenges. Think about specific problems you've solved in previous roles and how you approached them. This will demonstrate your analytical skills and ability to think on your feet.

✨Emphasise Collaboration and Client Engagement

Since this role involves client-facing work, highlight your experience in collaborating with stakeholders. Share examples of how you've successfully engaged with clients and delivered data solutions that meet their needs. This shows you're not just a tech whiz but also a great communicator.

✨Prepare for Technical Questions

Expect technical questions related to data modelling methodologies and cloud platforms. Brush up on your SQL and Python skills, and be ready to discuss version control tools and CI/CD pipelines. Practising coding challenges can help you feel more confident during this part of the interview.

Land your dream job quicker with Premium

You’re marked as a top applicant with our partner companies
Individual CV and cover letter feedback including tailoring to specific job roles
Be among the first applications for new jobs with our AI application
1:1 support and career advice from our career coaches
Go Premium

Money-back if you don't land a job in 6-months

N
Similar positions in other companies
UK’s top job board for Gen Z
discover-jobs-cta
Discover now
>