Databricks Data Engineer Contract

Databricks Data Engineer Contract

London Freelance 110000 - 130000 £ / year (est.) Home office (partial)
H

At a Glance

  • Tasks: Build and maintain data lakehouse architectures and develop ETL/ELT pipelines.
  • Company: Join a leading organisation in the UK's energy sector focused on renewable solutions.
  • Benefits: Enjoy a hybrid work model and competitive daily rates.
  • Why this job: Be part of a team driving impactful data solutions for a sustainable future.
  • Qualifications: Extensive Databricks experience, strong analytical skills, and proficiency in Python and SQL required.
  • Other info: 6-month contract with opportunities to enhance your data engineering skills.

The predicted salary is between 110000 - 130000 £ per year.

Location: Hybrid in London, 2 days per week

Duration: 6 month contract

Rate: £550 to £600 per day, Inside IR35

We are working with a leading organization that plays a key role in the UK's energy sector, driving forward innovative data solutions to support the transition to renewable energy. As part of their strategic initiatives, they are preparing for an upcoming regulatory change that will require detailed reporting on complex data products. This organization manages a vast amount of critical data, primarily from smart meters and power generators, which is essential for building impactful dashboards and driving operational efficiency. With these developments on the horizon, the team is looking for an experienced contractor to assist with a growing backlog, enhance data pipelines, and improve the overall quality and output of their data operations.

Key Responsibilities
  • Build and maintain scalable data lakehouse architectures using Databricks.
  • Design and develop ETL/ELT pipelines for large-scale data processing.
  • Manage both structured and unstructured datasets, optimizing performance and reliability.
  • Set up structured streaming pipelines (using Kafka).
  • Support Power BI reporting by preparing and transforming datasets.
  • Deploy infrastructure and manage environments using Terraform and CI/CD practices.
  • Collaborate with the engineering team to resolve backlogs and support data product delivery.
Required Experience & Skills
  • Extensive experience with Databricks in production environments.
  • Strong analytical skills with large, complex datasets (structured and unstructured).
  • Proven experience in building ETL/ELT pipelines.
  • In-depth understanding of Azure services (Data Factory, Azure Functions, Synapse, etc.).
  • Advanced SQL skills, including performance tuning and query optimization.
  • Strong Python programming skills.
  • Experience with big data tools such as Hadoop, Spark, and Kafka.
  • Proficiency in CI/CD processes and version control.
  • Solid experience with Terraform and Infrastructure as Code (IaC).
  • Experience with cloud-based networking and distributed systems.

Databricks Data Engineer Contract employer: Harnham - Data & Analytics Recruitment

Join a pioneering organisation at the forefront of the UK's energy sector, where your expertise as a Databricks Data Engineer will contribute to innovative data solutions that support the transition to renewable energy. With a hybrid work model in London, you will enjoy a collaborative and dynamic work culture that prioritises employee growth and development, alongside competitive daily rates and the opportunity to work on impactful projects that drive operational efficiency. This role not only offers a chance to enhance your technical skills but also to be part of a mission-driven team dedicated to making a difference in the energy landscape.
H

Contact Detail:

Harnham - Data & Analytics Recruitment Recruiting Team

StudySmarter Expert Advice 🤫

We think this is how you could land Databricks Data Engineer Contract

✨Tip Number 1

Familiarise yourself with Databricks and its functionalities. Since the role requires extensive experience in production environments, consider engaging in hands-on projects or contributing to open-source initiatives that utilise Databricks to showcase your skills.

✨Tip Number 2

Brush up on your knowledge of Azure services, particularly Data Factory and Synapse. You could even create a small project that demonstrates how you would use these tools in conjunction with Databricks to solve real-world data challenges.

✨Tip Number 3

Network with professionals in the energy sector and data engineering fields. Attend meetups or webinars focused on data solutions for renewable energy, as this can help you gain insights and potentially connect with decision-makers at the organisation.

✨Tip Number 4

Prepare to discuss your experience with CI/CD processes and Terraform during interviews. Consider creating a portfolio that highlights specific projects where you've successfully implemented these practices, as this will demonstrate your capability to manage infrastructure effectively.

We think you need these skills to ace Databricks Data Engineer Contract

Databricks
ETL/ELT Pipeline Development
Data Lakehouse Architecture
Structured and Unstructured Data Management
Performance Tuning
SQL Query Optimization
Python Programming
Hadoop
Spark
Kafka
CI/CD Processes
Version Control
Terraform
Infrastructure as Code (IaC)
Azure Services (Data Factory, Azure Functions, Synapse)
Cloud-based Networking
Distributed Systems
Analytical Skills

Some tips for your application 🫡

Tailor Your CV: Make sure your CV highlights your experience with Databricks and relevant technologies like Azure, SQL, and Python. Use specific examples that demonstrate your skills in building ETL/ELT pipelines and managing large datasets.

Craft a Compelling Cover Letter: Write a cover letter that connects your background to the job description. Emphasise your experience in the energy sector and your ability to enhance data pipelines and support reporting needs.

Showcase Relevant Projects: If you have worked on projects involving data lakehouse architectures or structured streaming pipelines, mention these in your application. Highlight your role and the impact of your contributions.

Proofread Your Application: Before submitting, carefully proofread your CV and cover letter for any errors. A polished application reflects your attention to detail, which is crucial for a data engineering role.

How to prepare for a job interview at Harnham - Data & Analytics Recruitment

✨Showcase Your Databricks Expertise

Make sure to highlight your extensive experience with Databricks during the interview. Be prepared to discuss specific projects where you've built and maintained data lakehouse architectures, as this will demonstrate your hands-on knowledge and ability to contribute immediately.

✨Demonstrate Your ETL/ELT Skills

Since the role requires designing and developing ETL/ELT pipelines, come ready to share examples of your past work in this area. Discuss the challenges you faced and how you optimised performance and reliability, as this will show your problem-solving skills.

✨Familiarise Yourself with Azure Services

Brush up on your knowledge of Azure services like Data Factory, Azure Functions, and Synapse. Be prepared to explain how you've used these tools in previous roles, as this will be crucial for the organisation's data operations.

✨Prepare for Technical Questions

Expect technical questions related to SQL performance tuning, Python programming, and big data tools like Hadoop and Spark. Practising these topics beforehand will help you feel more confident and articulate your expertise effectively.

H
Similar positions in other companies
UK’s top job board for Gen Z
discover-jobs-cta
Discover now
>