At a Glance
- Tasks: Join us as a Data Engineer, solving complex data challenges and delivering impactful solutions.
- Company: Dufrain is a leading data and analytics company shaping the future of data-driven decision-making.
- Benefits: Enjoy hybrid work options, a collaborative culture, and opportunities for professional growth.
- Why this job: Be part of a dynamic team, contribute to innovative projects, and make a real impact in the industry.
- Qualifications: Strong skills in Databricks, SQL, and data engineering practices are essential; experience with Microsoft Fabric is a plus.
- Other info: We value diversity and encourage applications from all backgrounds; visa sponsorship is not available.
The predicted salary is between 36000 - 60000 £ per year.
At Dufrain, we don’t just build data solutions - we shape the future of data-driven decision-making. Do you thrive on solving complex challenges and turning data into powerful insights that drive transformation? We’re looking for Data Engineers who have a broad range of data engineering skills, with a focus on Microsoft Databricks. Experience with Microsoft Fabric or Snowflake is also highly desirable.
ABOUT THE ROLE
As a Data Engineer, you will play a key role in delivering high-quality data solutions for our clients. This role offers the opportunity to work across a variety of technologies and contribute to the growth of our Consulting practice.
ROLE RESPONSIBILITIES
- Develop good working relationships with clients on a project including interpersonal skills with both business and technical focused colleagues.
- Experience working as a data engineer to develop performant end-to-end solutions in a collaborative team environment.
- Delivering high-quality pieces of work, proven ability to escalate problems to client/senior team members where necessary and propose possible solutions.
- Support building the Consulting practice through contribution to ongoing initiatives, including contributing to knowledge-sharing activities and data services.
ESSENTIAL TECHNICAL SKILLS REQUIRED
We are looking for a technically skilled data professional with a strong foundation in modern data platforms and engineering practices. Key technical competencies include:
- Databricks Platform Expertise: Proven experience designing and delivering data solutions using Databricks on Azure or AWS.
- Databricks Components: Proficient in Delta Lake, Unity Catalog, MLflow, and other core Databricks tools.
- Programming & Query Languages: Strong skills in SQL and Apache Spark (Scala or Python).
- Relational Databases: Experience with on-premises and cloud-based SQL databases.
- Data Engineering Techniques: Skilled in Data Governance, Architecture, Data Modelling, ETL/ELT, Data Lakes, Data Warehousing, Master Data Management, and Business Intelligence.
- Engineering Delivery Practices: Solid understanding of Agile, DevOps, Git, APIs, Containers, Microservices, and Data Pipelines.
- Modern Data Tools: Experience with Kafka, Snowflake, Azure Data Factory, Azure Synapse, or Microsoft Fabric (highly desirable).
- Data Architecture Frameworks: Knowledge of Inmon, Kimball, and Data Vault methodologies.
Nice to have certifications:
- Databricks Certified Data Engineer Associate
- Databricks Certified Data Engineer Professional
ABOUT YOU
You’re not just a data expert - you’re a problem solver, a collaborator, and a forward-thinker. You will be someone who:
- Brings energy, focus, and a drive to meet ambitious deadlines.
- Delivers insightful, practical solutions that go beyond the obvious.
- Communicates confidently with senior stakeholders and contributes meaningfully to strategic conversations.
- Thrives in a team environment, sharing knowledge and supporting others.
- Works independently with initiative and resilience under pressure.
- Takes ownership of deliverables and consistently produces high-quality output.
- Adapts easily to working solo or as part of a dynamic, integrated team.
- Stays ahead of industry trends, standards, and regulatory developments.
If you’re passionate about data, and you’re looking to join a leading data and analytics company based in the UK, you could find your dream role at Dufrain. Please submit your CV highlighting your relevant experience and certifications. Applicants must have the right to work in the UK and not require sponsorship now or in the future. Visa sponsorship is not available.
We are an equal opportunity employer and value diversity at our company. We do not discriminate on the basis of race, colour, religion, sex, sexual orientation, gender identity, national origin, disability, age, or any other status protected by law. All qualified applicants will receive consideration for employment without regard to these factors. We encourage applications from individuals of all backgrounds and experiences.
Department: Consulting
Locations: London, Edinburgh, Manchester
Remote status: Hybrid
Data Engineer - Databricks employer: Dufrain
Contact Detail:
Dufrain Recruiting Team
StudySmarter Expert Advice 🤫
We think this is how you could land Data Engineer - Databricks
✨Tip Number 1
Familiarise yourself with the Databricks platform and its components, such as Delta Lake and MLflow. Being able to discuss specific projects or experiences where you've used these tools will demonstrate your expertise during interviews.
✨Tip Number 2
Network with current or former employees of Dufrain on platforms like LinkedIn. Engaging in conversations about their experiences can provide valuable insights into the company culture and expectations, which you can leverage in your application.
✨Tip Number 3
Stay updated on the latest trends in data engineering, particularly around Microsoft Fabric and Snowflake. Showing that you're knowledgeable about emerging technologies can set you apart from other candidates.
✨Tip Number 4
Prepare to discuss your experience with Agile and DevOps practices. Be ready to share examples of how you've successfully collaborated in a team environment, as this role emphasises teamwork and communication.
We think you need these skills to ace Data Engineer - Databricks
Some tips for your application 🫡
Tailor Your CV: Make sure your CV highlights your experience with Databricks, SQL, and Apache Spark. Emphasise any relevant projects or roles that showcase your data engineering skills, particularly in a collaborative environment.
Craft a Compelling Cover Letter: Write a cover letter that reflects your passion for data and problem-solving. Mention specific examples of how you've delivered high-quality data solutions and your ability to communicate effectively with stakeholders.
Highlight Technical Skills: Clearly list your technical competencies related to the role, such as your experience with Delta Lake, MLflow, and any modern data tools like Kafka or Snowflake. This will help demonstrate your fit for the position.
Showcase Soft Skills: In your application, don't forget to mention your interpersonal skills and ability to work under pressure. Provide examples of how you've collaborated with teams and contributed to knowledge-sharing initiatives.
How to prepare for a job interview at Dufrain
✨Showcase Your Databricks Expertise
Make sure to highlight your experience with the Databricks platform during the interview. Be prepared to discuss specific projects where you've used Databricks, focusing on components like Delta Lake and MLflow, as well as how you overcame challenges in those projects.
✨Demonstrate Problem-Solving Skills
Since the role requires a strong problem-solving mindset, be ready to share examples of complex challenges you've faced in data engineering. Discuss how you approached these problems, the solutions you implemented, and the impact they had on your projects.
✨Communicate Effectively with Stakeholders
The ability to communicate confidently with both technical and non-technical stakeholders is crucial. Practice explaining your technical work in simple terms, and prepare to discuss how you've built relationships with clients or team members in previous roles.
✨Stay Updated on Industry Trends
Dufrain values candidates who are proactive about staying ahead of industry trends. Research recent developments in data engineering, particularly around tools like Snowflake and Microsoft Fabric, and be ready to discuss how these trends could influence your work.