At a Glance
- Tasks: Design and build scalable data platforms using Databricks and develop high-performance analytics solutions.
- Company: Join Korn Ferry, a leader in talent management and consulting.
- Benefits: Hybrid work model, competitive pay, and opportunities for professional growth.
- Why this job: Make an impact by optimising data pipelines and mentoring the next generation of engineers.
- Qualifications: Strong experience with Databricks, Python, SQL, and excellent communication skills.
- Other info: Collaborative environment with a focus on innovation and best practices.
The predicted salary is between 48000 - 72000 £ per year.
London Based (Hybrid) 6 months initial contract + ext Outside IR35
We are looking for a Senior Databricks Data Engineer who has DPP Status (Databricks Partner Program). You will design and build scalable data platforms, develop pipelines, and help deliver high-performance analytics solutions across cloud environments.
Key Responsibilities:- Develop and optimise data pipelines and lakehouse solutions using Databricks.
- Work across AWS, Azure, or GCP environments.
- Implement best practices in data engineering, CI/CD, and testing.
- Collaborate with partner and client teams on solution design and delivery.
- Mentor junior engineers and contribute to reusable frameworks and accelerators.
- Strong hands-on experience with Databricks, Python, SQL, and PySpark.
- Solid understanding of Delta Lake, Unity Catalog, and MLflow.
- Experience with DevOps tools (Git, CI/CD, IaC).
- Excellent communication and stakeholder skills.
- Databricks certification and partner or consulting experience are a plus.
DataBricks Data Engineer DPP in London employer: Lucas Group
Contact Detail:
Lucas Group Recruiting Team
StudySmarter Expert Advice 🤫
We think this is how you could land DataBricks Data Engineer DPP in London
✨Tip Number 1
Network like a pro! Reach out to your connections in the data engineering field, especially those who work with Databricks. A friendly chat can lead to insider info about job openings or even referrals.
✨Tip Number 2
Show off your skills! Create a portfolio showcasing your projects with Databricks, Python, and SQL. This gives potential employers a taste of what you can do and sets you apart from the crowd.
✨Tip Number 3
Prepare for interviews by brushing up on common data engineering scenarios. Be ready to discuss how you've optimised data pipelines or implemented CI/CD practices. We want to see your problem-solving skills in action!
✨Tip Number 4
Don’t forget to apply through our website! It’s the best way to ensure your application gets noticed. Plus, we love seeing candidates who are proactive about their job search.
We think you need these skills to ace DataBricks Data Engineer DPP in London
Some tips for your application 🫡
Tailor Your CV: Make sure your CV highlights your experience with Databricks, Python, and SQL. We want to see how your skills match the job description, so don’t be shy about showcasing your relevant projects!
Craft a Compelling Cover Letter: Your cover letter is your chance to shine! Use it to explain why you’re the perfect fit for the Senior Databricks Data Engineer role. Share specific examples of your work with data pipelines and cloud environments.
Show Off Your Certifications: If you’ve got any Databricks certifications or relevant training, make sure to include them in your application. We love seeing candidates who are committed to their professional development!
Apply Through Our Website: We encourage you to apply directly through our website. It’s the best way for us to receive your application and ensures you don’t miss out on any important updates from our team!
How to prepare for a job interview at Lucas Group
✨Know Your Databricks Inside Out
Make sure you brush up on your Databricks knowledge. Be ready to discuss your hands-on experience with Databricks, Python, SQL, and PySpark. Prepare examples of how you've developed and optimised data pipelines and lakehouse solutions in previous roles.
✨Showcase Your Cloud Skills
Since the role involves working across AWS, Azure, or GCP environments, be prepared to talk about your experience with these platforms. Highlight specific projects where you implemented best practices in data engineering and CI/CD.
✨Collaboration is Key
This position requires collaboration with partner and client teams, so be ready to share examples of how you've successfully worked in a team setting. Discuss your communication skills and how you've mentored junior engineers or contributed to reusable frameworks.
✨Get Familiar with the Tools
Familiarise yourself with DevOps tools like Git, CI/CD, and Infrastructure as Code (IaC). Be prepared to explain how you've used these tools in your previous work and how they can enhance the data engineering process.