At a Glance
- Tasks: Lead complex data engineering projects using Databricks and cloud platforms.
- Company: Join DATAPAO, a rapidly growing data consulting firm recognised for innovation.
- Benefits: Enjoy remote work, flexible PTO, learning opportunities, and a sign-up bonus.
- Why this job: Be the first UK hire and shape our regional growth while working with top global clients.
- Qualifications: 5+ years in Data Engineering with expertise in Databricks and cloud platforms.
- Other info: Opportunity for personal growth and leadership in a transparent, high-trust culture.
The predicted salary is between 56000 - 84000 £ per year.
Join to apply for the Senior Data Engineer (Databricks) role at DATAPAO
Get AI-powered advice on this job and more exclusive features.
This range is provided by DATAPAO. Your actual pay will be based on your skills and experience — talk with your recruiter to learn more.
Base pay range
Direct message the job poster from DATAPAO
At DATAPAO, data ignites passion, community fuels collaboration, and growth knows no bounds. We are a leading Data Engineering and Data Science consulting company, recognized for our innovation and rapid growth. We have been named Databricks EMEA Emerging Business Partner of the Year and have achieved a second consecutive appearance on the Financial Times FT1000 list.
We are currently looking for a Senior Data Engineer to join us remotely in the UK. We plan to set up a hub in the UK in the following 12-18 months to enable our GTM strategy, and this would be our first hire in the region. You will work with our EMEA and US customers to help them solve their data engineering, ML, ML Ops, and cloud migration puzzles.
What will you do?
As a Senior Data Engineer, you are expected to deliver (across industries) on some of our most complex projects – individually or by leading small delivery teams. Our projects are fast-paced, typically 2 to 4 months long, and primarily use Apache Spark/Databricks on AWS/Azure. You will manage customer relationships either alone or with a Project Manager, and support our pre-sales, mentoring, and hiring efforts.
What does it take to fit the bill?
Technical Expertise
- 5+ years in Data Engineering, focusing on cloud platforms (AWS, Azure, GCP);
- Proven experience with Databricks (PySpark, SQL, Delta Lake, Unity Catalog);
- Extensive ETL/ELT and data pipeline orchestration experience (e.g., Databricks Workflows, DLT, Airflow, ADF, Glue, Step Functions);
- Proficiency in SQL and Python for data transformation and optimization;
- Knowledge of CI/CD pipelines and Infrastructure as Code (Terraform, CloudFormation, Bicep);
- Hands-on experience with Databricks integration with BI tools (Power BI, Tableau, Looker).
Consulting & Client-Facing Skills
- Experience in consulting or product companies with a customer-facing mindset; ability to scope, gather requirements, design solutions, and communicate effectively;
- Successful delivery of Data Lakehouse and cloud migration projects;
- Ability to explain technical concepts to non-technical audiences and facilitate decision-making.
Operational & Soft Skills
- Ready to contribute immediately to live projects;
- Adaptability and problem-solving skills in a fast-paced environment;
- Excellent communication skills in writing and speaking.
What do we offer?
Opportunity to impact our growth as the first UK hire, with regional expansion leadership. Benefits include:
- Remote work with a one-month onboarding in Budapest and quarterly visits;
- Learning opportunities with Databricks courses, certifications, and internal resources;
- Flexible PTO;
- Paternity leave;
- Employee Assistance Program;
- Pension and private health insurance via Remote.com;
- High-quality equipment and a sign-up bonus of £1000;
- Transparent, high-trust culture.
The gross salary range is £70,000 to £100,000 per year, based on assessed seniority.
We are a trusted partner for Databricks and Microsoft, working with top global companies. We value diversity and are open to passionate candidates who may not meet every requirement. Your privacy is important to us, and your data will be used solely for recruitment purposes.
#J-18808-Ljbffr
Principal Data Engineer (Databricks) employer: DATAPAO
Contact Detail:
DATAPAO Recruiting Team
StudySmarter Expert Advice 🤫
We think this is how you could land Principal Data Engineer (Databricks)
✨Tip Number 1
Familiarise yourself with Databricks and its ecosystem. Since this role heavily involves Databricks, understanding its features like Delta Lake and Unity Catalog will give you an edge in discussions during interviews.
✨Tip Number 2
Brush up on your cloud platform knowledge, especially AWS and Azure. Being able to discuss your experience with these platforms and how you've used them in past projects will demonstrate your technical expertise.
✨Tip Number 3
Prepare to showcase your consulting skills. Since the role requires client-facing interactions, think of examples where you've successfully scoped projects or communicated complex technical concepts to non-technical stakeholders.
✨Tip Number 4
Network with current employees or industry professionals who have experience in data engineering roles. They can provide insights into the company culture and expectations, which can be invaluable during your interview.
We think you need these skills to ace Principal Data Engineer (Databricks)
Some tips for your application 🫡
Tailor Your CV: Make sure your CV highlights relevant experience in Data Engineering, particularly with Databricks and cloud platforms like AWS or Azure. Use specific examples of projects you've worked on that align with the job description.
Craft a Compelling Cover Letter: Write a cover letter that showcases your passion for data engineering and your understanding of the role. Mention your experience with ETL/ELT processes and how you can contribute to DATAPAO's growth as their first UK hire.
Showcase Technical Skills: In your application, emphasise your technical expertise, especially in SQL, Python, and Databricks. Provide concrete examples of how you've used these skills in past roles to solve complex data problems.
Highlight Soft Skills: Don't forget to mention your soft skills, such as communication and adaptability. Explain how these skills have helped you in client-facing roles and contributed to successful project deliveries.
How to prepare for a job interview at DATAPAO
✨Showcase Your Technical Expertise
Be prepared to discuss your experience with Databricks, Apache Spark, and cloud platforms like AWS or Azure. Highlight specific projects where you've successfully implemented ETL/ELT processes and data pipeline orchestration.
✨Demonstrate Client-Facing Skills
Since this role involves customer interaction, be ready to share examples of how you've effectively communicated technical concepts to non-technical audiences. Discuss your experience in scoping projects and gathering requirements.
✨Emphasise Adaptability and Problem-Solving
The fast-paced nature of the projects means adaptability is key. Prepare to discuss situations where you've had to quickly adjust to changes or solve unexpected problems during a project.
✨Ask Insightful Questions
Prepare thoughtful questions about DATAPAO's projects, culture, and future plans, especially regarding their expansion in the UK. This shows your genuine interest in the company and the role.