At a Glance
- Tasks: Join a dynamic team to develop and enhance key data platforms like the Data Lakehouse.
- Company: Be part of a vibrant insurance organization located in the heart of London.
- Benefits: Enjoy a competitive salary of £90K plus bonuses and benefits, with hybrid work options.
- Why this job: This role offers excellent career development in a forward-thinking, people-centric culture.
- Qualifications: You need extensive experience with SQL, DataBricks, and Azure Data Factory (ADF).
- Other info: Ideal for those with a strong data management background and project management skills.
The predicted salary is between 54000 - 126000 £ per year.
Data Engineer, Azure, DataBricks
London/hybrid
£90K + bonus and benefits
SQL, DataBricks, Azure Data Factory (ADF), Data Lakehouse
Data Engineer with extensive experience in Data Lakehousing practices, using SQL, DataBricks, and Azure Data Factory (ADF) is required to join a vibrant and growing insurance organisation with offices in the heart of the city. This is a pivotal role that will see you working collaboratively with stakeholders and wider data teams to develop key data platforms including a new Data Lakehouse.
This is a fantastic opportunity to join a forward-thinking, people-centric organisation with excellent scope for long term career development.
Key Responsibilities:
- Develop and enhance the Group Data Lakehouse and other data platforms.
- Collaborate with key business, data and tech teams to understand data requirements and deliver solutions.
- Set and enforce data management standards, ensuring best practices across the organisation.
- Conduct solution reviews and designs, providing assurance on development approaches.
- Manage multiple project deliveries simultaneously with outstanding attention to detail.
- Provide guidance and support to wider data teams as required
Key Skills and Experience:
- Extensive experience in Data Lakehousing practices, particularly using SQL, DataBricks, and Azure Data Factory (ADF).
- Strong data management background with hands-on experience.
- Proven ability to manage multiple projects in a fast-paced, agile, and dynamic working environment.
- Excellent attention to detail and ability to maintain high standards.
- Strong problem-solving skills and ability to provide technical guidance and support.
- Excellent communication skills, both written and verbal, with the ability to influence stakeholders.
- Experience working within an MGA (Managing General Agent) or (re)insurance carrier is highly desirable.
For a full consultation, please send your CV to Arc IT Recruitment.
Data Engineer, Databricks employer: ARC IT Recruitment
Contact Detail:
ARC IT Recruitment Recruiting Team
StudySmarter Expert Advice 🤫
We think this is how you could land Data Engineer, Databricks
✨Tip Number 1
Familiarize yourself with the latest trends and best practices in Data Lakehousing, especially focusing on SQL, DataBricks, and Azure Data Factory. This knowledge will not only help you during interviews but also demonstrate your commitment to staying updated in the field.
✨Tip Number 2
Network with professionals in the insurance and data engineering sectors. Attend relevant meetups or webinars where you can connect with potential colleagues or hiring managers from organizations similar to ours.
✨Tip Number 3
Prepare to discuss specific projects where you've successfully managed multiple deliveries in a fast-paced environment. Highlight your attention to detail and problem-solving skills, as these are crucial for the role.
✨Tip Number 4
Showcase your communication skills by practicing how to explain complex technical concepts in simple terms. This will be essential when collaborating with stakeholders and wider data teams.
We think you need these skills to ace Data Engineer, Databricks
Some tips for your application 🫡
Understand the Role: Make sure to thoroughly read the job description for the Data Engineer position. Highlight key responsibilities and required skills such as SQL, DataBricks, and Azure Data Factory (ADF) to tailor your application.
Tailor Your CV: Customize your CV to emphasize your experience with Data Lakehousing practices and relevant technologies. Include specific projects where you utilized SQL, DataBricks, and ADF, showcasing your problem-solving skills and attention to detail.
Craft a Compelling Cover Letter: Write a cover letter that reflects your enthusiasm for the role and the company. Discuss how your background aligns with the key responsibilities and how you can contribute to developing the Group Data Lakehouse.
Highlight Communication Skills: Since excellent communication skills are essential for this role, provide examples in your application of how you've effectively collaborated with stakeholders and teams in previous positions.
How to prepare for a job interview at ARC IT Recruitment
✨Showcase Your Technical Skills
Be prepared to discuss your experience with SQL, DataBricks, and Azure Data Factory in detail. Highlight specific projects where you've successfully implemented Data Lakehouse solutions, as this will demonstrate your hands-on expertise.
✨Understand the Business Context
Research the insurance organization and understand their data needs. Be ready to explain how your technical skills can help them achieve their goals, especially in developing key data platforms.
✨Emphasize Collaboration
Since the role involves working with various stakeholders, share examples of how you've effectively collaborated with different teams in the past. This will show that you can work well in a team-oriented environment.
✨Prepare for Problem-Solving Questions
Expect questions that assess your problem-solving abilities. Think of scenarios where you've faced challenges in data management or project delivery, and be ready to explain how you overcame them.