At a Glance
- Tasks: Design and implement scalable data solutions on Azure, ensuring efficient data processing and storage.
- Company: Join AXIS Capital, a leading global provider of specialty insurance and reinsurance with a strong ethical culture.
- Benefits: Enjoy remote work options, competitive pay, health plans, tuition reimbursement, and wellness programmes.
- Why this job: Be part of a top-tier team, modernise data architecture, and make impactful business decisions.
- Qualifications: 5+ years in Azure Data Engineering, expertise in Databricks, and strong problem-solving skills required.
- Other info: Full-time role with flexible working; in-office presence needed three days a week.
The predicted salary is between 48000 - 72000 £ per year.
Senior Data Engineer – (Azure/Databricks)
Axis Capital London, United Kingdom Apply now Posted 15 days ago Permanent Competitive
Senior Data Engineer – (Azure/Databricks)
Axis Capital London, United Kingdom Apply now
This is your opportunity to join AXIS Capital – a trusted global provider of specialty lines insurance and reinsurance. We stand apart for our outstanding client service, intelligent risk taking and superior risk adjusted returns for our shareholders. We also proudly maintain an entrepreneurial, disciplined and ethical corporate culture. As a member of AXIS, you join a team that is among the best in the industry.
At AXIS, we believe that we are only as strong as our people. We strive to create an inclusive and welcoming culture where employees of all backgrounds and from all walks of life feel comfortable and empowered to be themselves. This means that we bring our whole selves to work. All qualified applicants will receive consideration for employment without regard to race, color, religion or creed, sex, pregnancy, sexual orientation, gender identity or expression, national origin or ancestry, citizenship, physical or mental disability, age, marital status, civil union status, family or parental status, or any other characteristic protected by law. Accommodation is available upon request for candidates taking part in the selection process. Senior Data Engineer (Azure/Databricks
Job Family Grouping: Chief Underwriting Officer
Job Family: Data & Analytics Location: London How does this role contribute to our collective success? The Data & Analytics department transforms raw data into actionable insights to drive informed decision-making and optimize business operations. The Senior Azure Data Engineer will support these goals by designing, implementing, and managing scalable data solutions on the Azure platform, ensuring efficient data processing, storage, and retrieval. You will play a key role in modernizing our data architecture, ensuring efficient data integration, and enabling advanced analytics to support critical business decisions. This role will enhance the department's ability to deliver high-quality analytics and maintain robust data infrastructure. What will you do in this role? As a Senior Azure Data Engineer, you will be responsible for designing, implementing, and maintaining data storage and processing solutions on the Azure platform. You will work with modern data warehouse (MDW) technologies, big data, and Lakehouse architectures to ensure our data solutions are secure, efficient, and optimized. Key Responsibilities:
- Design and implement data solutions using Azure services, including Azure Databricks, ADF, and Data Lake Storage.
- Develop and maintain ETL/ELT pipelines to process structured and unstructured data from multiple sources. – Automate loads using Databricks workflows and Jobs
- Develop, test and build CI/CD pipelines using Azure DevOps to automate deployment and monitoring of data solutions to all environments. Provide knowledge sharing to data operations teams on release management and maintenance.
- Manage platform administration, ensuring optimal performance, availability, and scalability of Azure data services.
- Implement end-to-end data pipelines, ensuring data quality, data integrity and data security.
- Troubleshoot and resolve data pipeline issues while ensuring data integrity and quality.
- Implement and enforce data security best practices, including role-based access control (RBAC), encryption, and compliance with industry standards.
- Collaborate with data scientists, analysts, and business stakeholders to deliver high-quality data solutions.
- Monitor and optimize Databricks performance, including cost management guidance and cluster tuning.
- Stay up to date with Azure cloud innovations and recommend improvements to existing architectures.
- Assist data analysts with technical input.
You may also be required to take on additional duties, responsibilities and activities appropriate to the nature of this role.
About You: We encourage you to bring your own experience and expertise to the table, so while there are some qualifications and experiences, we need you to have, we are open to discussing how your individual knowledge might lend itself to fulfilling this role and help us achieve our goals. Required Skills & Experience:
- 5 plus years Azure & Data Engineering Expertise:
- Proven experience in designing and managing large-scale data solutions on Microsoft Azure.
- Unity Catalog Mastery:
- In-depth knowledge of setting up, configuring, and utilizing Unity Catalog for robust data governance, access control, and metadata management in a Databricks environment.
- Databricks Proficiency:
- Demonstrated ability to optimize and tune Databricks notebooks and workflows to maximize performance and efficiency. Experience with performance troubleshooting and best practices for scalable data processing is essential.
- Additional Requirements:
- Strong problem-solving skills, ability to work in agile environments, and effective collaboration with cross-functional teams.
- Experience with implementing a Data Lakehouse solution with Azure Databricks, data modeling, warehousing, and real-time streaming.
- Knowledge of developing and processing full and incremental loads.
- Experience of automated loads using Databricks workflows and Jobs
- Expertise in Azure Databricks, including Delta Lake, Spark optimizations, and MLflow.
- Strong experience with Azure Data Factory (ADF) for data integration and orchestration.
- Hands-on experience with Azure DevOps, including pipelines, repos, and infrastructure as code (IaC).
- Solid understanding of platform administration, including monitoring, logging, and cost management.
- Knowledge of data security, compliance, and governance in Azure, including Azure Active Directory (AAD), RBAC, and encryption.
- Experience working with big data technologies (Spark, Python, Scala, SQL).
- Strong problem-solving and troubleshooting skills.
- Excellent communication skills with the ability to collaborate with cross-functional teams to understand requirements, data solutions, data models and mapping documents.
- Preferred Qualifications:
- Azure certifications (e.g., Azure Data Engineer Associate, Azure Solutions Architect).
- Experience with Terraform, ARM templates, or Bicep for infrastructure automation.
- Experience integrating Azure Data Services with Power BI and AI/ML workflows.
Role Factors: The position is full-time with remote work options, requiring in-office presence three days per week What we offer: You will be eligible for a comprehensive and competitive benefits package which includes medical plans for you and your family, health and wellness programs, retirement plans, tuition reimbursement, paid annual leave, and much more.
Boost your career
Find thousands of job opportunities by signing up to eFinancialCareers today. #J-18808-Ljbffr
Senior Data Engineer - (Azure/Databricks) | London, UK (London) employer: AXIS Capital
Contact Detail:
AXIS Capital Recruiting Team
StudySmarter Expert Advice 🤫
We think this is how you could land Senior Data Engineer - (Azure/Databricks) | London, UK (London)
✨Tip Number 1
Familiarise yourself with Azure and Databricks by exploring their latest features and updates. This will not only enhance your technical knowledge but also show your genuine interest in the role during discussions.
✨Tip Number 2
Network with current or former employees of AXIS Capital on platforms like LinkedIn. Engaging with them can provide valuable insights into the company culture and expectations, which you can leverage in your conversations.
✨Tip Number 3
Prepare to discuss specific projects where you've implemented Azure solutions or optimised data pipelines. Real-world examples will demonstrate your expertise and problem-solving skills effectively.
✨Tip Number 4
Stay updated on industry trends related to data engineering and analytics. Being able to discuss these trends can set you apart as a candidate who is not only skilled but also forward-thinking.
We think you need these skills to ace Senior Data Engineer - (Azure/Databricks) | London, UK (London)
Some tips for your application 🫡
Tailor Your CV: Make sure your CV highlights relevant experience in Azure and Databricks. Focus on specific projects where you've designed or managed data solutions, and quantify your achievements to demonstrate impact.
Craft a Compelling Cover Letter: In your cover letter, express your enthusiasm for the role at AXIS Capital. Mention how your skills align with their needs, particularly in data engineering and cloud technologies, and share why you want to be part of their team.
Showcase Relevant Skills: Clearly list your technical skills related to Azure, Databricks, and data engineering in your application. Highlight your experience with ETL/ELT processes, CI/CD pipelines, and any relevant certifications to stand out.
Proofread Your Application: Before submitting, carefully proofread your CV and cover letter for any spelling or grammatical errors. A polished application reflects attention to detail, which is crucial in data engineering roles.
How to prepare for a job interview at AXIS Capital
✨Showcase Your Azure Expertise
Make sure to highlight your experience with Azure services, especially Azure Databricks and Data Factory. Be prepared to discuss specific projects where you've designed and implemented data solutions, as this will demonstrate your hands-on expertise.
✨Demonstrate Problem-Solving Skills
Prepare examples of how you've tackled complex data pipeline issues in the past. Discuss your approach to troubleshooting and ensuring data integrity, as this is crucial for the role.
✨Familiarise Yourself with Unity Catalog
Since knowledge of Unity Catalog is essential, brush up on its features and benefits. Be ready to explain how you've used it for data governance and access control in previous roles.
✨Engage with Cross-Functional Teams
Highlight your collaboration skills by discussing experiences where you've worked with data scientists, analysts, or business stakeholders. This will show that you can effectively communicate and deliver high-quality data solutions.