At a Glance
- Tasks: Engineer data platforms and enhance analytical capabilities for DUAL Personal Lines.
- Company: Join Howden, a global insurance group with a unique employee-owned culture.
- Benefits: Define your career with flexible hours, hybrid working, and a supportive environment.
- Why this job: Make a real impact in data engineering while collaborating with a diverse international team.
- Qualifications: 5+ years in data engineering, strong Python and SQL skills, and Azure experience.
- Other info: Embrace a culture of innovation and continuous learning at Howden.
The predicted salary is between 36000 - 60000 £ per year.
Who are we? Howden is a global insurance group with employee ownership at its heart. Together, we have pushed the boundaries of insurance. We are united by a shared passion and no-limits mindset, and our strength lies in our ability to collaborate as a powerful international team comprised of 23,000 employees spanning over 56 countries. People join Howden for many different reasons, but they stay for the same one: our culture. It’s what sets us apart, and the reason our employees have been turning down headhunters for years. Whatever your priorities – work / life balance, career progression, sustainability, volunteering – you’ll find like-minded people driving change at Howden.
Role responsibilities
- Key member of the team, engineering DUAL Personal Lines' strategic data platforms, chief of which is the Data Lakehouse, to enable the continuous enhancement of DUAL Personal Lines' underwriting analytical capabilities.
- Provide technical expertise in data engineering, analysis, orchestrations, and enrichment.
- Build data solutions where required to address changing requirements – managing new requests and incidents.
- Building, testing and deployment of data products while remaining true to governance principles including change control, accountability, data quality, and data security.
- Consult the DUAL Group data team's business analysts, developers and system owners to avoid divergence from agreed data standards.
- Assess the impact of change on DUAL Personal Lines' data model to determine and avoid any potential issues which may hinder our operational, analytical and reporting environments.
- Actively collaborates with data teams and system owners to maintain the data model and data integrity throughout the DUAL Group data lakehouse.
- Closely support underwriting analytical teams and reporting analysts to ensure data is accurately represented for use in underwriting and performance analysis and reporting.
- Build knowledge and expertise of DUAL Personal Lines' systems with other data team members.
- Working with other members of the data team and wider business to support delivery of additional project components (API interfaces, Data Models, Data Sharing, RDM/MDM tools, ML Endpoints).
- Working within an Agile delivery / DevOps methodology to incremental change to the overall data platform to deliver additional data products to provide value to stakeholders.
- Working with the Group Data Science team to productionise ML pipelines into the overall data platform solution.
- Work collaboratively with internal stakeholders across DUAL Personal Lines including but not limited to Underwriting teams, Operations, Finance, Risk & Compliance and HR.
- Act as a conduit between DUAL Personal Lines and Dual Data Team to facilitate the sharing of common approaches, adherence to standards, and advocating on behalf DUAL.
- Help drive success of future investments in Data within DUAL Personal Lines, including clear articulation of return on investment (ROI) and ongoing assessment of realisation attainment.
Key requirements
- Strong knowledge of Data Management principles in a Lakehouse architecture.
- At least 5 years' experience in data engineering and building data pipelines.
- Proven track record in Data Engineering and supporting the business to gain true insight from data. Experience in data integration and modelling including ELT pipelines.
- Hands on experience designing and delivering solutions using Azure services including Azure Data Factory, Azure Databricks, Azure Storage, Azure DevOps.
- Strong proficiency in Python and SQL.
- Experience working with Data Architects for technical design.
- Experience on the design of data models (Star Schema) for use in Power BI.
- Insurance analysis, MI or reporting experience.
- Understanding and adherence to CI/CD principles.
- Nice to have: experience working alongside Data Science teams to assist with building and deploying Machine Learning models.
- Nice to have: experience working with infrastructure as code tools for deploying resources.
- Ability to work quickly, efficiently and methodically.
- A strong team player who is confident in their ability.
- Very strong communication, influencing and negotiation skills. Actively listens to the views of colleagues, but also has the strength of character to challenge where required.
- Has a commercial awareness and stays up to date with current issues affecting the industry and its technologies.
- Proactively sharing best practice with others across the organisation.
- Planning, organising, and managing skills, and ability to prioritise.
- Good understanding of data operations.
- Broad knowledge and understanding of insurance principles, products and services.
- Self-starter, with a passion and ability for learning new skills and technologies.
What do we offer in return?
A career that you define. At Howden, we value diversity – there is no one Howden type. Instead, we’re looking for individuals who share the same values as us:
- Our successes have all come from someone brave enough to try something new.
- We support each other in the small everyday moments and the bigger challenges.
- We are determined to make a positive difference at work and beyond.
Reasonable adjustments
We’re committed to providing reasonable accommodations at Howden to ensure that our positions align well with your needs. Besides the usual adjustments such as software, IT, and office setups, we can also accommodate other changes such as flexible hours or hybrid working.
If you’re excited by this role but have some doubts about whether it’s the right fit for you, send us your application – if your profile fits the role's criteria, we will be in touch to assist in helping to get you set up with any reasonable adjustments you may require.
*Not all positions can accommodate changes to working hours or locations. Reach out to your Recruitment Partner if you want to know more.
Permanent
Data Engineer in Mount Pleasant employer: Howden
Contact Detail:
Howden Recruiting Team
StudySmarter Expert Advice 🤫
We think this is how you could land Data Engineer in Mount Pleasant
✨Tip Number 1
Network like a pro! Reach out to current employees at Howden on LinkedIn or through mutual connections. A friendly chat can give you insider info and might just get your foot in the door.
✨Tip Number 2
Prepare for the interview by brushing up on your data engineering skills. Be ready to discuss your experience with Azure services and data pipelines, as these are key for the role.
✨Tip Number 3
Show your passion for collaboration! Howden values teamwork, so be sure to highlight any past experiences where you worked closely with others to achieve a common goal.
✨Tip Number 4
Don’t forget to apply through our website! It’s the best way to ensure your application gets seen by the right people. Plus, it shows you’re serious about joining the Howden family.
We think you need these skills to ace Data Engineer in Mount Pleasant
Some tips for your application 🫡
Tailor Your CV: Make sure your CV reflects the skills and experiences that align with the Data Engineer role. Highlight your data engineering experience, especially with Azure services and data pipelines, to catch our eye!
Craft a Compelling Cover Letter: Use your cover letter to tell us why you're passionate about data engineering and how you can contribute to our team. Share specific examples of your past work that demonstrate your expertise and problem-solving skills.
Showcase Your Technical Skills: Don’t forget to mention your proficiency in Python, SQL, and any relevant tools you've used. We love seeing candidates who can clearly articulate their technical abilities and how they’ve applied them in real-world scenarios.
Apply Through Our Website: We encourage you to submit your application through our website. It’s the best way for us to receive your details and ensures you’re considered for the role. Plus, it shows you’re keen on joining our team!
How to prepare for a job interview at Howden
✨Know Your Data Inside Out
Make sure you have a solid understanding of data management principles, especially in a Lakehouse architecture. Brush up on your experience with Azure services like Azure Data Factory and Azure Databricks, as well as your proficiency in Python and SQL. Being able to discuss specific projects where you've built data pipelines will really impress the interviewers.
✨Showcase Your Collaboration Skills
Since this role involves working closely with various teams, be prepared to share examples of how you've successfully collaborated in the past. Highlight your ability to communicate effectively with stakeholders and your experience in Agile delivery or DevOps methodologies. This will demonstrate that you're not just a tech whiz but also a team player.
✨Prepare for Technical Questions
Expect technical questions related to data integration, modelling, and ELT pipelines. Review common data engineering challenges and think about how you've tackled similar issues in your previous roles. Practising these scenarios can help you articulate your thought process clearly during the interview.
✨Understand the Business Context
Familiarise yourself with the insurance industry and Howden's specific business model. Being able to discuss how your data engineering skills can drive insights and support underwriting analytical capabilities will show that you understand the bigger picture and are genuinely interested in contributing to the company's success.