At a Glance
- Tasks: Design and validate data models for enterprise initiatives in the insurance sector.
- Company: Join a leading firm in the London Market insurance industry.
- Benefits: Enjoy hybrid work options and competitive pay up to £644 per day.
- Why this job: Be part of a dynamic team shaping data strategies with real impact.
- Qualifications: Experience in data modeling and London Market insurance is essential.
- Other info: Contract role running until 31/12/2025, ideal for tech-savvy individuals.
The predicted salary is between 50000 - 90000 £ per year.
We are seeking an experienced Data Modeller with proven expertise in the London Market insurance sector. The successful candidate will play a key role in designing and validating data models that support enterprise data initiatives. This includes working closely with data engineers, architects, and business stakeholders to ensure data structures are scalable, accurate, and aligned with business needs.
Key Skills/Requirements:
- Design and maintain conceptual, logical, and physical data models to support reporting, analytics, and operational systems.
- Collaborate with data engineers and analysts to ensure models are implemented correctly and efficiently.
- Translate complex business requirements into scalable and maintainable data structures.
- Ensure data models comply with data governance, compliance, and London Market regulatory standards.
- Document data definitions, relationships, and lineage using industry-standard modeling tools.
- Support data quality initiatives by identifying gaps and inconsistencies in source systems and downstream usage.
Qualifications:
- London Market insurance experience is essential, including familiarity with market data structures and regulatory reporting.
- Strong experience in data modeling (conceptual, logical, physical) using tools such as Erwin, ER/Studio, or dbt.
- Solid understanding of data warehousing, data lakes, and enterprise data architecture.
- Proficiency in SQL and experience working with cloud data platforms (e.g., Azure, AWS, GCP).
- Familiarity with data governance frameworks, metadata management, and data cataloging tools.
- Excellent communication and documentation skills, with the ability to explain complex data concepts to non-technical stakeholders.
Preferred Skills:
- Experience with insurance platforms such as Guidewire, Duck Creek, or legacy PAS systems.
- Knowledge of Delta Lake, Apache Spark, and data pipeline orchestration tools.
- Exposure to Agile delivery methodologies and tools like JIRA, Confluence, or Azure DevOps.
- Understanding of regulatory data requirements such as Solvency II, Core Data Record (CDR), or Blueprint Two.
All profiles will be reviewed against the required skills and experience. Due to the high number of applications we will only be able to respond to successful applicants in the first instance. We thank you for your interest and the time taken to apply!
Data Modeler employer: Undisclosed
Contact Detail:
Undisclosed Recruiting Team
StudySmarter Expert Advice 🤫
We think this is how you could land Data Modeler
✨Tip Number 1
Make sure to highlight your experience in the London Market insurance sector during any networking opportunities. Attend industry events or webinars where you can connect with professionals in this field, as personal connections can often lead to job opportunities.
✨Tip Number 2
Familiarise yourself with the specific data modelling tools mentioned in the job description, such as Erwin or dbt. Consider taking a short online course or tutorial to brush up on these tools, which will not only boost your confidence but also demonstrate your commitment to the role.
✨Tip Number 3
Engage with relevant online communities or forums focused on data modelling and the London Market insurance sector. Sharing insights and asking questions can help you stay updated on industry trends and may even lead to referrals for job openings.
✨Tip Number 4
Prepare to discuss your understanding of data governance frameworks and regulatory requirements like Solvency II during interviews. Being able to articulate how you have applied these concepts in past roles will set you apart from other candidates.
We think you need these skills to ace Data Modeler
Some tips for your application 🫡
Tailor Your CV: Make sure your CV highlights your experience in the London Market insurance sector. Emphasise your data modelling skills and any relevant tools you've used, such as Erwin or dbt.
Craft a Strong Cover Letter: Write a cover letter that specifically addresses the key skills mentioned in the job description. Explain how your background aligns with the requirements, particularly your experience with data governance and regulatory standards.
Showcase Relevant Projects: If you have worked on projects related to data modelling or the insurance sector, include these in your application. Detail your role and the impact of your work on the project's success.
Proofread Your Application: Before submitting, carefully proofread your application for any errors or inconsistencies. A well-presented application reflects your attention to detail, which is crucial for a Data Modeler.
How to prepare for a job interview at Undisclosed
✨Showcase Your London Market Experience
Make sure to highlight your experience in the London Market insurance sector during the interview. Discuss specific projects or roles where you designed data models that comply with regulatory standards, as this is crucial for the role.
✨Demonstrate Technical Proficiency
Be prepared to discuss your experience with data modelling tools like Erwin or dbt. You might be asked to explain how you've used these tools to create conceptual, logical, and physical data models, so have examples ready.
✨Communicate Complex Concepts Simply
Since you'll need to explain complex data structures to non-technical stakeholders, practice simplifying your explanations. Use analogies or straightforward language to convey your points clearly during the interview.
✨Prepare for Scenario-Based Questions
Expect scenario-based questions that assess your problem-solving skills. Think of situations where you identified data quality issues or gaps in source systems, and be ready to discuss how you addressed them effectively.