We are seeking a Senior Data Engineer to join our dynamic Tech teams. The ideal candidate is self-learning, problem-solving oriented individual with strong analytical thinking. You will be working as part of a team to build / develop / enhance and support SCOR Data ecosystem (SCOR Data Platform, Datawarehouses…), including but not restricted to Syndicate Data Hub. The Senior Data Engineer will promote good data hygiene across the Syndicate and wider SCOR group.
Responsibilities
Under the responsibility of a Data Architect, your mission will be to:
-
Build, deliver, test, maintain data artefacts such as data pipelines, datasets, cubes, models, services (API) to serve data distribution while following standard best practices and state-of-the-art approaches (testing, reconciling and documenting changes are a key part of the role).
-
Document data artefacts (code, diagrams, wiki-like documentation) to secure the comprehensiveness of your work.
-
Collaborate within and outside your squad by participating to workshops or rituals (dailys, sprint reviews, design sessions) and promoting good practices across the SCOR group.
-
Support the delivery of trust-worthy data pipelines that serve both transactional and analytical needs, by implementing adequate tests and controls, and monitoring data scheduled outputs or ad hoc inputs when required.
-
Review and coach data engineers, support them by helping them to grow.
Contribute to ICS (Internal Control System) and support audits when required.
Qualifications
- Several years of experience as a data engineer, data oriented mindset
- Proven experience in development and maintenance of data pipelines, preferably in agile projects (Scrum and/or Kanban)
- Good knowledge of the Lloyds of London Market
Technical Skills :
- Strong level in T-SQL, Azure Data Factory, ability to develop data pipelines under various platform using good practices such as parallelization, distributing programing techniques, decisional modeling, slowly changing dimensions, change data capture management…
- Good knowledge of CI / CD pipelines and Gitflows best practices
- Good knowledge of Power BI / MDX / DAX
- Good knowledge in Python, Pyspark
- Experience with Databricks and/or Palantir Foundry is a strong plus
- Knowledge of REST API development is a plus
Behavioral & Management Skills :
- Strong analytical thinking, rigorous mindset solution-oriented and force of proposal
- Team player, commitment, curiosity, interest to challenge
- Capacity to navigate in an international environment
- Communication & people skills, ability to speak to a wide community of stakeholders (business, data, IT) and understand their needs
Required Education
-
Bachelor’s degree in computer science, software or computer engineering, applied maths, physics, statistics, or a related field or equivalent experience
Contact Detail:
SCOR Recruiting Team