Senior Data Engineer

Senior Data Engineer

Guildford Full-Time 54000 - 84000 £ / year (est.) No home office possible
A

At a Glance

  • Tasks: Join our team to develop data pipelines and innovative solutions for advanced pricing.
  • Company: Allianz Commercial is a global leader in insurance, fostering innovation and collaboration.
  • Benefits: Enjoy a dynamic work environment with opportunities for remote work and professional growth.
  • Why this job: Make a real impact by optimising data-driven pricing strategies in a collaborative international team.
  • Qualifications: 8+ years of SQL/PySpark experience, degree in a numerical field, and strong problem-solving skills required.
  • Other info: Opportunity to learn machine learning and work with cutting-edge cloud technologies.

The predicted salary is between 54000 - 84000 £ per year.

We are looking for a Senior Data Engineer (m/f/d), based in London/Munich.

Your Team

The Pricing Data Engineering & Automation team is part of the Global Pricing department at Allianz Commercial and is responsible for driving the development of data solutions within Pricing across Allianz Commercial globally. You will join an international department located across London, Munich, Bucharest, Chicago and New York.

The Impact You Will Have

Our global Pricing Data Engineering & Automation team is seeking an experienced Senior Data Engineer for a role within the Pricing Function. In this role, you will work closely with the Head of Pricing Data Engineering & Automation and other key stakeholders to interrogate data and implement pipelines across the various Lines of Business and regions in Allianz Commercial to support advanced pricing.

We are looking for someone who is willing to get hands on with the company’s data and to develop innovative and structured solutions to bring it together in a way that maximises its potential. This person will need to collaborate internationally with various stakeholders, both within the Pricing Function and across other Allianz Commercial functions.

Some of your specific responsibilities could include:

  • You will develop and maintain data pipelines within our Spark-based data platform to enhance the capabilities of the Pricing function

  • You will drive the design and implementation of solutions that contribute to Global Pricings data-driven pricing approach using internal and external data, and optimising fuzzy merge processes to release the intrinsic value of data

  • You will collaborate with Data Engineers, Pricing Actuaries and Predictive Modellers to deliver innovative data solutions for data-driven pricing

  • You will develop knowledge of the company’s IT landscape, data and data systems

  • You will implement solutions that adhere to the Data Engineering teams’ best practices for continuous improvement and participate in code reviews

  • You will act as an expert to support the wider Pricing team on data processing and coding best practices

  • You will contribute to a culture of results-driven collaboration, support and respect

  • You will support the development of reporting dashboard and application development where necessary

What You’ll Bring to the Role

Essential Skills & Experience:

  • You have minimum 8 years’ experience using SQL and/or PySpark for data-focused solutions, ideally within an insurance environment

  • You have significant experience in SQL and strong experience using PySpark to build data pipelines

  • You have a degree at BSc or MSc level in a Numerical field, preferably with a strong focus on Computer Science or Data Engineering, or qualified by experience

  • You have experience in data pipeline/ETL (Extract Transform Load) development, cleaning and bringing together multiple sources into a single production warehouse

  • You have experience in analysing, debugging and solving highly complex problems

  • You have experience taking ownership of own learning and development

  • You have knowledge of Engineering principles such as automated testing, code abstraction, and performance optimization

  • You have experience using cloud-based solutions (e.g Palantir Foundry, AWS, Azure, GCP, Snowflake)

  • You have knowledge of P&C insurance, ideally with some experience of working alongside Pricing teams

  • You should be able to integrate AI-driven insights into your work to optimize outcomes and support broader organizational goals

Nice-to-have skills (that can also be learned on the job!):

  • You have experience in one or more other programming languages (e.g. Python, Java, C#)

  • You have some understanding of Machine Learning and predictive modelling techniques

  • You have some experience producing data visualisations for stakeholder

79924 | IT & Tech Engineering | Professional | Non-Executive | Allianz Commercial | Full-Time | Permanent

#J-18808-Ljbffr

Senior Data Engineer employer: Allianz Popular SL.

Allianz Commercial is an exceptional employer, offering a dynamic and collaborative work culture that fosters innovation and professional growth. As part of a global team based in vibrant cities like London and Munich, employees benefit from diverse perspectives and the opportunity to work on cutting-edge data solutions that drive impactful pricing strategies. With a strong commitment to continuous improvement and employee development, Allianz provides a supportive environment where your contributions are valued and your career can flourish.
A

Contact Detail:

Allianz Popular SL. Recruiting Team

StudySmarter Expert Advice 🤫

We think this is how you could land Senior Data Engineer

✨Tip Number 1

Familiarise yourself with the specific technologies mentioned in the job description, such as SQL and PySpark. Consider working on personal projects or contributing to open-source projects that utilise these technologies to demonstrate your hands-on experience.

✨Tip Number 2

Network with current employees or professionals in the data engineering field, especially those who work in insurance or pricing. Engaging with them on platforms like LinkedIn can provide insights into the company culture and expectations, which can be invaluable during interviews.

✨Tip Number 3

Prepare to discuss your experience with data pipelines and ETL processes in detail. Be ready to share specific examples of challenges you've faced and how you overcame them, as this will showcase your problem-solving skills and technical expertise.

✨Tip Number 4

Stay updated on the latest trends in data engineering and AI-driven insights. Being knowledgeable about current advancements can help you stand out as a candidate who is not only qualified but also passionate about the field.

We think you need these skills to ace Senior Data Engineer

SQL
PySpark
Data Pipeline Development
ETL (Extract Transform Load)
Data Cleaning
Problem-Solving Skills
Cloud-Based Solutions (e.g. AWS, Azure, GCP, Snowflake)
Automated Testing
Code Abstraction
Performance Optimization
Collaboration Skills
Data Analysis
Knowledge of P&C Insurance
AI Integration
Programming Languages (e.g. Python, Java, C#)
Machine Learning Understanding
Data Visualisation

Some tips for your application 🫡

Tailor Your CV: Make sure your CV highlights your experience with SQL and PySpark, especially in data pipeline development. Emphasise any relevant projects or roles that showcase your ability to work with data in an insurance environment.

Craft a Compelling Cover Letter: In your cover letter, explain why you are passionate about data engineering and how your skills align with the responsibilities outlined in the job description. Mention your experience with cloud-based solutions and your collaborative approach to problem-solving.

Showcase Relevant Projects: If you have worked on specific projects that involved data pipelines, ETL processes, or collaboration with pricing teams, include these in your application. Use quantifiable results to demonstrate your impact.

Highlight Continuous Learning: Mention any ongoing learning or certifications related to data engineering, cloud technologies, or machine learning. This shows your commitment to professional development and staying current in the field.

How to prepare for a job interview at Allianz Popular SL.

✨Showcase Your Technical Skills

Be prepared to discuss your experience with SQL and PySpark in detail. Highlight specific projects where you've built data pipelines or solved complex problems, especially in an insurance context.

✨Demonstrate Collaboration Experience

Since the role involves working with various stakeholders, share examples of how you've successfully collaborated with teams in the past. Emphasise your ability to communicate technical concepts to non-technical colleagues.

✨Understand the Company’s Data Landscape

Research Allianz Commercial's data systems and IT landscape before the interview. Being knowledgeable about their tools and processes will show your genuine interest in the role and help you ask insightful questions.

✨Prepare for Problem-Solving Questions

Expect to face technical challenges during the interview. Practice explaining your thought process when debugging or optimising data pipelines, as this will demonstrate your analytical skills and problem-solving abilities.

Land your dream job quicker with Premium

You’re marked as a top applicant with our partner companies
Individual CV and cover letter feedback including tailoring to specific job roles
Be among the first applications for new jobs with our AI application
1:1 support and career advice from our career coaches
Go Premium

Money-back if you don't land a job in 6-months

A
Similar positions in other companies
UK’s top job board for Gen Z
discover-jobs-cta
Discover now
>