At a Glance
- Tasks: Design and develop data processing modules using Big Query and Tableau.
- Company: Join Ki, the fastest growing algorithmic insurance carrier in London.
- Benefits: Competitive pay, flexible working, and opportunities for professional growth.
- Why this job: Be part of a revolutionary team transforming the insurance industry with cutting-edge technology.
- Qualifications: Experience in data modelling, Python, SQL, and GCP required.
- Other info: Dynamic, agile environment with a focus on innovation and collaboration.
The predicted salary is between 36000 - 60000 £ per year.
Who are we? Look at the latest headlines and you will see something Ki insures. Think space shuttles, world tours, wind farms, and even footballers' legs. Ki's mission is simple: Digitally transform and revolutionise a 335-year-old market. Working with Google and UCL, Ki has created a platform that uses algorithms, machine learning and large language models to give insurance brokers quotes in seconds, rather than days. Ki is proudly the biggest global algorithmic insurance carrier. It's the fastest growing syndicate in the Lloyd's of London market, and the first ever to make $100m in profit in 3 years. Ki's teams have varied backgrounds and work together in an agile, cross-functional way to build the very best experience for its customers. Ki has big ambitions but needs more excellent minds to challenge the status-quo and help it reach new horizons.
Where you come in? While our broker platform is the core technology crucial to Ki's success, this role will focus on supporting the middle/back-office operations that will lay the foundations for further and sustained success. We're a multi-disciplined team, bringing together expertise in software and data engineering, full stack development, platform operations, algorithm research, and data science. Our squads focus on delivering high-impact solutions – we favour a highly iterative, analytical approach. You will be designing and developing complex data processing modules and reporting using Big Query and Tableau. In addition, you will also work closely with the Ki Infrastructure/Platform Team, responsible for architecting and operating the core of the Ki Data Analytics platform.
What you will be doing:
- Work with both the business teams (finance and actuary initially), data scientists and engineers to design, build, optimise and maintain production grade data pipelines and reporting from an internal Data warehouse solution, based on GCP/Big Query.
- Work with finance, actuaries, data scientists and engineers to understand how we can make best use of new internal and external data sources.
- Work with our delivery partners at EY/IBM to ensure robustness of Design and engineering of the data model/ MI and reporting which can support our ambitions for growth and scale.
- BAU ownership of data models, reporting and integrations/pipelines.
- Create frameworks, infrastructure and systems to manage and govern Ki's data asset.
- Produce detailed documentation to allow ongoing BAU support and maintenance of data structures, schema, reporting etc.
- Work with the broader Engineering community to develop our data and MLOps capability infrastructure.
- Ensure data quality, governance, and compliance with internal and external standards.
- Monitor and troubleshoot data pipeline issues, ensuring reliability and accuracy.
Requirements:
- Experience designing data models and developing industrialised data pipelines.
- Strong knowledge of database and data lake systems.
- Hands on experience in Big Query, dbt, GCP cloud storage.
- Proficient in Python, SQL and Terraform.
- Knowledge of Cloud SQL, Airbyte, Dagster.
- Comfortable with shell scripting with Bash or similar.
- Experience provisioning new infrastructure in a leading cloud provider, preferably GCP.
- Proficient with Tableau Cloud for data visualization and reporting.
- Experience creating DataOps pipelines.
- Comfortable working in an Agile environment, actively participating in approaches such as Scrum or Kanban.
Temporary GCP Data Engineer in London employer: Ki
Contact Detail:
Ki Recruiting Team
StudySmarter Expert Advice 🤫
We think this is how you could land Temporary GCP Data Engineer in London
✨Tip Number 1
Network like a pro! Reach out to people in the industry, especially those at Ki or similar companies. A friendly chat can open doors and give you insights that job descriptions just can't.
✨Tip Number 2
Show off your skills! If you've got a portfolio or projects that highlight your experience with GCP, Big Query, or data pipelines, make sure to share them during interviews. It’s all about proving you can walk the walk!
✨Tip Number 3
Prepare for the unexpected! Brush up on your problem-solving skills and be ready to tackle some technical challenges during interviews. They might throw a scenario your way to see how you think on your feet.
✨Tip Number 4
Don’t forget to apply through our website! It’s the best way to ensure your application gets seen by the right people. Plus, it shows you’re genuinely interested in being part of the Ki team.
We think you need these skills to ace Temporary GCP Data Engineer in London
Some tips for your application 🫡
Show Your Passion for Data: When you're writing your application, let your enthusiasm for data engineering shine through! Talk about your experiences and projects that relate to the role, especially those involving GCP, Big Query, and data pipelines. We love seeing candidates who are genuinely excited about transforming data into actionable insights.
Tailor Your CV and Cover Letter: Make sure to customise your CV and cover letter for this specific role. Highlight your relevant skills in Python, SQL, and any experience with Tableau or data modelling. We want to see how your background aligns with our mission at Ki, so don’t hold back on showcasing your achievements!
Be Clear and Concise: Keep your application clear and to the point. Use bullet points where possible to make it easy for us to read through your qualifications and experiences. Remember, we’re looking for someone who can communicate complex ideas simply, so show us you can do that right from the start!
Apply Through Our Website: We encourage you to apply directly through our website. It’s the best way for us to receive your application and ensures you’re considered for the role. Plus, it gives you a chance to explore more about Ki and what we stand for before you hit 'send'!
How to prepare for a job interview at Ki
✨Know Your Tech Stack
Make sure you’re well-versed in the technologies mentioned in the job description, especially GCP, Big Query, and Python. Brush up on your knowledge of data models and pipelines, as you might be asked to discuss your experience with these tools in detail.
✨Understand Ki's Mission
Familiarise yourself with Ki’s mission to revolutionise the insurance market. Be prepared to discuss how your skills can contribute to their goals, particularly in optimising data processes and supporting their ambitious growth plans.
✨Prepare for Scenario Questions
Expect scenario-based questions that assess your problem-solving skills. Think about past experiences where you’ve designed or optimised data pipelines, and be ready to explain your thought process and the impact of your work.
✨Show Your Agile Mindset
Since Ki values an Agile approach, be ready to discuss your experience working in Agile environments. Share examples of how you’ve participated in Scrum or Kanban, and how this has helped you deliver high-impact solutions effectively.