At a Glance
- Tasks: Design and develop data pipelines using GCP, Big Query, and Tableau for impactful solutions.
- Company: Join Ki, a revolutionary tech company disrupting the insurance market with cutting-edge algorithms.
- Benefits: Enjoy competitive salary, health benefits, remote work options, and opportunities for professional growth.
- Why this job: Be part of a fast-growing team making a real difference in the insurance industry.
- Qualifications: Experience in data engineering, Python, SQL, and cloud technologies is essential.
- Other info: Dynamic, agile environment with excellent career advancement opportunities.
The predicted salary is between 36000 - 60000 £ per year.
Who are we? Look at the latest headlines and you will see something Ki insures. Think space shuttles, world tours, wind farms, and even footballers’ legs. Ki’s mission is simple: digitally disrupt and revolutionise a 335-year-old market. Working with Google and UCL, Ki has created a platform that uses algorithms, machine learning, and large language models to give insurance brokers quotes in seconds, rather than days. Ki is proudly the biggest global algorithmic insurance carrier and the fastest growing syndicate in the Lloyd's of London market, being the first ever to make $100m in profit in 3 years. Ki’s teams have varied backgrounds and work together in an agile, cross-functional way to build the very best experience for its customers. Ki has big ambitions but needs more excellent minds to challenge the status quo and help it reach new horizons.
Where you come in? While our broker platform is the core technology crucial to Ki's success, this role will focus on supporting the middle/back-office operations that will lay the foundations for further and sustained success. We’re a multi-disciplined team, bringing together expertise in software and data engineering, full stack development, platform operations, algorithm research, and data science. Our squads focus on delivering high-impact solutions, favouring a highly iterative, analytical approach.
You will be designing and developing complex data processing modules and reporting using Big Query and Tableau. In addition, you will also work closely with the Ki Infrastructure/Platform Team, responsible for architecting and operating the core of the Ki Data Analytics platform.
What you will be doing:
- Work with both the business teams (finance and actuary initially), data scientists, and engineers to design, build, optimise, and maintain production grade data pipelines and reporting from an internal Data warehouse solution, based on GCP/Big Query.
- Work with finance, actuaries, data scientists, and engineers to understand how we can make best use of new internal and external data sources.
- Work with our delivery partners at EY/IBM to ensure robustness of design and engineering of the data model/MI and reporting which can support our ambitions for growth and scale.
- BAU ownership of data models, reporting, and integrations/pipelines.
- Create frameworks, infrastructure, and systems to manage and govern Ki’s data asset.
- Produce detailed documentation to allow ongoing BAU support and maintenance of data structures, schema, reporting, etc.
- Work with the broader Engineering community to develop our data and MLOps capability infrastructure.
- Ensure data quality, governance, and compliance with internal and external standards.
- Monitor and troubleshoot data pipeline issues, ensuring reliability and accuracy.
Experience:
- Designing data models and developing industrialised data pipelines.
- Strong knowledge of database and data lake systems.
- Hands-on experience in Big Query, dbt, GCP cloud storage.
- Proficient in Python, SQL, and Terraform.
- Knowledge of Cloud SQL, Airbyte, Dagster.
- Comfortable with shell scripting with Bash or similar.
- Experience provisioning new infrastructure in a leading cloud provider, preferably GCP.
- Proficient with Tableau Cloud for data visualisation and reporting.
- Experience creating DataOps pipelines.
- Comfortable working in an Agile environment, actively participating in approaches such as Scrum or Kanban.
Desirable Skills:
- Experience of streaming data systems and frameworks would be a plus.
- Experience working in regulated industry, especially financial services would be a plus.
- Experience creating MLOps pipelines is a plus.
Interview Process:
- Initial recruiter screening call.
- Interview with hiring manager.
- Technical Interview (this may vary depending on the role).
- Values Interview.
GCP Data Engineer employer: Ki
Contact Detail:
Ki Recruiting Team
StudySmarter Expert Advice 🤫
We think this is how you could land GCP Data Engineer
✨Tip Number 1
Get to know Ki and its mission! Research the company’s values and recent projects. This will help you tailor your conversations during interviews and show that you're genuinely interested in being part of their journey.
✨Tip Number 2
Network like a pro! Connect with current employees on LinkedIn or attend industry events. A friendly chat can sometimes lead to insider tips or even a referral, which can give you a leg up in the hiring process.
✨Tip Number 3
Prepare for technical interviews by brushing up on your skills in Big Query, Python, and SQL. Practise coding challenges and be ready to discuss your past projects. Confidence in your technical abilities can really set you apart!
✨Tip Number 4
Don’t forget to apply through our website! It’s the best way to ensure your application gets seen. Plus, it shows you’re serious about joining the team at Ki and ready to contribute to their exciting mission.
We think you need these skills to ace GCP Data Engineer
Some tips for your application 🫡
Tailor Your CV: Make sure your CV is tailored to the GCP Data Engineer role. Highlight your experience with Big Query, Python, and data pipelines. We want to see how your skills align with what we're looking for!
Craft a Compelling Cover Letter: Your cover letter is your chance to shine! Share your passion for data engineering and how you can contribute to Ki's mission. Let us know why you're excited about this opportunity and how you can help us disrupt the insurance market.
Showcase Your Projects: If you've worked on relevant projects, make sure to include them in your application. Whether it's data models or MLOps pipelines, we love seeing real examples of your work that demonstrate your expertise.
Apply Through Our Website: We encourage you to apply through our website for the best chance of getting noticed. It helps us keep track of applications and ensures you’re considered for the role. Don’t miss out on this opportunity!
How to prepare for a job interview at Ki
✨Know Your Tech Inside Out
Make sure you brush up on your knowledge of GCP, Big Query, and the tools mentioned in the job description. Be ready to discuss your experience with data pipelines, SQL, and Python. The more specific examples you can provide, the better!
✨Understand Ki's Mission
Familiarise yourself with Ki’s mission and how they’re disrupting the insurance market. This will help you align your answers with their goals and show that you’re genuinely interested in being part of their journey.
✨Prepare for Technical Questions
Expect technical questions during the interview, especially around data modelling and pipeline development. Practice explaining your thought process clearly and concisely, as this will demonstrate your problem-solving skills and technical expertise.
✨Show Your Agile Mindset
Since Ki values an Agile approach, be prepared to discuss your experience working in Agile environments. Share examples of how you've contributed to team success through collaboration and iterative processes, as this will resonate well with their culture.