At a Glance
- Tasks: Build and maintain scalable data pipelines while ensuring data quality and governance.
- Company: Join Artificial, a pioneering tech company transforming the specialty insurance market.
- Benefits: Enjoy private medical insurance, generous holiday allowance, and stock options.
- Why this job: Make a real impact in a collaborative environment with cutting-edge technology.
- Qualifications: Fluency in Python and SQL, with experience in data pipelines and analytical databases.
- Other info: Be part of a diverse team that values curiosity and continuous learning.
The predicted salary is between 28800 - 48000 ÂŁ per year.
About Artificial
Help shape the future of specialty insurance. At Artificial, we’re building the next generation of technology for the specialty (re)insurance market. Our mission is to transform how brokers and carriers operate in complex markets by removing operational barriers and enabling smarter, faster decision‑making. We use modern technology to solve real challenges for some of the world’s leading brokers and insurers. By automating the repetitive and structuring the complex, we help our partners unlock new opportunities for innovation and growth. You’ll be joining a collaborative team that values curiosity, ownership, and continuous learning. We work in an environment where ideas are heard, support is built‑in, and outcomes matter. Everyone here has the chance to make a tangible impact on our products, our customers, and the industry.
About the role
- Build and maintain scalable data pipelines (batch and streaming)
- Define data contracts for internal and external consumers
- Own data quality and governance: monitoring, alerting, lineage, documentation
- Create the core metrics layer (e.g. speed, conversion, pricing) that drives platform insights
- Optimize query performance and pipeline costs as data volume grows
About you
- You’re fluent in Python and SQL
- You’ve built, shipped and operated data pipelines in production (Airflow, Dagster, etc.)
- You’ve built and maintained dbt models in production
- You’ve worked with analytical databases (ClickHouse, Snowflake, BigQuery, DuckDB)
- You care deeply about data quality
- You’re self‑directed—you identify what to work on, not wait for tickets
- You’re proactive, self‑directed and comfortable being part of a small, autonomous team
- Bonus: you know a bit of Haskell (our core platform is written in it)
We especially want to hear from you if you have:
- Collaborative skills with an emphasis on product quality.
- Experience in insurtech, insurance or related industries.
- Strong problem‑solving skills.
- Experience in a distributed work environment.
Benefits
- Private medical insurance
- Income protection insurance
- Life insurance of 4 Ă— base salary
- On‑site gym and shower facilities
- Enhanced maternity and paternity pay
- Team social events and company parties
- Salary exchange on pension and nursery fees
- Access to Maji, the financial wellbeing platform
- Company stock options managed through Ledgy
- Milestone Birthday Bonus and a Life Events leave policy
- Generous holiday allowance of 28 days plus national holidays
- Home office and equipment allowance, and a company MacBook
- Learning allowance and leave to attend conferences or take exams
- YuLife employee benefits, including EAP and bereavement helplines
- For each new hire, we plant a tree through our partnership with Ecologi
- The best coffee machine in London, handmade in Italy and imported just for us!
We’re proud to be an equal opportunities employer and are committed to building a team that reflects the diverse communities around us. If there’s anything you need to make the hiring process more accessible, just let us know—we’re happy to make adjustments. You’re also welcome to share your preferred pronouns with us at any point. Think you don’t meet every requirement? Please apply anyway. We value potential as much as experience, and we know that raw talent counts. As part of our hiring process, we’ll carry out some background checks. These may include a criminal record check, reviewing your credit history, speaking with previous employers and confirming your academic qualifications.
Data Engineer employer: Artificial Labs Limited
Contact Detail:
Artificial Labs Limited Recruiting Team
StudySmarter Expert Advice 🤫
We think this is how you could land Data Engineer
✨Tip Number 1
Network like a pro! Reach out to folks in the industry, attend meetups, and connect with people on LinkedIn. You never know who might have the inside scoop on job openings or can put in a good word for you.
✨Tip Number 2
Show off your skills! Create a portfolio showcasing your data pipelines and projects. This is your chance to demonstrate your expertise in Python, SQL, and any tools you've used like Airflow or dbt. Make it easy for potential employers to see what you can do!
✨Tip Number 3
Prepare for interviews by brushing up on common data engineering questions and scenarios. Think about how you can discuss your experience with data quality and governance. Practice makes perfect, so get a friend to do mock interviews with you!
✨Tip Number 4
Don't forget to apply through our website! We love seeing applications directly from candidates who are excited about joining us at Artificial. Plus, it shows you're genuinely interested in being part of our team and mission.
We think you need these skills to ace Data Engineer
Some tips for your application 🫡
Show Your Passion for Data: When writing your application, let us see your enthusiasm for data engineering! Share specific examples of projects you've worked on and how they relate to the role. We love seeing candidates who are genuinely excited about transforming data into insights.
Tailor Your CV and Cover Letter: Make sure to customise your CV and cover letter for the Data Engineer position. Highlight your experience with Python, SQL, and any relevant tools like Airflow or dbt. We want to know how your skills align with our mission at Artificial!
Be Clear and Concise: Keep your application clear and to the point. Use bullet points where possible to make it easy for us to read. We appreciate straightforward communication, so don’t be afraid to show us your personality while keeping it professional.
Apply Through Our Website: We encourage you to apply directly through our website. It’s the best way for us to receive your application and ensures you’re considered for the role. Plus, it shows us that you’re proactive and keen to join our team!
How to prepare for a job interview at Artificial Labs Limited
✨Know Your Tech Inside Out
Make sure you’re well-versed in Python, SQL, and the tools mentioned in the job description like Airflow and dbt. Brush up on your experience with analytical databases too. Being able to discuss your past projects and how you’ve built and maintained data pipelines will show that you’re the right fit.
✨Showcase Your Problem-Solving Skills
Prepare examples of how you've tackled complex data challenges in the past. Think about specific instances where you improved data quality or optimised query performance. This will demonstrate your proactive nature and ability to contribute to the team’s goals.
✨Emphasise Collaboration and Communication
Since the role involves working in a collaborative environment, be ready to discuss how you’ve worked with others in previous roles. Highlight any experiences where you’ve contributed to product quality or supported team members, as this aligns with their values.
✨Ask Insightful Questions
Prepare thoughtful questions about the company’s mission, the technology stack, and how they measure success. This shows your genuine interest in the role and helps you assess if the company culture is a good fit for you.