At a Glance
- Tasks: Own and scale our data platform, driving analytics and improving workflows.
- Company: Join a leading consulting firm focused on impactful decision-making.
- Benefits: Enjoy competitive salary, health insurance, remote work flexibility, and generous annual leave.
- Why this job: Shape the future of data analytics and make a real impact in business decisions.
- Qualifications: Strong SQL skills, experience with dbt, and a passion for data quality.
- Other info: Be part of an inclusive team that values innovation and collaboration.
The predicted salary is between 36000 - 60000 ÂŁ per year.
Every day, somewhere in the world, important decisions are made. Whether it is a private equity company deciding to invest millions into a business or a large corporation implementing a new strategic direction, these decisions impact employees, customers, and other stakeholders. Consulting and private equity firms come to Sapient when they need to discover knowledge to help them make great decisions and succeed in their goals. It is our mission to support them in their discovery of knowledge.
We help our clients find industry experts who can provide their knowledge via interview or survey: we curate this knowledge in a market-leading software platform; and we help clients surface knowledge they already have through expansive knowledge management.
We’re looking for an Analytics Engineer to take ownership of our data platform within our small, high impact Data & Analytics team. You’ll work closely with the Head of Data & Analytics and a Commercial Data Analyst, playing a key role across the full data lifecycle – from ingestion and modelling through to self-serve analytics and stakeholder delivery. This is a hands‑on, high‑ownership role.
You’ll own our dbt environment and warehouse, shape how data is modelled and used across the business, and help move us towards a scalable, self‑serve analytics culture. You’ll also have the opportunity to explore how AI and modern tooling can improve both internal workflows and how stakeholders interact with data.
Our data stack:
- PostgreSQL data warehouse (AWS-hosted)
- AWS DMS & Fivetran (data ingestion)
- dbt Cloud (Enterprise)
- Tableau Server
Key duties in this role will include:
- Own and scale the data platform: Own the day‑to‑day health and performance of our dbt Cloud environment and PostgreSQL warehouse. Design, build, and maintain scalable, analytics‑ready data models using best practices (testing, version control, CI/CD). Improve data reliability, performance, and structure as we scale. Drive data quality and governance across the business.
- Build a self‑serve data culture: Own and develop our data dictionary, making it a genuinely useful tool for the business. Define and standardise key metrics, ensuring a clear single source of truth. Enable non‑technical teams to confidently use data through clear documentation and guidance.
- Partner with the business: Work closely with Product, Client Services, and Finance to deliver high‑quality datasets and dashboards. Translate business questions into well‑structured data models and metrics. Replace manual, Excel‑heavy reporting with scalable, automated pipelines. Support product analytics by improving tracking, modelling, and insight generation.
- Improve tooling & ways of working: Collaborate with Engineering to enhance data pipelines and architecture. Continuously improve how we build, test, and deploy data models. Explore and apply AI tools and automation to improve productivity and the analytics experience.
Why this role?
- High ownership: you’ll shape the data platform, not just contribute to it.
- Real impact: your work will directly influence decision‑making across the business.
- Early data team: opportunity to define standards, tooling, and best practices.
- Modern stack: dbt, cloud warehouse, and a strong foundation to build on.
- Room to innovate: especially around AI and automation.
Requirements:
- Strong SQL skills and experience working with dbt.
- Experience building and maintaining analytics‑ready data models in a warehouse environment.
- Experience with a BI tool (Tableau or similar).
- A strong focus on data quality, testing, and reliability.
- Experience owning or contributing to data infrastructure and tooling.
- Clear, confident communication with both technical and non‑technical audiences.
- Self‑motivated and accountable, you take ownership and follow through.
Benefits:
- Tenure Gifts – Vouchers, extra holiday and sabbaticals for each year of employment.
- Health insurance through Vitality.
- Enjoy the flexibility of working remotely for up to 20 days each year, allowing you to tailor your work environment to your needs and embrace a change of scenery.
- Employee Assistance Programme – Access to a health and wellbeing service that offers personalised advice and support from specialist teams.
- Enhanced Maternity & Paternity pay.
- Annual Leave – 25 days + bank holidays which includes a week's closure over the Christmas period to fully reset.
- MyMindPal app – Online support for mental fitness that helps people to stress less and enjoy life more.
- Corporate Events – From quarterly gatherings to our annual winter & summer parties, we love to celebrate, collaborate and have fun together.
We are committed to building an inclusive workplace – did you know that marginalized groups are less likely to apply to jobs unless they meet every requirement listed? If you are interested in the above role, but don’t necessarily tick every box, we encourage you to apply anyway – this role could still be a great match!
Analytics Engineer (Data Platform) employer: proSapient
Contact Detail:
proSapient Recruiting Team
StudySmarter Expert Advice 🤫
We think this is how you could land Analytics Engineer (Data Platform)
✨Tip Number 1
Network like a pro! Reach out to current employees on LinkedIn or attend industry meetups. A friendly chat can give you insider info and maybe even a referral!
✨Tip Number 2
Prepare for the interview by understanding the company’s data stack. Familiarise yourself with PostgreSQL, dbt, and Tableau. Show them you’re not just a fit, but the perfect fit!
✨Tip Number 3
Don’t just talk about your skills; demonstrate them! Bring examples of your past work, especially any projects that involved data modelling or analytics. Let your experience shine!
✨Tip Number 4
Apply through our website! It’s the best way to ensure your application gets seen. Plus, it shows you’re genuinely interested in being part of our team at StudySmarter.
We think you need these skills to ace Analytics Engineer (Data Platform)
Some tips for your application 🫡
Show Off Your Skills: When you're writing your application, make sure to highlight your SQL skills and experience with dbt. We want to see how you've built and maintained analytics-ready data models in the past, so don’t hold back!
Tailor Your Application: Make your application stand out by tailoring it to our job description. Mention specific tools like PostgreSQL, Tableau, and any experience you have with AI and automation. This shows us you’ve done your homework and are genuinely interested.
Be Clear and Concise: We appreciate clear communication, especially when it comes to technical details. Keep your application straightforward and avoid jargon unless necessary. Remember, we want to understand your experience without getting lost in the details!
Apply Through Our Website: Don’t forget to apply through our website! It’s the best way for us to receive your application and ensures you’re considered for the role. Plus, it gives you a chance to explore more about what we do at StudySmarter.
How to prepare for a job interview at proSapient
✨Know Your Data Stack
Familiarise yourself with the tools mentioned in the job description, like PostgreSQL, dbt, and Tableau. Be ready to discuss how you've used these technologies in past projects and how they can be applied to improve data reliability and performance.
✨Showcase Your Problem-Solving Skills
Prepare examples of how you've tackled complex data challenges in previous roles. Highlight your experience in building scalable data models and automating reporting processes, as this will demonstrate your ability to drive a self-serve data culture.
✨Communicate Clearly
Practice explaining technical concepts in simple terms. Since you'll be working with both technical and non-technical teams, being able to translate business questions into structured data models is crucial. Think of examples where you've successfully communicated complex ideas.
✨Emphasise Ownership and Initiative
This role requires a high level of ownership, so be prepared to discuss times when you've taken charge of a project or initiative. Share how you’ve improved processes or contributed to data infrastructure, showcasing your self-motivation and accountability.