At a Glance
- Tasks: Lead data engineering projects using Snowflake to transform complex data into actionable insights.
- Company: Keyrus, a global leader in data intelligence with a focus on innovation and diversity.
- Benefits: Competitive salary, private medical insurance, gym access, and career development opportunities.
- Other info: Hybrid work model with a focus on sustainability and positive societal impact.
- Why this job: Join a dynamic team making a real impact through data-driven solutions in a collaborative environment.
- Qualifications: 8+ years in data engineering with expertise in Snowflake and strong analytical skills.
The predicted salary is between 58000 - 87000 £ per year.
Keyrus is a global consulting and technology company focused on making data matter from a human perspective. Founded in 1996, Keyrus operates in 28+ countries across 5 continents, with more than 3,300 people worldwide. Our strength comes from combining deep expertise in Data & Analytics, AI, Digital, and Management Consulting with a strong understanding of business realities. We actively contribute to sustainability, inclusion, and positive societal impact.
The role involves working as a Senior Data Engineer, focused on Snowflake. You will work at the intersection of data, technology, and business, helping our clients turn complexity into clarity and insights into impact. You will collaborate with multidisciplinary teams in an international environment and contribute to projects that drive performance, innovation, and sustainable growth.
Job location: London, UK (Hybrid, 3 days on-site)
Contract type: Permanent
Target start date: May 2026
Working hours: Full-time (40h/week)
Compensation: £58k to £87k gross/year
Role Responsibilities:
- Understand client needs and lead delivery end-to-end — proposing solutions that solve real business problems across ingestion, transformation, governance, and activation.
- Build and manage scalable pipelines using Snowflake, Fivetran, Gong REST APIs, and Snowpipe.
- Configure AWS S3 raw landing zones and Snowflake External Stages to ensure a durable, immutable raw data layer.
- Deal with semi-structured data and expertly manage JSON/VARIANT payloads in Snowflake, implementing 'schema-on-read' patterns to protect against upstream API changes.
- Architect complex dbt models to parse, flatten, and normalise VARIANT data into structured tables.
- Implement logic to merge issuer entities, contact records, and learner profiles across disparate systems to create a 'Golden Record.'
- Build semantic layers for predictive metrics, including Engagement Scores, Sentiment Trends, and Churn Risk.
- Leverage Snowflake Cortex AI for predictive scoring and surfacing insights.
- Maintain Governance and Security by implementing Role-Based Access Control (RBAC), PII masking (Dynamic Data Masking), and Row-Level Security (RLS) across the Snowflake environment.
- Support clients in their data and digital transformation journeys.
What Makes This Role Challenging:
- You will work in complex, evolving client environments.
- You are expected to balance delivery quality with pragmatism.
- You will collaborate with diverse profiles across functions and cultures.
- Autonomy increases with seniority — so does responsibility.
What We’re Looking For:
Must-haves:
- 8+ years of experience in data engineering.
- Proven experience in Snowflake (expert level: Snowpipe, Access History, Lifecycle Tags, Cortex AI, External Stages).
- Strong hands-on experience with dbt (expert in modular modelling, documentation, testing).
- Ability to integrate the deployment pipeline using git based tools.
- Experience in building ingestion pipelines using AWS S3, Fivetran, Snowpipe, and REST APIs.
- Expertise in Cloud-based architecture (e.g. Azure, AWS).
- Strong analytical and problem-solving skills, especially in handling JSON/VARIANT and schema-on-read patterns.
- Advanced SQL skills (Jinja2 templating is a plus) and working experience with Python for API interactions.
- Demonstrate strong ownership and work independently, taking responsibility for full-project delivery.
- Ability to communicate clearly with both technical and non-technical stakeholders.
- Fluency in English.
Nice-to-haves:
- Familiarity with Databricks for data engineering, notebook workflows, or distributed processing.
- Experience handling CI/CD pipelines for data workloads (e.g., GitHub Actions, GitLab, Azure DevOps, or similar).
- Knowledge of governance practices, including PII handling, RBAC, RLS, and data cataloguing.
- Understanding of reverse ETL concepts and tools.
- Experience with Data Viz tools for analytics and dashboarding (e.g. Qlik, Power BI).
- Experience in consulting environments and mentoring.
- Ability to champion best practices and keep up with innovations and trends on the data landscape.
Salary and Benefits:
Salary ranges reflect different levels of mastery and impact within the same role — not different job titles. Final offers are based on experience, autonomy, scope, and market context.
What We Offer at Keyrus UK:
- Competitive holiday allowance.
- Private Medical & Dental Insurance (Bupa).
- Group Life Insurance.
- Gym & fitness discounts via Pluxee (Sodexo).
- On-site gym access at our London office.
- Access to lifestyle discounts (travel, retail, entertainment & more) via Pluxee (Sodexo).
- Auto-enrolment pension scheme with Aegon.
- Training & Development via KLX (Keyrus Learning Experience).
- Strong focus on career development and internal mobility.
- Electric & hybrid car scheme via Tusker.
- Annual discretionary bonus, based on individual and company performance.
- Referral bonus for introducing new colleagues.
Why Keyrus:
- A market leader in Data Intelligence.
- A company where people, trust, and diversity are core values.
- An environment that values ownership, flexibility, and innovation.
- A place where different backgrounds and perspectives are not just welcomed — they are essential.
- We believe diversity drives better thinking, stronger teams, and better outcomes. Everyone belongs at Keyrus.
Equal Opportunity Statement:
We are committed to building an inclusive workplace and encourage applications from all backgrounds, regardless of race, ethnicity, gender identity, sexual orientation, age, disability, or any other protected characteristic.
Lead - Senior Data Engineer - Snowflake employer: Keyrus
Contact Detail:
Keyrus Recruiting Team
StudySmarter Expert Advice 🤫
We think this is how you could land Lead - Senior Data Engineer - Snowflake
✨Tip Number 1
Network like a pro! Reach out to your connections in the data engineering field, especially those who work with Snowflake. A friendly chat can lead to insider info about job openings or even referrals.
✨Tip Number 2
Show off your skills! Create a portfolio showcasing your projects, especially those involving Snowflake and dbt. This gives potential employers a taste of what you can do and sets you apart from the crowd.
✨Tip Number 3
Prepare for interviews by brushing up on common data engineering scenarios. Be ready to discuss how you've tackled complex problems, especially with JSON/VARIANT data. We want to see your thought process in action!
✨Tip Number 4
Don’t forget to apply through our website! It’s the best way to ensure your application gets noticed. Plus, we love seeing candidates who are proactive about their job search.
We think you need these skills to ace Lead - Senior Data Engineer - Snowflake
Some tips for your application 🫡
Tailor Your CV: Make sure your CV speaks directly to the role of Senior Data Engineer. Highlight your experience with Snowflake, dbt, and any relevant projects that showcase your skills in data engineering. We want to see how you can turn complexity into clarity!
Craft a Compelling Cover Letter: Your cover letter is your chance to shine! Use it to explain why you're passionate about data engineering and how your background aligns with our mission at Keyrus. Let us know how you can contribute to our clients' data transformation journeys.
Showcase Your Problem-Solving Skills: In your application, don’t just list your skills—give us examples of how you've tackled complex data challenges in the past. We love seeing how you’ve turned insights into impact, so share those success stories!
Apply Through Our Website: We encourage you to apply through our website for a smoother process. It helps us keep track of your application and ensures you’re considered for the right role. Plus, it’s super easy—just a few clicks and you’re done!
How to prepare for a job interview at Keyrus
✨Know Your Snowflake Inside Out
Make sure you brush up on your Snowflake skills before the interview. Be ready to discuss your experience with Snowpipe, External Stages, and how you've managed JSON/VARIANT data. Prepare specific examples of how you've used these features to solve real business problems.
✨Showcase Your Problem-Solving Skills
During the interview, highlight your analytical abilities by discussing past projects where you tackled complex data challenges. Use the STAR method (Situation, Task, Action, Result) to structure your answers and demonstrate how you turned complexity into clarity.
✨Communicate Clearly with All Stakeholders
Since you'll be working with both technical and non-technical teams, practice explaining your technical knowledge in simple terms. Think about how you can convey complex ideas clearly and concisely, ensuring everyone understands your approach and solutions.
✨Emphasise Your Collaborative Spirit
Keyrus values teamwork, so be prepared to discuss how you've collaborated with diverse teams in the past. Share examples of how you’ve contributed to multidisciplinary projects and how you’ve adapted to different working styles and cultures.