At a Glance
- Tasks: Transform raw data into decision-grade assets and optimise cloud data warehousing.
- Company: Join a forward-thinking company leading in cloud-native data solutions.
- Benefits: Enjoy a competitive salary, bonus, and a full benefits package.
- Why this job: Be part of innovative projects that challenge legacy thinking and drive impactful change.
- Qualifications: Master SQL, proven Snowflake engineering, and hands-on DBT Cloud experience required.
- Other info: Location flexibility in Manchester or Greater London; UK work eligibility needed.
The predicted salary is between 72000 - 108000 £ per year.
Base Salary: Up to £95,000 + bonus & full benefits package
Location: England (UK work eligibility required) - Manchester or Greater London
Scaling cloud-native data estate this role requires an energetic engineer with Snowflake’s Zero-Copy Cloning and Time Travel, hard-wire best-practice ETL/ELT, and push CI/CD out of slideware and into production. If you can transform raw data into decision-grade assets at enterprise scale, keep reading.
PROJECTS TO DO
- Cloud data warehousing – Build and optimise Snowflake environments leveraging Zero-Copy & Time Travel for instant, cost-neutral cloning and back-dated recovery.
- Deep ingestion pipelines – Orchestrate high-volume, low-latency data flows that meet strict SLAs.
- DBT Cloud modelling – Design modular DBT models, enforce version control, and automate testing.
- CI/CD for data – Embed automated build, test, and deploy in every pipeline; no manual touches.
- Governance & quality – Enforce lineage, metadata, and security controls as code—no exceptions.
REQUIREMENTS
- Mastery of SQL on multi-terabyte datasets.
- Proven Snowflake engineering, ideally across multiple business domains.
- Hands-on DBT Cloud experience (macros, tests, deployments).
- Python/Scala or equivalent for orchestration and tooling.
- CI/CD - (Git, Docker, Azure DevOps/GitHub Actions, or similar).
- Challenge legacy thinking and mentor stakeholders.
- Direct ownership of the data-platform roadmap.
Senior Data Engineer employer: Intelix.AI
Contact Detail:
Intelix.AI Recruiting Team
StudySmarter Expert Advice 🤫
We think this is how you could land Senior Data Engineer
✨Tip Number 1
Familiarise yourself with Snowflake's features, especially Zero-Copy Cloning and Time Travel. Being able to discuss these in detail during your interview will show that you have the hands-on experience we’re looking for.
✨Tip Number 2
Brush up on your SQL skills, particularly with multi-terabyte datasets. Prepare to demonstrate your ability to handle large volumes of data efficiently, as this is crucial for the role.
✨Tip Number 3
Gain a solid understanding of CI/CD practices, especially in relation to data engineering. Be ready to share examples of how you've implemented automated testing and deployment in previous projects.
✨Tip Number 4
Showcase your experience with DBT Cloud, particularly in creating modular models and enforcing version control. Highlight any specific projects where you’ve successfully automated testing to demonstrate your expertise.
We think you need these skills to ace Senior Data Engineer
Some tips for your application 🫡
Tailor Your CV: Make sure your CV highlights your experience with Snowflake, SQL, and DBT Cloud. Use specific examples of projects where you've implemented ETL/ELT processes or worked with cloud data warehousing.
Craft a Compelling Cover Letter: In your cover letter, express your enthusiasm for the role and the company. Mention how your skills in Python/Scala and CI/CD practices align with their needs, and provide examples of how you've challenged legacy thinking in previous roles.
Showcase Relevant Projects: If possible, include links to any relevant projects or portfolios that demonstrate your expertise in data engineering, particularly with Snowflake and DBT Cloud. This can help set you apart from other candidates.
Proofread Your Application: Before submitting, carefully proofread your application materials. Check for any spelling or grammatical errors, and ensure that all technical terms are used correctly. A polished application reflects your attention to detail.
How to prepare for a job interview at Intelix.AI
✨Showcase Your Technical Skills
Be prepared to discuss your mastery of SQL and your experience with Snowflake. Bring examples of how you've optimised data environments and handled multi-terabyte datasets in previous roles.
✨Demonstrate Your Problem-Solving Abilities
Expect to face scenario-based questions that assess your ability to orchestrate high-volume data flows and manage strict SLAs. Think of specific challenges you've overcome and how you approached them.
✨Highlight Your CI/CD Experience
Discuss your hands-on experience with CI/CD tools like Git, Docker, and Azure DevOps. Be ready to explain how you've embedded automated processes in your data pipelines to enhance efficiency.
✨Prepare for Governance and Quality Questions
Understand the importance of data governance and quality controls. Be ready to talk about how you've enforced lineage, metadata, and security measures in your past projects.