At a Glance
- Tasks: Design and optimise data pipelines using Snowflake and dbt for a leading travel company.
- Company: Join an innovative Pharma brand transforming its data landscape near West London.
- Benefits: Competitive pay, collaborative culture, and exciting technical challenges await you.
- Other info: Enjoy a dynamic work environment with strong leadership support.
- Why this job: Make a real impact on customer experiences through data-driven insights.
- Qualifications: Experience with Snowflake, SQL, and cloud data architecture is essential.
This position as a Snowflake Engineer offers the opportunity to join a leading travel company based near West London, currently undergoing a major cloud data transformation.
This established Pharma brand is known for its innovation and commitment to delivering seamless customer experiences through data. With a focus on modernising its data platforms, the company is investing in Snowflake and cloud-native tooling to better understand customer journeys, improve operations, and fuel growth. Contractors and perm hires we've placed here consistently highlight the collaborative culture, exciting technical challenges, and strong support from leadership.
As a Data Engineer, you will play a crucial role in a data transformation program, focusing on optimising data pipelines and enabling cloud-driven insights. You will also be responsible for post-project documentation, ensuring clear communication with non-technical stakeholders.
Role type: 6-MONTH CONTRACT
Location: LONDON (1 day in office)
Pay: £500 - £550 per day (Outside IR35)
Key responsibilities:
- Designing and implementing a Snowflake-based data warehouse, ensuring scalability and efficiency.
- Developing and optimising ELT pipelines, leveraging Snowflake best practices for data ingestion and transformation.
- Enhancing performance and query optimisation, ensuring cost-effective and high-performing workloads.
- Working with dbt and Airflow to transform raw data into structured datasets for analytical consumption.
- Creating clear documentation to communicate technical processes to non-technical teams.
Key Skills and Requirements:
- Strong commercial experience with Snowflake and its ecosystem.
- Proficiency in SQL for data modelling and transformation within Snowflake.
- Experience with dbt to develop scalable data pipelines.
- Knowledge of ELT processes and best practices for cloud-based data architecture.
- Hands-on experience with performance tuning and query optimisation in Snowflake.
- Familiarity with cloud platforms (AWS/GCP/Azure) and their integration with Snowflake.
How to Apply: Please register your interest by sending your CV via the apply link on this page.
Data Engineer (Snowflake, DBT) employer: Harnham
Contact Detail:
Harnham Recruiting Team
StudySmarter Expert Advice 🤫
We think this is how you could land Data Engineer (Snowflake, DBT)
✨Tip Number 1
Network like a pro! Reach out to folks in the industry, especially those already working at the company you're eyeing. A friendly chat can give you insider info and maybe even a referral!
✨Tip Number 2
Show off your skills! If you’ve got a portfolio or GitHub with projects related to Snowflake or dbt, make sure to share it. It’s a great way to demonstrate your expertise beyond just your CV.
✨Tip Number 3
Prepare for the interview by brushing up on common data engineering scenarios. Think about how you’d tackle challenges like optimising ELT pipelines or enhancing query performance. We want to see your problem-solving skills in action!
✨Tip Number 4
Don’t forget to apply through our website! It’s the best way to ensure your application gets seen. Plus, we love seeing candidates who take that extra step to connect with us directly.
We think you need these skills to ace Data Engineer (Snowflake, DBT)
Some tips for your application 🫡
Tailor Your CV: Make sure your CV highlights your experience with Snowflake, dbt, and SQL. We want to see how your skills match the role, so don’t be shy about showcasing relevant projects or achievements!
Craft a Compelling Cover Letter: Your cover letter is your chance to shine! Use it to explain why you’re excited about this opportunity and how you can contribute to our data transformation journey. Keep it concise but impactful!
Showcase Your Technical Skills: When detailing your experience, focus on specific tools and techniques you've used in data engineering. We love seeing hands-on experience with ELT processes and performance tuning, so make sure to include those!
Apply Through Our Website: We encourage you to apply through our website for a smoother application process. It helps us keep everything organised and ensures your application gets the attention it deserves!
How to prepare for a job interview at Harnham
✨Know Your Snowflake Inside Out
Make sure you brush up on your Snowflake knowledge before the interview. Be ready to discuss your experience with its ecosystem, including any specific projects you've worked on. Highlight how you've optimised data pipelines and improved performance in past roles.
✨Show Off Your SQL Skills
Since SQL is crucial for this role, prepare to demonstrate your proficiency. You might be asked to solve a problem or optimise a query on the spot, so practice common SQL scenarios and be ready to explain your thought process clearly.
✨Familiarise Yourself with dbt and ELT Processes
Get comfortable discussing how you've used dbt to develop scalable data pipelines. Be prepared to explain the ELT processes you've implemented and how they fit into cloud-based architectures. This will show that you understand the technical requirements of the role.
✨Communicate Clearly with Non-Technical Stakeholders
Since part of your role involves creating documentation for non-technical teams, think about how you can convey complex ideas simply. Prepare examples of how you've successfully communicated technical processes in the past, as this will demonstrate your ability to bridge the gap between tech and business.