At a Glance
- Tasks: Build and maintain data pipelines, optimise Snowflake for reporting and analytics.
- Company: Leading financial services organisation with a collaborative team culture.
- Benefits: Competitive salary, hybrid working model, and opportunity to work on strategic initiatives.
- Other info: 5+ years of experience in data engineering with excellent career growth potential.
- Why this job: Join a dynamic team and make an impact on data-driven transformation projects.
- Qualifications: Bachelor's degree in Computer Science or related field; Snowflake expertise preferred.
The predicted salary is between 95000 - 95000 £ per year.
A leading financial services organisation is seeking a skilled Data Engineer to join their Data Team. This is a backfill FTC position to maintain and enhance data engineering capabilities while the primary role holder focuses on a major Workday HR and finance systems implementation.
Youll play a critical role in ensuring business-as-usual operations, maintaining robust data pipelines, and optimising the Snowflake environment to support reporting, analytics, and other strategic data initiatives.
Key Responsibilities:- Data Pipeline Development: Build and maintain pipelines to support smooth data flows into Snowflake.
- Data Modelling & Warehousing: Design, optimise, and scale data models to meet organisational needs.
- Performance Optimisation: Monitor and fine-tune data pipelines and Snowflake performance.
- Collaboration: Partner with stakeholders across teams to understand and deliver on data requirements.
- Governance & Security: Adhere to data governance policies and maintain robust security measures.
- Documentation & Support: Keep processes well-documented and ensure seamless data operations during the systems rollout.
- Bachelors degree in Computer Science, Data Engineering, or related field (Snowflake certification desirable).
- Advanced skills in Transact SQL and ETL/ELT tools such as Azure, Airflow, or Qlik Replicate.
- Strong experience in data warehouse modelling and pipeline builds.
- Expertise in Snowflake (or strong knowledge of cloud-based databases such as AWS, Azure, or GCP).
- Experience integrating data from HR & Finance systems, particularly Workday.
- Solid grasp of SDLC and commercial data engineering practices.
- 5 years + of industry experience in designing, building, and optimising data platforms, lakes, or warehouses.
- Excellent problem-solving ability and a passion for leveraging data to drive insight and innovation.
- Competitive FTC salary of circa £95,000 p/a
- Opportunity to work on strategic transformation initiatives
- Collaborative and inclusive team environment
- Hybrid working model for flexibility
- October Start
Data Engineer - Snowflake Specialist in London employer: ABC
Contact Detail:
ABC Recruiting Team
StudySmarter Expert Advice 🤫
We think this is how you could land Data Engineer - Snowflake Specialist in London
✨Tip Number 1
Network like a pro! Reach out to folks in the industry, especially those already working with Snowflake or in data engineering roles. A friendly chat can lead to insider info about job openings and even referrals.
✨Tip Number 2
Show off your skills! Create a portfolio showcasing your data pipeline projects or any Snowflake optimisations you've done. This gives potential employers a taste of what you can bring to the table.
✨Tip Number 3
Prepare for interviews by brushing up on common data engineering questions and Snowflake specifics. Practise explaining your past projects and how you tackled challenges – it’s all about demonstrating your problem-solving prowess!
✨Tip Number 4
Don’t forget to apply through our website! It’s the best way to ensure your application gets seen by the right people. Plus, we love seeing candidates who are proactive about their job search!
We think you need these skills to ace Data Engineer - Snowflake Specialist in London
Some tips for your application 🫡
Tailor Your CV: Make sure your CV is tailored to the Data Engineer role. Highlight your experience with Snowflake, data pipelines, and any relevant projects. We want to see how your skills match what we're looking for!
Craft a Compelling Cover Letter: Your cover letter is your chance to shine! Use it to explain why you're passionate about data engineering and how you can contribute to our team. Keep it concise but impactful – we love a good story!
Showcase Your Technical Skills: Don’t forget to mention your technical skills, especially in Transact SQL and ETL/ELT tools. We’re keen on seeing how you’ve used these in past roles, so give us some examples that demonstrate your expertise.
Apply Through Our Website: We encourage you to apply through our website for a smoother process. It helps us keep track of applications and ensures you don’t miss out on any important updates from us!
How to prepare for a job interview at ABC
✨Know Your Snowflake Inside Out
Make sure you brush up on your Snowflake knowledge before the interview. Be ready to discuss how you've optimised Snowflake environments in the past and any specific challenges you've faced. This will show that you're not just familiar with the tool, but that you can leverage it effectively.
✨Showcase Your Data Pipeline Skills
Prepare to talk about your experience in building and maintaining data pipelines. Have examples ready that demonstrate your ability to ensure smooth data flows into Snowflake. Highlight any ETL/ELT tools you've used, like Azure or Airflow, and how they contributed to your success.
✨Collaboration is Key
Since this role involves working with various stakeholders, be prepared to discuss how you've collaborated with different teams in the past. Share specific instances where you understood and delivered on data requirements, as this will illustrate your teamwork skills and adaptability.
✨Emphasise Governance and Security
Data governance and security are crucial in this role, so make sure you can speak to your experience in these areas. Discuss how you've adhered to data governance policies and maintained robust security measures in previous positions. This will demonstrate your commitment to best practices in data engineering.