At a Glance
- Tasks: Lead the design and implementation of data warehousing solutions using Snowflake and AWS.
- Company: Join a dynamic team in the heart of London, focused on innovative data solutions.
- Benefits: Enjoy hybrid working, competitive salary, and a 15% bonus.
- Why this job: Be part of a collaborative culture that values mentorship and innovation in data engineering.
- Qualifications: Bachelor’s degree in a technical field and proven experience with Snowflake and AWS.
- Other info: Insurance experience is a plus but not required; certifications are advantageous.
The predicted salary is between 56000 - 84000 £ per year.
Salary - £70-80k with 15% bonus
Hybrid working – couple of days in the office, City of London
We are looking for:
- Good understanding of data engineering principles
- A good technical grasp of Snowflake and automating it, and transforming complex datasets
- SnowPro core certification
- AWS Skillset
- Delivery experience
- Building solutions in Snowflake
- Insurance experience – advantageous but not necessary
Key Responsibilities:
- Lead the design and implementation of Snowflake [and Redshift] based data warehousing solutions within an AWS environment
- Mentoring of team members through code reviews and pair programming
- Build and support new AWS native cloud data warehouse solutions
- Develop and optimize ETL processes using AWS services (e.g. AWS Glue, Lambda) to ensure efficient data ingestion, transformation, storage, and cost optimization
- Deliver data to presentation layers to power reporting and analytics
- Collaborate with stakeholders, product owners, and developers
- Create and maintain data models that serve as a foundation for reporting and analytics
- Monitor performance metrics and troubleshoot issues within the data architecture, ensuring optimal health and efficiency
Skills and Experience:
- Bachelor’s degree or higher in a technical discipline
- Proven experience as a data engineer with a strong focus on Snowflake and AWS services in large-scale enterprise environments
- Extensive experience in AWS services, e.g. EC2, S3, RDS, DynamoDB, Redshift, Lambda, API Gateway
- Strong SQL skills for complex data queries and transformations
- Python programming for data processing and analysis is a plus
- Strong acumen for application health through performance monitoring, logging, and debugging
- AWS or Snowflake certifications are a plus
Contact Detail:
JobFlurry Recruiting Team
StudySmarter Expert Advice 🤫
We think this is how you could land AWS/Snowflake Specialist
✨Tip Number 1
Make sure to showcase your hands-on experience with Snowflake and AWS in any conversations you have. Be prepared to discuss specific projects where you've implemented data warehousing solutions, as this will demonstrate your practical knowledge.
✨Tip Number 2
Networking is key! Connect with professionals in the data engineering field, especially those who work with Snowflake and AWS. Attend relevant meetups or webinars to expand your network and gain insights into industry trends.
✨Tip Number 3
Brush up on your SQL and Python skills, as these are crucial for the role. Consider working on personal projects or contributing to open-source projects that involve data processing to keep your skills sharp and relevant.
✨Tip Number 4
Prepare to discuss your approach to mentoring and collaborating with team members. Highlight any experiences where you've led code reviews or pair programming sessions, as this will show your leadership potential and teamwork skills.
We think you need these skills to ace AWS/Snowflake Specialist
Some tips for your application 🫡
Tailor Your CV: Make sure your CV highlights your experience with Snowflake and AWS services. Include specific projects where you've implemented data warehousing solutions and any relevant certifications, such as SnowPro or AWS.
Craft a Compelling Cover Letter: In your cover letter, emphasise your understanding of data engineering principles and your ability to mentor team members. Mention any experience you have with ETL processes and how you've optimised them in previous roles.
Showcase Relevant Projects: If you have worked on significant projects involving AWS and Snowflake, describe these in your application. Focus on the challenges you faced, the solutions you implemented, and the outcomes achieved.
Highlight Collaboration Skills: Since the role involves collaboration with stakeholders and product owners, mention any experience you have working in cross-functional teams. Provide examples of how you effectively communicated technical concepts to non-technical stakeholders.
How to prepare for a job interview at JobFlurry
✨Showcase Your Technical Skills
Be prepared to discuss your experience with Snowflake and AWS in detail. Highlight specific projects where you've implemented data warehousing solutions, and be ready to explain the technical challenges you faced and how you overcame them.
✨Demonstrate Your Problem-Solving Abilities
Expect questions that assess your ability to troubleshoot and optimise ETL processes. Prepare examples of how you've monitored performance metrics and resolved issues in previous roles, showcasing your analytical thinking.
✨Emphasise Collaboration and Mentoring
Since mentoring is a key responsibility, share experiences where you've guided team members through code reviews or pair programming. This will demonstrate your leadership skills and ability to work collaboratively within a team.
✨Prepare for Scenario-Based Questions
Anticipate scenario-based questions related to data architecture and stakeholder collaboration. Think about how you would approach designing a solution for a given problem, and be ready to articulate your thought process clearly.