At a Glance
- Tasks: Join our team to build and optimise data pipelines using Snowflake and cloud platforms.
- Company: Global tech company focused on innovation and collaboration.
- Benefits: Flexible work options, tailored benefits, and continuous learning opportunities.
- Other info: Inclusive culture with various support networks for diverse backgrounds.
- Why this job: Make a real impact by solving complex data challenges in a dynamic environment.
- Qualifications: 5-8 years in data engineering with strong Snowflake expertise required.
The predicted salary is between 60000 - 80000 ÂŁ per year.
The team you'll be working with: We are seeking an accomplished and detail‑oriented Snowflake Data Engineer to join our Data & AI practice. The successful candidate will bring deep expertise in data engineering, ETL/ELT pipelines, and cloud‑native data platforms, with a strong focus on Snowflake. This role is critical in building and optimising modern data ecosystems that enable data‑driven decision making, advanced analytics, and AI capabilities for our clients.
As a trusted practitioner, you will collaborate with architects, developers, and analysts to design, implement, and maintain secure and high‑performing data pipelines. You will thrive in a collaborative, client‑facing environment, with a passion for solving complex data challenges, driving innovation, and ensuring the seamless delivery of data solutions.
What you'll be doing:
- Client Engagement & Delivery
- Data Pipeline Development (Batch and Streaming)
- Snowflake & Cloud Data Platforms
- Data Architecture & Modelling
- Collaboration & Best Practices
- Quality, Governance & Security
Business Relationships:
- Solution Architects
- Data Engineers, Developers, ML Engineers and Analysts
- Client stakeholders up to Head of Data Engineering, Chief Data Architect, and Analytics leadership
What experience you'll bring:
Competencies / Critical Skills:
- Proven experience in data engineering and pipeline development on Snowflake and cloud‑native platforms.
- Strong consulting values with ability to collaborate effectively in client‑facing environments.
- Hands‑on expertise across the data lifecycle: ingestion, transformation, modelling, governance, and consumption.
- Strong problem‑solving, analytical, and communication skills.
- Experience leading or mentoring teams of engineers to deliver high‑quality scalable data solutions.
Technical Expertise:
- Deep expertise with Snowflake features (warehouses, Snowpark, data sharing, performance tuning).
- Proficiency in ETL/ELT tools such as DBT, Matillion, Talend, or equivalent.
- Strong SQL and Python (or equivalent language) skills for data manipulation and automation.
- Hands‑on experience with cloud platforms (AWS, Azure, GCP).
- Knowledge of data modelling methodologies (star schemas, Data Vault, Kimball, Inmon).
- Familiarity with data lake architectures and distributed processing frameworks (e.g., Spark, Hadoop).
- Experience with version control tools (GitHub, Bitbucket) and CI/CD pipelines.
- Understanding of data governance, security, and compliance frameworks.
- Exposure to AI/ML workloads desirable.
Experience, Qualifications, and Education:
- Experience: Minimum 5–8 years in data engineering, data warehousing, or data architecture roles, with at least 3+ years working with Snowflake.
- Education: University degree required.
- Preferred: BSc/MSc in Computer Science, Data Engineering, or related field.
- Snowflake certifications (SnowPro Core, Advanced) highly desirable.
Measures of Success:
- Delivery of high‑performing, scalable, and secure data pipelines aligned to client requirements.
- High client satisfaction and successful adoption of Snowflake‑based solutions.
- Demonstrated ability to innovate and improve data engineering practices.
- Contribution to the growth of the practice through reusable assets, accelerators, and technical leadership.
Who we are: We’re a business with a global reach that empowers local teams, and we undertake hugely exciting work that is genuinely changing the world. Our advanced portfolio of consulting, applications, business process, cloud, and infrastructure services will allow you to achieve great things by working with brilliant colleagues, and clients, on exciting projects.
Our inclusive work environment prioritises mutual respect, accountability, and continuous learning for all our people. This approach fosters collaboration, well‑being, growth, and agility, leading to a more diverse, innovative, and competitive organisation.
We are also proud to share that we have a range of Inclusion Networks such as: the Women’s Business Network, Cultural and Ethnicity Network, LGBTQ+ & Allies Network, Neurodiversity Network and the Parent Network.
We are an equal opportunities employer. We believe in the fair treatment of all our employees and commit to promoting equity and diversity in our employment practices. We are also a proud Disability Confident Committed Employer - we are committed to creating a diverse and inclusive workforce. We actively collaborate with individuals who have disabilities and long‑term health conditions which have an effect on their ability to do normal daily activities, ensuring that barriers are eliminated when it comes to employment opportunities.
Snowflake Data Engineer in London employer: NTT America, Inc.
Contact Detail:
NTT America, Inc. Recruiting Team
StudySmarter Expert Advice 🤫
We think this is how you could land Snowflake Data Engineer in London
✨Tip Number 1
Network like a pro! Reach out to your connections in the data engineering field, especially those who work with Snowflake. A friendly chat can lead to insider info about job openings or even referrals.
✨Tip Number 2
Show off your skills! Create a portfolio showcasing your best data pipelines and projects. This is your chance to demonstrate your hands-on expertise with Snowflake and cloud platforms, making you stand out to potential employers.
✨Tip Number 3
Prepare for interviews by brushing up on common data engineering challenges and solutions. Be ready to discuss your experience with ETL/ELT processes and how you've tackled complex data problems in the past.
✨Tip Number 4
Don’t forget to apply through our website! It’s the best way to ensure your application gets noticed. Plus, we love seeing candidates who are proactive about their job search!
We think you need these skills to ace Snowflake Data Engineer in London
Some tips for your application 🫡
Tailor Your CV: Make sure your CV is tailored to highlight your experience with Snowflake and data engineering. Use keywords from the job description to show that you’re a perfect fit for the role.
Craft a Compelling Cover Letter: Your cover letter is your chance to shine! Share your passion for data engineering and how your skills align with our needs. Don’t forget to mention any relevant projects or achievements!
Showcase Your Technical Skills: Be specific about your technical expertise in Snowflake, ETL/ELT tools, and cloud platforms. We want to see your hands-on experience, so include examples of how you've used these skills in past roles.
Apply Through Our Website: We encourage you to apply through our website for a smoother application process. It’s the best way for us to receive your application and get you on our radar quickly!
How to prepare for a job interview at NTT America, Inc.
✨Know Your Snowflake Inside Out
Make sure you brush up on your Snowflake knowledge before the interview. Be ready to discuss its features like warehouses, Snowpark, and data sharing. Having hands-on experience with performance tuning will definitely give you an edge!
✨Showcase Your Data Pipeline Skills
Prepare to talk about your experience with ETL/ELT tools such as DBT or Matillion. Bring examples of data pipelines you've developed, focusing on both batch and streaming processes. This will demonstrate your practical skills and problem-solving abilities.
✨Emphasise Collaboration and Communication
Since this role involves working closely with architects, developers, and clients, highlight your collaborative experiences. Share specific instances where your communication skills helped solve complex data challenges or improved project outcomes.
✨Be Ready for Technical Questions
Expect technical questions related to SQL, Python, and data modelling methodologies. Brush up on your knowledge of data governance and security frameworks too. Being well-prepared will show that you're serious about the role and understand the technical landscape.