Data Engineer - Snowflake, dbt, AWS, Python - Remote
Data Engineer - Snowflake, dbt, AWS, Python - Remote

Data Engineer - Snowflake, dbt, AWS, Python - Remote

Manchester Full-Time 50000 - 65000 £ / year (est.) No home office possible
Go Premium
J

At a Glance

  • Tasks: Join a dynamic team to build data pipelines and support innovative data products.
  • Company: A growing consultancy delivering cutting-edge data solutions for high-profile clients.
  • Benefits: Enjoy a competitive salary, fully remote work, and opportunities for continuous learning.
  • Why this job: Be part of impactful projects in AI and automation while working with a modern tech stack.
  • Qualifications: Experience as a Data Engineer, strong skills in Snowflake, dbt, Python, and SQL required.
  • Other info: This role is open to UK residents only; no sponsorship available.

The predicted salary is between 50000 - 65000 £ per year.

A small and growing consultancy who are delivering cutting-edge data solutions for high-profile clients, are now looking for a hands-on Data Engineer to join their talented delivery team. This is a fantastic opportunity to work with a modern data stack - including Snowflake, dbt, AWS, Python, and SQL - on impactful projects that power reporting, automation, predictive analytics and Artificial Intelligence. This role is fully remote, and is therefore open to candidates across the UK. This role would be well-suited to someone with a self-starter mentality, who wants to be part of a company that is growing and maturing whilst continually learning.

Key responsibilities:

  • Designing and building robust data pipelines using SQL, dbt, and Python within Snowflake
  • Developing scalable, testable, and maintainable code
  • Collaborating with analytics, product, and client teams to deliver high-quality data solutions
  • Supporting the development of data products like dashboards, APIs, and predictive models
  • Driving automation and operational efficiency through data tooling
  • Contributing to internal best practices and agile delivery processes

Experience required:

  • Proven experience as a Data Engineer or Analytics Engineer
  • Strong experience with Snowflake
  • Hands-on experience with dbt
  • Proficiency in Python and SQL
  • Solid understanding of Git and development lifecycle best practices
  • Experience integrating APIs or working with event/log data streams
  • Ability to manage multiple priorities in a fast-paced environment
  • Experience with AWS would be highly desirable
  • An interest in AI

Benefits:

  • Salary between £50-65,000 depending on experience

Please Note: This is a role for UK residents only. This role does not offer Sponsorship. You must have the right to work in the UK with no restrictions. Some of our roles may be subject to successful background checks including a DBS and Credit Check.

Data Engineer - Snowflake, dbt, AWS, Python - Remote employer: Jefferson Frank

Join a dynamic and innovative consultancy that values growth, collaboration, and continuous learning. As a Data Engineer, you'll work remotely with a talented team on cutting-edge data solutions for high-profile clients, enjoying a supportive work culture that encourages professional development and the exploration of new technologies. With competitive salaries and the opportunity to contribute to impactful projects, this is an excellent place for self-starters looking to make a difference in the data landscape.
J

Contact Detail:

Jefferson Frank Recruiting Team

StudySmarter Expert Advice 🤫

We think this is how you could land Data Engineer - Snowflake, dbt, AWS, Python - Remote

✨Tip Number 1

Familiarise yourself with the specific tools mentioned in the job description, such as Snowflake, dbt, and AWS. Consider building a small project or contributing to an open-source project that uses these technologies to showcase your hands-on experience.

✨Tip Number 2

Network with professionals in the data engineering field, especially those who work with the modern data stack. Join relevant online communities or forums where you can ask questions and share insights about Snowflake, Python, and other tools relevant to the role.

✨Tip Number 3

Prepare to discuss your previous projects and experiences in detail during interviews. Be ready to explain how you've designed and built data pipelines, collaborated with teams, and contributed to automation and operational efficiency in your past roles.

✨Tip Number 4

Stay updated on the latest trends in data engineering and AI. Follow industry blogs, attend webinars, or take online courses to enhance your knowledge, which will not only help you in interviews but also demonstrate your commitment to continuous learning.

We think you need these skills to ace Data Engineer - Snowflake, dbt, AWS, Python - Remote

Data Pipeline Design
SQL Proficiency
Snowflake Expertise
dbt Experience
Python Programming
API Integration
Event/Log Data Stream Handling
Git Version Control
Agile Methodologies
Automation and Operational Efficiency
Collaboration Skills
Problem-Solving Skills
Ability to Manage Multiple Priorities
Interest in Artificial Intelligence
AWS Knowledge

Some tips for your application 🫡

Tailor Your CV: Make sure your CV highlights your experience with Snowflake, dbt, Python, and SQL. Use specific examples of projects where you've designed data pipelines or developed scalable code to demonstrate your skills.

Craft a Compelling Cover Letter: In your cover letter, express your enthusiasm for the role and the company. Mention your self-starter mentality and how you align with their focus on continuous learning and growth in the data engineering field.

Showcase Relevant Projects: If you have worked on any relevant projects, especially those involving AWS or AI, be sure to include them in your application. Highlight your contributions and the impact these projects had on the business.

Proofread Your Application: Before submitting, carefully proofread your CV and cover letter for any spelling or grammatical errors. A polished application reflects your attention to detail, which is crucial for a Data Engineer role.

How to prepare for a job interview at Jefferson Frank

✨Showcase Your Technical Skills

Make sure to highlight your experience with Snowflake, dbt, Python, and SQL during the interview. Be prepared to discuss specific projects where you used these technologies and how they contributed to successful outcomes.

✨Demonstrate Problem-Solving Abilities

Expect questions that assess your problem-solving skills. Prepare examples of challenges you've faced in previous roles and how you approached them, particularly in building data pipelines or automating processes.

✨Emphasise Collaboration

Since the role involves working with various teams, be ready to discuss your experience collaborating with analytics, product, and client teams. Share examples of how you effectively communicated and contributed to team success.

✨Express Your Interest in Continuous Learning

This consultancy values a self-starter mentality and continuous learning. Be sure to convey your enthusiasm for staying updated with industry trends, especially in AI and data engineering practices.

Data Engineer - Snowflake, dbt, AWS, Python - Remote
Jefferson Frank
Go Premium

Land your dream job quicker with Premium

You’re marked as a top applicant with our partner companies
Individual CV and cover letter feedback including tailoring to specific job roles
Be among the first applications for new jobs with our AI application
1:1 support and career advice from our career coaches
Go Premium

Money-back if you don't land a job in 6-months

J
  • Data Engineer - Snowflake, dbt, AWS, Python - Remote

    Manchester
    Full-Time
    50000 - 65000 £ / year (est.)

    Application deadline: 2027-08-17

  • J

    Jefferson Frank

Similar positions in other companies
UK’s top job board for Gen Z
discover-jobs-cta
Discover now
>