Data Engineer

Data Engineer

Cambridge Freelance 48000 - 72000 £ / year (est.) Home office (partial)
A

At a Glance

  • Tasks: Design and optimise ETL/ELT pipelines using Snowflake, DBT, and Python.
  • Company: Join a forward-thinking tech company in Cambridge/Luton, embracing innovation and collaboration.
  • Benefits: Enjoy hybrid work flexibility, competitive pay, and opportunities for professional growth.
  • Why this job: Be part of a dynamic team that values creativity and problem-solving in data engineering.
  • Qualifications: 5+ years experience with Snowflake, DBT, Python, and AWS; strong analytical skills required.
  • Other info: Long-term B2B contract with a focus on data quality and collaboration.

The predicted salary is between 48000 - 72000 £ per year.

Position: Data Engineer

Location: Cambridge / Luton, UK (Hybrid 2-3 days onsite in a week)

Duration: Long Term B2B Contract

Job Description:

The ideal candidate will have a minimum of 5+ years of experience working with Snowflake, DBT, Python, and AWS to deliver ETL/ELT pipelines using various resources.

  1. Proficiency in Snowflake data warehouse architecture. Design, build, and optimize ETL/ELT pipelines using DBT (Data Build Tool) and Snowflake.
  2. Experience with DBT (Data Build Tool) for data transformation and modeling. Implement data transformation workflows using DBT (core/cloud).
  3. Strong Python programming skills for automation and data processing. Leverage Python to create automation scripts and optimize data processing tasks.
  4. Proficiency in SQL performance tuning and query optimization techniques using Snowflake.
  5. Troubleshoot and optimize DBT models and Snowflake performance.
  6. Knowledge of CI/CD and version control (Git) tools. Experience with orchestration tools such as Airflow.
  7. Strong analytical and problem-solving skills with the ability to work independently in an agile development environment.
  8. Ensure data quality, reliability, and consistency across different environments.
  9. Collaborate with other data engineers, data analysts, and business stakeholders to understand data needs and translate them into engineering solutions.
  10. Certification in AWS, Snowflake, or DBT is a plus.

#J-18808-Ljbffr

Data Engineer employer: Axiom Software Solutions Limited

Join a forward-thinking company that values innovation and collaboration, offering a dynamic work culture in the vibrant cities of Cambridge and Luton. With a strong focus on employee growth, we provide ample opportunities for professional development and training, ensuring you stay at the forefront of technology. Enjoy the flexibility of a hybrid working model, competitive benefits, and a supportive environment that encourages creativity and teamwork.
A

Contact Detail:

Axiom Software Solutions Limited Recruiting Team

StudySmarter Expert Advice 🤫

We think this is how you could land Data Engineer

✨Tip Number 1

Familiarise yourself with the latest features and updates in Snowflake, DBT, and AWS. Being knowledgeable about recent advancements can give you an edge during discussions with our team.

✨Tip Number 2

Engage with the data engineering community online. Join forums or LinkedIn groups where professionals discuss challenges and solutions related to ETL/ELT pipelines. This can help you gain insights and make valuable connections.

✨Tip Number 3

Prepare to showcase your problem-solving skills through practical examples. Think of specific instances where you've optimised data processes or resolved issues using Python or SQL, as these will be key discussion points.

✨Tip Number 4

Brush up on your knowledge of CI/CD practices and orchestration tools like Airflow. Understanding how these fit into the data engineering workflow will demonstrate your readiness for the role and your ability to collaborate effectively.

We think you need these skills to ace Data Engineer

Snowflake Data Warehouse Architecture
ETL/ELT Pipeline Development
DBT (Data Build Tool)
Python Programming
SQL Performance Tuning
Query Optimization Techniques
Data Transformation Workflows
CI/CD Practices
Version Control (Git)
Orchestration Tools (e.g., Airflow)
Analytical Skills
Problem-Solving Skills
Data Quality Assurance
Collaboration with Stakeholders
AWS Certification
Snowflake Certification
DBT Certification

Some tips for your application 🫡

Tailor Your CV: Make sure your CV highlights your experience with Snowflake, DBT, Python, and AWS. Use specific examples of projects where you've built ETL/ELT pipelines and mention any relevant certifications.

Craft a Compelling Cover Letter: In your cover letter, explain why you're passionate about data engineering and how your skills align with the job requirements. Mention your experience with SQL performance tuning and any orchestration tools like Airflow.

Showcase Problem-Solving Skills: Provide examples in your application that demonstrate your analytical and problem-solving abilities. Discuss how you've tackled challenges in previous roles, particularly in an agile development environment.

Highlight Collaboration Experience: Emphasise your ability to work with other data engineers, analysts, and stakeholders. Include examples of how you've translated data needs into engineering solutions, showcasing your teamwork skills.

How to prepare for a job interview at Axiom Software Solutions Limited

✨Showcase Your Technical Skills

Be prepared to discuss your experience with Snowflake, DBT, Python, and AWS in detail. Highlight specific projects where you've built ETL/ELT pipelines and be ready to explain the challenges you faced and how you overcame them.

✨Demonstrate Problem-Solving Abilities

Expect questions that assess your analytical skills. Prepare examples of how you've troubleshot issues with DBT models or optimised SQL queries in Snowflake. This will show your ability to think critically and solve problems effectively.

✨Familiarise Yourself with CI/CD Practices

Since knowledge of CI/CD and version control is important, brush up on your understanding of these concepts. Be ready to discuss how you've implemented CI/CD in past projects and how it improved your workflow.

✨Collaborate and Communicate

As collaboration is key in this role, prepare to talk about your experiences working with data analysts and business stakeholders. Highlight how you translated their data needs into engineering solutions, showcasing your communication skills.

A
Similar positions in other companies
UK’s top job board for Gen Z
discover-jobs-cta
Discover now
>