Remote Data Engineer - ETL/dbt Pipelines & Quality
Remote Data Engineer - ETL/dbt Pipelines & Quality

Remote Data Engineer - ETL/dbt Pipelines & Quality

Full-Time 50000 - 60000 £ / year (est.) Home office possible
Mercor

At a Glance

  • Tasks: Build and develop ETL/ELT pipelines and dbt models in a remote setting.
  • Company: Join Mercor, a forward-thinking company focused on data engineering excellence.
  • Benefits: Enjoy flexible remote work, competitive salary, and opportunities for professional growth.
  • Other info: Be part of a dynamic remote team with exciting challenges.
  • Why this job: Make an impact by ensuring high-quality data processes in innovative projects.
  • Qualifications: 3+ years of experience in data engineering and strong communication skills required.

The predicted salary is between 50000 - 60000 £ per year.

Mercor is seeking a Data Engineering Expert to build and develop ETL/ELT pipelines, dbt models, and Airflow/Dagster DAGs. The ideal candidate will have a BS or MS in Computer Science or a related field, plus over 3 years of experience in data engineering or analytics engineering.

As part of a remote team, you'll create challenging scenarios for agent evaluation and ensure high-quality data processes. Strong written communication skills and familiarity with data engineering artifacts are essential.

Remote Data Engineer - ETL/dbt Pipelines & Quality employer: Mercor

Mercor is an exceptional employer that values innovation and collaboration within a fully remote work environment. We offer competitive benefits, a supportive culture that encourages professional growth, and opportunities to work on cutting-edge data engineering projects. Join us to be part of a dynamic team where your contributions directly impact our success and the quality of our data processes.
Mercor

Contact Detail:

Mercor Recruiting Team

StudySmarter Expert Advice 🤫

We think this is how you could land Remote Data Engineer - ETL/dbt Pipelines & Quality

✨Tip Number 1

Network like a pro! Reach out to your connections in the data engineering field and let them know you're on the hunt for a role. You never know who might have the inside scoop on job openings or can refer you directly.

✨Tip Number 2

Show off your skills! Create a portfolio showcasing your ETL/ELT pipelines, dbt models, and any projects you've worked on. This will give potential employers a taste of what you can do and set you apart from the crowd.

✨Tip Number 3

Prepare for interviews by brushing up on your technical knowledge. Be ready to discuss your experience with Airflow/Dagster and how you've tackled challenges in data quality. Practice makes perfect, so consider mock interviews with friends or mentors.

✨Tip Number 4

Don't forget to apply through our website! We love seeing candidates who are genuinely interested in joining our team. Tailor your application to highlight your relevant experience and skills that match the job description.

We think you need these skills to ace Remote Data Engineer - ETL/dbt Pipelines & Quality

ETL/ELT Pipeline Development
dbt Models
Airflow
Dagster
Data Engineering
Analytics Engineering
Data Quality Assurance
Written Communication Skills
Data Engineering Artifacts Familiarity
Problem-Solving Skills
Collaboration in Remote Teams
Technical Aptitude

Some tips for your application 🫡

Show Off Your Skills: Make sure to highlight your experience with ETL/ELT pipelines and dbt models. We want to see how your skills align with what we’re looking for, so don’t hold back!

Tailor Your Application: Customise your CV and cover letter to reflect the job description. Mention specific projects or experiences that relate to data engineering and quality processes, as this will catch our eye.

Be Clear and Concise: Strong written communication is key! Keep your application clear and to the point, showcasing your technical expertise without overwhelming us with jargon.

Apply Through Our Website: We encourage you to apply directly through our website. It’s the best way for us to receive your application and ensures you’re considered for the role!

How to prepare for a job interview at Mercor

✨Know Your ETL/ELT Inside Out

Make sure you brush up on your knowledge of ETL and ELT processes. Be ready to discuss your experience with building pipelines and how you've tackled challenges in the past. Having specific examples from your previous roles will show that you’re not just familiar with the concepts, but that you can apply them effectively.

✨Showcase Your dbt Skills

Since the role involves working with dbt models, be prepared to talk about your experience with dbt. Share any projects where you’ve implemented dbt, focusing on how you structured your models and ensured data quality. This will demonstrate your hands-on expertise and understanding of best practices.

✨Communicate Clearly and Effectively

Strong written communication skills are a must for this position. Practice explaining complex data engineering concepts in simple terms. You might be asked to describe your work to someone without a technical background, so being able to articulate your thoughts clearly will set you apart.

✨Familiarise Yourself with Airflow/Dagster

If you have experience with Airflow or Dagster, make sure to highlight it during the interview. If not, take some time to learn the basics and understand how they fit into the data engineering landscape. Being able to discuss these tools will show your commitment to staying current in the field.

Remote Data Engineer - ETL/dbt Pipelines & Quality
Mercor

Land your dream job quicker with Premium

You’re marked as a top applicant with our partner companies
Individual CV and cover letter feedback including tailoring to specific job roles
Be among the first applications for new jobs with our AI application
1:1 support and career advice from our career coaches
Go Premium

Money-back if you don't land a job in 6-months

>