At a Glance
- Tasks: Design and build scalable data pipelines using modern ELT principles.
- Company: Join a growing, data-driven organisation with a collaborative culture.
- Benefits: Remote work flexibility, competitive salary, and opportunities for professional growth.
- Other info: Work within a dynamic team focused on developing a scalable Data Mesh architecture.
- Why this job: Make an impact by contributing to innovative data solutions in a modern tech environment.
- Qualifications: 3-5 years of Data Engineering experience with strong SQL and Python skills.
The predicted salary is between 50000 - 60000 £ per year.
Data Engineer (Mid Level) £Remote (1-2 days per month London, Leeds or Preston)
Tech Stack: GCP (BigQuery, Dataflow), DBT, Terraform, Airflow (Composer), Python, SQL
VIQU are working with a growing, data-driven organisation building out a modern Data Platform following a successful migration from on-premise to Google Cloud. Sitting within a well-established Data Office, the team is now focused on developing a scalable Data Mesh architecture. This is a strong opportunity for a mid-level Data Engineer to join a collaborative product-led environment, working with modern tooling and contributing to the end-to-end delivery of data products.
- Design, build, and maintain scalable data pipelines and data products using modern ELT principles
- Work closely with product managers, architects, and engineers to deliver data solutions aligned to business needs
- Contribute across the full data product lifecycle, from design and development through to deployment and optimisation
- Ensure high-quality, well-documented code and maintain strong engineering standards
- Support CI/CD processes, environment management, and deployment pipelines
3–5 years’ experience in Data Engineering, with strong exposure to ETL/ELT pipelines
- Strong SQL and Python skills, with hands-on experience building data pipelines
- Cloud experience in GCP (BigQuery preferred) or Azure/AWS environments
- Understanding of modern data concepts including Data Mesh, Agile delivery, and test-driven development
Apply now to speak with VIQU IT in confidence.
Data Engineer needed - REMOTELY in London employer: VIQU IT Recruitment
Contact Detail:
VIQU IT Recruitment Recruiting Team
StudySmarter Expert Advice 🤫
We think this is how you could land Data Engineer needed - REMOTELY in London
✨Tip Number 1
Network like a pro! Reach out to folks in the industry on LinkedIn or at meetups. A friendly chat can lead to opportunities that aren’t even advertised yet.
✨Tip Number 2
Show off your skills! Create a portfolio showcasing your data projects, especially those using GCP, Python, and SQL. This gives potential employers a taste of what you can do.
✨Tip Number 3
Prepare for interviews by brushing up on common data engineering questions and practical tests. We recommend practicing coding challenges to keep your skills sharp!
✨Tip Number 4
Don’t forget to apply through our website! It’s the best way to get noticed and ensures your application lands directly in the right hands.
We think you need these skills to ace Data Engineer needed - REMOTELY in London
Some tips for your application 🫡
Tailor Your CV: Make sure your CV highlights your experience with the tech stack mentioned in the job description. We want to see your skills in GCP, Python, and SQL shine through, so don’t hold back!
Craft a Compelling Cover Letter: Your cover letter is your chance to tell us why you’re the perfect fit for this role. Share specific examples of your past projects and how they relate to building scalable data pipelines and products.
Showcase Your Problem-Solving Skills: In your application, give us a glimpse of how you tackle challenges. We love seeing candidates who can think critically and adapt to new technologies, especially in a collaborative environment.
Apply Through Our Website: We encourage you to apply directly through our website. It’s the best way to ensure your application gets into the right hands and shows us you’re serious about joining our team!
How to prepare for a job interview at VIQU IT Recruitment
✨Know Your Tech Stack
Make sure you’re familiar with the tech stack mentioned in the job description. Brush up on GCP, BigQuery, Dataflow, DBT, Terraform, Airflow, Python, and SQL. Being able to discuss your experience with these tools will show that you’re a great fit for the role.
✨Showcase Your Projects
Prepare to talk about specific projects where you've designed, built, or maintained data pipelines. Highlight your contributions to the full data product lifecycle and how you ensured high-quality code. Real examples will make your skills stand out.
✨Understand Data Mesh Concepts
Since the company is focusing on a Data Mesh architecture, it’s crucial to understand this concept. Be ready to discuss how you’ve applied modern data principles in your previous roles and how they can benefit the organisation.
✨Ask Insightful Questions
Prepare some thoughtful questions about the team’s current projects, their approach to CI/CD processes, and how they manage environment deployments. This shows your genuine interest in the role and helps you assess if it’s the right fit for you.