At a Glance
- Tasks: Design and optimise data platforms while collaborating with clients on innovative solutions.
- Company: Global consultancy offering a dynamic remote work environment.
- Benefits: Competitive salary of £60,000 plus benefits and flexible remote working.
- Why this job: Join a cutting-edge team and make an impact in the data engineering field.
- Qualifications: Experience with Azure, Databricks, and Python; client-facing skills are a plus.
- Other info: Opportunity for career growth in a supportive and collaborative setting.
The predicted salary is between 60000 - 84000 £ per year.
Data Engineer for a Global consultancy. Various clients. Please only apply if you have the ability to be security cleared (5+ years in the UK).
Role - Data Engineer
Type - Perm
Location - Remote - Occasional office visit in the UK
Salary - £60,000 + benefits
Role
- This role sits at the intersection of data engineering and consulting, supporting clients in designing, building, and optimising modern data platforms.
- You will work across the full data lifecycle, from ingestion and transformation to modelling and delivery, with a strong focus on Azure and Databricks.
- The ideal candidate combines hands-on technical capability with the confidence to engage directly with clients and guide them through data-driven solutions.
Key Responsibilities
- Design, build, and maintain scalable data pipelines using Azure and Databricks.
- Develop robust ETL/ELT processes to support analytics, reporting, and AI/ML workloads.
- Implement data models, transformation logic, and quality frameworks across the data lifecycle.
- Write clean, maintainable Python code for data engineering and automation tasks.
- Work with Delta Lake and modern lakehouse patterns to optimise data storage and performance.
- Contribute to CI/CD and DataOps practices to improve deployment reliability and workflow efficiency.
- Collaborate with clients to understand requirements, propose solutions, and deliver technical outcomes.
- Support best practice data governance, documentation, and platform standards.
Required Skills & Experience
- Strong experience with Azure cloud services and Databricks.
- Solid understanding of the end-to-end data lifecycle, from ingestion to consumption.
- Proficiency in Python for data engineering and automation.
- Hands-on experience building ETL/ELT pipelines in production environments.
- 4-5 years working in data engineering, data consulting, or a similar technical role.
Nice to Have Skills
- Exposure to Microsoft Fabric.
- Experience with Delta Lake and lakehouse architectures.
- Familiarity with CI/CD, DataOps, or modern deployment pipelines.
Soft Skills
- Confident in client-facing interactions, including workshops, requirements gathering, and solution walkthroughs.
- Consulting background is beneficial but not essential.
- Strong communication skills and the ability to translate technical concepts for non-technical audiences.
- Proactive, adaptable, and comfortable working across varied client environments.
Remote working/work at home options are available for this role.
Data Engineer Role - Perm - Fully Remote - Consultancy employer: GCS
Contact Detail:
GCS Recruiting Team
StudySmarter Expert Advice 🤫
We think this is how you could land Data Engineer Role - Perm - Fully Remote - Consultancy
✨Tip Number 1
Network like a pro! Reach out to your connections in the data engineering field and let them know you're on the hunt for a new role. You never know who might have a lead or can put in a good word for you.
✨Tip Number 2
Show off your skills! Create a portfolio showcasing your projects, especially those involving Azure and Databricks. This will give potential employers a taste of what you can do and set you apart from the crowd.
✨Tip Number 3
Prepare for interviews by brushing up on your client-facing skills. Practice explaining complex data concepts in simple terms, as you'll need to engage with clients and guide them through solutions.
✨Tip Number 4
Don't forget to apply through our website! We make it easy for you to find roles that match your skills and experience. Plus, it shows you're serious about joining our team!
We think you need these skills to ace Data Engineer Role - Perm - Fully Remote - Consultancy
Some tips for your application 🫡
Tailor Your CV: Make sure your CV highlights your experience with Azure, Databricks, and Python. We want to see how your skills align with the role, so don’t be shy about showcasing relevant projects or achievements!
Craft a Compelling Cover Letter: Your cover letter is your chance to shine! Use it to explain why you’re excited about the Data Engineer role and how your background makes you a perfect fit. Remember, we love a good story!
Showcase Your Client Interaction Skills: Since this role involves working directly with clients, make sure to mention any experience you have in client-facing situations. We want to know how you can engage and guide clients through data-driven solutions.
Apply Through Our Website: We encourage you to apply directly through our website for a smoother process. It helps us keep track of applications and ensures you don’t miss out on any important updates!
How to prepare for a job interview at GCS
✨Know Your Tech Inside Out
Make sure you’re well-versed in Azure and Databricks, as these are key to the role. Brush up on your Python skills too, especially for data engineering tasks. Being able to discuss your hands-on experience with ETL/ELT processes will definitely impress.
✨Showcase Your Client Interaction Skills
Since this role involves direct client engagement, prepare examples of how you've successfully communicated technical concepts to non-technical audiences. Think about times you've led workshops or gathered requirements, and be ready to share those stories.
✨Understand the Data Lifecycle
Familiarise yourself with the end-to-end data lifecycle, from ingestion to consumption. Be prepared to discuss how you’ve designed and optimised data pipelines in previous roles, and how you can apply that knowledge to support clients effectively.
✨Be Ready for Problem-Solving Questions
Expect questions that assess your problem-solving abilities, especially in relation to data governance and quality frameworks. Think of specific challenges you've faced in past projects and how you overcame them, as this will demonstrate your proactive approach.