At a Glance
- Tasks: Design and build scalable data platforms and pipelines using cutting-edge cloud technologies.
- Company: Join a leading tech consultancy with a collaborative and innovative culture.
- Benefits: Competitive salary, comprehensive benefits, training opportunities, and certification bonuses.
- Why this job: Work on exciting data projects and enhance your skills in a supportive environment.
- Qualifications: Experience in data engineering, cloud platforms, and strong Python skills required.
- Other info: Great career growth potential and a chance to work with skilled professionals.
The predicted salary is between 36000 - 60000 Β£ per year.
An established technology consultancy is looking to hire an experienced Data Engineer to work on large-scale, customer-facing data projects while also contributing to the development of internal data services. This role blends hands-on engineering with architecture design and technical advisory work, offering exposure to enterprise clients and modern cloud platforms.
The Role
As a Data Engineer, you will be responsible for designing, building, and maintaining scalable data platforms and pipelines. You will support and lead technical workshops, contribute to architecture decisions, and act as a trusted technical partner on complex data initiatives.
Key Responsibilities
- Designing and building scalable data platforms and ETL/ELT pipelines in Azure and AWS
- Implementing serverless, batch, and streaming data architectures
- Working hands-on with Spark, Python, Databricks, and SQL-based analytics platforms
- Designing Lakehouse-style architectures and analytical data models
- Feeding behavioural and analytical data back into production systems
- Supporting architecture reviews, design sessions, and technical workshops
- Collaborating with engineering, analytics, and commercial teams
- Advising customers throughout the full project lifecycle
- Contributing to internal data services, standards, and best practices
Qualifications
- Proven experience as a Data Engineer working with large-scale data platforms
- Strong hands-on experience in either Azure or AWS, with working knowledge of the other
- Azure experience with Lakehouse concepts, Data Factory, Synapse and/or Fabric
- AWS experience with Redshift, Lambda, and SQL-based analytics services
- Strong Python skills and experience using Apache Spark
- Hands-on experience with Databricks
- Experience designing and maintaining ETL/ELT pipelines
- Solid understanding of data modelling techniques
- Experience working in cross-functional teams on cloud-based data platforms
- Ability to work with SDKs and APIs across cloud services
- Strong communication skills and a customer-focused approach
- Data migrations and platform modernisation projects
- Implementing machine learning models using Python
- Consulting or customer-facing engineering roles
- Feeding analytics insights back into operational systems
Certifications (beneficial but not required)
- AWS Solutions Architect β Associate
- Azure Solutions Architect β Associate
What's on Offer
- The opportunity to work on modern cloud and data projects using leading technologies
- A collaborative engineering culture with highly skilled colleagues
- Structured learning paths and access to training and certifications
- Certification exam fees covered and certification-related bonuses
- Competitive salary and comprehensive benefits package
- A supportive and inclusive working environment with regular knowledge sharing and team events
This role would suit a Data Engineer who enjoys combining deep technical work with customer interaction and wants to continue developing their expertise across cloud and data platforms. If you would like to find out more, then please get in contact with Jack at jharding@weareninetwenty.com.
Seniority Level: Mid-Senior level
Employment Type: Full-time
Job Function: Information Technology
Industries: IT Services and IT Consulting and Technology, Information and Media
Data Engineer in Glasgow employer: Nine Twenty Recruitment
Contact Detail:
Nine Twenty Recruitment Recruiting Team
StudySmarter Expert Advice π€«
We think this is how you could land Data Engineer in Glasgow
β¨Network Like a Pro
Get out there and connect with people in the industry! Attend meetups, webinars, or even just grab a coffee with someone whoβs already in the data engineering game. You never know when a casual chat could lead to your next big opportunity.
β¨Show Off Your Skills
Donβt just tell them what you can do; show them! Create a portfolio of projects that highlight your experience with Azure, AWS, and all those cool tools like Spark and Databricks. A well-documented GitHub repo can really make you stand out.
β¨Ace the Interview
Prepare for technical interviews by brushing up on your problem-solving skills and understanding of data architectures. Practice common interview questions and be ready to discuss your past projects in detail. Confidence is key!
β¨Apply Through Our Website
Make sure to apply through our website for the Data Engineer role! Itβs the best way to ensure your application gets seen by the right people. Plus, we love seeing candidates who are proactive about their job search.
We think you need these skills to ace Data Engineer in Glasgow
Some tips for your application π«‘
Tailor Your CV: Make sure your CV is tailored to the Data Engineer role. Highlight your experience with Azure, AWS, and any relevant projects you've worked on. We want to see how your skills match what we're looking for!
Showcase Your Projects: Include specific examples of data platforms or pipelines you've designed or built. We love seeing hands-on experience, so donβt hold back on the details that show off your technical prowess!
Craft a Compelling Cover Letter: Your cover letter is your chance to shine! Use it to explain why you're passionate about data engineering and how you can contribute to our team. We appreciate a personal touch that reflects your enthusiasm for the role.
Apply Through Our Website: We encourage you to apply directly through our website. Itβs the best way for us to receive your application and ensures youβre considered for the role. Plus, it makes the process smoother for everyone involved!
How to prepare for a job interview at Nine Twenty Recruitment
β¨Know Your Tech Inside Out
Make sure you brush up on your technical skills, especially in Azure and AWS. Be ready to discuss your hands-on experience with tools like Spark, Python, and Databricks. Prepare to explain how you've designed and built scalable data platforms or ETL/ELT pipelines in previous roles.
β¨Showcase Your Problem-Solving Skills
During the interview, be prepared to tackle real-world scenarios. Think about how you would approach designing a Lakehouse architecture or implementing a serverless data solution. This will demonstrate your ability to think critically and apply your knowledge practically.
β¨Communicate Clearly and Confidently
Strong communication skills are key for this role. Practice explaining complex technical concepts in simple terms, as you'll need to collaborate with cross-functional teams and advise clients. Being able to articulate your thoughts clearly can set you apart from other candidates.
β¨Prepare Questions for Them
Interviews are a two-way street! Prepare insightful questions about their data projects, team culture, and the technologies they use. This shows your genuine interest in the role and helps you assess if it's the right fit for you.