At a Glance
- Tasks: Design and develop scalable data pipelines using AWS services and Python.
- Company: Join a leading Blue Chip client in a dynamic tech environment.
- Benefits: Competitive daily rate, flexible remote work, and opportunities for professional growth.
- Why this job: Make an impact by optimising data platforms and collaborating with top professionals.
- Qualifications: 8+ years of AWS Data Engineering experience and strong Python skills required.
- Other info: Onsite in Glasgow 2-3 days a week; immediate interviews available.
The predicted salary is between 60000 - 84000 Β£ per year.
We are seeking a highly skilled Senior AWS Data Engineer with strong hands-on experience building scalable, secure, and automated data platforms on AWS. The ideal candidate will have deep expertise in AWS CloudFormation, data ingestion and transformation services, Python-based ETL development, and orchestration workflows. This role will focus on designing, implementing, and optimizing end-to-end data pipelines, ensuring data quality, reliability, and governance across cloud-native environments.
Key Responsibilities
- Design, develop, and maintain large scale data pipelines using AWS services such as Glue, Lambda, Step Functions, EMR, DynamoDB, S3, Athena, and other ETL/ELT components.
- Build automated ingestion, transformation, and enrichment workflows for structured and unstructured datasets.
- Implement reusable data engineering frameworks and modular components using Python, PySpark, and AWS-native tooling.
- Develop and manage AWS CloudFormation templates for provisioning secure, scalable data engineering infrastructure.
- Optimize data storage strategies (S3 layouts, partitioning, compression, life cycle rules).
- Configure and maintain compute services for data workloads (Lambda, ECS, EC2, EMR).
- Build and enhance orchestration flows using AWS Step Functions, EventBridge, and Glue Workflows.
- Implement CI/CD practices for data pipelines and infrastructure automation.
- Apply strong authentication/authorization mechanisms using IAM, KMS, access policies, and data access controls.
- Ensure compliance with enterprise security standards, encryption requirements, and governance frameworks.
- Implement data quality checks, schema validation, lineage tracking, and metadata management.
- Work with data architects, platform engineers, analysts, and cross-functional stakeholders to deliver high-quality datasets.
- Troubleshoot pipeline issues, optimize performance, and improve reliability and observability across the data platform.
- Drive continuous improvement in automation, monitoring, and operational efficiency.
Required Skills & Experience
- 8+ years of hands-on experience as a Data Engineer with strong AWS expertise.
- Expert-level proficiency in AWS CloudFormation (mandatory).
- Strong experience with AWS data and compute services: Glue, Lambda, Step Functions, EMR, S3, DynamoDB, Athena, ECS/EC2 for data workloads where relevant.
- Solid experience building ETL/ELT pipelines using Python (and ideally PySpark).
- Strong knowledge of IAM, KMS, encryption, and AWS security fundamentals.
- Ability to design and implement authentication/authorization patterns (OAuth2, API security, IAM roles & policies).
- Strong understanding of distributed systems, data modelling, modern data architectures, and cloud-native design.
- Experience deploying pipelines using CI/CD practices and automated workflows.
Good to Have
- Experience with monitoring and observability tools (CloudWatch, Prometheus, Grafana).
- Exposure to serverless data architectures.
- Hands-on experience in cloud migration, Legacy-to-cloud data movement, or enterprise-scale transformations.
- Familiarity with data catalogues, lineage tools, and governance frameworks.
Please send CV for full details and immediate interviews.
AWS Data Engineer - Glasgow and remote - 10 months+ employer: OCTOPUS COMPUTER ASSOCIATES
Contact Detail:
OCTOPUS COMPUTER ASSOCIATES Recruiting Team
StudySmarter Expert Advice π€«
We think this is how you could land AWS Data Engineer - Glasgow and remote - 10 months+
β¨Tip Number 1
Network like a pro! Reach out to your connections on LinkedIn or at industry events. We all know that sometimes itβs not just what you know, but who you know that can help you land that AWS Data Engineer gig.
β¨Tip Number 2
Prepare for those interviews by brushing up on your technical skills and AWS services. We recommend doing mock interviews with friends or using online platforms to get comfortable talking about your experience with data pipelines and AWS tools.
β¨Tip Number 3
Showcase your projects! If you've built any data pipelines or worked on AWS projects, make sure to have them ready to discuss. We love seeing real-world applications of your skills, so donβt be shy about sharing your successes.
β¨Tip Number 4
Apply through our website! Itβs the best way to ensure your application gets seen. Plus, weβre always looking for talented individuals like you to join our team, especially for roles like the AWS Data Engineer position in Glasgow.
We think you need these skills to ace AWS Data Engineer - Glasgow and remote - 10 months+
Some tips for your application π«‘
Tailor Your CV: Make sure your CV highlights your AWS experience and data engineering skills. We want to see how your background aligns with the role, so donβt be shy about showcasing relevant projects and technologies you've worked with!
Craft a Compelling Cover Letter: Your cover letter is your chance to shine! Use it to explain why you're the perfect fit for this AWS Data Engineer role. Share your passion for data engineering and how you can contribute to our team at StudySmarter.
Showcase Your Technical Skills: Be specific about your technical expertise in AWS services like Glue, Lambda, and CloudFormation. We love seeing concrete examples of how you've built scalable data pipelines or automated workflows in your previous roles.
Apply Through Our Website: We encourage you to apply directly through our website. Itβs the best way for us to receive your application and ensures youβre considered for this exciting opportunity. Donβt miss out!
How to prepare for a job interview at OCTOPUS COMPUTER ASSOCIATES
β¨Know Your AWS Inside Out
Make sure you brush up on your AWS knowledge, especially around CloudFormation, Glue, and Lambda. Be ready to discuss specific projects where you've built data pipelines or automated workflows using these services.
β¨Showcase Your Python Skills
Prepare to talk about your experience with Python-based ETL development. Have examples ready that demonstrate how you've used Python and PySpark to build scalable data solutions, and be prepared to solve a coding challenge if asked.
β¨Understand Data Governance
Familiarise yourself with data quality checks, schema validation, and compliance standards. Be ready to explain how you've implemented security measures like IAM roles and encryption in your previous roles.
β¨Collaboration is Key
Highlight your experience working with cross-functional teams. Be prepared to discuss how you've collaborated with data architects and analysts to troubleshoot issues and improve data reliability in past projects.