At a Glance
- Tasks: Design and optimise cloud-based data platforms using AWS, Airflow, and Python.
- Company: Join Triad Group Plc, an award-winning digital consultancy with a supportive culture.
- Benefits: Enjoy 25 days annual leave, private healthcare, and continuous training opportunities.
- Why this job: Make a real impact on meaningful projects while working with cutting-edge technology.
- Qualifications: Strong experience in data engineering, AWS, and Python coding required.
- Other info: Collaborative environment with excellent career growth and a commitment to diversity.
The predicted salary is between 52000 - 78000 £ per year.
Based at client locations, working remotely, or based in our Godalming or Milton Keynes offices. Salary up to £65k plus company benefits.
About Us
Triad Group Plc is an award-winning digital, data, and solutions consultancy with over 35 years' experience primarily serving the UK public sector and central government. We deliver high-quality solutions that make a real difference to users, citizens and consumers. At Triad, collaboration thrives, knowledge is shared, and every voice matters. Our close-knit, supportive culture ensures you're valued from day one. Whether working with cutting-edge technology or shaping strategy for national-scale projects, you'll be trusted, challenged, and empowered to grow. We nurture learning through communities of practice and encourage creativity, autonomy, and innovation. If you're passionate about solving meaningful problems with smart and passionate people, Triad could be the place for you.
Role Summary
Triad is seeking a Senior Data Engineer to play a key role in delivering high-quality data solutions across a range of client assignments, primarily within the UK public sector. You will design, build, and optimise cloud-based data platforms, working closely with multidisciplinary teams to understand data requirements and deliver scalable, reliable, and secure data pipelines. This role offers the opportunity to shape data architecture, influence technical decisions, and contribute to meaningful, data-driven outcomes.
Key Responsibilities
- Design, develop, and maintain scalable data pipelines to extract, transform, and load (ETL) data into cloud-based data platforms, primarily AWS.
- Create and manage data models that support efficient storage, retrieval, and analysis of data.
- Utilise AWS services such as S3, EC2, Glue, Aurora, Redshift, DynamoDB and Lambda to architect and maintain cloud data solutions.
- Maintain modular Terraform based IaC for reliable provisioning of AWS infrastructure.
- Develop, optimise and maintain robust data pipelines using Apache Airflow.
- Implement data transformation processes using Python to clean, preprocess, and enrich data for analytical use.
- Collaborate with data analysts, data scientists, developers, and other stakeholders to understand and integrate data requirements.
- Monitor, optimise, and tune data pipelines to ensure performance, reliability, and scalability.
- Identify data quality issues and implement data validation and cleansing processes.
- Maintain clear and comprehensive documentation covering data pipelines, models, and best practices.
- Work within a continuous integration environment with automated builds, deployments, and testing.
Skills and Experience
- Strong experience designing and building data pipelines on cloud platforms, particularly AWS.
- Excellent proficiency in developing ETL processes and data transformation workflows.
- Strong SQL skills (PostgreSQL) and advanced Python coding capability (essential).
- Experience working with AWS services such as S3, EC2, Glue, Aurora, Redshift, DynamoDB and Lambda (essential).
- Understanding of Terraform codebases to create and manage AWS infrastructure.
- Experience developing, optimising, and maintaining data pipelines using Apache Airflow.
- Familiarity with distributed data processing systems such as Spark or Databricks.
- Experience working with high-performing, low-latency, or large-volume data systems.
- Ability to collaborate effectively within cross-functional, agile, delivery-focused teams.
- Experience defining data models, metadata, and data dictionaries to ensure consistency and accuracy.
Qualifications & Certifications
- A degree or equivalent qualification in Computer Science, Data Science, or a related discipline (desirable).
- Due to the nature of this position, you must be willing and eligible to achieve a minimum of SC clearance. To be eligible, you must have been a resident in the UK for a minimum of 5 years and have the right to work in the UK.
Triad's Commitment to You
- Continuous Training & Development: Access to top-rated Udemy Business courses.
- Work Environment: Collaborative, creative, and free from discrimination.
Benefits:
- 25 days of annual leave, plus bank holidays.
- Matched pension contributions (5%).
- Private healthcare with Bupa.
- Gym membership support or Lakeshore Fitness access.
- Perkbox membership.
- Cycle-to-work scheme.
What Our Colleagues Have to Say
Please see for yourself on Glassdoor and our "Day in the Life" videos at the bottom of our Careers Page.
Our Selection Process
After applying for the role, our in-house talent team will contact you to discuss Triad and the position. If shortlisted, you will be invited for:
- A technical test including numerical, logical and verbal reasoning.
- A technical interview with our consultants.
- A management interview to assess cultural fit.
We aim to complete interviews and progress candidates to offer stage within 2-3 weeks of the initial conversation.
Other Information
If this role is of interest to you or you would like further information, please contact Ryan Jordan and submit your application now. Triad is an equal opportunities employer and welcomes applications from all suitably qualified people regardless of sex, race, disability, age, sexual orientation, gender reassignment, religion, or belief. We are proud that our recruitment process is inclusive and accessible to disabled people who meet the minimum criteria for any role. Triad is a signatory to the Tech Talent Charter and a Disability Confident Leader.
Senior Data Engineer (AWS, Airflow, Python) in Milton Keynes employer: Triad
Contact Detail:
Triad Recruiting Team
StudySmarter Expert Advice 🤫
We think this is how you could land Senior Data Engineer (AWS, Airflow, Python) in Milton Keynes
✨Tip Number 1
Network like a pro! Reach out to your connections on LinkedIn or attend industry meetups. You never know who might have the inside scoop on job openings or can put in a good word for you.
✨Tip Number 2
Show off your skills! Create a portfolio showcasing your projects, especially those involving AWS, Airflow, and Python. This gives potential employers a taste of what you can do and sets you apart from the crowd.
✨Tip Number 3
Prepare for interviews by practising common technical questions related to data engineering. Brush up on your SQL and Python skills, and be ready to discuss your experience with cloud platforms like AWS.
✨Tip Number 4
Don’t forget to apply through our website! It’s the best way to ensure your application gets seen by the right people at Triad. Plus, it shows you’re genuinely interested in joining our awesome team!
We think you need these skills to ace Senior Data Engineer (AWS, Airflow, Python) in Milton Keynes
Some tips for your application 🫡
Tailor Your CV: Make sure your CV is tailored to the Senior Data Engineer role. Highlight your experience with AWS, Airflow, and Python, and don’t forget to showcase any relevant projects that demonstrate your skills in building data pipelines.
Craft a Compelling Cover Letter: Your cover letter is your chance to shine! Use it to explain why you’re passionate about data engineering and how your background aligns with Triad's mission. Be sure to mention specific experiences that relate to the job description.
Showcase Your Technical Skills: In your application, make sure to highlight your technical skills, especially your proficiency in SQL and Python. Mention any experience with AWS services and Terraform, as these are key for the role. We want to see what you can bring to the table!
Apply Through Our Website: We encourage you to apply through our website for a smoother process. It helps us keep track of your application and ensures you get all the updates directly from us. Plus, it’s super easy!
How to prepare for a job interview at Triad
✨Know Your Tech Inside Out
Make sure you brush up on your AWS services, especially S3, EC2, and Glue. Be ready to discuss how you've used these tools in past projects, as well as your experience with Apache Airflow and Python for data transformation.
✨Showcase Your Problem-Solving Skills
Prepare examples of how you've tackled data quality issues or optimised data pipelines in previous roles. Triad values creativity and innovation, so don't hesitate to share your thought process and the impact of your solutions.
✨Understand the Company Culture
Familiarise yourself with Triad's mission and values. They emphasise collaboration and a supportive environment, so be ready to discuss how you can contribute to this culture and work effectively within cross-functional teams.
✨Prepare for Technical Assessments
Since the selection process includes technical tests, practice numerical, logical, and verbal reasoning questions. Brush up on your SQL skills and be prepared to demonstrate your coding abilities in Python during the technical interview.