At a Glance
- Tasks: Design and implement robust ETL processes and scalable data structures.
- Company: Join an award-winning tech firm revolutionising the industry with their SaaS platform.
- Benefits: Enjoy a fully remote role, competitive salary, and potential for a 4-day work week.
- Why this job: Be part of a talented team driving innovation in data engineering and making a global impact.
- Qualifications: Expertise in Python, data processing, MongoDB, and AWS required; Agile mindset preferred.
- Other info: Work-life balance is a priority, with a strong emphasis on quality and delivery.
The predicted salary is between 42000 - 84000 £ per year.
Are you a Data Engineering enthusiast who thrives from designing and implementing robust ETL processes, highly scalable data structures and data pipelines within a truly enterprise-scale environment? If so - look no further. This is your chance to help my client build their new Data Infrastructure and essentially help them to leverage the power of data.
They launched their award-winning, enterprise-scale SaaS platform under 5 years ago and it has since revolutionised its industry and is being utilised by some of the biggest companies and brand names in the WORLD. They have grown quickly and have truly made an impact in their market. They currently have HUNDREDS of clients globally and they have 3 million+ users over the globe too.
They built the platform from scratch several years ago using absolutely no legacy code and only the very best technology stack and enterprise tools within a true Microservices environment. They have since grown their Engineering function significantly and are proud to have a talented team of Data experts who work together collaboratively to deliver robust Data solutions to their extensive client-base.
They show no signs of slowing down and they’re onboarding new customers consistently and due to this - we’re looking for more Data Engineers to join them and play a key role in driving the platform forward and its evolution. I am looking for someone to holistically take ownership of data modelling and engineering across the business.
You will have a robust Infrastructure background and a good understanding of the different complexities that come when moving one system to another. Technically, you will be a Python expert with solid exposure in data processing and automation. You will have solid MongoDB ATLAS exposure including monitoring, DB design and optimisation - ideally with some understanding of Kafka. You will be able to develop and maintain ELT and bring a solid understanding of data warehousing concepts and best practice. Naturally, you will have a good understanding of AWS.
I’d love you to be an advocate of Agile too - these guys are massive on Agile Delivery and Scrum - so it’s important you share a similar mindset and appreciate continuous integration and deployment.
This is a great chance to join a firm who do things the right way. They absolutely do NOT compromise on quality - they take it very seriously. They place a huge emphasis on delivery so it’s naturally critical we find someone that shares and owns this mindset!
They have awesome offices here in Nottingham but they’re remote-first, so you can work from home if you wish. They treat their staff super well and are massive believers in giving their people a good home life/work life balance and are even trialling a 4 day week! Salary to £70k plus amazing benefits.
Data Engineer (Python, Snowflake, ETL) REMOTE UK, £70k employer: Akkodis
Contact Detail:
Akkodis Recruiting Team
StudySmarter Expert Advice 🤫
We think this is how you could land Data Engineer (Python, Snowflake, ETL) REMOTE UK, £70k
✨Tip Number 1
Familiarise yourself with the specific technologies mentioned in the job description, such as Python, Snowflake, and Apache Airflow. Having hands-on experience or projects that showcase your skills in these areas will make you stand out.
✨Tip Number 2
Network with current or former employees of the company on platforms like LinkedIn. Engaging with them can provide insights into the company culture and expectations, which can be invaluable during interviews.
✨Tip Number 3
Prepare to discuss your experience with data modelling and ETL processes in detail. Be ready to share specific examples of how you've successfully implemented these in previous roles, as this is a key focus for the position.
✨Tip Number 4
Show your enthusiasm for Agile methodologies. Research Agile principles and be prepared to discuss how you've applied them in your work, as the company values a mindset aligned with continuous integration and delivery.
We think you need these skills to ace Data Engineer (Python, Snowflake, ETL) REMOTE UK, £70k
Some tips for your application 🫡
Tailor Your CV: Make sure your CV highlights your experience with Python, ETL processes, and data modelling. Use specific examples from your past work that demonstrate your skills in these areas, especially in an enterprise-scale environment.
Craft a Compelling Cover Letter: In your cover letter, express your enthusiasm for the role and the company. Mention your understanding of Agile methodologies and how you can contribute to their data infrastructure. Be sure to align your values with their emphasis on quality and delivery.
Showcase Relevant Projects: If you have worked on projects involving Snowflake, Apache Airflow, or AWS, be sure to include these in your application. Describe your role in these projects and the impact they had on the overall success of the initiatives.
Highlight Soft Skills: Don’t forget to mention your soft skills, such as teamwork and communication. Since the company values collaboration within their engineering team, showcasing your ability to work well with others will strengthen your application.
How to prepare for a job interview at Akkodis
✨Showcase Your Technical Skills
Be prepared to discuss your experience with Python, ETL processes, and data modelling in detail. Bring examples of past projects where you've successfully implemented these skills, especially in a cloud environment like AWS.
✨Understand the Company’s Tech Stack
Familiarise yourself with the technologies mentioned in the job description, such as Snowflake, Apache Airflow, and DBT. Demonstrating knowledge of these tools will show that you're serious about the role and can hit the ground running.
✨Emphasise Agile Methodologies
Since the company values Agile delivery, be ready to discuss your experience with Agile practices and how you’ve applied them in previous roles. Highlight any experience you have with Scrum and continuous integration/deployment.
✨Prepare Questions About the Role
Have thoughtful questions ready about the company's data infrastructure and future projects. This shows your genuine interest in the position and helps you assess if the company is the right fit for you.