At a Glance
- Tasks: Join us as a Data Engineer to implement and deploy Azure Data Factory pipelines.
- Company: Be part of an award-winning tech scale-up recognised for rapid growth and innovation.
- Benefits: Enjoy flexible remote or hybrid work options and competitive salary packages.
- Why this job: Work on exciting projects, collaborate with talented teams, and make a real impact in tech.
- Qualifications: Two years of Azure Data Factory experience and strong communication skills are essential.
- Other info: Opportunity to work with cutting-edge technology and support production systems.
The predicted salary is between 32000 - 44000 £ per year.
Location: Fully Remote or Hybrid / Edinburgh
Salary: £38,000 - £44,000
An award-winning enterprise software scale-up with high ambitions for growth is on the lookout for another Data Engineer to join the team. They recently won the ScotlandIS Digital Tech Scale-up Business of the Year award and have also previously been recognised as Scotland’s fastest-growing tech company in the Deloitte Technology Fast 50 for three consecutive years.
What you’ll do:
- Implement, test and deploy Azure Data Factory (ADF) pipeline definitions within version control to customer environments.
- Work with our Site Reliability Engineering team to ensure your solutions are observable, reliable and performant.
- Work with our software implementation consultants (SICs) to define and verify specification documents for ETL process.
- Work with customer IT to test customer data source endpoints to ensure they meet specification.
- Work with our Engineering teams to ensure end-to-end capability for integrated data.
- Support cutover to production systems (can be outside normal working hours).
- Identify improvements to existing Azure Data Factory processes to ensure they are more maintainable across a growing set of customers.
About you:
- You must have at least two years of experience in Azure Data Factory and be comfortable building transparent, easy-to-support pipelines.
- Experience building and maintaining data integrations with a variety of external systems.
- Good understanding of the ETL process.
- Comfortable being in a client-facing role.
- Excellent communication skills: you can clearly explain technical matters to any audience.
- Confident working with complex referential data.
- Knowledge of Rest APIs, SQL databases and other data sources.
- A team player, with experience collaborating with other departments.
- You demonstrate good attention to detail and enjoy breaking complex problems down into simple steps.
Applying for the opportunity:
If you feel you have the required experience and would like to be considered for the opportunity, please forward an up-to-date version of your CV, and someone will contact you back within 48 hours if we feel you meet the requirements of the role.
Data Engineer (Azure Data Factory) employer: Digital Waffle
Contact Detail:
Digital Waffle Recruiting Team
StudySmarter Expert Advice 🤫
We think this is how you could land Data Engineer (Azure Data Factory)
✨Tip Number 1
Familiarise yourself with Azure Data Factory's latest features and updates. Being well-versed in the most recent functionalities will not only boost your confidence but also demonstrate your commitment to staying current in the field.
✨Tip Number 2
Network with professionals in the data engineering community, especially those who work with Azure. Engaging in discussions or attending webinars can provide insights into industry trends and may even lead to referrals.
✨Tip Number 3
Prepare to discuss specific projects where you've implemented Azure Data Factory. Be ready to explain your role, the challenges you faced, and how you overcame them, as this will showcase your problem-solving skills.
✨Tip Number 4
Practice explaining technical concepts in simple terms. Since the role requires excellent communication skills, being able to convey complex ideas clearly will set you apart during interviews.
We think you need these skills to ace Data Engineer (Azure Data Factory)
Some tips for your application 🫡
Tailor Your CV: Make sure your CV highlights your experience with Azure Data Factory and any relevant projects you've worked on. Use specific examples that demonstrate your skills in building and maintaining data integrations.
Showcase Communication Skills: Since the role requires excellent communication skills, consider including a section in your CV or cover letter that illustrates how you've effectively communicated technical concepts to non-technical audiences.
Detail Your Technical Expertise: Clearly outline your knowledge of ETL processes, Rest APIs, SQL databases, and other data sources in your application. This will help the employer see that you have the necessary technical background for the role.
Express Your Team Collaboration Experience: Mention any past experiences where you collaborated with different departments or teams. Highlighting your ability to work as a team player can set you apart from other candidates.
How to prepare for a job interview at Digital Waffle
✨Showcase Your Azure Data Factory Experience
Make sure to highlight your two years of experience with Azure Data Factory during the interview. Be prepared to discuss specific projects where you implemented, tested, and deployed ADF pipelines, as this will demonstrate your hands-on expertise.
✨Communicate Clearly About Technical Concepts
Since excellent communication skills are essential for this role, practice explaining technical concepts in simple terms. Think about how you would describe the ETL process or data integrations to someone without a technical background.
✨Prepare for Client-Facing Scenarios
As this position involves client interaction, be ready to discuss your experience in client-facing roles. Prepare examples of how you've successfully communicated with clients or collaborated with other departments to solve problems.
✨Demonstrate Problem-Solving Skills
Be prepared to discuss how you approach complex problems, especially in relation to data integration and pipeline maintenance. Share specific examples of challenges you've faced and how you broke them down into manageable steps to find solutions.