At a Glance
- Tasks: Design and optimise data pipelines using Azure Data Factory and advanced SQL.
- Company: Join a rapidly growing global workforce solutions company with a diverse culture.
- Benefits: Enjoy a hybrid work model, competitive salary, and opportunities for professional growth.
- Why this job: Make an impact by working on innovative data projects with cutting-edge technologies.
- Qualifications: Experience in data engineering, SQL, Python, and Azure Data Factory is essential.
- Other info: Dynamic team environment with potential for career advancement and skill development.
The predicted salary is between 60000 - 84000 £ per year.
Net2Source Inc. is one of the fastest growing diversity certified global workforce solutions companies with an unprecedented YoY growth of over 100% for the last 6 years working with Fortune 1000/Global 2000 across 34 countries and 5 continents including North America, South America, Europe, Asia, Australia, and the Middle East.
About the Role
- Location – London, UK
- Mode of Work – Hybrid (3 Days Onsite in Week)
- Type of Hiring – Fixed Term Employment (FTE)
- Project Duration – 12 Months (Possible Extension)
Must-have Skills:
- Data Focused Essentials
- Advanced SQL expertise including stored procedures, indexing, and performance tuning
- Hands-on experience with Python and PowerShell for ETL, ELT, and automation
- Experience with Azure Data Factory, ADLS, Blob Storage, Azure SQL
- Expertise in data modelling, logical/physical quality frameworks, and optimisation
- Ability to work with structured, semi-structured, and unstructured data formats
- Strong knowledge of audit logging, lineage, cataloguing, metadata management, and security
- Demonstrated automation mindset including use of AI agents
- Hands-on experience or strong understanding of modern cloud data warehousing including Snowflake fundamentals such as virtual warehouses, micro-partitioning, query optimization, and role-based access control
Good to Have Skills:
- Infrastructure Advanced Platforms
- Infrastructure DevOps
- Experience with Terraform, Azure DevOps, YAML pipelines, and cloud automation
- Exposure to Azure Function Apps, serverless compute, and orchestration
- Understanding of infrastructure as code and cloud deployment patterns
- Exposure to Docker, Kubernetes
- Advanced Data Platforms Tools
- Deep or hands-on exposure to Snowflake including creating and managing Snowflake objects, databases, schemas, roles
- Using Snow pipe for automated ingestion
- Performance tuning using clustering, caching, and micro-partitioning
- Understanding Snowflake cost optimization and storage compute separation
- Experience with Databricks for advanced data engineering workflows
- Familiarity with Denodo or Microsoft Purview
Responsibilities:
- Design and implement scalable data ingestion pipelines from diverse sources including APIs, SharePoint, on-premises systems, and file-based sources
- Perform data cleansing, validation, and transformation to produce high quality reliable datasets
- Develop and maintain data migration and archival strategies ensuring accuracy, integrity, and compliance
- Build and optimise logical and physical data models
- Handle diverse data structures and formats including BAK, MDF, CSV, JSON, XML, and Parquet
- Automate ingestion and processing workflows using Python, PowerShell, and orchestration tools
- Apply an automation first mindset including experience integrating AI agents for workflow automation
- Build and maintain solutions using Azure Data Factory, Blob Storage, ADLS, and Azure SQL
- Implement data lineage, cataloguing, and governance capabilities
Data Quality, Security & Compliance:
- Oversee data quality frameworks ensuring accuracy, consistency, and integrity
- Implement audit logging, data lineage, and compliance practices
- Maintain strong security and governance controls
Performance & Scalability:
- Optimize pipelines, SQL queries, and storage layers
- Troubleshoot performance issues across compute and storage
Certifications:
- Azure or Snowflake certifications are strong advantages
Personal Attributes:
- Analytical thinker with strong problem-solving capabilities
- Proactive, adaptable, and committed to continuous learning
- Excellent communication and collaboration skills
Senior Azure Data Factory Engineer employer: Net2Source (N2S)
Contact Detail:
Net2Source (N2S) Recruiting Team
StudySmarter Expert Advice 🤫
We think this is how you could land Senior Azure Data Factory Engineer
✨Tip Number 1
Network like a pro! Reach out to your connections in the industry, attend meetups, and join online forums. You never know who might have the inside scoop on job openings or can refer you directly.
✨Tip Number 2
Show off your skills! Create a portfolio showcasing your projects, especially those involving Azure Data Factory, SQL, and Python. This gives potential employers a tangible look at what you can do.
✨Tip Number 3
Prepare for interviews by practising common questions related to data engineering and cloud technologies. We recommend doing mock interviews with friends or using online platforms to boost your confidence.
✨Tip Number 4
Don’t forget to apply through our website! It’s the best way to ensure your application gets noticed. Plus, we love seeing candidates who are proactive about their job search.
We think you need these skills to ace Senior Azure Data Factory Engineer
Some tips for your application 🫡
Tailor Your CV: Make sure your CV is tailored to the Senior Azure Data Factory Engineer role. Highlight your experience with Azure, SQL, and any relevant projects that showcase your skills in data modelling and automation.
Craft a Compelling Cover Letter: Your cover letter is your chance to shine! Use it to explain why you're passionate about this role and how your background aligns with our needs. Don’t forget to mention your hands-on experience with Python and PowerShell!
Showcase Your Projects: If you've worked on any relevant projects, be sure to include them in your application. We love seeing real-world examples of your work, especially those involving data ingestion pipelines or cloud data warehousing.
Apply Through Our Website: We encourage you to apply through our website for the best chance of getting noticed. It’s super easy, and we can’t wait to see what you bring to the table!
How to prepare for a job interview at Net2Source (N2S)
✨Know Your Tech Inside Out
Make sure you brush up on your advanced SQL skills, especially stored procedures and performance tuning. Be ready to discuss your hands-on experience with Azure Data Factory and how you've used Python and PowerShell for ETL processes.
✨Showcase Your Automation Mindset
Prepare examples of how you've implemented automation in your previous roles. Highlight any experience with AI agents or orchestration tools, as this will demonstrate your proactive approach to problem-solving.
✨Understand the Data Landscape
Familiarise yourself with different data formats and structures, including JSON, XML, and Parquet. Be prepared to discuss how you've handled structured, semi-structured, and unstructured data in past projects.
✨Communicate Clearly and Confidently
Practice explaining complex technical concepts in simple terms. Good communication is key, so be ready to collaborate and share your ideas effectively during the interview.