At a Glance
- Tasks: Lead the design and implementation of data systems using Azure and Databricks.
- Company: Join a forward-thinking company that empowers organisations with data and AI solutions.
- Benefits: Enjoy a hybrid work model, competitive daily rates, and opportunities for professional growth.
- Why this job: Be at the forefront of data analytics and AI, mentoring others while making a real impact.
- Qualifications: Experience in Azure services, Python, SQL, and data platform development is essential.
- Other info: Active SC Clearance required; contract starts on 1st May for an initial 6 months.
This is an exciting contract opportunity for an SC Cleared Azure Data Engineer to join an experienced team in a new customer engagement working at the forefront of data analytics and Artificial Intelligence. Our client equips ambitious organisations with enduring control of their data and AI evolution. This role is an opportunity to lead the build of bespoke data systems for our clients.
Responsibilities:
- Design and implement scalable data pipelines and ETL processes using Azure and Databricks technologies including Delta Live Tables.
- Lead technical discussions with clients and stakeholders to gather requirements and propose solutions.
- Help clients realise the potential of data science, machine learning, and scaled data processing within Azure / Databricks ecosystem.
- Mentor junior team members and support their personal development.
- Take ownership for the delivery of core solution components.
- Support with planning, requirements refinement and work estimation.
Skills and Experiences:
- Design and develop end-to-end data solutions leveraging Azure services for batch, real-time and streaming workloads (including data ingestion, cleansing, modelling and integration).
- Strong background of enterprise data platform development, concepts and methods such as data warehouses and data lakehouse, with the ability to adapt and tailor based on requirements.
- Experience with Azure Synapse Analytics/or Azure Databricks, Microsoft Fabric, Data Factory.
- Expert in Python, SQL and developer tooling such as Visual Studio Code, Azure DevOps.
- Good experience of CI/CD practices and tools for data platforms using Azure DevOps.
- Good knowledge on how to leverage AI to increase development productivity and quality.
- Good understanding of data governance and data architecture principles.
- Excellent communication skills.
- Desirable certification in Azure Data Engineering and Databricks Engineering.
Additional Information:
- Rate offered: £500-550 per day.
- Location: Hybrid with travel to client site 1 day/week.
- Start date: 1st May.
- Duration: 6 months initial sign up with significant opportunity for extension.
- Required: Active SC Clearance.
Azure Data Engineer - SC Cleared employer: Shareforce
Contact Detail:
Shareforce Recruiting Team
StudySmarter Expert Advice 🤫
We think this is how you could land Azure Data Engineer - SC Cleared
✨Tip Number 1
Make sure to brush up on your Azure and Databricks skills, especially around Delta Live Tables. Being able to demonstrate your hands-on experience with these technologies during discussions will set you apart from other candidates.
✨Tip Number 2
Prepare to discuss real-world examples of how you've designed and implemented data pipelines. Having specific scenarios ready will help you showcase your problem-solving abilities and technical expertise.
✨Tip Number 3
Familiarise yourself with the latest trends in data governance and architecture principles. Being knowledgeable about these topics will not only impress your interviewers but also show that you're committed to staying current in the field.
✨Tip Number 4
Since mentoring is part of the role, think about your past experiences where you've guided others. Be ready to share how you can contribute to the development of junior team members, as this will highlight your leadership potential.
We think you need these skills to ace Azure Data Engineer - SC Cleared
Some tips for your application 🫡
Tailor Your CV: Make sure your CV highlights relevant experience in Azure Data Engineering, particularly with Azure services and Databricks. Use keywords from the job description to demonstrate your fit for the role.
Craft a Compelling Cover Letter: Write a cover letter that showcases your passion for data analytics and AI. Mention specific projects where you've designed scalable data pipelines or led technical discussions, as these are key responsibilities of the role.
Showcase Your Technical Skills: In your application, emphasise your expertise in Python, SQL, and CI/CD practices. Provide examples of how you've used these skills in previous roles, especially in relation to data platforms and governance.
Highlight Your Communication Skills: Since excellent communication is crucial for this role, include examples of how you've effectively communicated with clients or mentored junior team members. This will help demonstrate your ability to lead discussions and support others.
How to prepare for a job interview at Shareforce
✨Showcase Your Technical Skills
Be prepared to discuss your experience with Azure services, Databricks, and data engineering concepts. Highlight specific projects where you've designed and implemented data pipelines or ETL processes, as this will demonstrate your hands-on expertise.
✨Prepare for Client Interaction Scenarios
Since the role involves leading technical discussions with clients, practice articulating how you gather requirements and propose solutions. Think of examples where you've successfully communicated complex technical ideas to non-technical stakeholders.
✨Emphasise Mentorship Experience
If you've mentored junior team members before, be ready to share those experiences. Discuss how you supported their development and what strategies you used to help them grow, as this shows your leadership potential.
✨Understand Data Governance Principles
Brush up on data governance and architecture principles, as these are crucial for the role. Be prepared to discuss how you ensure data quality and compliance in your projects, which will reflect your understanding of best practices in data management.