At a Glance
- Tasks: Design and manage data pipelines using Azure and Databricks technologies.
- Company: Join a leading Microsoft partner in the tech industry.
- Benefits: Competitive daily rate, remote work, and exposure to cutting-edge technologies.
- Why this job: Make an impact in big data and AI applications while working with top-tier tools.
- Qualifications: Experience with Big Data technologies and coding in Python is essential.
- Other info: Exciting contract role with opportunities for professional growth.
The predicted salary is between 44000 - 66000 Β£ per year.
We are currently recruiting for an experienced Data Engineer skilled in Microsoft Azure and cloud computing concepts.
As the Azure Data Engineer, you will work closely with a Microsoft & Databricks partner with responsibilities of end-to-end design, build and deployment of industry-leading big data and AI applications.
Your work will be varied and your responsibilities will include:
- Designing, developing, and managing end-to-end data pipelines
- Data processing and transformation using Spark
- Providing technical governance to enhance ways of working
- Champion DevOps and CI/CD methodologies to ensure agile collaboration and robust data solutions
- Engineer and orchestrate data models and pipelines
- Lead development activities using Python, PySpark and other technologies
- Write high-quality code that contributes to a scalable and maintainable data platform
To be successful in this role, you will need to have the following qualifications:
- Experience with Big Data Technologies such as Spark, Kafka in a customer-facing post-sales, technical architecture or consulting role
- Experience working on Big Data Architectures independently
- Comfortable writing code in Python
- Experience working across Azure including Azure Data Factory, Azure Synapse, Azure Delta Lake Storage, Delta Lake etc
- Experience with Purview, Unity Catalog etc
- Experience with streaming Data in Kafka/Event Hubs/Stream Analytics etc
- Experience working in the Databricks ecosystem
- MLOps
This is an exciting contract opportunity working at the cutting edge of technologies.
To be considered, please click apply.
Senior Azure Data Engineer in London employer: Primus Connect
Contact Detail:
Primus Connect Recruiting Team
StudySmarter Expert Advice π€«
We think this is how you could land Senior Azure Data Engineer in London
β¨Tip Number 1
Network like a pro! Reach out to your connections in the industry, especially those who work with Azure or Databricks. A friendly chat can lead to insider info about job openings that aren't even advertised yet.
β¨Tip Number 2
Show off your skills! Create a portfolio showcasing your projects, especially those involving data pipelines and big data technologies. This will give potential employers a taste of what you can do and set you apart from the crowd.
β¨Tip Number 3
Prepare for technical interviews by brushing up on your coding skills in Python and PySpark. Practice common data engineering problems and be ready to discuss your past experiences with Azure and Databricks in detail.
β¨Tip Number 4
Donβt forget to apply through our website! Itβs the best way to ensure your application gets seen by the right people. Plus, we love seeing candidates who are proactive and engaged in their job search.
We think you need these skills to ace Senior Azure Data Engineer in London
Some tips for your application π«‘
Tailor Your CV: Make sure your CV highlights your experience with Azure, Databricks, and big data technologies. We want to see how your skills align with the role, so donβt be shy about showcasing relevant projects!
Craft a Compelling Cover Letter: Your cover letter is your chance to shine! Use it to explain why youβre passionate about data engineering and how your background makes you a perfect fit for our team. Keep it concise but impactful!
Showcase Your Technical Skills: When detailing your experience, focus on specific tools and technologies like Spark, Python, and CI/CD methodologies. We love seeing concrete examples of how you've used these in past roles!
Apply Through Our Website: We encourage you to apply directly through our website. Itβs the best way for us to receive your application and ensures youβre considered for this exciting opportunity. Donβt miss out!
How to prepare for a job interview at Primus Connect
β¨Know Your Tech Stack
Make sure you brush up on your knowledge of Azure, Databricks, and the specific tools mentioned in the job description. Be ready to discuss your experience with Spark, Delta Live Tables, and Unity Catalog, as well as how you've used these technologies in past projects.
β¨Showcase Your Problem-Solving Skills
Prepare to share examples of how you've tackled challenges in data engineering. Think about specific instances where you designed and implemented data pipelines or improved processes using DevOps and CI/CD methodologies. Real-world examples will make your answers stand out.
β¨Demonstrate Your Coding Proficiency
Since coding is a big part of this role, be prepared to discuss your experience with Python and PySpark. You might even be asked to solve a coding problem during the interview, so practice writing clean, efficient code that adheres to best practices.
β¨Ask Insightful Questions
Interviews are a two-way street, so come prepared with questions that show your interest in the company and the role. Ask about their current projects, team dynamics, or how they implement MLOps within their workflows. This not only shows your enthusiasm but also helps you gauge if the company is the right fit for you.