At a Glance
- Tasks: Join our team to build and optimise data pipelines on Azure Databricks for economic data.
- Company: We are a leading firm in financial services, focusing on innovative data solutions.
- Benefits: Enjoy flexible working with 3 days in the office, competitive salary, and professional development opportunities.
- Why this job: Be part of a cutting-edge platform that impacts monetary analysis and forecasting.
- Qualifications: 8+ years in data engineering, strong skills in Azure Databricks, Python, and Spark required.
- Other info: SC clearance is necessary; experience in financial services is a plus.
The predicted salary is between 48000 - 84000 £ per year.
Job Title: Senior Data Engineer
Location: London, UK (3 days in the office)
SC Cleared: Required
Job Type: Full-Time
Experience: 8+ years
Job Summary:
We are seeking a highly skilled and experienced Senior Data Engineer to join our team and contribute to the development and maintenance of our cutting-edge Azure Databricks platform for economic data. This platform is critical for our Monetary Analysis, Forecasting, and Modelling activities. The Senior Data Engineer will be responsible for building and optimising data pipelines, implementing data transformations, and ensuring data quality and reliability. This role requires a strong understanding of data engineering principles, big data technologies, cloud computing (specifically Azure), and experience working with large datasets.
Key Responsibilities:
- Design, develop, and maintain robust and scalable data pipelines for ingesting, transforming, and loading data from various sources (e.g., APIs, databases, financial data providers) into the Azure Databricks platform.
- Optimise data pipelines for performance, efficiency, and cost-effectiveness.
- Implement data quality checks and validation rules within data pipelines.
Data Transformation & Processing:
- Implement complex data transformations using Spark (PySpark or Scala) and other relevant technologies.
- Develop and maintain data processing logic for cleaning, enriching, and aggregating data.
- Ensure data consistency and accuracy throughout the data lifecycle.
- Work extensively with Azure Databricks Unity Catalog, including Delta Lake, Spark SQL, and other relevant services.
- Implement best practices for Databricks development and deployment.
- Optimise Databricks workloads for performance and cost.
Data Integration:
- Integrate data from various sources, including relational databases, APIs, and streaming data sources.
- Implement data integration patterns and best practices.
- Work with API developers to ensure seamless data exchange.
Data Quality & Governance:
- Hands-on experience using Azure Purview for data quality and data governance.
- Implement data quality monitoring and alerting processes.
- Work with data governance teams to ensure compliance with data governance policies and standards.
- Implement data lineage tracking and metadata management processes.
- Collaborate closely with data scientists, economists, and other technical teams to understand data requirements and translate them into technical solutions.
- Communicate technical concepts effectively to both technical and non-technical audiences.
- Participate in code reviews and knowledge sharing sessions.
Automation & DevOps:
- Implement automation for data pipeline deployments and other data engineering tasks.
- Work with DevOps teams to implement and build CI/CD pipelines for environmental deployments.
- Promote and implement DevOps best practices.
Minimum Qualifications:
- 10+ years of experience in data engineering, with at least 3+ years of hands-on experience with Azure Databricks.
- Strong proficiency in Python and Spark (PySpark) or Scala.
- Deep understanding of data warehousing principles, data modelling techniques, and data integration patterns.
- Extensive experience with Azure data services, including Azure Data Factory, Azure Blob Storage, and Azure SQL Database.
- Experience working with large datasets and complex data pipelines.
- Experience with data architecture design and data pipeline optimization.
- Proven expertise with Databricks, including hands-on implementation experience and certifications.
- Experience with SQL and NoSQL databases.
- Experience with data quality and data governance processes.
- Experience with version control systems (e.g., Git).
- Experience with Agile development methodologies.
- Excellent communication, interpersonal, and problem-solving skills.
- Experience with streaming data technologies (e.g., Kafka, Azure Event Hubs).
- Experience with data visualisation tools (e.g., Tableau, Power BI).
- Experience with DevOps tools and practices (e.g., Azure DevOps, Jenkins, Docker, Kubernetes).
- Experience working in a financial services or economic data environment.
- Azure certifications related to data engineering (e.g., Azure Data Engineer Associate).
Seniority Level:
Mid-Senior level
Employment Type:
Full-time
Job Function:
IT Services and IT Consulting
#J-18808-Ljbffr
Senior Data Engineer employer: Mastek
Contact Detail:
Mastek Recruiting Team
StudySmarter Expert Advice 🤫
We think this is how you could land Senior Data Engineer
✨Tip Number 1
Familiarise yourself with Azure Databricks and its features, especially Delta Lake and Spark SQL. Being able to discuss specific projects or experiences where you've used these technologies will show your expertise and make you stand out.
✨Tip Number 2
Network with professionals in the data engineering field, particularly those who work with Azure. Attend meetups or webinars focused on Azure technologies to gain insights and potentially connect with current employees at StudySmarter.
✨Tip Number 3
Prepare to discuss your experience with data quality and governance processes. Be ready to share examples of how you've implemented data quality checks and monitoring in previous roles, as this is a key responsibility for the position.
✨Tip Number 4
Brush up on your knowledge of CI/CD pipelines and DevOps practices, as these are essential for the role. Consider sharing any relevant experiences where you've automated data pipeline deployments or collaborated with DevOps teams.
We think you need these skills to ace Senior Data Engineer
Some tips for your application 🫡
Tailor Your CV: Make sure your CV highlights relevant experience in data engineering, particularly with Azure Databricks and big data technologies. Use specific examples of projects you've worked on that align with the job responsibilities.
Craft a Compelling Cover Letter: Write a cover letter that showcases your passion for data engineering and your understanding of the role. Mention your experience with data pipelines, data quality, and any relevant certifications to make a strong case for your candidacy.
Highlight Technical Skills: Clearly list your technical skills related to the job description, such as proficiency in Python, Spark, and Azure services. Be specific about your experience with tools like Azure Data Factory and Databricks, as these are crucial for the role.
Showcase Problem-Solving Abilities: In your application, provide examples of how you've tackled complex data challenges in previous roles. This could include optimising data pipelines or implementing data quality checks, demonstrating your ability to deliver results.
How to prepare for a job interview at Mastek
✨Showcase Your Technical Skills
Be prepared to discuss your experience with Azure Databricks, Spark, and data pipeline optimisation. Bring examples of past projects where you successfully implemented complex data transformations or improved data quality.
✨Understand the Business Context
Familiarise yourself with the financial services sector and how data engineering supports economic analysis. This will help you articulate how your work can directly impact the company's goals.
✨Prepare for Scenario-Based Questions
Expect questions that assess your problem-solving abilities. Think about scenarios where you've had to troubleshoot data issues or optimise a data pipeline, and be ready to explain your thought process.
✨Communicate Clearly
Practice explaining technical concepts in simple terms, as you may need to communicate with non-technical stakeholders. Clear communication is key to demonstrating your ability to collaborate effectively.