Data Specialist

Data Specialist

Full-Time 36000 - 60000 £ / year (est.) No home office possible
Go Premium
V

At a Glance

  • Tasks: Build and maintain data validation frameworks for Azure-based platforms.
  • Company: Join a forward-thinking tech company focused on data quality and reliability.
  • Benefits: Competitive salary, flexible working hours, and opportunities for professional growth.
  • Why this job: Make a real impact by ensuring data accuracy and reliability in innovative projects.
  • Qualifications: Experience with data validation, automated testing, and Azure tools.
  • Other info: Collaborative environment with strong career advancement potential.

The predicted salary is between 36000 - 60000 £ per year.

This is a hands-on DataOps / Data Quality Engineer role with a strong focus on building data validation frameworks and automated testing for Azure-based data platforms. The role also includes DataOps responsibilities, ensuring reliable, observable, and well-governed pipeline operations across Fabric Data Factory, Azure Data Factory and Synapse environments. Additionally, the engineer will take on Data Reliability Engineering (SRE) responsibilities.

Key Responsibilities

  • Build, maintain, or leverage open-source data validation frameworks to ensure data accuracy, schema integrity, and quality across ingestion and transformation pipelines.
  • Test and validate data pipelines and PySpark notebooks developed by Data Engineers, ensuring they meet quality, reliability, and validation standards.
  • Define and standardize monitoring, logging, alerting, and KPIs/SLAs across data platform to enable consistent measurement of data reliability.
  • Identify and create Azure Monitor alert rules and develop KQL queries to extract metrics and logs from Azure Monitor/Log Analytics for reliability tracking and alerting.
  • Write SQL queries and PowerShell (or another scripting language) to automate the execution of validation routines, verify pipeline outputs, and support end-to-end data quality workflows.
  • Collaborate with Data Engineering, Cloud, and Governance teams to embed standardized validation and reliability practices into their workflows.
  • Document validation rules, testing processes, operational guidelines, and data reliability best practices to ensure consistency across teams.

What We're Looking For

  • Strong background in data validation frameworks, automated testing, data verification logic, and quality enforcement.
  • Automation Experience for data validations, reconciliations and generating alerts.
  • Experience with Azure Monitor, setting up Alert rules, building dashboards using data queried (KQL) from Log Analytics.
  • Experience with Fabric Data Factory, Azure Data Factory, Synapse pipelines, and PySpark notebooks.
  • Hands-on experience calling REST/OData APIs for validating data.
  • Experience writing SQL and scripts for programmatically doing data validations and reconciliation across systems.
  • Strong understanding of the Azure ecosystem, including identity, network security, storage, and authentication models.
  • Working experience with Azure DevOps and CI/CD.
  • Strong debugging, incident resolution, and system reliability skills aligned to SRE.
  • Ability to work independently while collaborating effectively across Data Engineering, Cloud, Analytics, and Governance teams.

Beneficial Experience

  • Experience in data space, with strong exposure to data testing, validations, and Data Reliability Engineering.
  • Experience defining and tracking data quality KPIs, operational KPIs, and SLAs to measure data reliability and performance.
  • Hands-on experience using Azure Monitor, Log Analytics, and writing KQL queries to collect monitoring data and define alert rules.
  • Experience writing SQL and PowerShell (or another scripting language) to automate data validation, reconciliation, and rule execution.
  • Exposure to data validation frameworks such as Great Expectations, Soda, or custom SQL/PySpark rule engines.
  • Experience validating pipelines and PySpark notebooks developed by data engineering teams across Fabric Data Factory, Azure Data Factory, and Synapse.
  • Experience defining and documenting validation rules, operational testing guidelines, and reliability processes for consistent team adoption.

Data Specialist employer: Vector Resourcing

As a Data Specialist at our company, you will thrive in a dynamic and innovative work environment that prioritises collaboration and continuous learning. We offer competitive benefits, including professional development opportunities and a strong focus on employee well-being, all while working with cutting-edge Azure technologies in a supportive team culture. Join us to make a meaningful impact on data quality and reliability, ensuring that our data-driven decisions are built on a solid foundation.
V

Contact Detail:

Vector Resourcing Recruiting Team

StudySmarter Expert Advice 🤫

We think this is how you could land Data Specialist

✨Tip Number 1

Network like a pro! Reach out to folks in the industry, attend meetups, and connect with people on LinkedIn. You never know who might have the inside scoop on job openings or can refer you directly.

✨Tip Number 2

Show off your skills! Create a portfolio showcasing your data validation frameworks and automated testing projects. This will give potential employers a taste of what you can do and set you apart from the crowd.

✨Tip Number 3

Prepare for interviews by brushing up on your technical skills. Be ready to discuss your experience with Azure Monitor, KQL queries, and data pipelines. Practice common interview questions and even do mock interviews with friends.

✨Tip Number 4

Don’t forget to apply through our website! We love seeing applications come directly from candidates who are excited about joining us. Plus, it’s a great way to ensure your application gets the attention it deserves.

We think you need these skills to ace Data Specialist

Data Validation Frameworks
Automated Testing
Data Verification Logic
Azure Monitor
KQL Queries
SQL
PowerShell
PySpark
Fabric Data Factory
Azure Data Factory
Synapse Pipelines
REST/OData APIs
Azure DevOps
CI/CD
Data Reliability Engineering

Some tips for your application 🫡

Tailor Your CV: Make sure your CV speaks directly to the Data Specialist role. Highlight your experience with data validation frameworks and automated testing, and don’t forget to mention any Azure-related skills. We want to see how you fit into our world!

Craft a Compelling Cover Letter: Your cover letter is your chance to shine! Use it to explain why you're passionate about data quality and how your background aligns with our needs. We love seeing enthusiasm and a personal touch, so let your personality come through!

Showcase Relevant Projects: If you've worked on projects involving Azure Data Factory or data validation frameworks, make sure to include them in your application. We’re keen to see real examples of your work and how you’ve tackled challenges in the data space.

Apply Through Our Website: We encourage you to apply directly through our website. It’s the best way for us to receive your application and ensures you’re considered for the role. Plus, it’s super easy – just follow the prompts and you’ll be set!

How to prepare for a job interview at Vector Resourcing

✨Know Your Data Validation Frameworks

Make sure you brush up on the data validation frameworks relevant to the role, like Great Expectations or Soda. Be ready to discuss how you've used these tools in past projects and how they can ensure data accuracy and quality.

✨Show Off Your Automation Skills

Prepare examples of how you've automated data validations and reconciliations using SQL or PowerShell. Highlight any experience with Azure Monitor and KQL queries, as this will demonstrate your ability to streamline processes and enhance data reliability.

✨Understand the Azure Ecosystem

Familiarise yourself with the Azure ecosystem, especially Fabric Data Factory, Azure Data Factory, and Synapse. Be prepared to explain how you've navigated these platforms in previous roles and how you can leverage them for effective data operations.

✨Collaboration is Key

Since this role involves working closely with Data Engineering, Cloud, and Governance teams, think of examples where you've successfully collaborated across departments. Emphasise your communication skills and how you can embed standardised practices into workflows.

Data Specialist
Vector Resourcing
Go Premium

Land your dream job quicker with Premium

You’re marked as a top applicant with our partner companies
Individual CV and cover letter feedback including tailoring to specific job roles
Be among the first applications for new jobs with our AI application
1:1 support and career advice from our career coaches
Go Premium

Money-back if you don't land a job in 6-months

V
Similar positions in other companies
UK’s top job board for Gen Z
discover-jobs-cta
Discover now
>