Data Migration Engineer
Data Migration Engineer

Data Migration Engineer

Full-Time 36000 - 60000 £ / year (est.) No home office possible
TXP

At a Glance

  • Tasks: Design and build data pipelines using Microsoft Fabric to transform raw data into valuable insights.
  • Company: Join TXP, a fast-growing tech company focused on innovation and collaboration.
  • Benefits: Enjoy 25 days annual leave, private medical insurance, and a supportive work environment.
  • Why this job: Work on exciting projects that make a real impact while developing your skills.
  • Qualifications: Experience in data pipeline development and proficiency with Microsoft Fabric and data transformation tools.
  • Other info: Be part of a dynamic team with excellent career growth opportunities and a commitment to community.

The predicted salary is between 36000 - 60000 £ per year.

We are TXP. We help businesses and organisations move forward, at pace and at scale. We believe in the transformative power of combining technology and people. By providing consulting expertise, development services and resourcing, we work closely with organisations to solve their most complex business problems. Our work transforms organisations – and we take that responsibility seriously. We focus on success, pursue excellence and take ownership of everything we do. But achieving that level of performance requires an inclusive and supportive working environment. We believe in the power of technology and people, and we help everyone here to succeed. At TXP, you can multiply your potential.

Role Purpose

The Data Engineer is a client-facing delivery role responsible for designing, building and maintaining data pipelines and platform infrastructure on Microsoft Fabric. Operating within a consulting model, this role deploys across engagements of varying scope and duration, delivering production-grade data solutions that meet client requirements. The Data Engineer works within cross-functional delivery teams to transform raw data into reliable, governed and analytically ready assets.

Key Responsibilities:

  • Design and build end-to-end data pipelines using Microsoft Fabric Data Factory, dataflows and notebooks.
  • Implement ingestion patterns for batch and near-real-time data from diverse source systems including APIs, databases, flat files and event streams.
  • Develop transformation logic using PySpark, Spark SQL or T-SQL within Fabric lakehouses and warehouses.
  • Build and maintain medallion architecture (bronze, silver, gold) to structure data processing layers.
  • Configure and manage Microsoft Fabric workspaces, capacities and compute resources across client tenants.
  • Implement OneLake storage strategies, including shortcuts and mirroring, to enable unified data access.
  • Establish CI/CD pipelines for Fabric assets using Azure DevOps or GitHub Actions.
  • Manage deployment across development, test and production environments with appropriate governance controls.

Data Quality and Governance

  • Implement data quality checks, validation rules and monitoring within pipelines.
  • Apply data cataloguing, lineage tracking and metadata management practices using Purview integration.
  • Design and enforce access control models, row-level security and sensitivity labelling.
  • Document data models, pipeline logic and operational procedures to consulting-grade standards.

Client Delivery

  • Translate client requirements into technical designs and implementation plans.
  • Operate within consulting delivery frameworks, managing scope, timelines and stakeholder expectations.
  • Contribute to estimation, solution architecture and proposal development for data workstreams.
  • Present technical approaches and progress updates to client stakeholders at varying levels of seniority.
  • Conduct knowledge transfer sessions and produce handover documentation for client teams.

Collaboration and Standards

  • Work within multidisciplinary delivery teams alongside analytics engineers, data scientists and business analysts.
  • Contribute to internal capability development through reusable accelerators, templates and reference architectures.
  • Participate in code reviews, design reviews and retrospectives.
  • Stay current with Microsoft Fabric platform updates, features and best practices.

Required Skills and Experience:

  • Demonstrable experience building data pipelines in production environments.
  • Proficiency with Microsoft Fabric, including Data Factory, lakehouses, warehouses, notebooks and semantic models.
  • Strong skills in PySpark, Spark SQL and T-SQL for data transformation.
  • Experience with medallion architecture or equivalent layered data processing patterns.
  • Working knowledge of OneLake, delta tables and Fabric capacity management.
  • Proficiency with version control (Git) and CI/CD practices for data platform assets.
  • Understanding of data governance principles, including cataloguing, lineage and access control.
  • Experience working in a consulting, professional services or client-facing delivery environment.
  • Strong written and verbal communication skills, with the ability to explain technical concepts to non-technical audiences.
  • Experience with Azure Data Lake Storage, Azure Synapse Analytics or Azure Data Factory prior to Fabric migration.
  • Familiarity with Microsoft Purview for data governance and compliance.
  • Exposure to Power BI semantic models and report development within Fabric.
  • Experience with event-driven architectures using Azure Event Hubs or Kafka.
  • Knowledge of infrastructure-as-code tools such as Terraform or Bicep for Azure resource provisioning.
  • DP-600 (Fabric Analytics Engineer) certification or equivalent Microsoft certifications.

Qualifications

  • Relevant Microsoft certifications are advantageous.

25 days annual leave (plus bank holidays). An additional day of paid leave for your birthday (or Christmas eve). Salary sacrifice, matched employer contributed pension (4%). Life assurance (3x). Access to an Employee Assistance Programme (EAP). Private medical insurance through our partner Aviva. Cycle to work scheme. Access to an independent financial advisor. 2 x social value days per year to give back to local communities.

Grow with us: Work on exciting new projects. If you want to avoid getting stuck with the mundane, you’re in the right place. We work in many sectors with fantastic clients, so you’ll always be working on something exciting and challenging. We recognise that you might have a career path planned out and you might need some support to help you move forward. We’re here to support you and make the most out of your time with us, through challenging work, opportunities to grow and learning and development opportunities. Be part of the TXP growth journey. We are a high growth, fast paced environment. We currently have 200+ employees and work with clients across the UK. Joining TXP means you’ll be part of that.

Data Migration Engineer employer: TXP

At TXP, we pride ourselves on being an exceptional employer that fosters a collaborative and inclusive work culture, empowering our employees to thrive in their roles as Data Migration Engineers. With a strong focus on professional development, we offer numerous growth opportunities through exciting projects across various sectors, alongside competitive benefits such as additional leave for birthdays, private medical insurance, and a supportive environment that values your contributions. Join us in our fast-paced journey of transformation, where your potential can truly multiply.
TXP

Contact Detail:

TXP Recruiting Team

StudySmarter Expert Advice 🤫

We think this is how you could land Data Migration Engineer

✨Tip Number 1

Network like a pro! Reach out to people in your industry on LinkedIn or at events. A friendly chat can lead to opportunities that aren’t even advertised yet.

✨Tip Number 2

Show off your skills! Create a portfolio or GitHub repository showcasing your data pipelines and projects. This gives potential employers a taste of what you can do.

✨Tip Number 3

Prepare for interviews by practising common questions and scenarios related to data engineering. We recommend doing mock interviews with friends or using online platforms.

✨Tip Number 4

Don’t forget to apply through our website! It’s the best way to ensure your application gets noticed, and we love seeing candidates who are proactive.

We think you need these skills to ace Data Migration Engineer

Data Pipeline Development
Microsoft Fabric
Data Factory
PySpark
Spark SQL
T-SQL
Medallion Architecture
OneLake
CI/CD Pipelines
Data Governance
Azure Data Lake Storage
Azure Synapse Analytics
Microsoft Purview
Event-Driven Architectures
Infrastructure-as-Code

Some tips for your application 🫡

Tailor Your CV: Make sure your CV reflects the skills and experiences that match the Data Migration Engineer role. Highlight your experience with Microsoft Fabric, data pipelines, and any relevant projects you've worked on. We want to see how you can contribute to our team!

Craft a Compelling Cover Letter: Your cover letter is your chance to shine! Use it to explain why you're passionate about data engineering and how your background aligns with TXP's mission. Don't forget to mention specific projects or achievements that showcase your expertise.

Showcase Your Technical Skills: In your application, be sure to highlight your proficiency in PySpark, Spark SQL, and T-SQL. Mention any experience with CI/CD practices and version control, as these are key for the role. We love seeing candidates who can demonstrate their technical prowess!

Apply Through Our Website: We encourage you to apply directly through our website for the best chance of getting noticed. It’s super easy, and you'll be able to keep track of your application status. Plus, we love seeing applications come through our own platform!

How to prepare for a job interview at TXP

✨Know Your Tech Inside Out

Make sure you’re well-versed in Microsoft Fabric and the tools mentioned in the job description. Brush up on your skills with Data Factory, PySpark, and T-SQL. Being able to discuss your experience with these technologies confidently will show that you're ready to hit the ground running.

✨Prepare for Client-Facing Scenarios

Since this role involves client interaction, think about how you can translate technical jargon into layman's terms. Prepare examples of how you've successfully communicated complex ideas to non-technical stakeholders in the past. This will demonstrate your ability to bridge the gap between tech and business.

✨Showcase Your Problem-Solving Skills

Be ready to discuss specific challenges you've faced in previous roles, particularly around data migration and pipeline building. Use the STAR method (Situation, Task, Action, Result) to structure your answers, highlighting how you approached problems and what the outcomes were.

✨Ask Insightful Questions

At the end of the interview, don’t shy away from asking questions. Inquire about TXP's approach to data governance or how they foster collaboration within teams. This shows your genuine interest in the company and helps you gauge if it’s the right fit for you.

Data Migration Engineer
TXP

Land your dream job quicker with Premium

You’re marked as a top applicant with our partner companies
Individual CV and cover letter feedback including tailoring to specific job roles
Be among the first applications for new jobs with our AI application
1:1 support and career advice from our career coaches
Go Premium

Money-back if you don't land a job in 6-months

>