At a Glance
- Tasks: Design and manage data pipelines using Microsoft Fabric for impactful analytics and AI solutions.
- Company: Join RELX, a global leader in information-based analytics with a vibrant culture.
- Benefits: Enjoy generous holidays, health perks, study assistance, and a competitive pension scheme.
- Why this job: Make a real difference by enhancing data practices and supporting innovative projects.
- Qualifications: Experience in cloud data platforms, Python, and strong collaboration skills required.
- Other info: Flexible work-life balance with extensive learning and development opportunities.
The predicted salary is between 43200 - 72000 £ per year.
About the role: The Senior Data Engineer will maintain and enhance a wide range of existing Fabric assets including pipelines, dataflows, notebooks, Lakehouse tables, and Warehouse objects, while proactively identifying opportunities to introduce new, more scalable, and more efficient data engineering practices. This role is central to enabling high-quality, reliable data products that support analytics, automation, and AI-driven solutions across the organisation.
The ideal candidate will be experienced in integrating diverse internal and external data sources, collaborating with stakeholders, and ensuring the availability of trusted, high-performance datasets for downstream consumers.
Responsibilities:- Design, build, and manage ingestion pipelines in Microsoft Fabric for structured and unstructured data, aligned to the Bronze and Silver layers of the Medallion Architecture.
- Maintain and enhance existing Fabric data assets while identifying opportunities to introduce improved patterns, automation, and modern engineering practices.
- Integrate data from internal and external APIs, SFTP endpoints, on-premise databases, and cloud data platforms.
- Use Python extensively within Fabric notebooks for data processing, API integration, automation, and advanced transformation logic.
- Develop and maintain Fabric Dataflows, Pipelines, Lakehouse tables, and Warehouse objects to support enterprise analytics and downstream applications.
- Collaborate with Data Analysts, Power Platform Developers, and data modelling teams to ensure well-prepared Silver datasets reliably support the Gold layer used for reporting and AI or automation scenarios.
- Work closely with internal stakeholders to gather data requirements and translate them into scalable, maintainable data pipelines.
- Partner with external organisations to facilitate secure and reliable data transfers and integrations.
- Implement scalable, repeatable processes for ingestion, schema management, validation, and metadata documentation.
- Ensure all data pipelines meet governance, compliance, performance, and security standards.
- Continually evaluate and adopt emerging Fabric features including Lakehouse optimisations, Warehouse capabilities, and orchestration improvements to support ongoing advancement of the data ecosystem.
- Proven experience as a Senior Data Engineer or similar role working with modern cloud-based data platforms.
- A high level of proficiency with Microsoft Fabric tooling including Pipelines, Dataflows, Lakehouse, Warehouse, and notebooks.
- Experience working within a Medallion Architecture with clear responsibility for engineering Bronze and Silver layers that support reliable downstream consumption.
- Advanced Python skills for ETL or ELT processes, API integration, data manipulation, and data quality enforcement.
- SQL proficiency with experience in schema design, performance tuning, and analytical data modelling.
- Proven experience integrating complex datasets from APIs, SFTP transfers, on-premise databases, and cloud data stores.
- Ability to collaborate closely with business and technical teams to gather requirements and translate them into professional-grade data pipelines.
- Excellent communication and stakeholder engagement skills across technical and non-technical audiences.
- Experience designing resilient pipelines with monitoring, logging, error handling, and incident response considerations.
- Familiarity with AI Foundry, Copilot, or similar AI-assisted engineering tools.
- Experience implementing data governance, cataloguing, lineage, and security best practices.
- Knowledge of DevOps practices for data engineering including CI/CD, version control, and environment management.
- Relevant Microsoft certifications such as Fabric or Azure Data Engineer.
We promote a healthy work/life balance across the organisation. We offer an appealing working prospect for our people. With numerous wellbeing initiatives, shared parental leave, study assistance and sabbaticals, we will help you meet your immediate responsibilities and your long-term goals.
Working for you:We know that your wellbeing and happiness are key to a long and successful career. These are some of the benefits we are delighted to offer:
- Generous holiday allowance with the option to buy additional days
- Health screening, eye care vouchers and private medical benefits
- Wellbeing programs
- Life assurance
- Access to a competitive contributory pension scheme
- Save As You Earn share option scheme
- Travel Season ticket loan
- Electric Vehicle Scheme
- Optional Dental Insurance
- Maternity, paternity and shared parental leave
- Employee Assistance Programme
- Access to emergency care for both the elderly and children
- RECARES days, giving you time to support the charities and causes that matter to you
- Access to employee resource groups with dedicated time to volunteer
- Access to extensive learning and development resources
- Access to employee discounts scheme via Perks at Work
RELX is a global provider of information-based analytics and decision tools for professional and business customers. RELX serves customers in more than 180 countries and has offices in about 40 countries. It employs more than 36,000 people over 40% of whom are in North America. The headquarters is in London. The market capitalization is about £60bn ($80bn), making it one of the 10 largest listed companies in the UK. The company is listed on the London Stock Exchange, Euronext and NYSE. The company has four market segments. It develops information-based analytics and decision tools for professional and business customers in the Risk, Scientific, Technical & Medical, Legal, and Exhibitions sectors.
Senior Data Engineer in London employer: RELX
Contact Detail:
RELX Recruiting Team
StudySmarter Expert Advice 🤫
We think this is how you could land Senior Data Engineer in London
✨Tip Number 1
Network like a pro! Reach out to your connections in the data engineering field, attend meetups, and engage in online forums. You never know who might have the inside scoop on job openings or can refer you directly.
✨Tip Number 2
Show off your skills! Create a portfolio showcasing your projects, especially those involving Microsoft Fabric and Python. This will give potential employers a taste of what you can do and set you apart from the crowd.
✨Tip Number 3
Prepare for interviews by brushing up on common data engineering scenarios. Be ready to discuss how you've tackled challenges in previous roles, particularly around pipeline design and data integration. Practice makes perfect!
✨Tip Number 4
Don't forget to apply through our website! It’s the best way to ensure your application gets noticed. Plus, we love seeing candidates who are proactive about their job search.
We think you need these skills to ace Senior Data Engineer in London
Some tips for your application 🫡
Tailor Your CV: Make sure your CV is tailored to the Senior Data Engineer role. Highlight your experience with Microsoft Fabric, Python, and data integration. We want to see how your skills align with our needs!
Showcase Your Projects: Include specific projects where you've designed and managed data pipelines or worked with complex datasets. This gives us a clear picture of your hands-on experience and problem-solving abilities.
Be Clear and Concise: When writing your application, keep it clear and to the point. Use bullet points for key achievements and responsibilities. We appreciate straightforward communication that gets right to the heart of your experience.
Apply Through Our Website: Don’t forget to apply through our website! It’s the best way for us to receive your application and ensures you’re considered for the role. Plus, it’s super easy to do!
How to prepare for a job interview at RELX
✨Know Your Tools Inside Out
Make sure you’re well-versed in Microsoft Fabric, especially Pipelines, Dataflows, and Lakehouse tables. Brush up on your Python skills too, as you'll need to demonstrate your ability to use it for data processing and API integration during the interview.
✨Showcase Your Problem-Solving Skills
Prepare to discuss specific examples where you've identified inefficiencies in data engineering practices and how you improved them. This role is all about enhancing existing assets, so be ready to share your innovative ideas and solutions.
✨Collaborate Like a Pro
Since this position involves working closely with various stakeholders, practice articulating how you gather requirements and translate them into actionable data pipelines. Highlight any past experiences where collaboration led to successful outcomes.
✨Stay Current with Trends
Familiarise yourself with the latest features in data engineering, particularly those related to AI and automation. Being able to discuss emerging trends and how they can be applied to improve data processes will set you apart from other candidates.