At a Glance
- Tasks: Build and optimise data pipelines and models in a dynamic data ecosystem.
- Company: Join Ceox Services Ltd, a forward-thinking IT consultancy.
- Benefits: Remote work, competitive pay, and opportunities for professional growth.
- Why this job: Make a real impact by shaping modern data platforms with cutting-edge tech.
- Qualifications: Strong Python, SQL, and Azure Databricks experience required.
- Other info: Ideal for those who thrive in collaborative and agile environments.
The predicted salary is between 36000 - 60000 £ per year.
Ceox Services are seeking a Data Engineer to support delivery of key data components within our evolving data ecosystem. This role will work closely with architects, analysts, engineers, and wider delivery teams to implement pipeline builds, data models, and integration patterns aligned to the organisation’s new Target Operating Model for Data. You will contribute to the development of scalable dataflows, optimise ingestion & transformation activities, and ensure solutions meet technical standards, performance expectations, and security controls. This is a hands-on engineering role ideal for someone who enjoys building, shaping, and improving modern data platforms.
Key Responsibilities:
- Build and maintain physical data models, ETL pipelines and code in cloud data platforms.
- Support ingestion activity and onboarding of new data sources.
- Assist in design, development and deployment of Azure platform services (Fabric, Synapse, ADL).
- Work with Databricks, Delta Lake, Unity Catalogue, Delta Share for dataflows and collaboration.
- Construct curated, raw and refined data layers; catalogue assets appropriately.
- Validate solutions against functional and non-functional requirements.
- Deliver data sets, transformations and performance-optimised data products.
- Improve processes, engineering patterns, and reusable tooling.
- Monitor and measure pipeline performance; support incident resolution.
- Ensure documentation meets acceptance standards and is approved centrally.
- Actively engage in Agile ceremonies and governance.
Mandatory Requirements:
- Strong experience with Python, PySpark & SQL for data engineering.
- Hands-on experience with Azure Databricks.
- Strong knowledge of Fabric, Synapse, ADF & ADLS for ETL pipelines.
- Experience with Delta Lake, Parquet FS, Unity Catalogue & MS Purview.
- Familiarity with Event-driven data ingestion (Event Grid / Pub-Sub).
- Understanding of SOLID principles, Async programming, Mediator/Factory patterns.
- Experience delivering unit + integration testing in Databricks.
- Knowledge of Secure ETL design with Entra IMID/SCIM integration.
- Understanding of Azure best practice, APIM, and platform governance.
- Ability to build and serve Power BI models via Databricks data sources.
Desirable:
- Prior experience working within UK Public Sector environment.
Soft Skills:
- Strong stakeholder communication and cross-team collaboration.
- Analytical and solution-focused mindset.
- Able to work independently, take ownership and drive progress.
- Commitment to clean, scalable, well-documented engineering.
- Adaptable, proactive, and comfortable working in dynamic delivery environments.
Contract Details:
- Contract Type: Freelance / Contract
- Location: Remote (UK-based candidates preferred)
- Start Date: ASAP
- Clearance: Candidates must be eligible to work with UK Government departments.
Seniority Level: Mid-Senior level
Employment Type: Contract
Job Function: Information Technology
Industries: IT Services and IT Consulting
Senior Data Engineer employer: Ceox Services Ltd
Contact Detail:
Ceox Services Ltd Recruiting Team
StudySmarter Expert Advice 🤫
We think this is how you could land Senior Data Engineer
✨Tip Number 1
Network like a pro! Reach out to your connections in the data engineering field, especially those who might know someone at Ceox Services. A friendly chat can sometimes lead to a referral, which is a golden ticket in the job hunt.
✨Tip Number 2
Prepare for the interview by brushing up on your technical skills. Make sure you can confidently discuss Python, PySpark, and Azure Databricks. We want you to shine when they ask about your experience with ETL pipelines and data models!
✨Tip Number 3
Showcase your problem-solving skills during interviews. Be ready to share examples of how you've optimised data flows or improved processes in past roles. This will demonstrate your analytical mindset and commitment to clean engineering.
✨Tip Number 4
Don’t forget to apply through our website! It’s the best way to ensure your application gets seen by the right people. Plus, we love seeing candidates who take that extra step to connect with us directly.
We think you need these skills to ace Senior Data Engineer
Some tips for your application 🫡
Tailor Your CV: Make sure your CV is tailored to the Senior Data Engineer role. Highlight your experience with Python, Azure Databricks, and any relevant ETL pipelines. We want to see how your skills align with what we're looking for!
Craft a Compelling Cover Letter: Your cover letter is your chance to shine! Use it to explain why you're passionate about data engineering and how you can contribute to our evolving data ecosystem. Keep it concise but impactful!
Showcase Your Projects: If you've worked on any cool data projects, make sure to mention them! Whether it's building data models or optimising data flows, we love to see real-world examples of your work that demonstrate your skills.
Apply Through Our Website: Don't forget to apply through our website! It’s the best way for us to receive your application and ensures you’re considered for the role. We can't wait to see what you bring to the table!
How to prepare for a job interview at Ceox Services Ltd
✨Know Your Tech Stack
Make sure you’re well-versed in the technologies mentioned in the job description, like Python, PySpark, and Azure Databricks. Brush up on your knowledge of ETL pipelines and data models, as you'll likely be asked to discuss your hands-on experience with these tools.
✨Showcase Your Problem-Solving Skills
Prepare to share specific examples of how you've tackled challenges in previous roles. Think about times when you optimised data ingestion or improved pipeline performance. This will demonstrate your analytical mindset and solution-focused approach.
✨Understand Agile Methodologies
Since the role involves engaging in Agile ceremonies, it’s crucial to understand Agile principles. Be ready to discuss how you've worked in Agile environments before and how you’ve contributed to team collaboration and project delivery.
✨Communicate Clearly
Strong stakeholder communication is key for this role. Practice explaining complex technical concepts in simple terms, as you may need to collaborate with non-technical team members. Clear communication can set you apart from other candidates.