At a Glance
- Tasks: Build and maintain scalable data pipelines and ensure data quality across platforms.
- Company: Join a dynamic team supporting the Royal Navy's Maritime Domain Awareness.
- Benefits: Competitive salary, career development, and opportunities to work on impactful projects.
- Other info: Work in a collaborative environment with opportunities for growth and learning.
- Why this job: Make a difference by delivering reliable data solutions for critical operations.
- Qualifications: Strong Python and SQL skills, with experience in building data pipelines.
The predicted salary is between 45000 - 55000 £ per year.
We are seeking a Data Services Engineer to join a Capability Engineering team delivering data platforms and pipelines that underpin operational and analytical services supporting Royal Navy Maritime Domain Awareness outputs. This is a hands-on engineering role focused on building reliable, scalable and secure data solutions. You will work across the full data lifecycle, from ingestion through to delivery, ensuring data is accessible, trusted and usable. The role will require an active SC clearance to start, with the requisite that you are able to obtain a DV clearance.
What You Will Do
- Build and maintain scalable data pipelines (ETL/ELT) across multiple systems.
- Integrate data from diverse sources including databases, APIs and enterprise systems.
- Ensure data quality, integrity and availability across platforms.
- Monitor and optimise data pipelines for performance and reliability.
- Troubleshoot and resolve complex data and platform issues.
- Contribute to automation and CI/CD of data services.
- Collaborate with stakeholders to understand and deliver data requirements.
- Producing technical documentation, runbooks and operational artefacts.
What We Are Looking For
Essential
- Experience building and maintaining data pipelines.
- Strong Python and SQL skills.
- Experience working with structured data and databases.
- Understanding of data modelling and transformation techniques.
- Experience with Linux environments.
- Strong problem-solving and analytical skills.
- Ability to work independently within defined standards.
Desirable
- Experience with orchestration tools or data platforms.
- Exposure to DevOps practices and automation.
- Experience in secure or regulated environments.
- Familiarity with service management tools (e.g. Jira, Confluence).
Data Services Engineer in Portsmouth employer: Carbon60 Project Services
Contact Detail:
Carbon60 Project Services Recruiting Team
StudySmarter Expert Advice 🤫
We think this is how you could land Data Services Engineer in Portsmouth
✨Tip Number 1
Network like a pro! Reach out to people in the industry, especially those already working at companies you're interested in. A friendly chat can open doors and give you insights that job descriptions just can't.
✨Tip Number 2
Show off your skills! Create a portfolio or GitHub repository showcasing your data pipelines and projects. This gives potential employers a tangible look at what you can do, making you stand out from the crowd.
✨Tip Number 3
Prepare for interviews by practising common technical questions related to data services. Brush up on your Python and SQL skills, and be ready to discuss how you've tackled data challenges in the past.
✨Tip Number 4
Don't forget to apply through our website! We love seeing candidates who are genuinely interested in joining us. Plus, it makes it easier for us to keep track of your application and get back to you quickly.
We think you need these skills to ace Data Services Engineer in Portsmouth
Some tips for your application 🫡
Tailor Your CV: Make sure your CV is tailored to the Data Services Engineer role. Highlight your experience with data pipelines, Python, and SQL. We want to see how your skills match what we're looking for!
Showcase Your Projects: Include any relevant projects or experiences that demonstrate your ability to build and maintain scalable data solutions. We love seeing real-world applications of your skills, so don’t hold back!
Be Clear and Concise: When writing your application, keep it clear and to the point. Use bullet points where possible to make it easy for us to read through your qualifications and experiences.
Apply Through Our Website: We encourage you to apply directly through our website. It’s the best way for us to receive your application and ensures you’re considered for the role. Plus, it’s super easy!
How to prepare for a job interview at Carbon60 Project Services
✨Know Your Data Pipelines
Make sure you can talk confidently about your experience building and maintaining data pipelines. Be ready to discuss specific projects where you've used ETL/ELT processes, and how you ensured data quality and integrity.
✨Show Off Your Python and SQL Skills
Brush up on your Python and SQL knowledge before the interview. Prepare to answer technical questions or even solve coding challenges on the spot. It’s a great way to demonstrate your hands-on skills in a practical context.
✨Understand the Full Data Lifecycle
Familiarise yourself with the entire data lifecycle from ingestion to delivery. Be prepared to discuss how you’ve integrated data from various sources and how you ensure it remains accessible and usable for stakeholders.
✨Be Ready to Troubleshoot
Expect questions about troubleshooting complex data issues. Think of examples where you’ve resolved problems in data pipelines or platforms, and be ready to explain your thought process and the steps you took to find a solution.