At a Glance
- Tasks: Design and implement secure, scalable data integration solutions and improve ETL/data pipelines.
- Company: Join a collaborative team focused on high-impact data platforms.
- Benefits: Competitive daily rate, hybrid working, and opportunities for mentorship.
- Other info: Active UK SC Clearance is mandatory; occasional on-site attendance needed.
- Why this job: Make a real impact in a secure environment while enhancing your skills.
- Qualifications: Strong experience with Talend, Unix/Linux, and Oracle SQL required.
We are looking for an experienced Data Engineer with strong Talend and Unix expertise to join our team and help deliver secure, high‑performance database and ETL solutions within a large, regulated environment. This role suits someone who enjoys building robust data pipelines, working closely with stakeholders, and supporting critical data services end‑to‑end.
What You’ll Do:
- Design and implement secure, scalable, and performant data integration solutions
- Build, operate, and continuously improve ETL/data pipelines (ingestion, transformation, curation)
- Ensure operational excellence through monitoring, alerting, SLAs, and incident management
- Collaborate with product teams and client stakeholders to refine requirements and align solutions with non‑functional requirements (performance, cost, security)
- Support incident resolution and ensure service continuity
- Produce high‑level and low‑level designs
- Share knowledge and mentor colleagues
- Actively contribute to Agile ceremonies and cross‑functional delivery teams
Essential Skills & Experience:
- Strong hands‑on experience with Talend
- Solid Unix/Linux experience (scripting, automation, troubleshooting)
- Oracle SQL
- Experience working in Agile environments
- Tools: Jira & Confluence
- Experience supporting production data pipelines
Desirable Skills (Nice to Have):
- Oracle PL/SQL
- AWS
- GitLab
- Artifactory
- Vault
- Berlin
- JobScheduler
- Test Automation Frameworks
- Data Modelling
- Denodo
- Development lifecycle frameworks (e.g. D4D)
Working Arrangement:
Hybrid role with occasional on‑site attendance for workshops (typically 2 days per month) in Telford or Worthing.
Security Clearance (Mandatory):
Active UK SC Clearance, which must be transferable to HMRC. Unfortunately, candidates without active SC clearance cannot be considered.
Why Join Us?
- Work on high‑impact, mission‑critical data platforms
- Collaborative environment with strong engineering standards
- Opportunity to influence design decisions and mentor others
- Stable, long‑term programme within a secure environment
Data Engineer- Telford / Worthing (£525/d Inside IR35, contract till end of December) employer: Tenth Revolution Group
Contact Detail:
Tenth Revolution Group Recruiting Team
StudySmarter Expert Advice 🤫
We think this is how you could land Data Engineer- Telford / Worthing (£525/d Inside IR35, contract till end of December)
✨Tip Number 1
Network like a pro! Reach out to your connections in the data engineering field, especially those who work with Talend and Unix. A friendly chat can lead to insider info about job openings or even referrals.
✨Tip Number 2
Show off your skills! Create a portfolio showcasing your best data pipelines and ETL solutions. This is your chance to demonstrate your hands-on experience and problem-solving abilities to potential employers.
✨Tip Number 3
Prepare for interviews by brushing up on Agile methodologies and incident management. Be ready to discuss how you've collaborated with stakeholders and ensured operational excellence in your previous roles.
✨Tip Number 4
Don't forget to apply through our website! We love seeing candidates who are genuinely interested in joining our team. Plus, it makes it easier for us to keep track of your application and get back to you quickly.
We think you need these skills to ace Data Engineer- Telford / Worthing (£525/d Inside IR35, contract till end of December)
Some tips for your application 🫡
Tailor Your CV: Make sure your CV highlights your experience with Talend and Unix, as these are key skills we're looking for. Use specific examples from your past roles to show how you've built robust data pipelines and supported critical data services.
Craft a Compelling Cover Letter: Your cover letter is your chance to shine! Explain why you're passionate about data engineering and how your skills align with our needs. Don't forget to mention your experience in Agile environments and any relevant tools like Jira or Confluence.
Showcase Your Problem-Solving Skills: In your application, highlight instances where you've resolved incidents or improved operational excellence. We love candidates who can demonstrate their ability to troubleshoot and ensure service continuity in a regulated environment.
Apply Through Our Website: We encourage you to apply directly through our website for the best chance of getting noticed. It’s super easy, and you'll be able to showcase your application in the best light!
How to prepare for a job interview at Tenth Revolution Group
✨Know Your Tech Inside Out
Make sure you brush up on your Talend and Unix skills before the interview. Be ready to discuss specific projects where you've built data pipelines or solved complex problems using these tools. The more detailed examples you can provide, the better!
✨Understand the Role and Environment
Familiarise yourself with the specifics of working in a regulated environment. Be prepared to talk about how you ensure operational excellence and manage SLAs. Showing that you understand the importance of security and performance in data engineering will set you apart.
✨Show Your Collaborative Spirit
This role involves working closely with stakeholders and product teams. Think of examples where you've successfully collaborated in an Agile setting. Highlight your experience in refining requirements and aligning solutions with non-functional needs to demonstrate your teamwork skills.
✨Prepare for Scenario-Based Questions
Expect questions that assess your problem-solving abilities, especially around incident resolution and service continuity. Prepare to walk through how you would handle specific scenarios related to data pipeline issues or performance challenges, showcasing your analytical thinking.