At a Glance
- Tasks: Design and implement data solutions for AI-powered analytics on network security data.
- Company: Join a forward-thinking tech company focused on data and AI innovation.
- Benefits: Fully remote work, competitive contracts, and access to a professional network.
- Other info: Collaborate with experts in a dynamic environment with exciting growth opportunities.
- Why this job: Make a real impact by transforming complex data into actionable insights.
- Qualifications: Strong experience with DynamoDB, ETL pipelines, and AWS services.
The predicted salary is between 60000 - 80000 £ per year.
Are you a Senior Data Engineer with strong AWS and DynamoDB experience who enjoys building data platforms from the ground up? This role focuses on building an AI-powered analytics layer on top of a network security data platform, helping turn complex operational data into clear, actionable insights. You’ll be working with large volumes of network and security log data from industrial environments, designing the core data infrastructure and enabling visibility through BI dashboards used by stakeholders.
Your Role as a Senior Data Engineer:
- You will take ownership of designing and implementing the data layer that underpins security analytics use cases. This includes structuring high-volume event data, building reliable pipelines, and enabling downstream analytics.
- This is a hands-on role, working closely with an AI/Analytics specialist and domain experts to ensure the data platform is scalable, reliable, and aligned with real-world security requirements.
What You’ll Do:
- Design and implement a DynamoDB database for high-volume, time-series network and security data.
- Define table structures, partition and sort keys, and indexing strategies for performance and scalability.
- Optimise cost and performance using DynamoDB features (capacity modes, TTL, streams, indexing strategies).
- Build robust ETL pipelines to ingest data from Excel, CSVs, and API sources.
- Clean, validate, and normalise complex, multi-source security data.
- Automate pipeline orchestration, scheduling, and error handling.
- Connect the data layer to AWS QuickSight and build interactive dashboards.
- Design datasets, calculated fields, and visualisations to surface actionable insights.
- Collaborate with domain experts and stakeholders to translate requirements into data solutions.
- Produce clear technical documentation (schemas, pipelines, data definitions, runbooks).
What You’ll Bring:
- Strong hands-on experience with DynamoDB (data modelling, GSIs, performance tuning).
- Proven experience building ETL/ELT pipelines in Python (pandas, boto3, etc.).
- Solid AWS experience (S3, Lambda, IAM, CloudWatch and related services).
- Experience with AWS QuickSight for dashboarding and BI delivery.
- Strong understanding of NoSQL data modelling and high-throughput data systems.
- Experience working with messy, multi-format data and implementing data quality checks.
- Strong Python skills with clean coding practices and version control (Git).
- Ability to communicate clearly and work with both technical and non-technical stakeholders.
Nice to Have:
- Experience with AWS Glue and Athena.
- Exposure to graph databases (Neo4j, Amazon Neptune).
- Experience with CI/CD for data pipelines.
- Background in cybersecurity, network data, or similar domains.
- Understanding of event-based data structures (process mining or similar).
- Experience using AI-assisted development tools.
Benefits:
- Fully/90% remote, b2b/freelance contracts.
- Opportunity to work on high-impact data and AI programmes for large-scale enterprise clients.
- Access to a network of experienced data, AI, and engineering professionals.
Data Engineer in England employer: Remobi
Contact Detail:
Remobi Recruiting Team
StudySmarter Expert Advice 🤫
We think this is how you could land Data Engineer in England
✨Tip Number 1
Network like a pro! Reach out to your connections in the data engineering field, especially those who work with AWS and DynamoDB. A friendly chat can lead to insider info about job openings that aren't even advertised yet.
✨Tip Number 2
Show off your skills! Create a portfolio showcasing your projects, especially those involving ETL pipelines and data visualisation. This gives potential employers a taste of what you can do and sets you apart from the crowd.
✨Tip Number 3
Prepare for interviews by brushing up on your technical knowledge. Be ready to discuss your experience with DynamoDB, AWS services, and how you've tackled messy data. Practice common interview questions to boost your confidence.
✨Tip Number 4
Don't forget to apply through our website! We make it easy for you to find roles that match your skills and interests. Plus, it shows you're serious about joining our team and helps us get to know you better.
We think you need these skills to ace Data Engineer in England
Some tips for your application 🫡
Tailor Your CV: Make sure your CV highlights your experience with AWS, DynamoDB, and building data platforms. We want to see how your skills align with the role, so don’t be shy about showcasing relevant projects or achievements!
Craft a Compelling Cover Letter: Your cover letter is your chance to shine! Use it to explain why you’re passionate about data engineering and how your background makes you a perfect fit for our team. Keep it engaging and personal – we love to see your personality!
Showcase Your Technical Skills: When detailing your experience, focus on specific technologies and methodologies you've used, especially around ETL pipelines and data modelling. We’re keen to see your hands-on experience, so don’t hold back on the details!
Apply Through Our Website: We encourage you to apply directly through our website. It’s the best way for us to receive your application and ensures you’re considered for the role. Plus, it’s super easy – just follow the prompts and submit your materials!
How to prepare for a job interview at Remobi
✨Know Your Tech Inside Out
Make sure you’re well-versed in AWS, DynamoDB, and Python. Brush up on your knowledge of data modelling, ETL pipelines, and how to optimise performance. Being able to discuss specific projects where you've implemented these technologies will show your hands-on experience.
✨Showcase Your Problem-Solving Skills
Prepare to discuss how you've tackled messy, multi-format data in the past. Think of examples where you cleaned, validated, and normalised data. This will demonstrate your ability to handle real-world challenges and your understanding of data quality checks.
✨Communicate Clearly
Since you'll be working with both technical and non-technical stakeholders, practice explaining complex concepts in simple terms. Be ready to share how you’ve collaborated with domain experts to translate requirements into actionable data solutions.
✨Prepare Questions for Them
Interviews are a two-way street! Prepare insightful questions about their current data platform, the team dynamics, or future projects. This shows your genuine interest in the role and helps you assess if it’s the right fit for you.