At a Glance
- Tasks: Own and scale data infrastructure for cutting-edge AI processing of financial documents.
- Company: Exciting early-stage start-up with massive growth potential in the financial sector.
- Benefits: Competitive salary up to £120k, profit share, and remote work flexibility.
- Other info: VISA sponsorship available; thrive in a dynamic, fast-paced environment.
- Why this job: Join a small team making a real impact with AI in finance and own your projects.
- Qualifications: 3-6 years experience in data pipelines, Python, and a passion for data quality.
The predicted salary is between 100000 - 120000 € per year.
Salary: up to ~£120k + profit share
Location: Old Street (4-5 office days/week)
This early stage start-up is processing hundreds of thousands of unstructured financial documents into clean, structured datasets for some of the world's largest financial institutions - producing the output of 50, with a team of 5.
Forecasting £1.5m revenue within their first 12 months, they have immense potential - not based on hype or inflated valuations, but rather achieving mega productivity through intelligent application of AI agents.
Cash in your account on a regular basis - not a promise of a huge payout IF the company succeeds and sells.
You will own data infrastructure end-to-end - ingestion, transformation, storage, delivery. Your job is to scale it: more documents, more asset classes, more clients. You'll make architecture decisions, push the boundaries of agentic tooling and ship data as a product to institutional buyers who rely on it for decisions that move millions.
Requirements:
- Roughly 3-6 years experience building performant data pipelines and ML systems in production at a company where data is the core product, not a support function
- Genuinely AI-native experience: Python, Postgres, async, queues - fluent and battle-tested
- Experience in a small team (2-30 people) where you owned the whole function without a safety net
- Data quality obsessive - inconsistencies bother you until they're fixed, not flagged and forgotten
Bonus points for:
- Experience at a financial data provider (Bloomberg, Refinitiv, Preqin, FactSet etc.)
- Web scraping and document parsing at scale
VISA sponsorship is available if needed (but you need to be already living in the UK).
Data Engineer Python,aws Remote in City of London employer: Wave Group
Join an innovative early-stage start-up as a Founding Data Engineer, where you'll play a pivotal role in transforming unstructured financial data into structured datasets for leading financial institutions. With a competitive salary of up to £120k plus profit share, and the opportunity to work in a dynamic team environment in Old Street, this role offers not just a job but a chance to shape the future of data processing. Enjoy a culture that values your contributions, fosters growth, and rewards your efforts with tangible benefits, including regular cash flow and the chance to make impactful architectural decisions.
StudySmarter Expert Advice🤫
We think this is how you could land Data Engineer Python,aws Remote in City of London
✨Tip Number 1
Network like a pro! Reach out to people in the industry, especially those who work at companies you're interested in. A friendly chat can open doors and give you insights that job descriptions just can't.
✨Tip Number 2
Show off your skills! Create a portfolio or GitHub repository showcasing your data pipelines and ML systems. When you apply through our website, include links to your work so potential employers can see what you're capable of.
✨Tip Number 3
Prepare for interviews by practising common questions and scenarios related to data engineering. Think about how you'd tackle scaling data infrastructure and be ready to discuss your past experiences in detail.
✨Tip Number 4
Follow up after interviews! A quick thank-you email can leave a lasting impression. It shows you're genuinely interested in the role and helps keep you on their radar as they make their decision.
We think you need these skills to ace Data Engineer Python,aws Remote in City of London
Some tips for your application 🫡
Tailor Your CV:Make sure your CV is tailored to the Data Engineer role. Highlight your experience with Python, AWS, and any relevant projects that showcase your skills in building data pipelines and ML systems. We want to see how you can contribute to our mission!
Craft a Compelling Cover Letter:Your cover letter is your chance to shine! Use it to explain why you're passionate about data engineering and how your experience aligns with our goals. Don’t forget to mention your obsession with data quality – we love that!
Showcase Your AI Experience:Since we're all about intelligent application of AI agents, make sure to highlight any AI-native experience you have. Whether it's Python, async programming, or working with queues, let us know how you've pushed boundaries in your previous roles.
Apply Through Our Website:We encourage you to apply through our website for a smoother process. It helps us keep track of applications and ensures you don’t miss out on any important updates. Plus, we love seeing candidates who take that extra step!
How to prepare for a job interview at Wave Group
✨Know Your Data Inside Out
Make sure you’re well-versed in the data processing techniques relevant to the role. Brush up on your experience with Python, Postgres, and any async programming you've done. Be ready to discuss specific projects where you’ve built data pipelines or ML systems, as this will show your hands-on expertise.
✨Showcase Your Problem-Solving Skills
Prepare to share examples of how you've tackled data quality issues in the past. Since they’re looking for someone who’s obsessive about data quality, think of instances where you identified inconsistencies and how you resolved them. This will demonstrate your proactive approach and attention to detail.
✨Understand Their Business Model
Research the company’s focus on processing unstructured financial documents and their use of AI agents. Being able to articulate how your skills can contribute to their goal of scaling data infrastructure will set you apart. Show that you understand the importance of delivering clean, structured datasets to institutional clients.
✨Be Ready for Technical Challenges
Expect technical questions or even a coding challenge during the interview. Practice common data engineering problems and be prepared to explain your thought process. This will not only showcase your technical skills but also your ability to communicate complex ideas clearly.