At a Glance
- Tasks: Join our team to analyse data and build impactful AI solutions.
- Company: Amach, a fast-growing tech company based in Dublin.
- Benefits: Flexible working, competitive salary, and career advancement opportunities.
- Why this job: Be part of innovative projects that shape the future of data analytics.
- Qualifications: Experience in data analysis, SQL, Python, and data visualisation tools.
- Other info: Dynamic hybrid work environment with a focus on diversity and inclusion.
The predicted salary is between 36000 - 60000 £ per year.
Amach is an industry-leading technology driven company with headquarters located in Dublin and remote teams in the UK and Europe. Our blended teams of local and nearshore talent are optimised to deliver high quality and collaborative solutions. Established in 2013, we specialise in cloud migration and development, digital transformation including agile software development, DevOps, automation, data and machine learning.
We are hiring a senior, hands-on Data Analyst / Data Engineer to support the delivery of AI-enabled, decision-support solutions within a large, complex operational environment, with designs that scale across multiple operating companies. This role sits across advanced analytics and data engineering, with work flexing depending on the delivery phase and workstream. You will operate as a senior individual contributor, embedded within a cross-functional product squad alongside Data Scientists, Visualisation Engineers, and change teams. The role is highly hands-on and delivery-focused, suited to someone who enjoys deep problem-solving, data exploration and building production-ready analytical assets.
Please note this role operates in a hybrid model with candidates expected to be able and willing to work from our customer’s London office three times per week.
Responsibilities:- Strong experience in data analysis, analytics engineering or data engineering within a product or delivery-focused environment.
- Advanced skills in SQL and data processing using Python (e.g. Pandas).
- Hands-on experience developing and optimising data pipelines for analytics and reporting use cases.
- Experience working with data visualisation tools such as PowerBI, Tableau, or similar.
- Proven ability to understand, assess and modernise legacy datasets and pipelines.
- Strong understanding of data modelling and API integration.
- Experience developing, testing and deploying production data solutions (not just PoCs).
- Familiarity with cloud platforms (AWS preferred) and working knowledge of DevOps concepts (CI/CD, version control).
- Comfortable working independently and communicating with non-technical stakeholders.
- Strong stakeholder engagement and solution-oriented mindset.
- Ability to deliver high-impact outcomes under tight timelines.
- Experience working in advisory or consultancy-style delivery settings.
- Discover, connect to, and analyse data from a wide range of sources, including relational databases and flat files (CSV, YML, XLS etc.).
- Identify, investigate and remediate data quality, completeness, and consistency issues.
- Challenge data provenance, assumptions and definitions within legacy datasets to ensure they are fit for modern analytics and AI use cases.
- Translate business questions into clear analytical approaches, KPIs, metrics and data narratives.
- Support the definition of KPIs and analytical logic that underpin dashboards and operational reporting.
- Design, develop, and optimise data pipelines for ingestion, transformation, and storage.
- Ensure data pipelines are production-ready, reliable, scalable, and maintainable beyond proof-of-concept.
- Implement best practices for data quality, integrity, security, performance and scalability in cloud environments.
- Support multi-OpCo deployment by designing modular, interoperable data architectures and pipelines.
- Collaborate with Data Scientists to prepare, validate, and structure datasets that support advanced analytics and AI-driven solutions.
- Support the integration of analytics and AI outputs into live operational workflows, ensuring outputs are actionable and adopted.
- Willingness to travel internationally during later stages to support group-wide deployment.
- Familiarity with airline or logistics data domains.
- Ability to implement standards and frameworks for scalable data solutions across multiple operating companies.
- An opportunity to join a fast-growing company.
- Options for career advancement.
- Learning and development opportunities.
- Flexible working environment.
- Competitive salaries based on experience.
Amach is an equal opportunity employer and makes employment decisions on the basis of merit. We celebrate diversity and are committed to creating an inclusive environment for all employees.
Data Analyst / Data Engineer employer: Amach
Contact Detail:
Amach Recruiting Team
StudySmarter Expert Advice 🤫
We think this is how you could land Data Analyst / Data Engineer
✨Tip Number 1
Network like a pro! Reach out to your connections on LinkedIn or attend industry meetups. You never know who might have the inside scoop on job openings or can put in a good word for you.
✨Tip Number 2
Prepare for interviews by practising common questions and showcasing your problem-solving skills. Use real-life examples from your experience with data analysis or engineering to demonstrate your expertise.
✨Tip Number 3
Don’t just apply blindly! Tailor your approach for each role. Research the company, understand their projects, and align your skills with what they need. This shows you're genuinely interested and not just sending out generic applications.
✨Tip Number 4
Keep an eye on our website for the latest job openings. We’re always on the lookout for talented individuals, and applying directly through us can give you a better chance of standing out!
We think you need these skills to ace Data Analyst / Data Engineer
Some tips for your application 🫡
Tailor Your CV: Make sure your CV is tailored to the Data Analyst / Data Engineer role. Highlight your experience with SQL, Python, and data pipelines, as these are key skills we're looking for. A personalised CV shows us you’re genuinely interested in the position!
Craft a Compelling Cover Letter: Your cover letter is your chance to shine! Use it to explain why you're the perfect fit for our team. Mention specific projects or experiences that relate to the responsibilities listed in the job description. We love seeing your passion and personality come through!
Showcase Your Problem-Solving Skills: In your application, don’t just list your skills—show us how you've used them to solve real-world problems. Whether it's optimising data pipelines or improving data quality, we want to see examples of your hands-on experience and impact.
Apply Through Our Website: We encourage you to apply directly through our website. It’s the best way for us to receive your application and ensures you’re considered for the role. Plus, it gives you a chance to explore more about our company culture and values!
How to prepare for a job interview at Amach
✨Know Your Data Tools
Make sure you brush up on your SQL and Python skills, especially with libraries like Pandas. Be ready to discuss how you've used these tools in past projects, as they'll want to see your hands-on experience in action.
✨Understand the Business Context
Before the interview, take some time to research Amach and their focus on cloud migration and digital transformation. Being able to translate business questions into analytical approaches will show that you understand their needs and can contribute effectively.
✨Prepare for Problem-Solving Questions
Expect to face scenarios where you'll need to demonstrate your problem-solving skills. Think of examples from your previous roles where you tackled data quality issues or optimised data pipelines, and be ready to walk them through your thought process.
✨Showcase Your Collaboration Skills
Since this role involves working closely with cross-functional teams, be prepared to discuss how you've collaborated with Data Scientists or other stakeholders in the past. Highlight your ability to communicate complex data concepts to non-technical audiences.