At a Glance
- Tasks: Join our Data Solutions team to create innovative data architectures and solve technical challenges.
- Company: EXL is a global leader in analytics and digital solutions, transforming businesses with data-driven insights.
- Benefits: Enjoy remote work flexibility, competitive salary, bonuses, private healthcare, and continuous learning opportunities.
- Why this job: Be part of a dynamic team influencing major business decisions while developing your skills in a fast-paced environment.
- Qualifications: 5+ years in Data Engineering with expertise in SQL, Python, and AWS ETL services required.
- Other info: This role is fully remote within the UK or Ireland, promoting a balanced and collaborative work culture.
The predicted salary is between 36000 - 60000 £ per year.
EXL is a global analytics and digital solutions company that partners with clients to improve business outcomes and unlock growth. Bringing together domain expertise with robust data, powerful analytics, cloud and AI to create agile, scalable solutions and execute complex operations for the world’s leading corporations. EXL was founded on the core values of innovation, collaboration, excellence, integrity and respect creating value from data to ensure faster decision-making and transforming operating models. Key industries include Insurance, Healthcare, Banking and Financial Services, Media, and Retail among others. Headquartered in New York, our team is over 40,000 strong, with more than 50 offices spanning six continents.
Location: Fully Remote within the United Kingdom or Ireland
Employment Type: Permanent
Summary of the role: EXL Analytics provides data-driven, action-oriented solutions to business problems through statistical data mining, innovative analytics techniques and a consultative approach. Leveraging proprietary methodology and best-of-breed technology, EXL Analytics takes an industry-specific approach to transform our clients’ decision-making and embed analytics more deeply into their business processes. EXL Analytics offers an exciting, fast paced and innovative environment, which brings together a group of sharp and entrepreneurial professionals who are eager to influence business decisions.
We are looking for a go-getter who can define:
- Data strategies to meet the demands of business requirements
- Technical requirements of the organizational data needs from diverse teams
- Data architecture, models and go-forward data solutions for various teams of the organization
As part of your duties, you will be responsible for:
- You will be part of the Data Solutions team for a major Insurance client.
- You will work with different stakeholders as SME for data engineering; a typical workday will involve working with stakeholders in individual contributor.
- Engage in technical problem solving across multiple technologies; often develop new solutions and recommend technologies that can be leveraged to create data solutions.
- Develop, construct, test, and maintain data architectures for data platform, database and analytical/reporting systems.
- Partner with other technology platform teams to leverage innovative and new technology for delivering solutions that best fit internal data needs for various analytical solutions.
- Handle codes independently, manage them and also guide other team members around its development/management.
Qualifications and experience we consider to be essential for the role:
- 5+ years of experience in Data Engineering: SQL, DWH (Redshift or Snowflake), Python (PySpark), Spark and associated data engineering jobs.
- Experience with AWS ETL pipeline services: Lambda, S3, EMR/Glue, Redshift(or Snowflake), step-functions (Preferred).
- Experience with building and supporting cloud based ETL (Extract Transform Load) Data Pipelines.
- Good to have working experience on RESTful API frameworks (Flask/FastAPI), Messaging Queue service (Kafka).
- Should be able to work on a problem independently and prepare client ready deliverable with minimal or no supervision.
- Experience with working in an agile environment, development life cycle and with diverse stakeholders (like IT, Business, Project Managers etc).
- General Insurance domain experience is preferred.
Data Management Skillsets:
- Ability to develop, enhance data models and identify ETL optimization opportunities.
- Exposure to ETL tools is going to help in the work.
- Strong grasp of advanced SQL functionalities (joins, nested query, procedures, PL/SQL).
- Strong grasp of python libraries and concepts around PySpark, Numpy, Pandas, functions, constructors etc.
- Strong ability to translate functional specifications / requirements to technical requirements.
Skills and Personal attributes we would like to have:
- Bachelor’s/Master's degree in economics, mathematics, actuarial sciences, computer science/engineering, operations research or related analytics areas; candidates with BA/BS degrees in the same fields from top tier academic institutions are also welcome to apply.
- Strong and in-depth understanding of DE Fundamentals.
- Exposure on designing ETL data pipelines and data analysis.
- Exposure on end-to-end data lifecycle management.
- Superior analytical and problem-solving skills.
- Outstanding written and verbal communication skills.
- Able to work in fast pace continuously evolving environment and ready to take up uphill challenges.
As part of a leading global analytics and digital solutions company, you can look forward to:
- A competitive salary with a generous bonus, private healthcare, critical illness life assurance at 4 x your annual salary, income protection insurance, and a rewarding pension.
- We are committed to providing our employees with the tools and resources they need to succeed and excel in their careers.
- We offer a wide range of professional and personal development opportunities.
- We also support a range of learning initiatives that allow our employees to build on their existing skills and knowledge.
- From online courses to seminars and workshops, our employees have the opportunity to enhance their skills and stay up to date with the latest trends and technologies.
As an Equal Opportunity Employer, EXL is committed to diversity. Our company does not discriminate based on race, religion, colour, national origin, sex (including pregnancy, childbirth, or related medical conditions), sexual orientation, age, or disability status.
To be considered for this role, you must already be eligible to work in the United Kingdom or Ireland.
Data Engineer (AWS) employer: EXL Service
Contact Detail:
EXL Service Recruiting Team
accessibility@talentify.io
StudySmarter Expert Advice 🤫
We think this is how you could land Data Engineer (AWS)
✨Tip Number 1
Familiarise yourself with the specific technologies mentioned in the job description, such as AWS ETL services like Lambda and Glue. Having hands-on experience or projects showcasing these skills can set you apart during discussions.
✨Tip Number 2
Network with current or former employees of EXL or similar companies on platforms like LinkedIn. Engaging in conversations about their experiences can provide valuable insights and potentially lead to referrals.
✨Tip Number 3
Prepare to discuss your problem-solving approach in detail. Since the role requires independent work and technical problem-solving, be ready to share examples of challenges you've faced and how you overcame them.
✨Tip Number 4
Stay updated on industry trends, especially in data engineering and analytics. Being knowledgeable about the latest tools and methodologies will demonstrate your commitment to continuous learning and innovation.
We think you need these skills to ace Data Engineer (AWS)
Some tips for your application 🫡
Tailor Your CV: Make sure your CV highlights relevant experience in Data Engineering, particularly with SQL, AWS ETL services, and Python. Use keywords from the job description to demonstrate that you meet the qualifications.
Craft a Compelling Cover Letter: In your cover letter, express your enthusiasm for the role at EXL and explain how your skills align with their needs. Mention specific projects or experiences that showcase your ability to develop data architectures and work with diverse stakeholders.
Showcase Technical Skills: Clearly outline your technical skills related to data engineering, such as your experience with Redshift or Snowflake, and your proficiency in building cloud-based ETL pipelines. Provide examples of how you've used these skills in previous roles.
Highlight Problem-Solving Abilities: EXL values candidates who can work independently and solve problems. Include examples in your application that demonstrate your analytical skills and your ability to deliver client-ready solutions with minimal supervision.
How to prepare for a job interview at EXL Service
✨Showcase Your Technical Skills
Make sure to highlight your experience with SQL, AWS ETL services, and Python during the interview. Be prepared to discuss specific projects where you've successfully implemented these technologies, as this will demonstrate your hands-on expertise.
✨Understand the Business Context
Familiarise yourself with the industries EXL operates in, such as Insurance and Healthcare. Being able to relate your technical skills to real-world business problems will show that you understand the bigger picture and can contribute effectively.
✨Prepare for Problem-Solving Questions
Expect to face technical problem-solving scenarios during the interview. Practice explaining your thought process clearly and concisely, as this will showcase your analytical skills and ability to work independently on complex issues.
✨Demonstrate Communication Skills
Since you'll be working with diverse stakeholders, it's crucial to exhibit strong verbal and written communication skills. Prepare examples of how you've successfully collaborated with different teams in the past, as this will highlight your ability to convey technical information effectively.