At a Glance
- Tasks: Design and develop data pipelines, ensuring quality and efficiency for impactful insights.
- Company: ICIS optimises global resources, providing transparency in commodities markets for strategic decision-making.
- Benefits: Enjoy a supportive culture focused on innovation, career development, and flexible work options.
- Why this job: Join a dynamic team where your work directly impacts customers and drives AI innovation.
- Qualifications: Experience in Data Engineering with strong skills in Python, SQL, and cloud platforms required.
- Other info: Mentorship opportunities available; collaborative environment with a focus on continuous improvement.
The predicted salary is between 43200 - 72000 £ per year.
About the Business
At ICIS, our mission is to optimize the world\’s resources. We help companies make strategic, sustainable decisions by bringing transparency to markets across the world. We create a comprehensive view of commodities markets, providing companies with the data and intelligence to successfully navigate across global value chains every day. Our customers benefit from instant access to price assessments, reports and forecasts, a dedicated news channel and supply and demand data. You can learn more about ICIS at the link at: the Team
You\’ll be joining a collaborative, high-performing team with deep technical and domain expertise. We work closely with Data Analysts and Data Scientists across a range of business areas, turning complex requirements into scalable, reliable data solutions. Our team plays a central role in ingesting and managing many of the organisation\’s key datasets. The data pipelines we build and maintain serve as the backbone for the insights delivered to our customers. Open communication, knowledge sharing, and a strong sense of ownership are core to how we work. With a strong focus on delivering the right data at the right time, this is a great opportunity to be part of a team where your work directly contributes to meaningful outcomes for our customers.
About the Role
We have an outstanding opportunity available for a senior data engineer within our data operations team. This role will collaborate with stakeholders across business units to design, develop, and maintain the data pipelines, ensuring data quality which provides customers with the \’Right Data at the Right Time\’.
This is an exciting opportunity to be part of a strategic transformation focused on data and AI innovation within a dynamic market-leading global business. We have a supportive culture with a keen focus on innovation, technical excellence, career development and mutual support.
Responsibilities
- Data Pipeline Development: Design, develop, and optimize robust data pipelines and ETL processes to ensure efficient data flow and integration
- Data Infrastructure Management: Manage and enhance our data infrastructure to support performance, scalability, and long-term reliability
- Advanced Analytics Support: Build and maintain data models, data marts, and data lakehouse architectures to support data science initiatives, advanced analytics and reporting
- Data Quality Assurance: Implement data quality checks to maintain accuracy and consistency across all data sources
- Technology Exploration: Explore and implement advanced data technologies and tools
- Drive Continuous Improvement: Identify opportunities to streamline processes and improve the efficiency of our data pipelines
- Mentor and Support Team Members: Provide guidance and mentorship to junior engineers and support team to tackle technical challenges together
- Stakeholder Collaboration: Work closely with cross-functional teams to understand data requirements and deliver solutions that meet business needs and collaborate with Analysts, Data Scientists, product owners, and business stakeholders to deliver high-impact, AI-driven solutions.
Requirements
- Considerable experience in Data Engineering with strong focus on Data Management and Data Quality
- Bachelor\’s Degree (Engineering/Computer Science preferred but not required); or equivalent experience required
- Deep proficiency in Python, SQL, Cloud Platforms ((AWS, GCP, Azure). Data Warehousing (Snowflake), Orchestration (Airflow, Rundeck), Streaming (Kafka)
- Continuous engagement with Data Science and Analytics colleagues to understand requirements for our data-assets and empower them with best possible data, to create high value analytical services
- Ownership of assigned data products, including data model design, end-to-end data pipeline delivery, data product quality monitoring, requirements analysis and issue resolution
- Enthusiastic attitude to explore and implement advanced data technologies and tools
- Working with other tech teams to define data requirements for external data products e.g., APIs, Data Marketplace offerings etc.
- Create data-to-value framework, which enables data value tracking from ingestion to customer value realisation
- A team player who works collaboratively and possesses excellent communications skills with ability to communicate technical details in business terminology
- Demonstrated success in managing multiple deliverables concurrently and prioritising effectively
- Detail-orientated with strong problem-solving skills, innovative thinking and self-motivation in learning and exploring applications
- Capable of providing coaching and support to transfer technical and data knowledge, fostering a collaborative team environment
- Contribute to continuous improvement initiatives and process enhancements.
Learn more about the LexisNexis Risk team and how we work
We are committed to providing a fair and accessible hiring process. If you have a disability or other need that requires accommodation or adjustment, please let us know by completing our Applicant Request Support Form or please contact 1-855-833-5120.
Criminals may pose as recruiters asking for money or personal information. We never request money or banking details from job applicants. Learn more about spotting and avoiding scams here
Please read our Candidate Privacy Policy.
We are an equal opportunity employer: qualified applicants are considered for and treated during employment without regard to race, color, creed, religion, sex, national origin, citizenship status, disability status, protected veteran status, age, marital status, sexual orientation, gender identity, genetic information, or any other characteristic protected by law.
USA Job Seekers:
EEO Know Your Rights. #J-18808-Ljbffr
Senior Data Engineer employer: RELX Group plc
Contact Detail:
RELX Group plc Recruiting Team
StudySmarter Expert Advice 🤫
We think this is how you could land Senior Data Engineer
✨Tip Number 1
Familiarise yourself with the specific data technologies mentioned in the job description, such as Python, SQL, and cloud platforms like AWS or GCP. Having hands-on experience or projects showcasing these skills can set you apart during discussions.
✨Tip Number 2
Network with current or former employees of ICIS to gain insights into their team culture and expectations. This can help you tailor your approach and demonstrate your understanding of their collaborative environment during interviews.
✨Tip Number 3
Prepare to discuss your experience with data quality assurance and pipeline development. Be ready to share specific examples of how you've implemented data quality checks or optimised data flows in previous roles.
✨Tip Number 4
Showcase your problem-solving skills by preparing a case study or example where you identified a data-related challenge and successfully implemented a solution. This will highlight your innovative thinking and ability to drive continuous improvement.
We think you need these skills to ace Senior Data Engineer
Some tips for your application 🫡
Tailor Your CV: Make sure your CV highlights relevant experience in data engineering, particularly focusing on data management and quality. Use specific examples of projects where you've designed and developed data pipelines or worked with cloud platforms like AWS, GCP, or Azure.
Craft a Compelling Cover Letter: In your cover letter, express your enthusiasm for the role and the company’s mission. Discuss how your skills align with their needs, especially in areas like data pipeline development and collaboration with cross-functional teams.
Showcase Technical Skills: Clearly list your technical skills, such as proficiency in Python, SQL, and experience with data warehousing tools like Snowflake. Mention any relevant technologies you’ve explored or implemented, as this will demonstrate your proactive approach to learning.
Highlight Team Collaboration: Emphasise your ability to work collaboratively within a team. Provide examples of how you've mentored junior engineers or collaborated with data scientists to meet business needs, showcasing your communication skills and teamwork.
How to prepare for a job interview at RELX Group plc
✨Showcase Your Technical Skills
Be prepared to discuss your experience with Python, SQL, and cloud platforms like AWS or GCP. Highlight specific projects where you designed and optimised data pipelines, as this will demonstrate your technical expertise relevant to the role.
✨Emphasise Data Quality Assurance
Since data quality is crucial for this position, be ready to explain how you've implemented data quality checks in previous roles. Share examples of how you ensured accuracy and consistency across data sources.
✨Demonstrate Collaboration Skills
This role requires working closely with cross-functional teams. Prepare to discuss instances where you've collaborated with data analysts, scientists, or other stakeholders to deliver impactful solutions. Communication is key!
✨Express Your Enthusiasm for Innovation
ICIS values a culture of innovation, so convey your eagerness to explore and implement advanced data technologies. Share any experiences where you've driven continuous improvement or adopted new tools to enhance data processes.