At a Glance
- Tasks: Design and implement data pipelines for our SaaS platform, optimising datasets for analysis.
- Company: Join Ookla, a leader in connectivity intelligence, known for Speedtest and Downdetector.
- Benefits: Enjoy a flexible work environment that values individuality, fun, and talent.
- Why this job: Be part of a passionate team solving real-world connectivity challenges with innovative solutions.
- Qualifications: Experience with cloud data pipelines, Python or Java, SQL, and data modelling is essential.
- Other info: Background in telecom is a plus, but not required; embrace a fast-paced start-up culture.
The predicted salary is between 43200 - 72000 £ per year.
Ookla is a global leader in connectivity intelligence, offering unparalleled network insights through the combined expertise of Speedtest , Downdetector , RootMetrics , and Ekahau . Ookla’s complementary datasets combine crowdsourced and controlled, public and private collection methods, QoS and QoE metrics, and more to unlock correlations and actionable
insights — helping organizations optimize networks, enhance digital experiences, and create better connected experiences for end-users.
Our team is a group of people brought together through passion and inspired by possibility. We are looking for team members who love solving problems, are motivated by challenges, and enjoy turning clever ideas into exceptional products. When you work for us, you are using Ookla data and insights to advance our mission of better connectivity for all.
We are committed to providing you a flexible work environment where individuality, fun, and
talent are all valued equally. If you consider yourself innovative, adept at collaboration, and you care deeply about the work you do, we want to talk!
Ookla is looking for a Data Engineer to help us scale our SaaS Platform & Applications further to take on more customers and data. Daily you will be working with data pipelines taking our raw data, delivered through APIs, to enhanced datasets that can be used by data analysts and data scientists.
Expectations for Success
Participate in the design and help drive the implementation of our data platform
Design, implement, and operate streaming and batch pipelines that scale
Partner with both engineers and data analysts to build reliable datasets that can be trusted, understood, and used by the rest of the company
Help manage our SingleStore database, optimizing performance for queries and tables
Requirements:
Experience working with data pipelines in at least one cloud, preferably AWS
Know and have worked with Python or Java
Comfortable in SQL
Experience in creating, and optimising, data models suited for data analysts used in BI tools such as Tableau
Embrace a fast-paced start-up environment
Have prior professional experience in building streaming and batch pipelines
Familiarity with data orchestration (Apache Airflow)
Experience with docker/kubernetes
Know your way around data warehouse solutions such as BigQuery and/or Amazon Redshift
Should be passionate about your job and enjoy a fast paced international working environment
Background or experience in the telecom industry is a plus but not a requirement
#J-18808-Ljbffr
Data Engineer II employer: Ziff Davis, LLC
Contact Detail:
Ziff Davis, LLC Recruiting Team
StudySmarter Expert Advice 🤫
We think this is how you could land Data Engineer II
✨Tip Number 1
Familiarise yourself with the specific technologies mentioned in the job description, such as AWS, Python, SQL, and Apache Airflow. Having hands-on experience or projects that showcase your skills in these areas can set you apart from other candidates.
✨Tip Number 2
Network with current or former employees of Ookla on platforms like LinkedIn. Engaging in conversations about their experiences can provide valuable insights into the company culture and expectations, which you can leverage during interviews.
✨Tip Number 3
Prepare to discuss real-world examples of how you've built and optimised data pipelines in previous roles. Be ready to explain your thought process and the impact of your work, as this will demonstrate your problem-solving abilities and technical expertise.
✨Tip Number 4
Show your passion for connectivity and data engineering by staying updated on industry trends and innovations. Mentioning recent developments or challenges in the telecom sector during your interview can highlight your enthusiasm and commitment to the field.
We think you need these skills to ace Data Engineer II
Some tips for your application 🫡
Understand the Role: Before applying, make sure you fully understand the responsibilities and requirements of the Data Engineer II position at Ookla. Familiarise yourself with their products and services, especially how they utilise data to enhance connectivity.
Tailor Your CV: Customise your CV to highlight relevant experience in data engineering, particularly with data pipelines, cloud services like AWS, and programming languages such as Python or Java. Emphasise any experience with SQL and BI tools like Tableau.
Craft a Compelling Cover Letter: Write a cover letter that showcases your passion for data engineering and your problem-solving skills. Mention specific projects or experiences that align with the expectations outlined in the job description, demonstrating how you can contribute to Ookla's mission.
Proofread and Submit: Before submitting your application, carefully proofread all documents for spelling and grammatical errors. Ensure that your application is complete and accurately reflects your qualifications. Submit your application through the StudySmarter website for consideration.
How to prepare for a job interview at Ziff Davis, LLC
✨Showcase Your Technical Skills
Be prepared to discuss your experience with data pipelines, especially in cloud environments like AWS. Highlight your proficiency in Python or Java and your comfort with SQL, as these are crucial for the role.
✨Demonstrate Problem-Solving Abilities
Ookla values individuals who love solving problems. Prepare examples of challenges you've faced in previous roles and how you overcame them, particularly in building streaming and batch pipelines.
✨Familiarise Yourself with Their Tools
Research the tools mentioned in the job description, such as Apache Airflow for data orchestration and data warehouse solutions like BigQuery or Amazon Redshift. Being knowledgeable about these will show your genuine interest in the position.
✨Emphasise Collaboration and Innovation
Since the role involves working closely with engineers and data analysts, be ready to discuss your collaborative experiences. Share instances where your innovative ideas contributed to successful projects or improved processes.