At a Glance
- Tasks: Develop and maintain Hadoop applications using Scala, ensuring data security and performance.
- Company: Join Belhati, a leading UK consulting firm specialising in innovative tech solutions.
- Benefits: Enjoy flexible work options, competitive pay, and opportunities for professional growth.
- Why this job: Be part of a dynamic team driving digital transformation with cutting-edge technologies.
- Qualifications: 4+ years in Hadoop and Scala development, strong communication skills, and agile experience required.
- Other info: This role may require relocation; background checks will be conducted.
The predicted salary is between 43200 - 72000 £ per year.
Belhati is one of the competitive consulting companies, headquartered in United Kingdom that provide cutting-edge technology solutions, that enable businesses to thrive in the digital era. We specialize in Cloud, DevOps, Data Science, Blockchain, IoT, and Metaverse technologies, and our team of experts is dedicated to delivering innovative solutions that drive business growth and transformation.
Our team of experts has years of experience in above technologies. They are highly skilled, use the latest tools and technologies to design, develop, and implement solutions that transform businesses and drive innovation.
What will your job look like
- 4+ years of relevant experience in Hadoop with Scala Development
- Its mandatory that the candidate should have handled more than 2 projects in the above framework using Scala.
- Should have 4+ years of relevant experience in handling end to end Big Data technology.
- Meeting with the development team to assess the company’s big data infrastructure.
- Designing and coding Hadoop applications to analyze data collections.
- Creating data processing frameworks.
- Extracting data and isolating data clusters.
- Testing scripts and analyzing results.
- Troubleshooting application bugs.
- Maintaining the security of company data.
- Training staff on application use
- Good project management and communication skills.
- Designing, creating, and maintaining Scala-based applications
- Participating in all architectural development tasks related to the application.
- Writing code in accordance with the app requirements
- Performing software analysis
- Working as a member of a software development team to ensure that the program meets standards
- Application testing and debugging
- Making suggestions for enhancements to application procedures and infrastructure.
- Collaborating with cross-functional team
- 12+ years of hands-on experience in variety of platform & data development roles
- 5+ years of experience in big data technology with experience ranging from platform architecture, data management, data architecture and application architecture
- High Proficiency working with Hadoop platform including Spark/Scala, Kafka, SparkSQL, HBase, Impala, Hive and HDFS in multi-tenant environments
- Solid base in data technologies like warehousing, ETL, MDM, DQ, BI and analytical tools extensive experience in metadata management and data quality processes and tools
- Experience in full lifecycle architecture guidance
- Advanced analytical thinking and problem-solving skills
- Advanced knowledge of application, data and infrastructure architecture disciplines
- Understanding of architecture and design across all systems
- Demonstrated and strong experience in a software engineering role, including the design, development and operation of distributed, fault-tolerant applications with attention to security, scalability, performance, availability and optimization
Requirements
- 4+ years of hands-on experience in designing, building and supporting Hadoop Applications using Spark, Scala, Sqoop and Hive.
- Strong knowledge of working with large data sets and high capacity big data processing platform.
- Strong experience in Unix and Shell scripting.
- Experience using Source Code and Version Control Systems like Bitbucket, Git
- Experience working in an agile environment
- Strong communication skills both verbal and written and strong relationship and collaborative skills and organizational skills with the ability to work as a member of matrix based diverse and geographically distributed project team.
Apply Here
Are you legally authorized to work in the country of this job? Yes No
Are you willing to relocate for future career changes? Yes No
If offered a position, would you be willing to submit to a criminal background check, a previous employment verification check, and/or an education verification check? Yes No
This position is a relocation position, are you willing to relocate? Yes No
Upload Resume
(Supported pdf, doc, docx types)
#J-18808-Ljbffr
Hadoop Scala Developer employer: Belhati
Contact Detail:
Belhati Recruiting Team
StudySmarter Expert Advice 🤫
We think this is how you could land Hadoop Scala Developer
✨Tip Number 1
Make sure to brush up on your Hadoop and Scala skills. Since the role requires extensive experience in these technologies, being able to discuss specific projects where you've used them will set you apart during interviews.
✨Tip Number 2
Familiarise yourself with the latest trends and tools in big data technology. Being knowledgeable about platforms like Spark, Kafka, and Hive will not only help you in interviews but also show your passion for the field.
✨Tip Number 3
Prepare to discuss your project management and communication skills. This role involves collaboration with cross-functional teams, so having examples ready of how you've successfully worked in a team environment will be beneficial.
✨Tip Number 4
Network with professionals in the big data community. Engaging with others in the field can provide insights into the company culture at Belhati and may even lead to referrals, increasing your chances of landing the job.
We think you need these skills to ace Hadoop Scala Developer
Some tips for your application 🫡
Tailor Your CV: Make sure your CV highlights your experience with Hadoop and Scala development. Include specific projects you've worked on, especially those that demonstrate your ability to handle end-to-end Big Data technology.
Craft a Compelling Cover Letter: Write a cover letter that showcases your passion for technology and your relevant experience. Mention your familiarity with the tools listed in the job description, such as Spark, Kafka, and Hive, and how you can contribute to Belhati's innovative solutions.
Showcase Your Projects: In your application, provide details about at least two significant projects you've completed using Hadoop and Scala. Explain your role, the challenges faced, and the outcomes achieved to demonstrate your hands-on experience.
Highlight Soft Skills: Belhati values strong communication and project management skills. Make sure to include examples of how you've successfully collaborated with teams and communicated complex technical concepts to non-technical stakeholders.
How to prepare for a job interview at Belhati
✨Showcase Your Project Experience
Make sure to highlight your experience with Hadoop and Scala, especially the two projects you've handled. Be prepared to discuss the challenges you faced and how you overcame them, as this will demonstrate your problem-solving skills.
✨Understand the Company’s Tech Stack
Familiarise yourself with Belhati's focus on Cloud, DevOps, and Big Data technologies. Knowing their specific tools like Spark, Kafka, and Hive will show that you're genuinely interested in the role and can hit the ground running.
✨Prepare for Technical Questions
Expect technical questions related to Hadoop applications, data processing frameworks, and troubleshooting. Brush up on your knowledge of Unix and Shell scripting, as well as version control systems like Git, to impress your interviewers.
✨Demonstrate Strong Communication Skills
Since the role requires collaboration with cross-functional teams, practice articulating your thoughts clearly. Be ready to discuss how you've effectively communicated complex technical concepts to non-technical stakeholders in the past.