Job Description
Key Responsibilities
- Design and implement real-time data pipelines using tools like Apache Kafka, Apache Flink, or Spark Streaming.
- Develop and maintain event schemas using Avro, Protobuf, or JSON Schema.
- Collaborate with backend teams to integrate event-driven microservices.
- Ensure data quality, lineage, and observability across streaming systems.
- Optimize performance and scalability of streaming applications.
- Implement CI/CD pipelines for data infrastructure.
- Monitor and troubleshoot production data flows and streaming jobs.
Required Skills & Qualifications
- 3+ years of experience in data engineering or backend development.
- Strong programming skills in Python, Java, or Scala.
- Hands-on experience with Kafka, Kinesis, or similar messaging systems.
- Familiarity with stream processing frameworks like Flink, Kafka Streams, or Spark Structured Streaming.
- Solid understanding of event-driven design patterns (e.g., event sourcing, CQRS).
- Experience with cloud platforms (AWS, GCP, or Azure) and infrastructure-as-code tools.
- Knowledge of data modeling, schema evolution, and serialization formats.
Contact Detail:
Response Informatics Recruiting Team