Date Posted

March 10, 2025

Location

United States

Job Type

Corporate Contract

Qualification

Masters Degree

Role

Technical

Remote?

Yes

Tax Terms

C2C, W-2, 1099

Duration

12 Months

Job Description

Job Title: Senior Flink + Temporal Developer

Company Overview:

Pulivarthi Group is a premier global provider of staffing and IT technology solutions, renowned for delivering exceptional services tailored to each client's unique needs. With a steadfast commitment to excellence, we merge expertise with innovation, ensuring cost-effective solutions of the highest quality. Our diverse client base spans healthcare, finance, government, and beyond, reflecting our adaptability and proficiency across industries. Operating in the United States, Canada, and Mexico, we pride ourselves on aligning with clients' cultures, deploying top-tier talent, and utilizing cutting-edge technologies. Pulivarthi Group stands as a beacon of reliability, efficiency, and innovation in the realm of staffing solutions.

Job Overview/Summary:

We are seeking a Senior Flink + Temporal Developer to design, develop, and optimize real-time data processing solutions using Apache Flink, Apache Kafka, and Temporal IO. This role involves building scalable, fault-tolerant data pipelines and event-driven architectures while ensuring high performance, reliability, and observability. The ideal candidate will have strong expertise in real-time data streaming, workflow orchestration, and distributed systems.

Responsibilities:

  • Design, develop, and maintain data pipelines for handling large volumes of data streams using Apache Kafka and Apache Flink.
  • Implement real-time data processing solutions leveraging Apache Flink or Apache Spark.
  • Build and maintain RESTful APIs using Spring Boot to support data integration across systems.
  • Design and implement workflow orchestration solutions using Temporal IO to ensure fault tolerance, scalability, and reliability.
  • Optimize Flink jobs for performance, reliability, and scalability.
  • Develop event-driven architectures integrating Flink and Temporal for resilient data processing.
  • Develop and manage Temporal workflows for orchestrating complex data pipelines and processes.
  • Work with NoSQL databases to ensure optimal performance and scalability.
  • Collaborate with cross-functional teams to deliver high-quality data solutions for business needs.
  • Monitor and improve system performance, latency, and throughput.
  • Ensure best practices for fault tolerance, high availability, and observability.
  • Troubleshoot and optimize existing data pipelines and workflows to enhance efficiency and reliability.

Primary Skills:

  • 5+ years of experience in data engineering or a related field.
  • Strong proficiency in Apache Kafka for large-scale data streaming.
  • Deep expertise in Apache Flink, including Flink SQL, DataStream API, and State Management.
  • Experience with Apache Flink or Apache Spark for real-time and batch data processing.
  • Proven experience with Temporal IO or similar workflow orchestration frameworks (e.g., Cadence).
  • Proficiency in Java or Scala (Python experience is a plus).
  • Strong understanding of distributed systems and event-driven architecture.
  • Knowledge of SQL/NoSQL databases and their integration with Flink.
  • Familiarity with monitoring tools like Prometheus, Grafana, or OpenTelemetry.
  • Experience building RESTful APIs with Spring Boot.
  • Hands-on experience with NoSQL databases (e.g., MongoDB, Cassandra).

Secondary Skills (Good to Have):

  • Experience with cloud environments (AWS, GCP, or Azure).
  • Knowledge of containerization (Docker, Kubernetes).
  • Experience with performance tuning in large-scale streaming architectures.

Qualifications:

  • Bachelor’s or Master’s degree in Computer Science, Engineering, or a related field.
  • Strong problem-solving, analytical, and troubleshooting skills.
  • Excellent communication and collaboration abilities.

Benefits/Perks:

  • Competitive salary and comprehensive benefits package.
  • Opportunity to work with cutting-edge real-time data processing technologies.
  • Collaborative work environment with career growth opportunities.
  • Exposure to large-scale, high-performance distributed systems.