Epicareer Might not Working Properly
Learn More
B

REMOTE - Kafka Developer

Salary undisclosed

Checking job availability...

Original
Simplified

100% remote

36-40 hours per week with required 8 hours time off of billing per month (govt standard)

12 month contract with multiple extension years

Key Responsibilities

  • Design and develop Kafka-based event-driven architectures to support BID Program applications and microservices.
  • Implement Kafka topics, producers, consumers, brokers, and stream processing applications for real-time data streaming.
  • Optimize Kafka cluster performance, scalability, and security, ensuring compliance with VA requirements.
  • Integrate Kafka with Salesforce, RESTful APIs, cloud environments (AWS/Azure/Google Cloud Platform), and backend databases as needed.
  • Develop and maintain Kafka Connectors for seamless data ingestion and extraction.
  • Implement error handling, message replay mechanisms, and monitoring tools to ensure reliability.
  • Work in an Agile development environment with Scrum teams to deliver high-quality software solutions.
  • Ensure Kafka solutions adhere to VA security, compliance, and governance standards.
  • Support troubleshooting, debugging, and performance tuning of Kafka applications.

Required Qualifications

  • 10+ years of experience in software development, with at least 3 years working with Kafka in production environments.
  • Hands-on experience with Apache Kafka, Kafka Streams, and Kafka Connect.
  • Strong knowledge of distributed systems, event-driven architecture, and messaging systems.
  • Experience integrating Kafka with cloud platforms (AWS, Azure, or Google Cloud Platform) and working with Kubernetes/Docker.
  • Proficiency in Java, Scala, or Python for developing Kafka-based applications.
  • Experience with SQL and NoSQL databases (PostgreSQL, MongoDB, etc.).
  • Knowledge of data serialization formats (Avro, JSON, Protobuf).
  • Familiarity with monitoring and logging tools (Prometheus, Grafana, Splunk, ELK Stack).
  • Experience with CI/CD pipelines and DevSecOps best practices.
  • Strong understanding of security protocols for Kafka authentication and authorization.
  • U.S. Citizenship is required due to federal contract regulations.
  • BS degree in Science math or engineering
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.
Report this job

100% remote

36-40 hours per week with required 8 hours time off of billing per month (govt standard)

12 month contract with multiple extension years

Key Responsibilities

  • Design and develop Kafka-based event-driven architectures to support BID Program applications and microservices.
  • Implement Kafka topics, producers, consumers, brokers, and stream processing applications for real-time data streaming.
  • Optimize Kafka cluster performance, scalability, and security, ensuring compliance with VA requirements.
  • Integrate Kafka with Salesforce, RESTful APIs, cloud environments (AWS/Azure/Google Cloud Platform), and backend databases as needed.
  • Develop and maintain Kafka Connectors for seamless data ingestion and extraction.
  • Implement error handling, message replay mechanisms, and monitoring tools to ensure reliability.
  • Work in an Agile development environment with Scrum teams to deliver high-quality software solutions.
  • Ensure Kafka solutions adhere to VA security, compliance, and governance standards.
  • Support troubleshooting, debugging, and performance tuning of Kafka applications.

Required Qualifications

  • 10+ years of experience in software development, with at least 3 years working with Kafka in production environments.
  • Hands-on experience with Apache Kafka, Kafka Streams, and Kafka Connect.
  • Strong knowledge of distributed systems, event-driven architecture, and messaging systems.
  • Experience integrating Kafka with cloud platforms (AWS, Azure, or Google Cloud Platform) and working with Kubernetes/Docker.
  • Proficiency in Java, Scala, or Python for developing Kafka-based applications.
  • Experience with SQL and NoSQL databases (PostgreSQL, MongoDB, etc.).
  • Knowledge of data serialization formats (Avro, JSON, Protobuf).
  • Familiarity with monitoring and logging tools (Prometheus, Grafana, Splunk, ELK Stack).
  • Experience with CI/CD pipelines and DevSecOps best practices.
  • Strong understanding of security protocols for Kafka authentication and authorization.
  • U.S. Citizenship is required due to federal contract regulations.
  • BS degree in Science math or engineering
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.
Report this job