Kafka Specialist – Cluster Management, Data Streaming, and Performance Optimization

UpworkUnited States3 days agoContractor
Job Title: Kafka Specialist – Cluster Management, Data Streaming, and Performance Optimization Work Schedule: Days: Monday to Friday Hours: 3 hours/day Timings: 8 PM to 12 AM IST Location: Remote via Zoom (Screen share and remote control required) Salary: ₹30,000 per month Job Description: We are seeking an experienced Kafka Specialist to manage, optimize, and secure Apache Kafka clusters and build real-time data streaming solutions. The ideal candidate will have a strong understanding of distributed systems and event-driven architectures, as well as the ability to integrate Kafka with other systems and ensure peak performance. Key Responsibilities: Kafka Cluster Management: Design, deploy, and manage Apache Kafka clusters, ensuring high availability, scalability, and fault tolerance. Data Streaming Architecture: Develop and maintain real-time data streaming solutions using Kafka, Kafka Streams, and related technologies. Performance Optimization: Monitor and optimize Kafka clusters, including tuning brokers, topics, partitions, and configurations for maximum performance. Security and Compliance: Implement and manage Kafka security measures such as encryption, authentication, and authorization to ensure data integrity and compliance with industry standards. Integration: Collaborate with application developers, data engineers, and DevOps teams to integrate Kafka with other systems and services. Monitoring and Alerts: Use tools such as Prometheus, Grafana, and Kafka Manager to set up monitoring, logging, and alerting for Kafka clusters. Troubleshooting and Support: Diagnose and resolve issues related to Kafka performance, connectivity, and data processing promptly. Documentation: Create and maintain detailed documentation for Kafka configurations, processes, and best practices. Innovation and Improvement: Stay up-to-date with Kafka advancements, proposing improvements and innovative solutions as appropriate. Requirements: Proven expertise in managing distributed systems and event-driven architectures. Hands-on experience with Kafka Streams, KSQL, and other tools within the Kafka ecosystem. Strong skills in designing and maintaining real-time data streaming solutions. Proficiency in monitoring tools such as Prometheus, Grafana, and Kafka Manager. Knowledge of Kafka security configurations, including encryption, authentication, and authorization. Experience working with cloud platforms (AWS, Azure, Google Cloud) is a plus. Strong troubleshooting skills and ability to optimize Kafka performance. Note: Automated or copy-pasted applications will not be considered. We are not working with agencies for this role. Only direct applicants will be reviewed.

Ready to Apply?

By applying, you'll be redirected to the company's application page

Share this job

More Jobs at Upwork