SalaryUpto 32 LpaLocationBangaloreIndustryInformation TechnologyJob Description
Roles & Responsibilities -
• Expert with software development tools and source code management, understanding, managing issues, code changes, and grouping them into deployment releases in a stable.
• Participation in - Data Pipeline Maintenance and Design/Upgrade activities.
• Developing and implementing monitoring for various cluster components.
• Configure deployment templates & tuning open source components into enterprise-ready production tooling.
• Building CI/CD pipeline and maintenance of the same.
• Work closely with the development and infrastructure teams to analyze and design solutions.
Skillset requirements –
• Extensive experience operating and tuning Kafka, specifically as a message queue for big data processing.
• Strong understanding & working experience with:
- Data Storage Platform – Elastic Search, Carbon, Clickhouse, HDFS
- Apache Spark – Internal & Streaming
- Developing & using Ansible Roles
• Excellent scripting and programming skills – Bash, Python, Rust
• Experience on CI/CD (Continuous Integrations and Deployment) system solutions (Jenkins).
• Solid understanding of data collection tools like Flume, Filebeat.