Roles and Responsibility :
- Understanding of Elastic search architecture, indexing, and querying.
- Experience with cluster management, performance tuning, and scaling.
- Proficiency in configuring Logstash pipelines and understanding various input, filter, and output plugins.
- Experience with data transformation and processing.
- Ability to create visualizations and dashboards using Grafana.
- Familiarity with Grafana/Kibana's query language and features for data analysis.
- Experience with data ingestion and ETL processes.
- Understanding of JSON and other data formats.
- Proficiency in languages like Python, Java, or similar for automation and custom scripts.
- Experience in using Java, SpringBoot, JPA, MySql/PostGres DB, Frontend(Angular/ReactJS) technologies
- Proficiency in linux, windows, cloud, containerized based applications
- Skills in monitoring the ELK stack and troubleshooting issues.
- Ability to analyze logs and data for insights.
- Strong troubleshooting skills for identifying and resolving issues.
- Ability to explain technical concepts to non-technical stakeholders.
- Experience working in cross-functional teams.
- Relevant experience in system administration, DevOps, or data engineering.
- Familiarity with cloud and containerized services (AWS, PCF, OpenShift etc.) can be beneficial.
Qualifications:
- Minimum bachelor's degree or equivalent
- With 5 years of ELK stack experience with Grafana dashboards in banking industry.
- Keen knowledge or experience in Waterfall and/or Agile Methodologies (ITIL, Scrum, etc would be an advantage)
- Ability to articulate and clearly communicate complex problems and solutions in a simple and logical manner.
- Well-developed analytical skills and the ability to provide clarity to complex issues and synthesize large amounts of information.
- Deadline sensitive and able to work independently and under pressure.
- Certifications in Elasticsearch or related technologies will be an added advantage.