
Techops Sre
3 days ago
Gurugram
**About Us**
- We empower enterprises globally through intelligent, creative, and insightful services for data integration, data analytics and data visualization.- Hoonartek is a leader in enterprise transformation, data engineering and an acknowledged world-class Ab Initio delivery partner.- Using centuries of cumulative experience, research and leadership, we help our clients eliminate the complexities & risk of legacy modernization and safely deliver big data hubs, operational data integration, business intelligence, risk & compliance solutions and traditional data warehouses & marts.- At Hoonartek, we work to ensure that our customers, partners and employees all benefit from our unstinting commitment to delivery, quality and value. Hoonartek is increasingly the choice for customers seeking a trusted partner of vision, value and integrity**How We Work?**
Define, Design and Deliver (D3) is our in-house delivery philosophy. It’s culled from agile and rapid methodologies and focused on ‘just enough design’. We embrace this philosophy in everything we do, leading to numerous client success stories and indeed to our own success.- We embrace change, empowering and trusting our people and building long and valuable relationships with our employees, our customers and our partners. We work flexibly, even adopting traditional/waterfall methods where circumstances demand it. At Hoonartek, the focus is always on delivery and value.- L1 Skillset:
- 1-2 Years of Experience in running Technical Operations on any ETL tools (preferably Spark)
- Hands-on experience in Linux and mainstream database (Oracle, SQL)
- Should be able to run basic queries to fetch data
- Basic knowledge of Hive/Hadoop/HDFS
- Understanding of at least one scripting language (e.g. shell, Perl).
- Good knowledge of Spark or Airflow
- Should be able to monitor, re-run, kill Jobs in Yarn and Airflow
- Familiar with Grafana usage
- Should have knowledge on LogStash / ElasticSearch / Kibana/Grafana
- OCP Exposure would be added advantage
- Should have worked on any ETL as TechOps
- L2 Skillset:
- 3-4 Years of Experience in running Technical Operations on Spark, Airflow (Preferably on RedHat OCP and Telecom Domain)
- Strong knowledge on Linux/Unix commands
- Strong proven hands-on experience on Spark and Airflow, Yarn
- Having good understanding of SQL/exadata query writing skills (DDL,DML, Joins, Sub queries, View, Select statement)
- Hands-on experience with data and analytics
- Basic knowledge on Jenkins for running and monitoring pipelines at the time of Deployments
- Should have worked on LogStash / ElasticSearch and should be able to make changes in queries
- Good Python / shell Scripting
- Quickly gain understanding of existing processes, Jobs, KPI, Reports.
- Strong Analytics, Debugging and Problem-solving skills
- Should have done Automation of Operation Processes
- Should be able to query/extract data from Hive/HDFS