BigData and Hadoop Ecosystems
2 weeks ago
Job Description Teamware Solutions is seeking a skilled professional for the BigData and Hadoop Ecosystems Engineer role. This position is crucial for designing, building, and maintaining scalable big data solutions. You'll work with relevant technologies, ensuring smooth data operations, and contributing significantly to business objectives through expert analysis, development, implementation, and troubleshooting within the BigData and Hadoop Ecosystems domain. Roles and Responsibilities: - Big Data Platform Management: Install, configure, and maintain components of the Hadoop ecosystem (e.g., HDFS, YARN, Hive, Spark, Kafka, HBase) to ensure optimal performance, scalability, and high availability. - Data Pipeline Development: Design, develop, and implement robust and efficient data pipelines for ingestion, processing, and transformation of large datasets using tools like Apache Spark, Hive, or Kafka. - Performance Tuning: Monitor the performance of big data clusters and applications. Identify bottlenecks and implement optimization strategies for Spark jobs, Hive queries, and other big data processes. - Data Lake/Warehouse Design: Contribute to the design and implementation of data lake and data warehouse solutions leveraging Hadoop-based technologies. - ETL/ELT Processes: Develop and manage complex ETL/ELT processes to integrate data from various sources into the big data ecosystem. - Troubleshooting: Perform in-depth troubleshooting, debugging, and resolution for complex issues within the Hadoop ecosystem, including cluster stability, data processing failures, and performance degradation. - Security & Governance: Implement and maintain security best practices for big data platforms, including access control, encryption, and data governance policies. - Automation: Develop scripts and automation routines for cluster management, deployment, monitoring, and routine operational tasks within the big data environment. - Collaboration: Work closely with data scientists, data analysts, application developers, and infrastructure teams to support data-driven initiatives. Preferred Candidate Profile: - Hadoop Ecosystem Expertise: Strong hands-on experience with core components of the Hadoop ecosystem (HDFS, YARN) and related technologies like Apache Spark, Hive, Kafka, HBase, or Presto. - Programming/Scripting: Proficient in programming languages commonly used in big data, such as Python, Scala, or Java. Strong scripting skills for automation. - SQL Proficiency: Excellent proficiency in SQL for data manipulation and querying in big data environments (e.g., HiveQL, Spark SQL). - Cloud Big Data (Plus): Familiarity with cloud-based big data services (e.g., AWS EMR, Azure HDInsight, Google Cloud Dataproc) is a plus. - Distributed Systems: Understanding of distributed computing principles and challenges in managing large-scale data systems. - Problem-Solving: Excellent analytical and problem-solving skills with a methodical approach to complex big data challenges. - Communication: Strong verbal and written communication skills to articulate technical concepts and collaborate effectively with diverse teams. - Education: Bachelor's degree in Computer Science, Data Engineering, Information Technology, or a related technical field.
-
Bigdata Hadoop Platform
3 weeks ago
New Delhi, India Tata Consultancy Services Full timeGreetings from TCS!!!TCS is hiring for Bigdata Hadoop Platform -Cloudera .Job DescriptionRole: Bigdata Hadoop Platform -ClouderaDesired Experience Range** 4 - 10Location -HyderabadRequired Technical Skill SetBig Data, Hadoop Administration, Cloudera CDH Administration, Cloudera Manager Admin Console, Sqoop, Hive, Impala, YARN, HDFS, CDSW, Kerberos, Sentry,...
-
Bigdata & Hadoop
3 weeks ago
New Delhi, India Tata Consultancy Services Full timeRole : Bigdata & Hadoop Location : Hyderabad Required Technical Skill Set** Hadoop, Python, PySpark, HIVEMust-Have** · Hands-on experience of Hadoop, Python, PySpark, Hive, Big Data Eco System Tools. · Should be able to develop, tweak queries and work on performance enhancement. · Solid understanding of object-oriented programming and HDFS concepts · The...
-
Bigdata Hadoop Platform
4 weeks ago
New Delhi, India Tata Consultancy Services Full timeGreetings from TCS!!!TCS is hiring for Bigdata Hadoop Platform -Cloudera .Job DescriptionRole:Bigdata Hadoop Platform -Cloudera Desired Experience Range**4 - 10 Location-HyderabadRequired Technical Skill Set Big Data, Hadoop Administration, Cloudera CDH Administration, Cloudera Manager Admin Console, Sqoop, Hive, Impala, YARN, HDFS, CDSW, Kerberos, Sentry,...
-
Bigdata + Cloudera Engineer
4 weeks ago
New Delhi, India Fractal Full timeAbout the Company:Fractal is a leading AI and analytics company that helps businesses leverage data to drive growth and innovation. Our mission is to empower organizations with advanced analytics and AI solutions, fostering a culture of collaboration and continuous improvement.About the Role:We are seeking highly skilled Big Data Engineers/Senior Engineers...
-
Bigdata + Cloudera Engineer
3 weeks ago
New Delhi, India Fractal Full timeAbout the Company: Fractal is a leading AI and analytics company that helps businesses leverage data to drive growth and innovation. Our mission is to empower organizations with advanced analytics and AI solutions, fostering a culture of collaboration and continuous improvement.About the Role: We are seeking highly skilled Big Data Engineers/Senior Engineers...
-
Delhi, India Syntasa Full timePlatform Engineer – Java/ BigData/ KubernetesJob DescriptionSyntasa is seeking a high-caliber and dedicated Platform Engineer to join its India subsidiary, Syntasa Technologies India Private Limited. This position offers an exciting opportunity to contribute to the growth of a world-class development and support center. The India operations will serve as...
-
Delhi, India Syntasa Full timePlatform Engineer – Java/ BigData/ KubernetesJob DescriptionSyntasa is seeking a high-caliber and dedicated Platform Engineer to join its India subsidiary, Syntasa Technologies India Private Limited. This position offers an exciting opportunity to contribute to the growth of a world-class development and support center. The India operations will serve as...
-
Hadoop Admin
3 weeks ago
New Delhi, India Tata Consultancy Services Full timeMust-Have4 to 7 years of working exp as an administrator/developer in Cloudera Hadoop distri bution ecosystem namely CDP Data Science (Data Warehouse (DW), Data Engineering (DE), Machine learning (ML), HDFC, Ozone, Iceberg, YARN, Impala, Spark, Java, Oozie, Kerberos/Active Directory/LDAP, etc in the capacity of a system administrator/platform...
-
New Delhi, India Syntasa Full timePlatform Engineer – Java/ BigData/ KubernetesJob DescriptionSyntasa is seeking a high-caliber and dedicated Platform Engineer to join its India subsidiary, Syntasa Technologies India Private Limited. This position offers an exciting opportunity to contribute to the growth of a world-class development and support center. The India operations will serve as...
-
Hadoop Developer
3 weeks ago
New Delhi, India Emergys Full timeWork Location:Pune (Work-from-office) ⏳Notice Period:Immediate to 30 Days Experience:2+ YearsMust Have Skills: Strong Hadoop Administration– Hands-on experience managing and supporting Hadoop clusters (Cloudera CDP/CDH, Hortonworks HDP). Linux/Unix System Administration– Expertise in RedHat, CentOS, Ubuntu, with deep knowledge of OS internals. Cluster...