Opening for Big Data, Hive, Pyspark, SQL, Unix

2 weeks ago


Gurugram Haryana, India Cognizant Full time

Experience
- 5 to 12 yrs

Notice period
- 30 days

Work Location
- Gurgaon

Job description

5+ Years of working experience in Bigdata ( Hive and PySpark)
3+ Years of experience in Unix shell scripting.
Strong SQL Programming skills.
Knowledge/ experience on Python is good to have.
Excellent communication skills for effective stakeholder management.
Strong analytical skills
LI-RK6



  • Gurugram, India Virtusa Full time

    Java Big Data Developer - CREQ Description Responsibilities include, but are not limited to - Develops and tests software, including ongoing refactoring of code, and drives continuous improvement in code structure and quality Primary focus is spent writing code, API specs, conducting code reviews and testing in ongoing sprints, or doing proof of...


  • Gurugram, India Impetus Full time

    QualificationNeed to hire Module Leads with proficiency on data engineering technologies and languages. Role5-9 Years of experience in the role of implementation of high-end software products.Provides technical leadership in Big Data space (Hadoop Stack like Spark, M/R, HDFS, Hive, HBase etc) and contributes to open source Big Data technologies.Must have :...


  • Gurugram, India Impetus Full time

    QualificationNeed to hire Module Leads with proficiency on data engineering technologies and languages. Role5-9 Years of experience in the role of implementation of high-end software products.Provides technical leadership in Big Data space (Hadoop Stack like Spark, M/R, HDFS, Hive, HBase etc) and contributes to open source Big Data technologies.Must have :...


  • Gurugram, India Impetus Full time

    QualificationNeed to hire Module Leads with proficiency on data engineering technologies and languages. Role5-9 Years of experience in the role of implementation of high-end software products.Provides technical leadership in Big Data space (Hadoop Stack like Spark, M/R, HDFS, Hive, HBase etc) and contributes to open source Big Data technologies.Must have :...


  • haryana, India Impetus Full time

    Qualification Need to hire Module Leads with proficiency on data engineering technologies and languages. Role 5-9 Years of experience in the role of implementation of high-end software products. Provides technical leadership in Big Data space (Hadoop Stack like Spark, M/R, HDFS, Hive, HBase etc) and contributes to open source Big Data technologies. Must have...


  • Gurugram, India Impetus Full time

    Qualification Need to hire Module Leads with proficiency on data engineering technologies and languages. Role - 5-9 Years of experience in the role of implementation of high-end software products. - Provides technical leadership in Big Data space (Hadoop Stack like Spark, M/R, HDFS, Hive, HBase etc) and contributes to open source Big Data technologies. -...


  • Gurugram, India Impetus Full time

    Qualification Need to hire Module Leads with proficiency on data engineering technologies and languages. Role 5-9 Years of experience in the role of implementation of high-end software products. Provides technical leadership in Big Data space (Hadoop Stack like Spark, M/R, HDFS, Hive, HBase etc) and contributes to open source Big Data technologies. Must have...

  • GCP Data Engineer

    2 days ago


    Bengaluru, Gurugram, India IntraEdge Technology Full time ₹ 15,00,000 - ₹ 25,00,000 per year

    Bachelors/Masters degree in Computer Science, Management of Information System or equivalent.2+ years of experience in GCP - BigQuery, Dataproc, Dataflow.4 or more years of relevant software engineering experience (Big Data: Python, SQL, Hadoop, Hive, Spark) in a data-focused role.Strong experience in Big Data, Python, SQL, Spark and cloud exp...


  • Bengaluru, Gurugram, Pune, India Tredence Full time ₹ 15,00,000 - ₹ 25,00,000 per year

    Experience Level: 3-9 YearsLoc: Bangalore/Chennai/Pune/Kolkata/GurugramRole & responsibilitiesBachelor's and/or masters degree in computer science or equivalent experience.Must have total 3+ yrs. of IT experience and experience in Data warehouse/ETL projects.Deep understanding of Star and Snowflake dimensional modelling.Strong knowledge of Data Management...


  • Gurugram, India ClicFlyer Full time

    Roles and Responsibilities :Proficiency in building highly scalable ETL and streaming-based data pipelines using Google Cloud Platform (GCP) services and products like Biquark, Cloud DataflowProficiency in large scale data platforms and data processing systems such as Google Big Query, Amazon Redshift, Azure Data LakeExcellent Python, PySpark and SQL...