Bigdata and Hadoop Engineer

15 hours ago


Bengaluru Chennai Hyderabad, India Artech Full time ₹ 1,04,000 - ₹ 1,30,878 per year

Job Title: Developer

Experience: 4-6 Years

Work Location: Chennai, TN || Bangalore, KA || Hyderabad, TS

Skill Required: Digital : Bigdata and Hadoop Ecosystems Digital : PySpark

Job Description:

Need to work as a developer in Bigdata, Hadoop or Data Warehousing Tools and Cloud Computing ?

Work on Hadoop, Hive SQL?s, Spark, Bigdata Eco System Tools.?

Experience in working with teams in a complex organization involving multiple reporting lines.?

The candidate should have strong functional and technical knowledge to deliver what is required and he/she should be well acquainted with Banking terminologies. ?

The candidate should have strong DevOps and Agile Development Framework knowledge.?

Create Scala/Spark jobs for data transformation and aggregation?

Experience with stream-processing systems like Storm, Spark-Streaming, Flink"

Essential Skills:

Working experience of Hadoop, Hive SQL?

s, Spark, Bigdata Eco System Tools.?

Should be able to tweak queries and work on performance enhancement. ?

The candidate will be responsible for delivering code, setting up environment, connectivity, deploying the code in production after testing. ?

The candidate should have strong functional and technical knowledge to deliver what is required and he/she should be well acquainted with Banking terminologies. Occasionally, the candidate may have to be responsible as a primary contact and/or driver for small to medium size projects. ?

The candidate should have strong DevOps and Agile Development Framework knowledge ?

Preferable to have good technical knowledge on Cloud computing, AWS or Azure Cloud Services.?

Strong conceptual and creative problem-solving skills, ability to work with considerable ambiguity, ability to learn new and complex concepts quickly.?

Experience in working with teams in a complex organization involving multiple reporting lines?

Solid understanding of object-oriented programming and HDFS concepts"



  • Chennai, India Jobs for Humanity Full time

    Company Description Jobs for Humanity is dedicated to building an inclusive and just employment ecosystem. Therefore, we have dedicated this job posting to individuals coming from the following communities: Refugee, Neurodivergent, Single Parent, Blind or Low Vision, Deaf of Hard of Hearing Black, Hispanic, Asian, Military Veterans, the Elderly, the LGBTQ,...


  • Chennai, India Quillsoft IT Services Private Limited Full time

    WE are looking for Knime machine learning expertise along with hadoop /bigdata admin, for part time/temporary job for one of client based at Dubai. Complete remote job. Should have experience in open source. After installation 6 months Support job(paid hourly basis) **Job Types**: Part-time, Contractual / Temporary, Freelance Contract length: 1...


  • Whitefield, Bengaluru, Karnataka, India People Prime Worldwide Full time

    **About company** Our Company is global technology consulting and digital solutions company that enables enterprises to reimagine business models and accelerate innovation through digital technologies. Powered by more than 84,000 entrepreneurial professionals across more than 30 countries it covers to over 700 clients. With its extensive domain and...


  • Bengaluru, Karnataka, India beBeeDataProfessional Full time ₹ 1,50,00,000 - ₹ 2,50,00,000

    As a high-achieving Data Professional, you will utilize cutting-edge GCP managed services to drive business growth. This role involves leveraging expertise in Bigdata, Spark, PySpark & GCP-Pub Sub to design and implement scalable data solutions.Key Responsibilities:Expertise in GCP Managed Services: Dataproc, Dataflow, Pub/Sub, Cloud Functions, Big Query,...


  • Chennai, India Capgemini Full time

    Build data lake using big data skills and use the data for enabling analytics and building predictive models. - Exposure Hadoop tools (Hive, HBase, HDFS etc), Shell Scripting/Linux Commands - Nice to have CI/CD (like Jenkins, Run deck etc.) - Experience on Hadoop Ecosystem Architecture and components - Strong technical and architectural knowledge on...


  • Chennai, India Capgemini Full time

    **Job Description**: - Build data lake using big data skills and use the data for enabling analytics and building predictive models. - Exposure Hadoop tools (Hive, HBase, HDFS etc), Shell Scripting/Linux Commands - Nice to have CI/CD (like Jenkins, Run deck etc.) - Experience on Hadoop Ecosystem Architecture and components - Strong technical and...


  • Chennai, Gurugram, Pune, India Xoriant Full time US$ 1,20,000 - US$ 2,00,000 per year

    Bigdata DeveloperDomain Skills:1. KYC / AML expertiseTechnical Skills:Data Pipeline Development (scalable ETL/ELT pipelines)Big Data Infrastructure (Spark, Flink, Hadoop, Kafka)Programming (Python or Scala)Data Processing Frameworks (Spark, Hadoop, or Flink, Iceberg)Cloud Data Platforms (AWS, Azure, or GCP)SQL and database technologies (e.g., Oracle,...


  • Bengaluru, Karnataka, India beBeeDataEngineer Full time ₹ 15,00,000 - ₹ 20,20,000

    Senior Data EngineerThe ideal candidate will possess a strong background in data engineering, with expertise in Bigdata technologies such as Hadoop, Sqoop, Hive, and Spark.A key responsibility of the successful candidate will be to design, develop, and deploy scalable data processing systems utilizing GCP managed services like Dataproc, Dataflow, Pub/Sub,...

  • Bigdata Administrator

    24 hours ago


    Hyderabad, Telangana, India TalentNest Solutions Full time

    **Key skills**: - Deep understanding of Linux, networking and security fundamentals. - Experience working with AWS cloud platform and infrastructure. - Experience working with infrastructure as code with Terraform or Ansible tools. - Experience managing large BigData clusters in production (at least one of -- Cloudera, Hortonworks, EMR). - Excellent...

  • Bigdata Engineer

    3 days ago


    Bengaluru, Chennai, Gurugram, India Impetus Technologies Full time US$ 1,50,000 - US$ 2,00,000 per year

    Skills : Bigdata, Pyspark, Hive, Spark OptimizationGood to have : GCPRoles and Responsibilities Skills : Bigdata, Pyspark, Hive, Spark OptimizationGood to have : GCP