Current jobs related to Lead GCP Data Engineer - pune - EXL

  • Data Engineer

    4 months ago


    Pune, India Intriant HR Services Full time

    JobDescription:We areseeking a skilled and experienced GCP (Google Cloud Platform) DataEngineering Specialist to join our team. The ideal candidate shouldhave 3 years of relevant experience with expertise in BigQueryDataflow Spark and Pub/Sub. As a Data Engineering Specialist youwill be responsible for designing developing and maintaining datapipelines data...


  • Pune, India Virtusa Full time

    GDT WPB DF GCP Lead Data Engineer - CREQ193216 Description GDT WPB DF GCP Lead Data Engineer JD: Mandatory Skills Technical Lead with Cloud GCP Experience 8 to 12 Years Experience in working within an agile, multidisciplinary devops team Hadoop knowledge, NiFi/Kafka experience Expert in Python, Data Flow, Pubsub, Big Query Expert in SQL Must have good...

  • Java with GCP

    1 week ago


    Pune, India NTT DATA Services Full time

    Req ID: 292565 NTT DATA Services strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now.We are currently seeking a Java with GCP to join our team in Pune, Mahārāshtra (IN-MH), India (IN).Experience: 3 to 5 YearsYou are a...

  • GCP Data Engineer

    4 months ago


    Pune, India CloudHire Full time

    CloudHire, a remote employee provider that sources globally, is looking for a skilled GCP Data Engineer to join their team in the Information Technology and Services industry. As a GCP Data Engineer, you will be responsible for designing, developing and implementing data architectures on Google Cloud Platform (GCP). CloudHire offers vetted, time-flexible,...

  • GCP Data Engineer

    3 weeks ago


    pune, India Krishna Global Services Pvt. Ltd. Full time

    Position: GCP Data EngineerLocation: Pune, IndiaExperience: 4-10 yearsWhat You’ll Do:Design, develop, and maintain robust data pipelines using GCP services like Dataflow, Pub/Sub, Cloud Functions, and Cloud Composer.Implement ETL processes to ingest data from various sources into GCP data warehouses such as BigQuery.Ensure data quality, reliability, and...

  • GCP Data Engineer

    2 weeks ago


    pune, India RiDiK (a Subsidiary of CLPS. Nasdaq: CLPS) Full time

    Hello ,Greetings from RIDIK,We have opening for the position GCP Data Engineer for our client .Location: Pune, Maharashtra, IndiaExperience:5+ years of experience in data engineering, with a focus on cloud based solutions.Extensive experience with Google Cloud Platform (GCP) and its data services, including BigQuery, Dataflow, Pub/Sub, Cloud Storage, and...

  • GCP Data Engineer

    3 weeks ago


    Pune, India RiDiK (a Subsidiary of CLPS. Nasdaq: CLPS) Full time

    Hello ,Greetings from RIDIK,We have opening for the position GCP Data Engineer for our client .Location: Pune, Maharashtra, IndiaExperience:5+ years of experience in data engineering, with a focus on cloud based solutions. Extensive experience with Google Cloud Platform (GCP) and its data services, including BigQuery, Dataflow, Pub/Sub, Cloud Storage, and...

  • GCP Data Engineer

    2 weeks ago


    Pune, India RiDiK (a Subsidiary of CLPS. Nasdaq: CLPS) Full time

    Hello ,Greetings from RIDIK,We have opening for the position GCP Data Engineer for our client .Location: Pune, Maharashtra, IndiaExperience:5+ years of experience in data engineering, with a focus on cloud based solutions. Extensive experience with Google Cloud Platform (GCP) and its data services, including BigQuery, Dataflow, Pub/Sub, Cloud Storage, and...

  • GCP Data Engineer

    4 days ago


    pune, India RiDiK (a Subsidiary of CLPS. Nasdaq: CLPS) Full time

    Hello , Greetings from RIDIK, We have opening for the position GCP Data Engineer for our client . Location: Pune, Maharashtra, India Experience: 5+ years of experience in data engineering, with a focus on cloud based solutions. Extensive experience with Google Cloud Platform (GCP) and its data services, including BigQuery, Dataflow, Pub/Sub, Cloud Storage,...

  • GCP Data Engineer

    4 days ago


    pune, India RiDiK (a Subsidiary of CLPS. Nasdaq: CLPS) Full time

    Hello ,Greetings from RIDIK,We have opening for the position GCP Data Engineer for our client .Location: Pune, Maharashtra, IndiaExperience:5+ years of experience in data engineering, with a focus on cloud based solutions. Extensive experience with Google Cloud Platform (GCP) and its data services, including BigQuery, Dataflow, Pub/Sub, Cloud Storage, and...

  • GCP Data Engineer

    2 weeks ago


    Pune, India RiDiK (a Subsidiary of CLPS. Nasdaq: CLPS) Full time

    Hello , Greetings from RIDIK, We have opening for the position GCP Data Engineer for our client . Location: Pune, Maharashtra, India Experience: 5+ years of experience in data engineering, with a focus on cloud based solutions. Extensive experience with Google Cloud Platform (GCP) and its data services, including BigQuery, Dataflow, Pub/Sub, Cloud Storage,...

  • Gcp Data Engineer

    1 week ago


    Pune H.O, India RiDiK (a Subsidiary of CLPS. Nasdaq: CLPS) Full time

    Hello ,Greetings from RIDIK,We have opening for the position GCP Data Engineer for our client .Location: Pune, Maharashtra, IndiaExperience:5+ years of experience in data engineering, with a focus on cloud based solutions. Extensive experience with Google Cloud Platform (GCP) and its data services, including BigQuery, Dataflow, Pub/Sub, Cloud Storage, and...

  • GCP Data Engineer

    3 weeks ago


    Pune, India Krishna Global Services Pvt. Ltd. Full time

    🔍 Position: GCP Data Engineer📍 Location: Pune, India🕒 Experience: 4-10 yearsWhat You’ll Do:Design, develop, and maintain robust data pipelines using GCP services like Dataflow, Pub/Sub, Cloud Functions, and Cloud Composer.Implement ETL processes to ingest data from various sources into GCP data warehouses such as BigQuery.Ensure data quality,...

  • GCP Data Engineer

    3 weeks ago


    Pune, India Krishna Global Services Pvt. Ltd. Full time

    Position: GCP Data Engineer Location: Pune, India Experience: 4-10 yearsWhat You’ll Do:Design, develop, and maintain robust data pipelines using GCP services like Dataflow, Pub/Sub, Cloud Functions, and Cloud Composer.Implement ETL processes to ingest data from various sources into GCP data warehouses such as BigQuery.Ensure data quality, reliability,...

  • GCP Data Engineer

    3 weeks ago


    Pune, India Krishna Global Services Pvt. Ltd. Full time

    Position: GCP Data Engineer Location: Pune, India Experience: 4-10 years What You’ll Do: Design, develop, and maintain robust data pipelines using GCP services like Dataflow, Pub/Sub, Cloud Functions, and Cloud Composer. Implement ETL processes to ingest data from various sources into GCP data warehouses such as BigQuery. Ensure data quality,...

  • GCP Data Engineer

    2 weeks ago


    Pune, India Atyeti Inc Full time

    Role: Data EngineerExp: 5+ yearsLocations: Pune, Mumbai, Hyderabad, Chennai, and BangaloreJob Description:- Required - Python, GCP cloud experience (GCP, Big Query, Airflow, Dataproc or similar services)- Desired: FastAPI, Python UI framework packages, Apigee (as an app developer), Airflow (as an app developer), Cloudrun, Glue, DBT,- Prior experience...


  • Pune, India Virtusa Full time

    Technical Lead with Cloud GCP - CREQ193209 Description Mandatory Skills Technical Lead with Cloud GCP Experience 8 to 12 Years Experience in working within an agile, multidisciplinary devops team Hadoop knowledge, NiFi/Kafka experience Expert in Python, Data Flow, Pubsub, Big Query Expert in SQL Must have good Experience/knowledge on GCP components like GCS,...

  • GCP Data Engineer

    1 month ago


    Pune, India Atyeti Inc Full time

    Role: Data EngineerExp: 5+ yearsLocations: Pune, Mumbai, Hyderabad, Chennai, and BangaloreJob Description:Required - Python, GCP cloud experience (GCP, Big Query, Airflow, Dataproc or similar services)Desired: FastAPI, Python UI framework packages, Apigee (as an app developer), Airflow (as an app developer), Cloudrun, Glue, DBT,Prior experience migrating...

  • GCP Data Engineer

    1 month ago


    Pune, India Atyeti Inc Full time

    Role: Data EngineerExp: 5+ yearsLocations: Pune, Mumbai, Hyderabad, Chennai, and BangaloreJob Description:Required - Python, GCP cloud experience (GCP, Big Query, Airflow, Dataproc or similar services)Desired: FastAPI, Python UI framework packages, Apigee (as an app developer), Airflow (as an app developer), Cloudrun, Glue, DBT,Prior experience migrating...

  • GCP Data Engineer

    1 month ago


    Pune, India Atyeti Inc Full time

    Role: Data Engineer Exp: 5+ years Locations: Pune, Mumbai, Hyderabad, Chennai, and Bangalore Job Description: Required - Python, GCP cloud experience (GCP, Big Query, Airflow, Dataproc or similar services) Desired: FastAPI, Python UI framework packages, Apigee (as an app developer), Airflow (as an app developer), Cloudrun, Glue, DBT, Prior experience...

Lead GCP Data Engineer

4 months ago


pune, India EXL Full time
As a Lead GCP Data Engineer, you will be responsible for data migration, transformation or modernization on Google Cloud Platform (GCP) integrating native GCP services and other 3rd party data solutions. You will collaborate with cloud partners and our cross-functional teams to enable sector-specific use cases. Your responsibilities will involve technology adoptions pertaining to AI/ML, Generative AI, Advanced analytics, serverless and service driven ecosystem for data led and AI enabled digital transformations.
Expert proficiency in operationalization of data pipelines, warehouses, data lakes, analytics platforms and activation services on GCP is crucial. We are looking for professionals with solid experience on design, delivery, and implementation of GCP infrastructure.
Work Location – Pune (Hybrid)
Experience - 8 to 10 Years
Must have skills:
8+ years of proven experience as a Lead in Big Data Engineering .
Mandatory Technical Proficiency - Hands-on experience with Python , PySpark , Google Cloud Full Stack , SQL , DevOps , Databricks , DBT & Terraform .
Experience with Google Cloud Services such as Streaming +Batch, Cloud Storage, Cloud Dataflow, Data Proc, Big Query & Big Table
Proven real-time exposure and use of contemporary data mining, cloud computing, and data management ecosystems like Google Cloud, Hadoop, HDFS, and Spark.
Proficient in Data Modelling that can represent complex data structures while ensuring accuracy, consistency, and efficiency; data warehousing , and ETL processes .
Ability to perform system analysis and assessment of existing Systems Design and operating methodologies leveraging in-depth knowledge of big data technologies and ecosystems.
Experience with GitHub, Anaplan, Looker & Power BI,
Excellent problem-solving skills and the ability to address complex technical challenges.
Strong communication and leadership skills.
Good to have:
Experience with Apigee, Apollo GraphQL.
Experience with Quality Assurance, Agile , Documentation & Presentation.
Experience with serverless Data warehousing concepts,
Knowledge of Snowflake , ML - Machine Learning, Gen AI (LLMs), Marketing activation.
Role & Responsibilities:
Accountable for overall technical implementations and management of respective engineering POD
Liaise with Foundry on solutions design; Agile Project Manager on planning & estimation; Engineers on task assignment; Business stakeholders on prioritization
Streamline workflows and orchestrate Data pipelines .
Automate deployments and facilitate MTQ/MTP through CI-CD/DevOps
Conduct code reviews, automate regression tests, and facilitate data reloads.
Participate in requirements gathering and architectural discussions.
Responsible for the Overall Quality of Engineering builds and Architecture.
Technical Leadership: Lead creation of technical design/specifications and provide technical leadership and guidance to development teams, promoting best practices. Provide expertise in Master Data Management, Reference Data Management, Data Quality , Meta Data Management , and Data Governance in General.
Programming Hands on: Develop and maintain software components using Python , PySpark , and GCP services to process and analyze large datasets efficiently. Build data pipelines and perform data transformations.
Enforce consistent standards and quality through end-to-end review of all artifacts (incl. BRD, HLD, LLD, Data Models, STTMs, UI-UX, NFRs, Code, Test, PBI architecture design)
POC and Pilot of new reusable frameworks, tools & technologies.
Evaluate the newest technologies for optimization opportunities and future enhancement needs like self-serve and ad hoc reporting. Implement necessary infrastructure for optimal and efficient ETL from disparate data.
Performance Optimization : Identify and address performance bottlenecks, ensuring the system meets required throughput and latency targets.
Security and Compliance : Ensure that data solutions adhere to security and compliance standards, implementing necessary controls and encryption mechanisms.
Scalability : Architect scalable and highly available data solutions, considering both batch and real-time processing.
Documentation : Create and maintain comprehensive technical documentation to support the development and maintenance of data solutions