GCP Data Engineer

18 hours ago


gurgaon district, India Impetus Full time

Job Title: GCP Data Engineer Location: Gurgaon Experience: 3-8 years About the Role: Design, build, and maintain large-scale data pipelines on BigQuery and other Google Cloud Platform (GCP) services. Use Python and PySpark/Spark to transform, clean, aggregate and prepare data for analytics/ML. Orchestrate workflows using Cloud Composer (Airflow) to schedule, monitor and operationalise jobs. Optimize query performance, partitioning, clustering and cost in BigQuery. Work with structured, semi-structured and unstructured data, integrating multiple data sources. Collaborate with data scientists, analysts, business stakeholders to translate requirements into data solutions. Implement data governance, quality checks, pipeline monitoring, version control & CI/CD practices. Required Skills / Qualifications: Strong hands-on experience with GCP services: BigQuery, Cloud Storage, DataProc/Dataproc, Dataflow, Pub/Sub, Cloud Composer. For example: designing pipelines, data ingestion, transformations. (Several roles explicitly list BigQuery + Composer + DataProc or Dataflow) Proficiency in Python (scripting, ETL, automation) and PySpark (or Spark) for large-scale data processing. Excellent SQL and BigQuery SQL skills, including query optimization, partitioning/clustering design. Experience with workflow orchestration tools: Cloud Composer (Airflow) or equivalent scheduling tools. Experience building and managing ELT/ETL/data-warehouse solutions at scale (data modelling, schemas, star/snowflake, analytics). Good understanding of cloud-native architecture, cost optimisation, data security, monitoring, and possibly DevOps/CI/CD. (Preferable) Certifications such as Google Cloud Professional Data Engineer or hands-on large-scale projects in GCP.


  • GCP data Engineer

    1 day ago


    Gurgaon, India Impetus Full time

    Location: Gurgaon and Bangalore Experience: 8+ years (data engineering / analytics engineering), with previous lead responsibilities Job Descriptions for Big data or Cloud Engineer We are looking for candidates with hands on experience in pyspark with GCP cloud. Qualifications 3-10 years of IT experience range is preferred. Able to effectively use GCP...

  • GCP data Engineer

    18 hours ago


    Gurgaon, India Impetus Full time

    Location: Gurgaon and BangaloreExperience: 8+ years (data engineering / analytics engineering), with previous lead responsibilitiesJob Descriptions for Big data or Cloud EngineerWe are looking for candidates with hands on experience in pyspark with GCP cloud.Qualifications- 3-10 years of IT experience range is preferred.- Able to effectively use GCP managed...

  • Data Engineer

    6 days ago


    hyderabad district, India People Prime Worldwide Full time

    Job Title: Senior Data Engineer - GCP + Python Location: Hyderabad Years of Experience : 5+Years About the Company Our client is a trusted global innovator of IT and business services, present in 50+ countries. They specialize in digital & IT modernization, consulting, managed services, and industry-specific solutions. With a commitment to long-term success,...

  • GCP data engineer

    2 weeks ago


    Gurgaon, Haryana, India Impetus Full time

    Qualifications3-11 years of IT experience range is preferred.Able to effectively use GCP managed services e.g. Dataproc, Dataflow, pub/sub, Cloud functions, Big Query, GCS - At least 4 of these Services.Good to have knowledge on Cloud Composer, Cloud SQL, Big Table, Cloud Function.Strong experience in Big Data technologies – Hadoop, Sqoop, Hive and Spark...

  • GCP Data Engineer

    18 hours ago


    Gurgaon, India Impetus Full time

    Job Title: GCP Data EngineerLocation: GurgaonExperience: 3-8 yearsAbout the Role:- Design, build, and maintain large-scale data pipelines on BigQuery and other Google Cloud Platform (GCP) services.- Use Python and PySpark/Spark to transform, clean, aggregate and prepare data for analytics/ML.- Orchestrate workflows using Cloud Composer (Airflow) to schedule,...


  • bangalore district, India Impetus Full time

    Job Title: Lead Data Engineer – GCP (BigQuery • Composer • Python • PySpark) Location: Bengaluru Experience: 8+ years (data engineering / analytics engineering), with previous lead responsibilities About the Role: You will lead the design, build and operation of large-scale data platforms on the Google Cloud Platform. You will manage a team of data...

  • Lead GCP Data Engineer

    18 hours ago


    Gurgaon, India Impetus Full time

    Job Title: Lead Data Engineer – GCP (BigQuery • Composer • Python • PySpark)Location: GurgaonExperience: 8+ years (data engineering / analytics engineering), with previous lead responsibilitiesAbout the Role:You will lead the design, build and operation of large-scale data platforms on the Google Cloud Platform. You will manage a team of data...


  • Gurgaon, Haryana, India Impetus Full time ₹ 8,00,000 - ₹ 24,00,000 per year

    Job Title:Lead Data Engineer – GCP (BigQuery • Composer • Python • PySpark)Location:GurgaonExperience:8+ years (data engineering / analytics engineering), with previous lead responsibilitiesAbout the Role:You will lead the design, build and operation of large-scale data platforms on the Google Cloud Platform. You will manage a team of data engineers,...

  • Senior Data Engineer

    20 hours ago


    gurgaon district, India Pacific Data Integrators Full time

    Role: Senior Data Engineer Location: Remote Job Type: Full-time Shift time: Open to work in EST shift (5PM to 2AM IST) Key Responsibilities Lead the design, development, and implementation of complex data integration solutions using Informatica Intelligent Data Management Cloud (IDMC). Develop, document, unit test, and maintain high-quality ETL applications...

  • GCP cloud engineer

    7 days ago


    bangalore district, India Impetus Full time

    Job Descriptions for Big data or Cloud Engineer Position Summary: We are looking for candidates with hands on experience in Big Data with GCP cloud. Qualifications 4-7 years of IT experience range is preferred. Able to effectively use GCP managed services e.g. Dataproc, Dataflow, pub/sub, Cloud functions, Big Query, GCS - At least 4 of these Services. Good...