GCP Data Engineer
18 hours ago
Job Title: GCP Data EngineerLocation: GurgaonExperience: 3-8 yearsAbout the Role:- Design, build, and maintain large-scale data pipelines on BigQuery and other Google Cloud Platform (GCP) services.- Use Python and PySpark/Spark to transform, clean, aggregate and prepare data for analytics/ML.- Orchestrate workflows using Cloud Composer (Airflow) to schedule, monitor and operationalise jobs.- Optimize query performance, partitioning, clustering and cost in BigQuery.- Work with structured, semi-structured and unstructured data, integrating multiple data sources.- Collaborate with data scientists, analysts, business stakeholders to translate requirements into data solutions.- Implement data governance, quality checks, pipeline monitoring, version control & CI/CD practices.Required Skills / Qualifications:- Strong hands-on experience with GCP services: BigQuery, Cloud Storage, DataProc/Dataproc, Dataflow, Pub/Sub, Cloud Composer. For example: designing pipelines, data ingestion, transformations. (Several roles explicitly list BigQuery + Composer + DataProc or Dataflow)- Proficiency in Python (scripting, ETL, automation) and PySpark (or Spark) for large-scale data processing.- Excellent SQL and BigQuery SQL skills, including query optimization, partitioning/clustering design.- Experience with workflow orchestration tools: Cloud Composer (Airflow) or equivalent scheduling tools.- Experience building and managing ELT/ETL/data-warehouse solutions at scale (data modelling, schemas, star/snowflake, analytics).- Good understanding of cloud-native architecture, cost optimisation, data security, monitoring, and possibly DevOps/CI/CD.- (Preferable) Certifications such as Google Cloud Professional Data Engineer or hands-on large-scale projects in GCP.
-
GCP data Engineer
1 day ago
Gurgaon, India Impetus Full timeLocation: Gurgaon and Bangalore Experience: 8+ years (data engineering / analytics engineering), with previous lead responsibilities Job Descriptions for Big data or Cloud Engineer We are looking for candidates with hands on experience in pyspark with GCP cloud. Qualifications 3-10 years of IT experience range is preferred. Able to effectively use GCP...
-
GCP data Engineer
18 hours ago
Gurgaon, India Impetus Full timeLocation: Gurgaon and BangaloreExperience: 8+ years (data engineering / analytics engineering), with previous lead responsibilitiesJob Descriptions for Big data or Cloud EngineerWe are looking for candidates with hands on experience in pyspark with GCP cloud.Qualifications- 3-10 years of IT experience range is preferred.- Able to effectively use GCP managed...
-
GCP data engineer
2 weeks ago
Gurgaon, Haryana, India Impetus Full timeQualifications3-11 years of IT experience range is preferred.Able to effectively use GCP managed services e.g. Dataproc, Dataflow, pub/sub, Cloud functions, Big Query, GCS - At least 4 of these Services.Good to have knowledge on Cloud Composer, Cloud SQL, Big Table, Cloud Function.Strong experience in Big Data technologies – Hadoop, Sqoop, Hive and Spark...
-
Lead GCP Data Engineer
18 hours ago
Gurgaon, India Impetus Full timeJob Title: Lead Data Engineer – GCP (BigQuery • Composer • Python • PySpark)Location: GurgaonExperience: 8+ years (data engineering / analytics engineering), with previous lead responsibilitiesAbout the Role:You will lead the design, build and operation of large-scale data platforms on the Google Cloud Platform. You will manage a team of data...
-
GCP Data Engineer
18 hours ago
gurgaon district, India Impetus Full timeJob Title: GCP Data Engineer Location: Gurgaon Experience: 3-8 years About the Role: Design, build, and maintain large-scale data pipelines on BigQuery and other Google Cloud Platform (GCP) services. Use Python and PySpark/Spark to transform, clean, aggregate and prepare data for analytics/ML. Orchestrate workflows using Cloud Composer (Airflow) to schedule,...
-
Lead GCP Data Engineer
7 days ago
Gurgaon, Haryana, India Impetus Full time ₹ 8,00,000 - ₹ 24,00,000 per yearJob Title:Lead Data Engineer – GCP (BigQuery • Composer • Python • PySpark)Location:GurgaonExperience:8+ years (data engineering / analytics engineering), with previous lead responsibilitiesAbout the Role:You will lead the design, build and operation of large-scale data platforms on the Google Cloud Platform. You will manage a team of data engineers,...
-
GCP Big Data Engineer
4 days ago
Gurgaon, India Acme Services Private Limited Full timeJob Description: Required Technical Skills: Design and Development: Design and implement robust, scalable, and cost-efficient data pipelines (ETL/ELT) using GCP services. BigQuery Expertise: Develop, optimize, and maintain complex SQL queries and data models in Google BigQuery for large-scale analytical workloads. Data Architecture: Manage and optimize data...
-
Senior Data Engineer
1 day ago
Gurgaon, India Pacific Data Integrators Full timeRole: Senior Data Engineer Location: Remote Job Type: Full-time Shift time: Open to work in EST shift (5PM to 2AM IST) Key Responsibilities Lead the design, development, and implementation of complex data integration solutions using Informatica Intelligent Data Management Cloud (IDMC). Develop, document, unit test, and maintain high-quality ETL applications...
-
Senior Data Engineer
6 days ago
Gurgaon, Haryana, India Pacific Data Integrators Full time ₹ 12,00,000 - ₹ 36,00,000 per yearRole: Senior Data EngineerLocation: RemoteJob Type: Full-timeShift time: Open to work in EST shift (5PM to 2AM IST)Key ResponsibilitiesLead the design, development, and implementation of complex data integration solutions using Informatica Intelligent Data Management Cloud (IDMC).Develop, document, unit test, and maintain high-quality ETL applications that...
-
Data Engineer,gcp
3 weeks ago
Gurgaon, Haryana, India Crescendo Global Full timeData Engineer GCP-SQL Python - 5 Years - Gurgaon Summary - We are looking for highly skilled Data Engineers to design build and optimize scalable data pipelines on Google Cloud Platform GCP The role requires deep expertise in BigQuery SQL Python and GCP-native tools ensuring seamless high-volume ingestion strong data quality checks and efficient...