Consultant – Cloud Data Engineer – GCP

11 hours ago


Gurgaon, Haryana, India Sirius AI Full time ₹ 8,00,000 - ₹ 25,00,000 per year

Key Responsibilities:

  • Design, develop, and manage scalable, secure, and efficient data pipelines on Google Cloud Platform (GCP) to process and transform large datasets from multiple sources.
  • Implement and optimize GCP data services such as BigQuery, Cloud Storage, Cloud SQL, Dataflow, Dataproc, Pub/Sub, and Cloud Functions.
  • Architect and maintain ETL/ELT processes to ensure efficient data ingestion, transformation, and storage using Cloud Data Fusion, Dataflow (Apache Beam), or Composer (Airflow).
  • Architect and implement data models on GCP to support efficient data storage and retrieval in BigQuery, Cloud Spanner, or Firestore.
  • Collaborate with data architects to design and implement data lakes, data warehouses, and data marts on GCP.
  • Build and maintain data integration workflows using tools like Cloud Composer (Apache Airflow), Cloud Functions, or Dataflow.
  • Utilize GCP DevOps tools for source control, build automation, release management, and infrastructure as code (IaC) using Terraform or Deployment Manager.
  • Implement CI/CD pipelines within Google Cloud Build to automate the build, test, and deployment of data pipelines and infrastructure changes, ensuring rapid and reliable delivery of data solutions.
  • Utilize big data technologies such as Dataproc (Apache Spark, Hadoop, etc.) to handle large volumes of data and perform complex analytics tasks.
  • Implement real-time data processing solutions using streaming technologies like Pub/Sub and Dataflow (Apache Beam) to enable timely insights and actions.
  • Implement data governance policies and security controls within GCP environments to protect sensitive data and ensure compliance with regulatory requirements such as GDPR, HIPAA, or PCI DSS.
  • Optimize data pipelines and processing workflows for performance, scalability, and cost-effectiveness on GCP, leveraging BigQuery optimizations, Autoscaling, and cost management techniques.
  • Monitor, troubleshoot, and optimize the performance of data pipelines and cloud-based systems to ensure high availability, low latency, and scalability.
  • Work closely with cross-functional teams to understand business requirements and translate them into scalable GCP data solutions.
  • Ensure data security and compliance with industry best practices, including IAM roles, VPC configurations, encryption, and access controls.
  • Conduct regular code reviews, provide feedback, and ensure adherence to best practices and standards.
  • Stay up to date with the latest GCP services and cloud technologies to drive innovation within the team.

Job Qualifications:

  • Bachelor's degree in Computer Science, Engineering, or a related field.
  • 2-3 years of experience as a Cloud Data Engineer or similar role, with a strong focus on GCP cloud services.
  • Proficiency in designing and building scalable data architectures using GCP data services such as BigQuery, Cloud Storage, Dataflow, Cloud SQL, and Dataproc.
  • Strong experience with ETL/ELT frameworks and tools like Cloud Composer (Apache Airflow), Cloud Data Fusion, or Dataflow (Apache Beam).
  • Expertise in SQL, Python, or other programming languages used in data engineering.
  • Hands-on experience with data lakes, data warehouses, and data pipeline orchestration on GCP.
  • Familiarity with CI/CD pipelines and infrastructure-as-code (IaC) tools like Terraform, Deployment Manager, or Cloud Build.
  • Understanding data governance, security, and compliance standards, including encryption, IAM policies, and role-based access control (RBAC).
  • Experience in data modeling, data normalization, and performance optimization for cloud-based data solutions.
  • GCP Certifications such as Google Cloud Professional Data Engineer or Google Cloud Professional Cloud Architect.
  • Experience with Apache Spark or Databricks on GCP.
  • Knowledge of machine learning workflows and working with data science teams is a plus.
  • Familiarity with DevOps practices and tools such as Docker and Kubernetes.
  • Excellent communication and collaboration skills, with the ability to work effectively in a team environment.
  • Strong problem-solving and troubleshooting skills, with a proactive approach to identifying and resolving issues.
  • Ability to adapt to a fast-paced, agile environment and manage multiple priorities effectively.

  • GCP Data Engineer

    4 days ago


    Gurgaon, Haryana, India BT Group Full time ₹ 15,00,000 - ₹ 25,00,000 per year

    Job Description: Data Engineer (Specialist)Platform: GCPPosition Overview:We are seeking a skilled and experienced GCP Data Engineer to join our team. As a GCP Data Engineer, you will be responsible for designing, implementing and maintaining our data infrastructure and pipelines on the Google Cloud Platform (GCP).Key Responsibilities:Design &...


  • Gurgaon, Haryana, India Nexthire Full time ₹ 12,00,000 - ₹ 36,00,000 per year

    Company - Statusneo Position - GCP Data EngineerLocation - Gurgaon Experience - 3-9 years Job Summary:We are seeking a skilled GCP Data Engineer with over 3-9 years of experience to design, build, and maintain scalable data pipelines on Google Cloud Platform. The ideal candidate will have strong proficiency in SQL, Python, and BigQuery, with a bonus for...

  • GCP Data Engineer

    4 hours ago


    Gurgaon, Haryana, India Impetus Full time ₹ 20,00,000 - ₹ 25,00,000 per year

    QualificationNeed to hire GCP enabled Module Leads and Leads with proficiency on data engineering technologies and languages. These folks should be able to to drive and lead migration from On Prem to GCP for Amex use casesRole7-10 years of experience in the role of implementation of high end software products.Provides technical leadership in Big Data space...


  • Gurgaon, Haryana, India Impetus Full time ₹ 8,00,000 - ₹ 24,00,000 per year

    Job Title:Lead Data Engineer – GCP (BigQuery • Composer • Python • PySpark)Location:GurgaonExperience:8+ years (data engineering / analytics engineering), with previous lead responsibilitiesAbout the Role:You will lead the design, build and operation of large-scale data platforms on the Google Cloud Platform. You will manage a team of data engineers,...

  • GCP Devops Engineer

    13 hours ago


    Gurgaon, Haryana, India StatusNeo Full time ₹ 15,00,000 - ₹ 25,00,000 per year

    Company DescriptionStatusNeo is a distinguished global consulting firm empowering businesses through cutting-edge AI, automation, and cloud-first digital solutions. We specialize in product and platform engineering, enhancing businesses with exceptional user experience, design, and functionality. As community leaders for , we guide CXOs worldwide in...

  • Data & AI Engineer

    1 week ago


    Gurgaon, Haryana, India Cloud Atler Full time ₹ 40,00,000 - ₹ 80,00,000 per year

    Company DescriptionCloud Atler is a global cloud marketplace that helps businesses compare, save, and transform how they gain value from cloud services. We aggregate Infrastructure as a Service (IaaS) offerings from major providers like AWS, Azure, GCP, Oracle, and top regional providers to deliver the best cloud deals, saving businesses up to 50%. Our...


  • Gurgaon, Haryana, India Xebia It Architects Full time ₹ 8,00,000 - ₹ 20,00,000 per year

    Were Hiring: Senior Data Engineer GCP Migration | Gurgaon (Hybrid, 3 days office per week)Location: Gurgaon (Hybrid 3 days in office per week)Start Date: End of SeptemberExperience Level: 5+ years relevant Data Engineering experienceAbout the Role: We are looking for a highly skilled Senior Data Engineer to lead and support our strategic migration of data...

  • Lead GCP Data Engineer

    16 hours ago


    Gurgaon, Haryana, India i-Qode Digital Solutions Full time ₹ 12,00,000 - ₹ 36,00,000 per year

    Key SkillsData GCPPysparkPythonLead experience: 2-4 Years Technical ExpertiseData engineering: ETL/ELT, Spark, DBTInfrastructure: GCP, Kubernetes/DockerTools: Airflow, BigQuery, Dataflow, Kafka/PubSub, Vertex AI, Jenkins/GitHub ActionsProgramming languages: Python, SQLMethods: dimensional modeling, OO ResponsibilitiesDevelop and maintain data models,...

  • Data Engineer-GCP

    14 hours ago


    Gurgaon, Haryana, India NucleusTeq Full time ₹ 12,00,000 - ₹ 36,00,000 per year

    Position: Data Engineer – GCPRequired Skills: Python, PySpark, SQL, Airflow, GCP, BigdataYears of experience: 5+ YearsLocations: Gurgaon We are seeking skilled Data Engineers with experience in modern data engineering practices and data modeling. You will leverage your knowledge and experience in programming languages and data engineering tools to...


  • Gurgaon, Haryana, India Innova ESI Full time ₹ 6,00,000 - ₹ 18,00,000 per year

    Role: GCP DevOps and Cloud Support EngineerExperience: 4+ YearsLocation: GurugramNotice: Immediate Joiners OnlyJob Description:Key ResponsibilitiesDesign, implement, and support cloud infrastructure on GCP .Manage containerized workloads using GKE .Create and maintain Terraform-based IaC for consistent deployments.Build and maintain CI/CD pipelines using...