GCP Data Pipeline Engineer

3 days ago


New Delhi, India People Prime Worldwide Full time

About Client :-Our client is a French multinational information technology (IT) services and consulting company, headquartered in Paris, France. Founded in 1967, It has been a leader in business transformation for over 50 years, leveraging technology to address a wide range of business needs, from strategy and design to managing operations. The company is committed to unleashing human energy through technology for an inclusive and sustainable future, helping organisations accelerate their transition to a digital and sustainable world. They provide a variety of services, including consulting, technology, professional, and outsourcing services.Job Details:-location : Bangalore Mode Of Work : Hybrid Notice Period : Immediate Joiners Experience : 6-8 yrs Type Of Hire : Contract to HireJOB DESCRIPTION:About the Role We are looking for a highly skilled Data Pipeline Engineer with strong experience in Google Cloud Platform (GCP) to design, develop, and optimize data integration pipelines that move and transform data, specifically from BigQuery to AlloyDB. The ideal candidate will have a deep understanding of data orchestration, automation, and ETL/ELT frameworks on GCP, and will be comfortable working in a dynamic environment where data models, performance requirements, and architecture evolve over time. The candidate should be able to understand the solution design, and make changes to it as per the requirement. This role is hands-on and cross-functional — working closely with solution architects, data analysts, and application teams to ensure seamless, secure, and efficient data flow across GCP components. Key Responsibilities • Design and Develop Data Pipelines o Design and build robust, scalable, and parameterized data pipelines to move data from BigQuery ? Cloud Storage ? AlloyDB. o Leverage Cloud Composer (Airflow), Cloud Functions, EventArc, and Pub/Sub to orchestrate and automate data movement. o Implement control schema to handle incremental and delta loads. • Data Orchestration and Scheduling o Create and manage DAGs in Cloud Composer 2 to schedule and monitor data workflows. o Develop asynchronous or parallel execution strategies to optimize pipeline performance under GCP constraints (e.g., AlloyDB single COPY process). • Architecture & Design Enhancement o Collaborate with solution architects to review and refine pipeline architecture. o Make design updates and code refactoring based on evolving data requirements, schema changes, or performance improvements. o Ensure pipelines align with GCP best practices, security, and cost optimization guidelines. • Performance Optimization o Tune BigQuery queries and AlloyDB import strategies for large datasets (terabytes of data). o Implement partitioning, batching, and retry mechanisms for high throughput and reliability. • Monitoring & Logging o Implement detailed logging, alerting, and monitoring using Cloud Logging, Cloud Monitoring, and Stackdriver. o Set up job-level and table-level audit trails for pipeline observability and troubleshooting. • Security & Compliance o Use Service Accounts, VPC Service Controls, IAM roles, and CMEK encryption to ensure data security and governance compliance. o Adhere to enterprise security policies and guardrails for data movement across GCP projects. Required Skills and Experience • 5+ years of experience in data engineering or cloud data integration roles. • Strong expertise in Google Cloud Platform (GCP), including: o BigQuery (SQL, views, partitioning) o AlloyDB / PostgreSQL (import/export, COPY operations) o Cloud Storage (buckets, lifecycle policies) o Cloud Composer (Airflow) o Cloud Functions, Pub/Sub, and EventArc • Proven experience building ETL/ELT data pipelines and automated workflows using GCP native tools or Python-based orchestration. • Proficiency in Python (Airflow operators, GCP SDK, REST API integration). • Strong SQL and database schema design skills. • Familiarity with asynchronous processing, retry handling, and GCP APIs for data import/export. • Understanding of data quality, lineage, and audit frameworks. Soft Skills • Excellent analytical and problem-solving abilities. • Strong documentation and communication skills. • Ability to adapt to changing requirements and propose alternative designs. • Collaborative mindset with cross-functional teams (architecture, security, operations). Sample Deliverables • End-to-end data movement pipeline from BigQuery to AlloyDB. • Configurable Cloud Composer DAG to orchestrate import jobs per table. • Python scripts or Cloud Functions with GCS ? AlloyDB import APIs. • Design documents and sequence diagrams reflecting data flow, control tables, and failure handling. • Monitoring dashboards and job logs in GCP console.



  • Delhi, India People Prime Worldwide Full time

    About Client :-Our client is a French multinational information technology (IT) services and consulting company, headquartered in Paris, France. Founded in 1967, It has been a leader in business transformation for over 50 years, leveraging technology to address a wide range of business needs, from strategy and design to managing operations.The company is...


  • Delhi, India People Prime Worldwide Full time

    About Client :- Our client is a French multinational information technology (IT) services and consulting company, headquartered in Paris, France. Founded in 1967, It has been a leader in business transformation for over 50 years, leveraging technology to address a wide range of business needs, from strategy and design to managing operations. The company is...

  • GCP Data Engineer

    3 weeks ago


    New Delhi, India EXL Full time

    Job Description:Must Have- GCP Data Engineer with Banking/ Finance Institutions ExperienceAbout the RoleWe are seeking experienced Google Cloud Platform (GCP) Engineers to join our team in building a scalable and robust Model Monitoring Framework. This solution will integrate data from diverse sources including Excel files, SAS datasets, and GCP-native...

  • Data Engineer

    2 weeks ago


    New Delhi, India Pixeldust Technologies Full time

    Role Overview:We are seeking a highly skilled Data Engineer - GCP with 6–8 years of experience in designing, developing, and managing enterprise data solutions on Google Cloud Platform (GCP). The ideal candidate will have a strong background in cloud data architecture, data warehousing, big data processing, and data integration, with proven expertise in...

  • GCP Data Engineer

    10 minutes ago


    New Delhi, India Brillio Full time

    We’re Hiring: GCP Senior Data Engineer! Do you love building scalable data pipelines and working with cutting‑edge cloud technologies? Join our team and make an impact by designing solutions that power data‑driven decisions. Key Responsibilities Design and develop robust, scalable data pipelines and solutions Collaborate with cross‑functional teams...

  • GCP Data engineer

    4 weeks ago


    New Delhi, India LTIMindtree Full time

    We're Hiring: GCP Data EngineerLocation: GurugramNotice Period: Up to 30 DaysExperience: 5+ YearsInterested candidates can share their CVs at: Neha.Ingale@alphacom.inWe are looking for a skilled GCP Data Engineer with strong expertise in Google Cloud Platform (GCP) services. The ideal candidate will have hands-on experience across GCP's data ecosystem and...

  • GCP Data Engineer

    2 weeks ago


    New Delhi, India Tata Consultancy Services Full time

    Job Title: GCP Data Engineer Location Options: Chennai / Bangalore / Hyderabad / Delhi ( Onsite ) Experience Required: 6–10 years Joining: Immediate Interview Mode: Virtual Drive Interview Date: 14-Nov-2025Role OverviewAs a GCP Data Engineer, you will be responsible for designing, developing, and maintaining scalable data pipelines and infrastructure using...

  • GCP Data engineer

    5 days ago


    New Delhi, India Tata Consultancy Services Full time

    We await your innovation at TCS: Hiring | GCP Data Engineer|Greetings from TCS!! We are Hiring for GCP Data EngineerRequired Experience: 5-10 yearsWork location: Hyderabad, Chennai, Bangalore, DelhiSkills: GCP, Big Query, Dataflow, Data Proc, Cloud SQL, Cloud Spanner, Pub/Sub, ETL, Apache airflow, Python, Pandas and NumphyJD:- Google Data Engineer (Overall...

  • GCP Data engineer

    3 weeks ago


    New Delhi, India LTIMindtree Full time

    We’re Hiring: GCP Data EngineerLocation:Gurugram Notice Period:Up to 30 Days Experience:5+ YearsInterested candidates can share their CVs at:Neha.Ingale@alphacom.inWe are looking for a skilledGCP Data Engineerwith strong expertise inGoogle Cloud Platform (GCP)services. The ideal candidate will have hands-on experience across GCP’s data ecosystem and...

  • Gcp Data engineer

    2 days ago


    New Delhi, India Teamware Solutions Full time

    Greetings fromTeamware Solutions We Are Hiring – GCP Data Engineer Experience:8+ Years Location:Pan INDIA Full-time | Immediate to 15 Days Joiners PreferredWe are looking for ahighly skilled GCP Data Engineerto join our dynamic data engineering team. If you excel in building scalable data pipelines and enjoy working with modern cloud technologies—this is...