GCP Data Pipeline Engineer

7 days ago


bangalore district, India People Prime Worldwide Full time

About Client :- Our client is a French multinational information technology (IT) services and consulting company, headquartered in Paris, France. Founded in 1967, It has been a leader in business transformation for over 50 years, leveraging technology to address a wide range of business needs, from strategy and design to managing operations. The company is committed to unleashing human energy through technology for an inclusive and sustainable future, helping organisations accelerate their transition to a digital and sustainable world. They provide a variety of services, including consulting, technology, professional, and outsourcing services. Job Details:- location : Bangalore Mode Of Work : Hybrid Notice Period : Immediate Joiners Experience : 6-8 yrs Type Of Hire : Contract to Hire JOB DESCRIPTION: About the Role We are looking for a highly skilled Data Pipeline Engineer with strong experience in Google Cloud Platform (GCP) to design, develop, and optimize data integration pipelines that move and transform data, specifically from BigQuery to AlloyDB. The ideal candidate will have a deep understanding of data orchestration, automation, and ETL/ELT frameworks on GCP, and will be comfortable working in a dynamic environment where data models, performance requirements, and architecture evolve over time. The candidate should be able to understand the solution design, and make changes to it as per the requirement. This role is hands-on and cross-functional — working closely with solution architects, data analysts, and application teams to ensure seamless, secure, and efficient data flow across GCP components. Key Responsibilities • Design and Develop Data Pipelines o Design and build robust, scalable, and parameterized data pipelines to move data from BigQuery ? Cloud Storage ? AlloyDB. o Leverage Cloud Composer (Airflow), Cloud Functions, EventArc, and Pub/Sub to orchestrate and automate data movement. o Implement control schema to handle incremental and delta loads. • Data Orchestration and Scheduling o Create and manage DAGs in Cloud Composer 2 to schedule and monitor data workflows. o Develop asynchronous or parallel execution strategies to optimize pipeline performance under GCP constraints (e.g., AlloyDB single COPY process). • Architecture & Design Enhancement o Collaborate with solution architects to review and refine pipeline architecture. o Make design updates and code refactoring based on evolving data requirements, schema changes, or performance improvements. o Ensure pipelines align with GCP best practices, security, and cost optimization guidelines. • Performance Optimization o Tune BigQuery queries and AlloyDB import strategies for large datasets (terabytes of data). o Implement partitioning, batching, and retry mechanisms for high throughput and reliability. • Monitoring & Logging o Implement detailed logging, alerting, and monitoring using Cloud Logging, Cloud Monitoring, and Stackdriver. o Set up job-level and table-level audit trails for pipeline observability and troubleshooting. • Security & Compliance o Use Service Accounts, VPC Service Controls, IAM roles, and CMEK encryption to ensure data security and governance compliance. o Adhere to enterprise security policies and guardrails for data movement across GCP projects. Required Skills and Experience • 5+ years of experience in data engineering or cloud data integration roles. • Strong expertise in Google Cloud Platform (GCP), including: o BigQuery (SQL, views, partitioning) o AlloyDB / PostgreSQL (import/export, COPY operations) o Cloud Storage (buckets, lifecycle policies) o Cloud Composer (Airflow) o Cloud Functions, Pub/Sub, and EventArc • Proven experience building ETL/ELT data pipelines and automated workflows using GCP native tools or Python-based orchestration. • Proficiency in Python (Airflow operators, GCP SDK, REST API integration). • Strong SQL and database schema design skills. • Familiarity with asynchronous processing, retry handling, and GCP APIs for data import/export. • Understanding of data quality, lineage, and audit frameworks. Soft Skills • Excellent analytical and problem-solving abilities. • Strong documentation and communication skills. • Ability to adapt to changing requirements and propose alternative designs. • Collaborative mindset with cross-functional teams (architecture, security, operations). Sample Deliverables • End-to-end data movement pipeline from BigQuery to AlloyDB. • Configurable Cloud Composer DAG to orchestrate import jobs per table. • Python scripts or Cloud Functions with GCS ? AlloyDB import APIs. • Design documents and sequence diagrams reflecting data flow, control tables, and failure handling. • Monitoring dashboards and job logs in GCP console.



  • bangalore, India People Prime Worldwide Full time

    About Client :-Our client is a French multinational information technology (IT) services and consulting company, headquartered in Paris, France. Founded in 1967, It has been a leader in business transformation for over 50 years, leveraging technology to address a wide range of business needs, from strategy and design to managing operations.The company is...


  • bangalore, India People Prime Worldwide Full time

    About Client :- Our client is a French multinational information technology (IT) services and consulting company, headquartered in Paris, France. Founded in 1967, It has been a leader in business transformation for over 50 years, leveraging technology to address a wide range of business needs, from strategy and design to managing operations. The company is...

  • GCP Data Engineer

    4 days ago


    bangalore district, India Brillio Full time

    We’re Hiring: GCP Senior Data Engineer! Do you love building scalable data pipelines and working with cutting‑edge cloud technologies? Join our team and make an impact by designing solutions that power data‑driven decisions. Key Responsibilities Design and develop robust, scalable data pipelines and solutions Collaborate with cross‑functional teams...

  • GCP Data Engineer

    4 days ago


    bangalore district, India Impetus Full time

    About the Organization- Impetus Technologies is a digital engineering company focused on delivering expert services and products to help enterprises achieve their transformation goals. We solve the analytics, AI, and cloud puzzle, enabling businesses to drive unmatched innovation and growth. Founded in 1991, we are cloud and data engineering leaders...


  • bangalore district, India Impetus Full time

    Job Title: Lead Data Engineer – GCP (BigQuery • Composer • Python • PySpark) Location: Bengaluru Experience: 8+ years (data engineering / analytics engineering), with previous lead responsibilities About the Role: You will lead the design, build and operation of large-scale data platforms on the Google Cloud Platform. You will manage a team of data...

  • Data Engineer

    2 weeks ago


    hyderabad district, India People Prime Worldwide Full time

    Job Title: Senior Data Engineer - GCP + Python Location: Hyderabad Years of Experience : 5+Years About the Company Our client is a trusted global innovator of IT and business services, present in 50+ countries. They specialize in digital & IT modernization, consulting, managed services, and industry-specific solutions. With a commitment to long-term success,...

  • Data Engineer

    14 hours ago


    bangalore, India People Prime Worldwide Full time

    Job Title: Senior Data Engineer - GCP + PythonLocation: HyderabadYears of Experience: 5+YearsAbout the CompanyOur client is a trusted global innovator of IT and business services, present in 50+ countries. They specialize in digital & IT modernization, consulting, managed services, and industry-specific solutions. With a commitment to long-term success, they...

  • Gcp Data engineer

    5 days ago


    bangalore, India Teamware Solutions Full time

    Greetings from Teamware Solutions We Are Hiring – GCP Data Engineer Experience: 8+ Years Location: Pan INDIA Full-time | Immediate to 15 Days Joiners PreferredWe are looking for a highly skilled GCP Data Engineer to join our dynamic data engineering team. If you excel in building scalable data pipelines and enjoy working with modern cloud...

  • GCP Data Engineer

    4 days ago


    Bangalore, India Teamware Solutions Full time

    Greetings from Teamware Solutions &## ; We Are Hiring – GCP Data Engineer &## ; Experience: 8+ Years &## ; Location: Pan INDIA &## ; Full-time | Immediate to 15 Days Joiners Preferred We are looking for a highly skilled GCP Data Engineer to join our dynamic data engineering team. If you excel in building scalable data pipelines and enjoy working with...

  • Gcp Data engineer

    4 days ago


    bangalore, India Teamware Solutions Full time

    Greetings from Teamware Solutions🚀 We Are Hiring – GCP Data Engineer💼 Experience: 8+ Years🌍 Location: Pan INDIA🕒 Full-time | Immediate to 15 Days Joiners PreferredWe are looking for a highly skilled GCP Data Engineer to join our dynamic data engineering team. If you excel in building scalable data pipelines and enjoy working with modern cloud...