GCP Data Pipeline Engineer
3 days ago
About Client :- Our client is a French multinational information technology (IT) services and consulting company, headquartered in Paris, France. Founded in 1967, It has been a leader in business transformation for over 50 years, leveraging technology to address a wide range of business needs, from strategy and design to managing operations. The company is committed to unleashing human energy through technology for an inclusive and sustainable future, helping organisations accelerate their transition to a digital and sustainable world. They provide a variety of services, including consulting, technology, professional, and outsourcing services. Job Details:- location : Bangalore Mode Of Work : Hybrid Notice Period : Immediate Joiners Experience : 6-8 yrs Type Of Hire : Contract to Hire JOB DESCRIPTION: About the Role We are looking for a highly skilled Data Pipeline Engineer with strong experience in Google Cloud Platform (GCP) to design, develop, and optimize data integration pipelines that move and transform data, specifically from BigQuery to AlloyDB. The ideal candidate will have a deep understanding of data orchestration, automation, and ETL/ELT frameworks on GCP, and will be comfortable working in a dynamic environment where data models, performance requirements, and architecture evolve over time. The candidate should be able to understand the solution design, and make changes to it as per the requirement. This role is hands-on and cross-functional — working closely with solution architects, data analysts, and application teams to ensure seamless, secure, and efficient data flow across GCP components. Key Responsibilities • Design and Develop Data Pipelines o Design and build robust, scalable, and parameterized data pipelines to move data from BigQuery ? Cloud Storage ? AlloyDB. o Leverage Cloud Composer (Airflow), Cloud Functions, EventArc, and Pub/Sub to orchestrate and automate data movement. o Implement control schema to handle incremental and delta loads. • Data Orchestration and Scheduling o Create and manage DAGs in Cloud Composer 2 to schedule and monitor data workflows. o Develop asynchronous or parallel execution strategies to optimize pipeline performance under GCP constraints (e.g., AlloyDB single COPY process). • Architecture & Design Enhancement o Collaborate with solution architects to review and refine pipeline architecture. o Make design updates and code refactoring based on evolving data requirements, schema changes, or performance improvements. o Ensure pipelines align with GCP best practices, security, and cost optimization guidelines. • Performance Optimization o Tune BigQuery queries and AlloyDB import strategies for large datasets (terabytes of data). o Implement partitioning, batching, and retry mechanisms for high throughput and reliability. • Monitoring & Logging o Implement detailed logging, alerting, and monitoring using Cloud Logging, Cloud Monitoring, and Stackdriver. o Set up job-level and table-level audit trails for pipeline observability and troubleshooting. • Security & Compliance o Use Service Accounts, VPC Service Controls, IAM roles, and CMEK encryption to ensure data security and governance compliance. o Adhere to enterprise security policies and guardrails for data movement across GCP projects. Required Skills and Experience • 5+ years of experience in data engineering or cloud data integration roles. • Strong expertise in Google Cloud Platform (GCP), including: o BigQuery (SQL, views, partitioning) o AlloyDB / PostgreSQL (import/export, COPY operations) o Cloud Storage (buckets, lifecycle policies) o Cloud Composer (Airflow) o Cloud Functions, Pub/Sub, and EventArc • Proven experience building ETL/ELT data pipelines and automated workflows using GCP native tools or Python-based orchestration. • Proficiency in Python (Airflow operators, GCP SDK, REST API integration). • Strong SQL and database schema design skills. • Familiarity with asynchronous processing, retry handling, and GCP APIs for data import/export. • Understanding of data quality, lineage, and audit frameworks. Soft Skills • Excellent analytical and problem-solving abilities. • Strong documentation and communication skills. • Ability to adapt to changing requirements and propose alternative designs. • Collaborative mindset with cross-functional teams (architecture, security, operations). Sample Deliverables • End-to-end data movement pipeline from BigQuery to AlloyDB. • Configurable Cloud Composer DAG to orchestrate import jobs per table. • Python scripts or Cloud Functions with GCS ? AlloyDB import APIs. • Design documents and sequence diagrams reflecting data flow, control tables, and failure handling. • Monitoring dashboards and job logs in GCP console.
-
GCP Data Pipeline Engineer
3 days ago
Delhi, India People Prime Worldwide Full timeAbout Client :-Our client is a French multinational information technology (IT) services and consulting company, headquartered in Paris, France. Founded in 1967, It has been a leader in business transformation for over 50 years, leveraging technology to address a wide range of business needs, from strategy and design to managing operations.The company is...
-
GCP Data Pipeline Engineer
3 days ago
New Delhi, India People Prime Worldwide Full timeAbout Client :-Our client is a French multinational information technology (IT) services and consulting company, headquartered in Paris, France. Founded in 1967, It has been a leader in business transformation for over 50 years, leveraging technology to address a wide range of business needs, from strategy and design to managing operations. The company is...
-
Data Engineer
3 weeks ago
Delhi, India People Prime Worldwide Full timeJob Title: Senior Data Engineer - GCP + PythonLocation: HyderabadYears of Experience : 6+YearsAbout the CompanyOur client is a trusted global innovator of IT and business services, present in 50+ countries. They specialize in digital & IT modernization, consulting, managed services, and industry-specific solutions. With a commitment to long-term success,...
-
GCP Data Engineer
3 weeks ago
New Delhi, India EXL Full timeJob Description:Must Have- GCP Data Engineer with Banking/ Finance Institutions ExperienceAbout the RoleWe are seeking experienced Google Cloud Platform (GCP) Engineers to join our team in building a scalable and robust Model Monitoring Framework. This solution will integrate data from diverse sources including Excel files, SAS datasets, and GCP-native...
-
GCP Data Engineer
2 days ago
Delhi, India Brillio Full timeWe’re Hiring: GCP Senior Data Engineer! Do you love building scalable data pipelines and working with cutting‑edge cloud technologies? Join our team and make an impact by designing solutions that power data‑driven decisions. Key Responsibilities Design and develop robust, scalable data pipelines and solutions Collaborate with cross‑functional teams...
-
GCP Data engineer
4 weeks ago
Delhi, India LTIMindtree Full timeWe're Hiring: GCP Data EngineerLocation: GurugramNotice Period: Up to 30 DaysExperience: 5+ YearsInterested candidates can share their CVs at: are looking for a skilled GCP Data Engineer with strong expertise in Google Cloud Platform (GCP) services. The ideal candidate will have hands-on experience across GCP's data ecosystem and proficiency in modern data...
-
GCP Data Engineer
2 days ago
Delhi, India Impetus Full timeJob Title: GCP Data Engineer Location: Gurgaon Experience: 3-8 years About the Role: Design, build, and maintain large-scale data pipelines on BigQuery and other Google Cloud Platform (GCP) services. Use Python and PySpark/Spark to transform, clean, aggregate and prepare data for analytics/ML. Orchestrate workflows using Cloud Composer (Airflow) to schedule,...
-
Data Engineer
2 weeks ago
New Delhi, India Pixeldust Technologies Full timeRole Overview:We are seeking a highly skilled Data Engineer - GCP with 6–8 years of experience in designing, developing, and managing enterprise data solutions on Google Cloud Platform (GCP). The ideal candidate will have a strong background in cloud data architecture, data warehousing, big data processing, and data integration, with proven expertise in...
-
GCP Data Engineer
8 minutes ago
New Delhi, India Brillio Full timeWe’re Hiring: GCP Senior Data Engineer! Do you love building scalable data pipelines and working with cutting‑edge cloud technologies? Join our team and make an impact by designing solutions that power data‑driven decisions. Key Responsibilities Design and develop robust, scalable data pipelines and solutions Collaborate with cross‑functional teams...
-
GCP Data Engineer
3 weeks ago
Delhi, India Xsell Resources Full time#offshorejobs #india #remotework #GCPDataengineerSeeking a GCP Certified Data Engineer to work remotely from India for our Fortune 5 healthcare client in the US.Remote work from India2nd shift work hoursMust be immediate joiner. No notice periods more than 15 days.Requirements:- 5 years of proven hands-on experience with GCP data services.- GCP Google Cloud...