GCP Data Pipeline Engineer

3 days ago


Kozhikode, India People Prime Worldwide Full time

About Client :-Our client is a French multinational information technology (IT) services and consulting company, headquartered in Paris, France. Founded in 1967, It has been a leader in business transformation for over 50 years, leveraging technology to address a wide range of business needs, from strategy and design to managing operations.The company is committed to unleashing human energy through technology for an inclusive and sustainable future, helping organisations accelerate their transition to a digital and sustainable world.They provide a variety of services, including consulting, technology, professional, and outsourcing services.Job Details:-location : BangaloreMode Of Work : HybridNotice Period : Immediate JoinersExperience : 6-8 yrsType Of Hire : Contract to HireJOB DESCRIPTION:About the RoleWe are looking for a highly skilled Data Pipeline Engineer with strong experience in Google Cloud Platform (GCP) to design, develop, and optimize data integration pipelines that move and transform data, specifically from BigQuery to AlloyDB.The ideal candidate will have a deep understanding of data orchestration, automation, and ETL/ELT frameworks on GCP, and will be comfortable working in a dynamic environment where data models, performance requirements, and architecture evolve over time. The candidate should be able to understand the solution design, and make changes to it as per the requirement.This role is hands-on and cross-functional — working closely with solution architects, data analysts, and application teams to ensure seamless, secure, and efficient data flow across GCP components.Key Responsibilities• Design and Develop Data Pipelineso Design and build robust, scalable, and parameterized data pipelines to move data from BigQuery ? Cloud Storage ? AlloyDB.o Leverage Cloud Composer (Airflow), Cloud Functions, EventArc, and Pub/Sub to orchestrate and automate data movement.o Implement control schema to handle incremental and delta loads.• Data Orchestration and Schedulingo Create and manage DAGs in Cloud Composer 2 to schedule and monitor data workflows.o Develop asynchronous or parallel execution strategies to optimize pipeline performance under GCP constraints (e.g., AlloyDB single COPY process).• Architecture & Design Enhancemento Collaborate with solution architects to review and refine pipeline architecture.o Make design updates and code refactoring based on evolving data requirements, schema changes, or performance improvements.o Ensure pipelines align with GCP best practices, security, and cost optimization guidelines.• Performance Optimizationo Tune BigQuery queries and AlloyDB import strategies for large datasets (terabytes of data).o Implement partitioning, batching, and retry mechanisms for high throughput and reliability.• Monitoring & Loggingo Implement detailed logging, alerting, and monitoring using Cloud Logging, Cloud Monitoring, and Stackdriver.o Set up job-level and table-level audit trails for pipeline observability and troubleshooting.• Security & Complianceo Use Service Accounts, VPC Service Controls, IAM roles, and CMEK encryption to ensure data security and governance compliance.o Adhere to enterprise security policies and guardrails for data movement across GCP projects.Required Skills and Experience• 5+ years of experience in data engineering or cloud data integration roles.• Strong expertise in Google Cloud Platform (GCP), including:o BigQuery (SQL, views, partitioning)o AlloyDB / PostgreSQL (import/export, COPY operations)o Cloud Storage (buckets, lifecycle policies)o Cloud Composer (Airflow)o Cloud Functions, Pub/Sub, and EventArc• Proven experience building ETL/ELT data pipelines and automated workflows using GCP native tools or Python-based orchestration.• Proficiency in Python (Airflow operators, GCP SDK, REST API integration).• Strong SQL and database schema design skills.• Familiarity with asynchronous processing, retry handling, and GCP APIs for data import/export.• Understanding of data quality, lineage, and audit frameworks.Soft Skills• Excellent analytical and problem-solving abilities.• Strong documentation and communication skills.• Ability to adapt to changing requirements and propose alternative designs.• Collaborative mindset with cross-functional teams (architecture, security, operations).Sample Deliverables• End-to-end data movement pipeline from BigQuery to AlloyDB.• Configurable Cloud Composer DAG to orchestrate import jobs per table.• Python scripts or Cloud Functions with GCS ? AlloyDB import APIs.• Design documents and sequence diagrams reflecting data flow, control tables, and failure handling.• Monitoring dashboards and job logs in GCP console.


  • Lead Data Engineer

    2 weeks ago


    Kozhikode, India NexGen Tech Solutions Full time

    Job Title: Lead Data Engineer– Wealth ManagementLocation: [Noida/Gurgaon/Remote]Experience: 10+ yearsMust Have: Wealth Management DomainJob Summary:We are seeking a highly skilled Lead Data Engineer with deep expertise in the Wealth Management domain to design, implement, and manage scalable data solutions.Experience in building end-to-end Data Lake and...

  • GCP Cloud Engineer

    6 days ago


    kozhikode, India LTIMindtree Full time

    Role: GCP Cloud EngineerExperience: 5 to 8 YearsJob Location: Hyderabad/PuneHybrid ModeFTE with LTIMindtreeNotice Period: Do not apply if your notice period exceeds 30 days. We are considering only immediate joiners and candidates with a maximum 30-day notice period.Mandatory Skills: Compute Engine, Load Balancer, VPC, IAM, Big Query, Cloud Composer.Thanks &...

  • Lead Engineer

    1 week ago


    Kozhikode, India QBurst Full time

    Description :We are seeking a highly skilled Lead Engineer, DevOps to join our team. The successful candidate will be responsible for designing, implementing, and maintaining our cloud infrastructure using DevOps practices.  Responsibilities: Design and implement cloud infrastructure using DevOps practices Lead a team of engineers to deliver high-quality...


  • Kozhikode, India Pipeline AI Full time

    Job Title: Managed Services InternLocation: Remote, IndiaType: Internship (6 months, full-time)Work Hours: Flexible, with strong overlap to US business hours. Our teams align theirschedules around client projects and delivery priorities.About PipelinePipeline is an AI platform that helps sales teams research, prioritize, and personalize outreach.We work with...


  • Kozhikode, India Pipeline AI Full time

    Job Title: Managed Services InternLocation: Remote, IndiaType: Internship (6 months, full-time)Work Hours: Flexible, with strong overlap to US business hours. Our teams align theirschedules around client projects and delivery priorities.About PipelinePipeline is an AI platform that helps sales teams research, prioritize, and personalize outreach.We work with...

  • LEAD DATA ENGINEER

    7 days ago


    Kozhikode, India Prophecy Technologies Full time

    💼 We’re Hiring: LEAD DATA ENGINEER📍 Location: HYDERABAD(HYBRID)🧑 💻 Experience: 7+ Years⏳ Notice Period: Immediate to 30 DaysResponsibilitiesDesign, build, and optimize scalable data pipelines on Azure Data Lake, Azure Databricks, and Azure Synapse.Develop ETL/ELT workflows using PySpark, Python, and Databricks notebooks.Implement and manage...

  • LEAD DATA ENGINEER

    7 days ago


    Kozhikode, India Prophecy Technologies Full time

    💼 We’re Hiring: LEAD DATA ENGINEER📍 Location: HYDERABAD(HYBRID)🧑 💻 Experience: 7+ Years⏳ Notice Period: Immediate to 30 DaysResponsibilitiesDesign, build, and optimize scalable data pipelines on Azure Data Lake, Azure Databricks, and Azure Synapse.Develop ETL/ELT workflows using PySpark, Python, and Databricks notebooks.Implement and manage...

  • Senior data engineer

    2 weeks ago


    Kozhikode, India American Inference Full time

    About the Company We are an AI and Data Consulting Startup transforming how businesses leverage technology through four core service lines: Consulting Services: AI Strategy, Automation, and Digital Transformation for enterprises. Saa S Platform Development: Building a business application suite similar to Odoo and Zoho that is AI-native and user friendly....


  • Kozhikode, India DigiHelic Solutions Pvt. Ltd. Full time

    Software Engineer - Middleware Release Pipelines Experience - 3+ YearsLocation - BangaloreJob Summary:Looking for a Software Engineer to join our Secure Flow Cloud Team in Bangalore, India. As a Software Engineer, you will contribute to feature development and bug fi xes, be responsible for designing, developing and maintaining robust release pipelines that...


  • Kozhikode, India Whatjobs IN C2 Full time

    About Turing: Based in San Francisco, California, Turing is the world’s leading research accelerator for frontier AI labs and a trusted partner for global enterprises deploying advanced AI systems. Turing supports customers in two ways: first, by accelerating frontier research with high-quality data, advanced training pipelines, plus top AI researchers who...