GCP Lead Data Engineer

4 weeks ago


Erode, India People Prime Worldwide Full time

Lead Data Engineer – Python & GCPLocation - 10+ YearsJob Overview:We are seeking an experienced Lead Data Engineer with strong expertise in Python and Google Cloud Platform (GCP). The ideal candidate will lead the end-to-end data engineering lifecycle—from requirement gathering and solution design to development, deployment, and post-delivery support. This role involves designing scalable ETL/ELT pipelines, architecting cloud-native solutions, implementing data ingestion & transformation processes, and ensuring data quality across systems.Experience Level:10+ years of relevant IT experience in data engineering and backend development.Key Responsibilities:Design, develop, test, and maintain scalable ETL/ELT data pipelines using Python.Architect enterprise-grade data solutions using technologies such as Kafka, GKE, multi-cloud services, load balancers, APIGEE, DBT, LLMs, and DLP tools.Work extensively with GCP services, including:Dataflow – real-time & batch processingCloud Functions – serverless computeBigQuery – data warehousing & analyticsCloud Composer – workflow orchestration (Airflow)GCS – scalable storageIAM – access control & securityCloud Run – containerized workloadsBuild APIs using Python FastAPI.Work with Big Data and processing technologies:Apache Spark, Kafka, Airflow, MongoDB, Redis/BigtablePerform data ingestion, transformation, cleansing, and validation to ensure high data quality.Implement and enforce data quality checks, monitoring, and validation rules.Collaborate with data scientists, analysts, and engineering teams to understand data needs and deliver solutions.Use GitHub for version control and support CI/CD deployments.Write complex SQL queries for relational databases such as SQL Server, Oracle, and PostgreSQL.Document data pipeline designs, architecture diagrams, and operational procedures.Required Skills:10+ years of hands-on experience with Python in data engineering or backend development.Strong working knowledge of GCP services (Dataflow, BigQuery, Cloud Functions, Cloud Composer, GCS, Cloud Run).Deep understanding of data pipeline architecture, ETL/ELT processes, and data integration patterns.Experience with Apache Spark, Kafka, Airflow, FastAPI, Redis/Bigtable.Strong SQL skills with at least one enterprise RDBMS (SQL Server, Oracle, PostgreSQL).Experience in on-prem to cloud data migrations.Knowledge of GitHub and CI/CD best practices.Good to Have:Experience with Snowflake.Hands-on knowledge of Databricks.Familiarity with Azure Data Factory (ADF) or other Azure data tools.



  • erode, India beBeeDataEngineer Full time

    Job Overview:We are seeking a highly skilled Data Engineer to join our team.The ideal candidate will have experience with GCP – BigQuery, Cloud Storage, SQL, and Power BI. They will be responsible for migrating reporting data sources from Teradata to Cerebro (BigQuery), repointing and validating existing Power BI reports to new BigQuery data models,...


  • erode, India beBeeDataEngineer Full time

    Data Engineering Role OverviewWe are seeking a skilled Data Engineer to lead our data engineering projects on Google Cloud Platform (GCP). The ideal candidate will have hands-on experience with GCP services, particularly BigQuery and DataProc.Key Responsibilities:Design and develop scalable data architectures using GCP services.Collaborate with...

  • Senior Data Engineer

    3 weeks ago


    Erode, India Whatjobs IN C2 Full time

    Job Title: Senior Data Engineer Employment Type: Full-Time Location: Kochi / Trivandrum / Bangalore Experience: Minimum 8 years We are seeking a highly skilled Senior Data Engineer with strong expertise in Spark, Kafka, modern cloud platforms (AWS/GCP/Azure), ETL processes, SQL, and Databricks. The ideal candidate will have hands-on experience building...


  • erode, India beBeeDataEngineer Full time

    Cloud Data Engineering Role:We are seeking a highly skilled professional with 5+ years of experience in designing, implementing, and maintaining large-scale data processing systems on Google Cloud Platform (GCP).Key responsibilities include:Designing and developing scalable data pipelines using GCP services like Dataflow and BigQuery.Developing and...


  • erode, India beBeeDataEngineering Full time

    **Job Title:** Data Engineering ProfessionalOur organization seeks a highly skilled and experienced Data Engineering Professional to join its team. This role presents an exciting opportunity to contribute to the growth of our world-class development and support center.The ideal candidate will have strong experience in designing and developing scalable data...


  • erode, India beBeeData Full time

    About the Role:This position entails utilizing Snowflake, SQL for data manipulation and engineering tasks.The ideal candidate should possess strong expertise in SQL and Snowflake architecture.Familiarity with one or more cloud platforms – preferably Azure /AWS/GCP – is also required.Key responsibilities include developing and managing data warehouses,...


  • erode, India beBeeData Full time

    Job Title: Data Engineering SpecialistOur organization is seeking a highly skilled data engineering professional to join our team. This is an exciting opportunity for an experienced professional in data engineering to design, develop, and optimize large-scale data pipelines.The ideal candidate should have extensive experience with SQL, Ab-Initio, Teradata,...


  • Erode, India Luxoft Full time

    Project Description:We are seeking a highly skilled Snowflake Data Engineer with 7 years of IT experience to design, build, and optimize scalable data pipelines and cloud-based solutions across AWS, Azure, and GCP. The ideal candidate will have strong expertise in Snowflake, ETL Tools like DBT, Python, visualization tools like Tableau and modern CI/CD...


  • Erode, India Luxoft Full time

    Project Description:We are seeking a highly skilled Snowflake Data Engineer with 7 years of IT experience to design, build, and optimize scalable data pipelines and cloud-based solutions across AWS, Azure, and GCP. The ideal candidate will have strong expertise in Snowflake, ETL Tools like DBT, Python, visualization tools like Tableau and modern CI/CD...


  • erode, India beBeeData Full time

    About the RoleAs a Cloud Data Engineer, you will play a crucial part in designing, developing, and maintaining scalable ETL pipelines using cloud-native tools. Your expertise will be invaluable in architecting and implementing data lakes and data warehouses on cloud platforms.You will develop and optimize data ingestion, transformation, and loading processes...