GCP data engineer

1 day ago


Guindy Chennai Tamil Nadu, India Prodapt Solutions Full time ₹ 8,00,000 - ₹ 24,00,000 per year

Overview:

Design and implement complex ETL/ELT pipelines using PySpark and Airflow for large-scale data processing on GCP.Lead data migration initiatives, including automating the movement of Teradata tables to BigQuery, ensuring data accuracy and consistency.

Develop robust frameworks to streamline batch and streaming data ingestion workflows, leveraging Kafka, Dataflow, and NiFi.

Collaborate with data scientists to build ML-ready data layers and support analytics solutions.

Conduct proof of concepts (POCs) and document performance benchmarking for data throughput and velocity, ensuring optimized data workflows.

Enhance CI/CD pipelines using Jenkins and GitLab for efficient deployment and monitoring of data solutions.

Collaborate in agile teams for product development and delivery.

Ability to work independently and design data integrations and data quality framework.

Responsibilities:

Strong proficiency in Python and SQL for data engineering tasks.

Strong understanding and experience with distributed computing principles and frameworks like Hadoop, Apache Spark etc.

Advanced experience with GCP services, including BigQuery, Dataflow, Cloud Composer (Airflow), and Dataproc.

Expertise in data modeling, ETL/ELT pipeline development, and workflow orchestration using Airflow DAGs.

Hands-on experience with data migration from legacy systems (Teradata, Hive) to cloud platforms (BigQuery).

Familiarity with streaming data ingestion tools like Kafka and NiFi.

Strong problem-solving skills and experience with performance optimization in large-scale data environments.

Proficiency in CI/CD tools (Jenkins, GitLab) and version control systems (Git).

GCP Professional Data Engineer certification.

Requirements:

Design and implement complex ETL/ELT pipelines using PySpark and Airflow for large-scale data processing on GCP.Lead data migration initiatives, including automating the movement of Teradata tables to BigQuery, ensuring data accuracy and consistency.

Develop robust frameworks to streamline batch and streaming data ingestion workflows, leveraging Kafka, Dataflow, and NiFi.

Collaborate with data scientists to build ML-ready data layers and support analytics solutions.

Conduct proof of concepts (POCs) and document performance benchmarking for data throughput and velocity, ensuring optimized data workflows.

Enhance CI/CD pipelines using Jenkins and GitLab for efficient deployment and monitoring of data solutions.

Collaborate in agile teams for product development and delivery.

Ability to work independently and design data integrations and data quality framework.Strong proficiency in Python and SQL for data engineering tasks.

Strong understanding and experience with distributed computing principles and frameworks like Hadoop, Apache Spark etc.

Advanced experience with GCP services, including BigQuery, Dataflow, Cloud Composer (Airflow), and Dataproc.

Expertise in data modeling, ETL/ELT pipeline development, and workflow orchestration using Airflow DAGs.

Hands-on experience with data migration from legacy systems (Teradata, Hive) to cloud platforms (BigQuery).

Familiarity with streaming data ingestion tools like Kafka and NiFi.

Strong problem-solving skills and experience with performance optimization in large-scale data environments.

Proficiency in CI/CD tools (Jenkins, GitLab) and version control systems (Git).

GCP Professional Data Engineer certification.


  • GCP data engineer

    3 weeks ago


    Chennai - Guindy, India Jobted IN C2 Full time

    Overview: Design and implement complex ETL/ELT pipelines using PySpark and Airflow for large-scale data processing on GCP.Lead data migration initiatives, including automating the movement of Teradata tables to BigQuery, ensuring data accuracy and consistency. Develop robust frameworks to streamline batch and streaming data ingestion workflows, leveraging...

  • GCP Data Engineer

    3 weeks ago


    Chennai, Tamil Nadu, India, Tamil Nadu Tata Consultancy Services Full time

    TCS Hiring for GCP Data Engineers Role - GCP Data EngineersExperience - 10 to 12 Years Location - Bengaluru, Chennai, Hyderabad, Pune Roles & ResponsibilitiesProficiency in programming languages: Python, JavaExpertise in data processing frameworks: Apache Beam (Data Flow)Active experience on GCP tools and technologies like Big Query, Dataflow, Cloud...

  • GCP Data Engineer

    4 weeks ago


    Chennai, Tamil Nadu, India, Tamil Nadu Tata Consultancy Services Full time

    ng, transformation, and performance tuning. Experience with Apache Airflow for orchestration of data workflows. Familiarity with CI/CD pipelines and version control (e.g., Git). Strong problem-solving skills and ability to work in a fast-paced environment. Preferred Qualifications: GCP Professional Data Engineer certification. Experience with data modeling...

  • GCP Data Engineer

    2 weeks ago


    Chennai, Tamil Nadu, India, Tamil Nadu Tata Consultancy Services Full time

    Position- GCP Data EngineerExperience-5 to 10 yearsLocation- Chennai, DelhiNotice Period- 0 to 15 days / Serving Notice periodSkills: GCP, PySpark, Python, HDFS, Hadoop, SQLMust-HaveGood Hands on knowledge on GCP.Should have worked on Data Migration projects, from On-prem to CloudShould have Cloud Storage Knowledge, Big Query, Cluster KnowledgeSound...


  • Chennai - Guindy, India Jobted IN C2 Full time

    Overview: Join the Prodapt team in building AI/ML Solutions. If you have 3+ years of experience & would like to work on a cutting-edge ML platform, enabling efficient analysis, testing, and decision-making at scale. Responsibilities: - Design, develop, and maintain simulation services and tools for ML feature, model, and rule evaluation. - Build and optimize...

  • GCP Data Engineer

    2 weeks ago


    Chennai, India Tata Consultancy Services Full time

    TCS Hiring for GCP Data Engineers Role - GCP Data Engineers Experience - 10 to 12 Years Location - Bengaluru, Chennai, Hyderabad, Pune Roles & Responsibilities Proficiency in programming languages: Python, Java Expertise in data processing frameworks: Apache Beam (Data Flow) Active experience on GCP tools and technologies like Big Query, Dataflow, Cloud...

  • GCP Data Engineer

    2 weeks ago


    Chennai, India Tata Consultancy Services Full time

    TCS Hiring for GCP Data Engineers Role - GCP Data Engineers Experience - 10 to 12 Years Location - Bengaluru, Chennai, Hyderabad, Pune Roles & Responsibilities Proficiency in programming languages: Python, Java Expertise in data processing frameworks: Apache Beam (Data Flow) Active experience on GCP tools and technologies like Big Query, Dataflow, Cloud...

  • GCP Data Engineer

    3 weeks ago


    Chennai, India Tata Consultancy Services Full time

    TCS Hiring for GCP Data Engineers Role - GCP Data Engineers Experience - 10 to 12 Years Location - Bengaluru, Chennai, Hyderabad, Pune Roles & Responsibilities - Proficiency in programming languages: Python, Java - Expertise in data processing frameworks: Apache Beam (Data Flow) - Active experience on GCP tools and technologies like Big Query, Dataflow,...

  • GCP Data Engineer

    2 weeks ago


    Chennai, Tamil Nadu, India Prodapt Solutions Full time ₹ 15,00,000 - ₹ 25,00,000 per year

    Job Summary:We are seeking a skilled GCP Data Engineer to design, develop, and manage large-scale data pipelines and cloud-based data solutions on Google Cloud Platform (GCP). The role requires expertise in data engineering, ETL/ELT processes, and GCP services to deliver scalable, reliable, and high-performance data platforms that support analytics and...

  • GCP Data Engineer

    2 weeks ago


    Chennai, India Tata Consultancy Services Full time

    TCS Hiring for GCP Data EngineersRole - GCP Data EngineersExperience - 10 to 12 YearsLocation - Bengaluru, Chennai, Hyderabad, PuneRoles & Responsibilities- Proficiency in programming languages: Python, Java- Expertise in data processing frameworks: Apache Beam (Data Flow)- Active experience on GCP tools and technologies like Big Query, Dataflow, Cloud...