
GCP data engineer
3 days ago
Overview: Design and implement complex ETL/ELT pipelines using PySpark and Airflow for large-scale data processing on GCP.Lead data migration initiatives, including automating the movement of Teradata tables to BigQuery, ensuring data accuracy and consistency. Develop robust frameworks to streamline batch and streaming data ingestion workflows, leveraging Kafka, Dataflow, and NiFi. Collaborate with data scientists to build ML-ready data layers and support analytics solutions. Conduct proof of concepts (POCs) and document performance benchmarking for data throughput and velocity, ensuring optimized data workflows. Enhance CI/CD pipelines using Jenkins and GitLab for efficient deployment and monitoring of data solutions. Collaborate in agile teams for product development and delivery. Ability to work independently and design data integrations and data quality framework. Responsibilities: Strong proficiency in Python and SQL for data engineering tasks. Strong understanding and experience with distributed computing principles and frameworks like Hadoop, Apache Spark etc. Advanced experience with GCP services, including BigQuery, Dataflow, Cloud Composer (Airflow), and Dataproc. Expertise in data modeling, ETL/ELT pipeline development, and workflow orchestration using Airflow DAGs. Hands-on experience with data migration from legacy systems (Teradata, Hive) to cloud platforms (BigQuery). Familiarity with streaming data ingestion tools like Kafka and NiFi. Strong problem-solving skills and experience with performance optimization in large-scale data environments. Proficiency in CI/CD tools (Jenkins, GitLab) and version control systems (Git). GCP Professional Data Engineer certification. Requirements: Design and implement complex ETL/ELT pipelines using PySpark and Airflow for large-scale data processing on GCP.Lead data migration initiatives, including automating the movement of Teradata tables to BigQuery, ensuring data accuracy and consistency. Develop robust frameworks to streamline batch and streaming data ingestion workflows, leveraging Kafka, Dataflow, and NiFi. Collaborate with data scientists to build ML-ready data layers and support analytics solutions. Conduct proof of concepts (POCs) and document performance benchmarking for data throughput and velocity, ensuring optimized data workflows. Enhance CI/CD pipelines using Jenkins and GitLab for efficient deployment and monitoring of data solutions. Collaborate in agile teams for product development and delivery. Ability to work independently and design data integrations and data quality framework.Strong proficiency in Python and SQL for data engineering tasks. Strong understanding and experience with distributed computing principles and frameworks like Hadoop, Apache Spark etc. Advanced experience with GCP services, including BigQuery, Dataflow, Cloud Composer (Airflow), and Dataproc. Expertise in data modeling, ETL/ELT pipeline development, and workflow orchestration using Airflow DAGs. Hands-on experience with data migration from legacy systems (Teradata, Hive) to cloud platforms (BigQuery). Familiarity with streaming data ingestion tools like Kafka and NiFi. Strong problem-solving skills and experience with performance optimization in large-scale data environments. Proficiency in CI/CD tools (Jenkins, GitLab) and version control systems (Git). GCP Professional Data Engineer certification.
-
GCP data engineer
2 days ago
Guindy, Chennai, Tamil Nadu, India Prodapt Solutions Full time ₹ 8,00,000 - ₹ 24,00,000 per yearOverview:Design and implement complex ETL/ELT pipelines using PySpark and Airflow for large-scale data processing on GCP.Lead data migration initiatives, including automating the movement of Teradata tables to BigQuery, ensuring data accuracy and consistency.Develop robust frameworks to streamline batch and streaming data ingestion workflows, leveraging...
-
Senior Technical Lead
1 week ago
Chennai - Guindy, India Jobted IN C2 Full timeOverview: Prodapt is lookig for Senior Technical Lead in Data Engineering stream having 11+ years of experience having hands on expertise on GCP, BQ, Data Pipeline creation & ETL. Responsibilities: - Understanding the requirements - Performing in-depth analysis on data - Setting up secure data pipelines from ingestion to delivery - Individual...
-
▷ (3 Days Left) Senior GCP Data Engineer
1 week ago
Chennai - Guindy, India Jobted IN C2 Full timeOverview: Prodapt is looking for senior data engineers having 3+ years of experience having hands on expertise on GCP, BQ, Data Pipeline creation & ETL. Responsibilities: - Understanding the requirements - Performing in-depth analysis on data - Setting up secure data pipelines from ingestion to delivery - Individual contributor Requirements: 3- 5 years...
-
GCP Data Engineer
2 weeks ago
Chennai, Tamil Nadu, India Prodapt Solutions Full time ₹ 15,00,000 - ₹ 25,00,000 per yearJob Summary:We are seeking a skilled GCP Data Engineer to design, develop, and manage large-scale data pipelines and cloud-based data solutions on Google Cloud Platform (GCP). The role requires expertise in data engineering, ETL/ELT processes, and GCP services to deliver scalable, reliable, and high-performance data platforms that support analytics and...
-
GCP Data Engineer
6 days ago
Chennai, India Tata Consultancy Services Full time- ng, transformation, and performance tuning. - Experience with Apache Airflow for orchestration of data workflows. - Familiarity with CI/CD pipelines and version control (e.g., Git). - Strong problem-solving skills and ability to work in a fast-paced environment. Preferred Qualifications: - GCP Professional Data Engineer certification. - Experience with...
-
GCP Data Engineer
6 days ago
Chennai, India Tata Consultancy Services Full time- ng, transformation, and performance tuning. - Experience with Apache Airflow for orchestration of data workflows. - Familiarity with CI/CD pipelines and version control (e.g., Git). - Strong problem-solving skills and ability to work in a fast-paced environment. Preferred Qualifications: - GCP Professional Data Engineer certification. - Experience with...
-
GCP Data Engineer
4 days ago
Chennai, India Tata Consultancy Services Full time- ng, transformation, and performance tuning.- Experience with Apache Airflow for orchestration of data workflows.- Familiarity with CI/CD pipelines and version control (e.g., Git).- Strong problem-solving skills and ability to work in a fast-paced environment.Preferred Qualifications:- GCP Professional Data Engineer certification.- Experience with data...
-
GCP Data Engineer
7 days ago
Chennai, India Tata Consultancy Services Full time- ng, transformation, and performance tuning.- Experience with Apache Airflow for orchestration of data workflows.- Familiarity with CI/CD pipelines and version control (e.g., Git).- Strong problem-solving skills and ability to work in a fast-paced environment.Preferred Qualifications:- GCP Professional Data Engineer certification.- Experience with data...
-
GCP Data Engineer
5 days ago
Chennai, India Tata Consultancy Services Full time- ng, transformation, and performance tuning.- Experience with Apache Airflow for orchestration of data workflows.- Familiarity with CI/CD pipelines and version control (e.g., Git).- Strong problem-solving skills and ability to work in a fast-paced environment.Preferred Qualifications:- GCP Professional Data Engineer certification.- Experience with data...
-
GCP Data Engineer
1 week ago
Chennai, Tamil Nadu, India Qode Full time ₹ 12,00,000 - ₹ 36,00,000 per yearGCP Data Engineer Location: Chennai Workplace Type: HybridAbout the RoleWe are seeking a highly skilled and experienced Data Engineer to join our growing data team. In this role, you will be responsible for designing, building, and maintaining our data infrastructure on Google Cloud Platform (GCP). You will work closely with data scientists, analysts, and...