
GCP data engineer
1 day ago
Overview:
Design and implement complex ETL/ELT pipelines using PySpark and Airflow for large-scale data processing on GCP.Lead data migration initiatives, including automating the movement of Teradata tables to BigQuery, ensuring data accuracy and consistency.
Develop robust frameworks to streamline batch and streaming data ingestion workflows, leveraging Kafka, Dataflow, and NiFi.
Collaborate with data scientists to build ML-ready data layers and support analytics solutions.
Conduct proof of concepts (POCs) and document performance benchmarking for data throughput and velocity, ensuring optimized data workflows.
Enhance CI/CD pipelines using Jenkins and GitLab for efficient deployment and monitoring of data solutions.
Collaborate in agile teams for product development and delivery.
Ability to work independently and design data integrations and data quality framework.
Responsibilities:
Strong proficiency in Python and SQL for data engineering tasks.
Strong understanding and experience with distributed computing principles and frameworks like Hadoop, Apache Spark etc.
Advanced experience with GCP services, including BigQuery, Dataflow, Cloud Composer (Airflow), and Dataproc.
Expertise in data modeling, ETL/ELT pipeline development, and workflow orchestration using Airflow DAGs.
Hands-on experience with data migration from legacy systems (Teradata, Hive) to cloud platforms (BigQuery).
Familiarity with streaming data ingestion tools like Kafka and NiFi.
Strong problem-solving skills and experience with performance optimization in large-scale data environments.
Proficiency in CI/CD tools (Jenkins, GitLab) and version control systems (Git).
GCP Professional Data Engineer certification.
Requirements:
Design and implement complex ETL/ELT pipelines using PySpark and Airflow for large-scale data processing on GCP.Lead data migration initiatives, including automating the movement of Teradata tables to BigQuery, ensuring data accuracy and consistency.
Develop robust frameworks to streamline batch and streaming data ingestion workflows, leveraging Kafka, Dataflow, and NiFi.
Collaborate with data scientists to build ML-ready data layers and support analytics solutions.
Conduct proof of concepts (POCs) and document performance benchmarking for data throughput and velocity, ensuring optimized data workflows.
Enhance CI/CD pipelines using Jenkins and GitLab for efficient deployment and monitoring of data solutions.
Collaborate in agile teams for product development and delivery.
Ability to work independently and design data integrations and data quality framework.Strong proficiency in Python and SQL for data engineering tasks.
Strong understanding and experience with distributed computing principles and frameworks like Hadoop, Apache Spark etc.
Advanced experience with GCP services, including BigQuery, Dataflow, Cloud Composer (Airflow), and Dataproc.
Expertise in data modeling, ETL/ELT pipeline development, and workflow orchestration using Airflow DAGs.
Hands-on experience with data migration from legacy systems (Teradata, Hive) to cloud platforms (BigQuery).
Familiarity with streaming data ingestion tools like Kafka and NiFi.
Strong problem-solving skills and experience with performance optimization in large-scale data environments.
Proficiency in CI/CD tools (Jenkins, GitLab) and version control systems (Git).
GCP Professional Data Engineer certification.
-
GCP data engineer
2 days ago
Chennai - Guindy, India Jobted IN C2 Full timeOverview: Design and implement complex ETL/ELT pipelines using PySpark and Airflow for large-scale data processing on GCP.Lead data migration initiatives, including automating the movement of Teradata tables to BigQuery, ensuring data accuracy and consistency. Develop robust frameworks to streamline batch and streaming data ingestion workflows, leveraging...
-
GCP Data Engineer
4 days ago
Chennai, Tamil Nadu, India, Tamil Nadu Tata Consultancy Services Full timeng, transformation, and performance tuning. Experience with Apache Airflow for orchestration of data workflows. Familiarity with CI/CD pipelines and version control (e.g., Git). Strong problem-solving skills and ability to work in a fast-paced environment. Preferred Qualifications: GCP Professional Data Engineer certification. Experience with data modeling...
-
Data Engineer
4 days ago
tamil nadu, India iO Associates Full timeOur Client is a leading company in the tech industry, known for its innovation, growth opportunities, and excellent workplace culture.Role SummaryOur Client is seeking a Data Engineer (Python + GCP) to join their team in a contract position. This hire is crucial for supporting ongoing growth and innovation within the company. As a Data Engineer, you will...
-
Senior GCP Data Engineer
2 days ago
tamil nadu, India Getronics Full timeGreeting from Getronics!We have multiple opportunities for Senior GCP Data Engineers for our automotive client in Chennai Location.Position Description:Data Analytics team is seeking a GCP Data Engineer to create, deliver, and support custom data products, as well as enhance/expand team capabilities. They will work on analysing and manipulating large...
-
Senior Technical Lead
1 week ago
Chennai - Guindy, India Jobted IN C2 Full timeOverview: Prodapt is lookig for Senior Technical Lead in Data Engineering stream having 11+ years of experience having hands on expertise on GCP, BQ, Data Pipeline creation & ETL. Responsibilities: - Understanding the requirements - Performing in-depth analysis on data - Setting up secure data pipelines from ingestion to delivery - Individual...
-
▷ (3 Days Left) Senior GCP Data Engineer
1 week ago
Chennai - Guindy, India Jobted IN C2 Full timeOverview: Prodapt is looking for senior data engineers having 3+ years of experience having hands on expertise on GCP, BQ, Data Pipeline creation & ETL. Responsibilities: - Understanding the requirements - Performing in-depth analysis on data - Setting up secure data pipelines from ingestion to delivery - Individual contributor Requirements: 3- 5 years...
-
GCP Data Engineer
2 weeks ago
Chennai, Tamil Nadu, India Prodapt Solutions Full time ₹ 15,00,000 - ₹ 25,00,000 per yearJob Summary:We are seeking a skilled GCP Data Engineer to design, develop, and manage large-scale data pipelines and cloud-based data solutions on Google Cloud Platform (GCP). The role requires expertise in data engineering, ETL/ELT processes, and GCP services to deliver scalable, reliable, and high-performance data platforms that support analytics and...
-
GCP Data Engineer
6 days ago
Chennai, India Tata Consultancy Services Full time- ng, transformation, and performance tuning. - Experience with Apache Airflow for orchestration of data workflows. - Familiarity with CI/CD pipelines and version control (e.g., Git). - Strong problem-solving skills and ability to work in a fast-paced environment. Preferred Qualifications: - GCP Professional Data Engineer certification. - Experience with...
-
GCP Data Engineer
6 days ago
Chennai, India Tata Consultancy Services Full time- ng, transformation, and performance tuning. - Experience with Apache Airflow for orchestration of data workflows. - Familiarity with CI/CD pipelines and version control (e.g., Git). - Strong problem-solving skills and ability to work in a fast-paced environment. Preferred Qualifications: - GCP Professional Data Engineer certification. - Experience with...
-
GCP Data Engineer
4 days ago
Chennai, India Tata Consultancy Services Full time- ng, transformation, and performance tuning.- Experience with Apache Airflow for orchestration of data workflows.- Familiarity with CI/CD pipelines and version control (e.g., Git).- Strong problem-solving skills and ability to work in a fast-paced environment.Preferred Qualifications:- GCP Professional Data Engineer certification.- Experience with data...