 
						Lead Data Engineer- GCP
3 weeks ago
Job Overview: We are looking for a skilled and motivated Lead Data Engineer with strong experience in Python programming and Google Cloud Platform (GCP) to join our data engineering team. The ideal candidate will be responsible for requirements gathering, designing, architecting the solution, developing, and maintaining robust and scalable ETL (Extract, Transform, Load) & ELT data pipelines. The role involves working with customers directly, gathering requirements, discovery phase, designing, architecting the solution, using various GCP services, implementing data transformations, data ingestion, data quality, and consistency across systems, and post post-delivery support.Experience Level: 10 to 12 years of relevant IT experienceKey Responsibilities:Design, develop, test, and maintain scalable ETL data pipelines using Python.Architect the enterprise solutions with various technologies like Kafka, multi-cloud services, auto-scaling using GKE, Load balancers, APIGEE proxy API management, DBT, using LLMs as needed in the solution, redaction of sensitive information, DLP (Data Loss Prevention) etc.Work extensively on Google Cloud Platform (GCP) services such as:Dataflow for real-time and batch data processingCloud Functions for lightweight serverless computeBigQuery for data warehousing and analyticsCloud Composer for orchestration of data workflows (on Apache Airflow)Google Cloud Storage (GCS) for managing data at scaleIAM for access control and securityCloud Run for containerized applicationsShould have experience in the following areas : API framework: Python FastAPIProcessing engine: Apache SparkMessaging and streaming data processing: KafkaStorage: MongoDB, Redis/BigtableOrchestration: AirflowExperience in deployments in GKE, Cloud Run.Perform data ingestion from various sources and apply transformation and cleansing logic to ensure high-quality data delivery.Implement and enforce data quality checks, validation rules, and monitoring.Collaborate with data scientists, analysts, and other engineering teams to understand data needs and deliver efficient data solutions.Manage version control using GitHub and participate in CI/CD pipeline deployments for data projects.Write complex SQL queries for data extraction and validation from relational databases such as SQL Server, Oracle, or PostgreSQL.Document pipeline designs, data flow diagrams, and operational support procedures.Required Skills:10 to 12 years of hands-on experience in Python for backend or data engineering projects.Strong understanding and working experience with GCP cloud services (especially Dataflow, BigQuery, Cloud Functions, Cloud Composer, etc.).Solid understanding of data pipeline architecture, data integration, and transformation techniques.Experience in working with version control systems like GitHub and knowledge of CI/CD practices.Experience in Apache Spark, Kafka, Redis, Fast APIs, Airflow, GCP Composer DAGs.Strong experience in SQL with at least one enterprise database (SQL Server, Oracle, PostgreSQL, etc.).Experience in data migrations from on-premise data sources to Cloud platforms.Good to Have (Optional Skills):Experience working with the Snowflake cloud data platform.Hands-on knowledge of Databricks for big data processing and analytics.Familiarity with Azure Data Factory (ADF) and other Azure data engineering tools.Additional Details:Excellent problem-solving and analytical skills.Strong communication skills and ability to collaborate in a team environment.Education: Bachelor's degree in Computer Science, a related field, or equivalent experience.
- 
					  Data Engineer3 weeks ago 
 Hyderabad, Telangana, India, Telangana Talescope Full timeAbout the Role:We are looking for a highly experienced and hands-on GCP Data Engineer (AdTech) to lead our analytics team. This role is ideal for someone with a strong background in log-level data handling, cross-platform data engineering, and a solid command of modern BI tools. You'll play a key role in building scalable data pipelines, leading analytics... 
- 
					  GCP Data Engineer2 weeks ago 
 Hyderabad, Telangana, India, Telangana Egen Full timeWe are looking for a skilled and motivated Data Engineer with strong experience in Python programming and Google Cloud Platform (GCP) to join our data engineering team. The ideal candidate will be responsible for designing, developing, and maintaining robust and scalable ETL (Extract, Transform, Load) data pipelines. The role involves working with various... 
- 
					  GCP Senior Data Engineer2 weeks ago 
 Hyderabad, Telangana, India, Telangana People Prime Worldwide Full timeAbout Company : Our Client is a leading Indian multinational IT services and consulting firm. It provides digital transformation, cloud computing, data analytics, enterprise application integration, infrastructure management, and application development services. The company caters to over 700 clients across industries such as banking and financial services,... 
- 
					  GCP Data Architect2 weeks ago 
 Hyderabad, Telangana, India, Telangana Tata Consultancy Services Full timeJob Title :- GCP Data ArchitectExperience: 7 to 12 yearsLocation: Pan IndiaVirtual Drive : 10am to 4pmDate: 11th Oct 2025Greetings from TCS!!!Job Description:Design and Implement Data Architectures: Architect and build scalable, end-to-end data solutions on GCP, encompassing data ingestion, transformation, storage, and consumption.Develop Data Pipelines:... 
- 
					  Lead Data Engineer2 weeks ago 
 Hyderabad, Telangana, India, Telangana People Prime Worldwide Full timeSenior & Lead Data EngineerJob Details:Client: EgenRole: Senior & lead Data EngineerExperience: 7-12 YearsLocation: HyderabadInterview Mode: F2F Interview on mondayNotice Period Preference: Immediate to 30 Days Mandatory Skills Matrix :Skill Mandatory/Good to Have1 .Python Mandatory2 .GCP Mandatory3.RESTful APIs or FastAPI Mandatory4.GitHub... 
- 
					  Gcp Data Engineer2 days ago 
 Hyderabad, Telangana, India People Prime Worldwide Full time ₹ 6,00,000 - ₹ 18,00,000 per yearRole : GCP Data EngineerExperience : 10 to 15 YearsWork Mode : HybridLocation : HyderabadJob Overview:We are looking for a skilled and motivated Lead Data Engineer with strong experience in Python programming and Google Cloud Platform (GCP) to join our data engineering team. The ideal candidate will be responsible for requirements gathering, designing,... 
- 
					  Lead Data Engineer with GCP1 week ago 
 Hyderabad, Telangana, India Egen (Formerly SpringML) Full time ₹ 15,00,000 - ₹ 20,00,000 per yearJob Overview:We are looking for a skilled and motivated Lead Data Engineer with strong experience in Python programming and Google Cloud Platform (GCP) to join our data engineering team. The ideal candidate will be responsible for requirements gathering, designing, architecting the solution, developing, and maintaining robust and scalable ETL (Extract,... 
- 
					  Lead Data Engineer2 weeks ago 
 Hyderabad, Telangana, India, Telangana People Prime Worldwide Full timeWe’re Hiring: Senior / Lead Data Engineer – Python & GCP | Hyderabad (Hybrid)Are you someone who loves building scalable data pipelines, designing cloud-native solutions, and solving complex data challenges?Here’s your chance to join a high-performing team working on cutting-edge Data Engineering projects on Google Cloud Platform (GCP)! Role: Senior /... 
- 
					  GCP Data Engineer2 weeks ago 
 Hyderabad, Telangana, India Talent21 Full time ₹ 12,50,000 - ₹ 25,00,000 per yearRole & responsibilities-GCP Data Engineer , Big Querry, Data flow, Data Composer, SQL, Python, Pyspark along with 3+ Years strong GCP experience & worked as Module lead / Tech LeadExp - 4 to 15 yrsLocations - Pan IndiaWork Mode - HybridNotice Period - Immediate to 30 Days & Serving notice period 
- 
					Lead Engineer GCP5 days ago 
 Hyderabad, Telangana, India Syncarp Full time ₹ 15,00,000 - ₹ 25,00,000 per yearLooking for Lead Engineer / Technical Lead for a Global IT Service Provider based out of Chennai/Bangalore/Hyderabad/Pune Locations.Job Description:Lead Engineer:Exp: 5-8 YearsDesign, develop, and maintain simulation services and tools for ML feature, model, and rule evaluation.Build and optimize data pipelines for point-in-time historical data access and...