
Gcp Data Engineer
2 weeks ago
The GCP Data Engineer will be responsible for designing, developing, and maintaining data pipelines and data infrastructure on Google Cloud Platform (GCP). This role requires expertise in data engineering best practices, cloud architecture, and big data technologies. The ideal candidate will work closely with data scientists, analysts, and other stakeholders to ensure the availability, reliability, and efficiency of data systems, enabling data-driven decision-making across the organization.
Key Responsibilities
Data Pipeline Development
- Design, develop, and maintain scalable and efficient ETL/ELT pipelines on GCP.
- Implement data ingestion processes from various data sources (e.g., APIs, databases, file systems).
- Ensure data quality, integrity, and reliability throughout the data lifecycle.
Cloud Architecture
- Design and implement data architecture on GCP using services such as BigQuery, Dataflow, Pub/Sub, Cloud Storage, and Cloud Composer.
- Optimize and manage data storage and retrieval processes to ensure high performance and cost efficiency.
- Ensure data infrastructure is secure, scalable, and aligned with industry best practices.
Big Data Processing
- Develop and manage large-scale data processing workflows using Apache Beam, Dataflow, and other big data technologies.
- Implement real-time data streaming solutions using Pub/Sub and Dataflow.
- Optimize data processing jobs for performance and cost.
Collaboration and Communication
- Work closely with data scientists, analysts, and other stakeholders to understand data requirements and deliver solutions that meet business needs.
- Communicate technical concepts effectively to both technical and non-technical stakeholders.
- Participate in agile development processes, including sprint planning, stand-ups, and retrospectives.
Data Management and Governance
- Implement and maintain data governance practices, including data cataloging, metadata management, and data lineage.
- Ensure compliance with data security and privacy regulations.
- Monitor and manage data quality and consistency.
Troubleshooting and Support
- Debug and resolve technical issues related to data pipelines and infrastructure.
- Provide support and maintenance for existing data solutions.
- Continuously monitor and improve data pipeline performance and reliability.
Qualifications
- Education: Bachelors degree in Computer Science, Information Technology, Data Science, or a related field.
Experience:
- Minimum of 4-12 years of experience in data engineering.
- Proven experience with GCP data services and tools.
Technical Skills:
- Proficiency in GCP services (e.g., BigQuery, Dataflow, Pub/Sub, Cloud Storage, Cloud Composer).
- Strong programming skills in languages such as Python
- Familiarity with big data technologies and frameworks (e.g., Apache Beam, Hadoop, Spark).
- Knowledge of containerization and orchestration tools (e.g., Docker, Kubernetes) is a plus.
Key Competencies
- Strong problem-solving skills and attention to detail.
- Excellent communication and teamwork skills.
- Ability to work in a fast-paced, dynamic environment.
- Self-motivated and able to work independently as well as part of a team.
- Continuous learning mindset and a passion for staying up-to-date with emerging technologies.
-
GCP Data Engineer
2 weeks ago
Gurgaon, Haryana, India Cibronix Software Pvt Ltd Full time ₹ 12,00,000 - ₹ 36,00,000 per yearWe're Hiring: GCP Data Engineer (Remote | 7+ Years Experience)We are looking for a highly skilledGCP Data Engineerwith7+ years of experiencein data engineering and cloud technologies. The ideal candidate will have strong expertise inGoogle Cloud Platform (GCP), BigQuery, Python, and SQL.If you're passionate about building scalable data pipelines and...
-
Gcp Data Engineer
1 week ago
Gurgaon, Haryana, India Impetus Technologies Full time ₹ 15,00,000 - ₹ 20,00,000 per yearJob DescriptionThe candidate should have extensive production experience (2+ Years ) in GCPStrong background in Data engineering 2-3 Years of exp in Big Data technologies including, Hadoop, NoSQL, Spark, Kafka etc.Exposure to enterprise application development is a must.Roles & Responsibilities4-10 years of IT experience range is preferred.Able to...
-
Senior Data Engineer GCP Migration
4 days ago
Gurgaon, Haryana, India Xebia It Architects Full time ₹ 8,00,000 - ₹ 20,00,000 per yearWere Hiring: Senior Data Engineer GCP Migration | Gurgaon (Hybrid, 3 days office per week)Location: Gurgaon (Hybrid 3 days in office per week)Start Date: End of SeptemberExperience Level: 5+ years relevant Data Engineering experienceAbout the Role: We are looking for a highly skilled Senior Data Engineer to lead and support our strategic migration of data...
-
GCP Data Analyst
2 weeks ago
Gurgaon, Haryana, India Rackspace Technology Full time ₹ 15,00,000 - ₹ 20,00,000 per yearAbout the RoleWere seeking a GCP Data Analyst with deep expertise in BigQuery, strong SQL and Python skills, and a sharp analytical mindset to support both data validation initiatives and ongoing analytics work. This role is ideal for someone who can navigate large datasets, build robust queries, and identify inconsistencies with precision and insight.The...
-
Gcp Data Engineer(PART TIME)
1 week ago
Gurgaon, Haryana, India Nexdata Dynamic Consultin Full time ₹ 9,00,000 - ₹ 12,00,000 per yearThis is Part time role, you can do it with your job.Responsibilities:* Design, develop, and maintain GCP data pipelines using BigQuery, Cloud Functions, and Cloud Run.
-
GCP Devops Engineer
5 days ago
Gurgaon, Haryana, India Innova ESI Full time ₹ 15,00,000 - ₹ 25,00,000 per yearRole: GCP DevOps EngineerLocation: GurgaonExperience: 4+yearsIMMEDIATE JOINERS ONLYJob DescriptionWe are hiring a Cloud Support and DevOps Engineer (Associate L2) with hands-on experience inGCP, Infrastructure as Code (IaC), CI/CD pipelines, and containerized environments. The ideal candidate is proactive, automation-focused, and passionate about delivering...
-
Junior Data Engineer
1 week ago
Gurgaon, Haryana, India Fabric Full time ₹ 9,00,000 - ₹ 12,00,000 per yearJob Application Job Summary:Junior Data Engineer role at Aays Analytics focused on designing and implementing data and analytics solutions on Google Cloud Platform (GCP). The position involves working with clients to reinvent their corporate finance functions through advanced analytics. Key responsibilities include architecture design, data modeling, and...
-
Gcp Data Engineer
1 week ago
Gurgaon, Haryana, India Lancesoft Full time ₹ 15,00,000 - ₹ 25,00,000 per yearRole & responsibilitiesCertificationsMinimum: Google Professional Cloud ArchitectAlternate: Google Professional Data Engineer Big Data Specialty Certification7+ years direct experience working in Enterprise Data Warehouse technologies3+ years in a customer facing role working with enterprise clientsExperience with implementing and/or maintaining technical...
-
GCP DevOps Engineer
1 week ago
Gurgaon, Haryana, India Innova ESI Full time ₹ 8,00,000 - ₹ 24,00,000 per yearRole: GCP DevOps EngineerExperience: 3+ YearsLocation: Gurugram / BangaloreNotice: Immediate Joiners OnlyJob Description:Key ResponsibilitiesDesign, implement, and support cloud infrastructure on GCP .Working knowledge of containerized workloads using GKE.Create and maintain Terraform-based IaC for consistent deployments.Build and maintain CI/CD pipelines...
-
Principal Data Engineer
2 weeks ago
Gurgaon, Haryana, India Fabric Full time ₹ 20,00,000 - ₹ 25,00,000 per yearJob Application Link: Job Summary:The Principal Data Engineer - GCP role involves designing and implementing end-to-end data and analytics solutions on Google Cloud Platform. The position requires expertise in GCP services like BigQuery, Dataflow, and Cloud Storage, along with strong data modeling skills. The role includes mentoring teams, participating in...