Gcp data engineer
3 weeks ago
#offshorejobs #india #remotework #GCPDataengineer Seeking a GCP Certified Data Engineer to work remotely from India for our Fortune 5 healthcare client in the US. Remote work from India 2nd shift work hours Must be immediate joiner. No notice periods more than 15 days. Requirements : 5 years of proven hands-on experience with GCP data services. GCP Google Cloud Professional Data Engineer Certification. Strong understanding of distributed systems and data engineering best practices. Excellent communication and documentation skills. SAFe Agile methodology Experience working in onshore/offshore support model collaborating work with offshore teams. Experience working with Git Hub, RTC, automation tools. Preferred experience AI programming, IVR Technologies like AVAYA, CISCO, Chatbot etc. Required - minimum 5 years of proven hands-on experience in the following: Design and implement robust data pipelines using Google Cloud Platform (GCP) services such as Big Query, Cloud Storage, and Pub/Sub. Develop and manage workflows using Cloud Composer (Apache Airflow) for efficient scheduling and orchestration. Write clean, efficient, and scalable code in Python, leveraging advanced programming techniques. Craft complex SQL queries in Big Query, including window functions, CTEs, and performance tuning strategies. Build and maintain real-time data processing systems using Apache Kafka. Model and manage No SQL databases, particularly Mongo DB, with a focus on scalable schema design. Utilize Shell scripting and perform Linux system administration tasks to support data infrastructure. Conduct data profiling and implement validation techniques to ensure data quality and integrity. Develop and maintain API integration scripts for seamless service automation and data exchange. Troubleshoot and resolve data-related issues with strong analytical and problem-solving skills. Create and maintain data flow diagrams to clearly communicate architecture and pipeline logic to stakeholders.
-
Gcp Data Engineer @ Remote
2 weeks ago
Aurangabad, India Whatjobs IN C2 Full timePosition: GCP Data Engineer LOcation: Remote Duration: Full Time Essential Skills: 6+ years of professional experience in data engineering or a similar role. Hands-on experience with Google Cloud Platform (GCP) services (BigQuery, Cloud Run, Scheduler). Proficiency in SQL for data manipulation and analysis. Experience with dbt. Familiarity with...
-
GCP Data Architect
2 weeks ago
Aurangabad, India HCLTech Full timeJob Title - GCP Data ArchitectExperience - 15 to 20 yearsLocation - Bangalore/Chennai/Hyderabad/NoidaKey ResponsibilitiesArchitectural Leadership: Contribute to architectural decisions and support the Architecture Lead in defining and delivering technology strategy.Roadmaps & Patterns: Help create and maintain architecture roadmaps, guardrails, and patterns,...
-
GCP Data Architect
2 weeks ago
Aurangabad, India HCLTech Full timeJob Title - GCP Data ArchitectExperience - 15 to 20 yearsLocation - Bangalore/Chennai/Hyderabad/NoidaKey ResponsibilitiesArchitectural Leadership: Contribute to architectural decisions and support the Architecture Lead in defining and delivering technology strategy.Roadmaps & Patterns: Help create and maintain architecture roadmaps, guardrails, and patterns,...
-
Data Engineer
1 day ago
Aurangabad, India SII Group USA Full timeJob Description: Data Engineering Specialist (PySpark / Databricks) – Noida, India Company: SII Group Location: Noida, India Employment Type: Full-time Role Type: Data Engineer / Senior Data Engineer (depending on experience) About SII Group USA SII Group USA is part of the global SII Group, a leading provider of engineering, consulting, and digital...
-
Data Architect
3 weeks ago
Aurangabad, India Whatjobs IN C2 Full timePosition Overview: The Data Architect is responsible for designing, implementing, and managing the data architecture of the organization. This includes ensuring that data is accurate, accessible, secure, and scalable across all systems. The role focuses on developing data models , database solutions , and integration frameworks that support business...
-
Data Engineer
2 weeks ago
Aurangabad, India NexionPro Services Full timeJob Title: Data EngineerExperience: 5–10 YearsLocation: Bangalore (ONLY)Mandatory Skills: SQL, ETL, PySpark, PythonKey ResponsibilitiesDesign, develop, and maintain scalable ETL/ELT pipelines to ingest and transform large datasets from diverse sources.Build and optimize PySpark-based data processing workflows for batch and real-time data use cases.Write...
-
Data Engineer
2 weeks ago
Aurangabad, India NexionPro Services Full timeJob Title: Data EngineerExperience: 5–10 YearsLocation: Bangalore (ONLY)Mandatory Skills: SQL, ETL, PySpark, PythonKey ResponsibilitiesDesign, develop, and maintain scalable ETL/ELT pipelines to ingest and transform large datasets from diverse sources.Build and optimize PySpark-based data processing workflows for batch and real-time data use cases.Write...
-
Data Specialist
2 weeks ago
Aurangabad, India Whatjobs IN C2 Full timeAbout the Role: The Data Specialist evaluates learners’ technical proficiency in data engineering, analytics, and cloud data platforms. This role involves comprehensive assessment across SQL, data pipeline development, data modeling, and visualization capabilities, while providing actionable feedback to accelerate learner development.. The specialist...
-
Data Engineer
2 hours ago
Aurangabad, India Staffingine LLC Full timeData EngineerIndia, RemoteFull Time /Contract# of Positions - 3Job SummaryThe Data Engineer will be responsible for designing, developing, and optimizing scalable data pipelines and cloud-based data solutions. This role requires strong Python programming skills, expertise in ETL/ELT processes, and deep hands-on experience with AWS cloud services such as S3,...
-
Data Engineer
3 weeks ago
Aurangabad, India meanSquare.ai Full timeData Engineer (onsite | Offshore | PST Overlap Required)We’re looking for an experienced and independent Data Engineer with 1-5 years of experience to help us design and build data systems that are clean, scalable, and reusable. If you enjoy working with modern data tools, solving real-world data problems, and collaborating with a supportive...