Gcp data engineer
4 weeks ago
Location : Gurgaon and Bangalore Experience : 4-7 years Description: We are looking for talented GCP Data Engineer having minimum experience of 2 year in GCP Cloud, passionate about technology, motivated for continuous learning and an individual who views every client interaction as an opportunity to create an exceptional customer experience. Qualifications: Must have: · BE/B. Tech/MCA/MS-IT/CS/B. Sc/BCA or any other degrees in related fields · Expertise and hands-on experience on Big data, Hadoop, Hive, SQL, and Spark · Expertise knowledge on GCP cloud. Job Description - The candidate should have extensive production experience (1-2 Years ) in GCP, Other cloud experience would be a strong bonus. - Strong background in Data engineering 2-3 Years of exp in Big Data technologies including, Hadoop, No SQL, Spark, Kafka etc. - Exposure to enterprise application development is a must Roles & Responsibilities 4-7 years of IT experience range is preferred. Able to effectively use GCP managed services e.g. Dataproc, Dataflow, pub/sub, Cloud functions, Big Query, GCS - At least 4 of these Services. Good to have knowledge on Cloud Composer, Cloud SQL, Big Table, Cloud Function. Strong experience in Big Data technologies – Hadoop, Sqoop, Hive and Spark including Dev OPs. Good hands on expertise on either Python or Java programming. Good Understanding of GCP core services like Google cloud storage, Google compute engine, Cloud SQL, Cloud IAM. Good to have knowledge on GCP services like App engine, GKE, Cloud Run, Cloud Built, Anthos. Ability to drive the deployment of the customers’ workloads into GCP and provide guidance, cloud adoption model, service integrations, appropriate recommendations to overcome blockers and technical road-maps for GCP cloud implementations. Experience with technical solutions based on industry standards using GCP - Iaa S, Paa S and Saa S capabilities. Act as a subject-matter expert OR developer around GCP and become a trusted advisor to multiple teams. Technical ability to become certified in required GCP technical certifications. Please share resume at
-
GCP Data Engineer
3 days ago
Bangalore, India Impetus Full timeLocation : Gurgaon and Bangalore Experience : 4-7 years Description: We are looking for talented GCP Data Engineer having minimum experience of 2 year in GCP Cloud, passionate about technology, motivated for continuous learning and an individual who views every client interaction as an opportunity to create an exceptional customer experience. Qualifications:...
-
GCP Data Engineer
4 days ago
Bangalore, India Impetus Full timeJob Title: GCP Data Engineer Experience: 4–7 Years Location: Bangalore / Gurgaon Employment Type: Full-Time About the Role We are looking for an experienced GCP Data Engineer with a strong background in Big Data, PySpark, and Python , and hands-on experience with core Google Cloud Platform (GCP) services. The ideal candidate will be responsible for...
-
GCP Data Engineer
2 weeks ago
bangalore, India Impetus Full timeAbout the Organization-Impetus Technologies is a digital engineering company focused on delivering expert services and products to help enterprises achieve their transformation goals. We solve the analytics, AI, and cloud puzzle, enabling businesses to drive unmatched innovation and growth.Founded in 1991, we are cloud and data engineering leaders providing...
-
Gcp data engineer
4 weeks ago
Bangalore, India RapidBrains Full timeJob Title: GCP Data Engineer Experience: 5+ Years Location: Chennai, Bangalore, Hyderabad Contract Type: Short Term Work Time: IST Work Mode: Hybrid (4 days onsite per week) Job Description We are hiring GCP Data Engineers with experience levels ranging from 5 to 10+ years. The role is based in Chennai, Bangalore, and Hyderabad. Candidates with...
-
GCP Data Engineer
3 days ago
Bangalore, India Impetus Full timeJob Title: GCP Data Engineer Location: Bengaluru Experience: 3-8 years About the Role: Design, build, and maintain large-scale data pipelines on BigQuery and other Google Cloud Platform (GCP) services. Use Python and PySpark/Spark to transform, clean, aggregate and prepare data for analytics/ML. Orchestrate workflows using Cloud Composer (Airflow) to...
-
GCP Data Engineer
4 days ago
Bangalore, India Impetus Full timeJob Title: GCP Data Engineer Location: Bengaluru Experience: 3-8 years About the Role: - Design, build, and maintain large-scale data pipelines on BigQuery and other Google Cloud Platform (GCP) services. - Use Python and PySpark/Spark to transform, clean, aggregate and prepare data for analytics/ML. - Orchestrate workflows using Cloud Composer (Airflow) to...
-
GCP Data Engineer
1 week ago
bangalore, India Impetus Full timeAbout the Organization- Impetus Technologies is a digital engineering company focused on delivering expert services and products to help enterprises achieve their transformation goals. We solve the analytics, AI, and cloud puzzle, enabling businesses to drive unmatched innovation and growth. Founded in 1991, we are cloud and data engineering leaders...
-
GCP Data Engineer
4 days ago
bangalore district, India Impetus Full timeLocation : Gurgaon and Bangalore Experience : 4-7 years Description: We are looking for talented GCP Data Engineer having minimum experience of 2 year in GCP Cloud, passionate about technology, motivated for continuous learning and an individual who views every client interaction as an opportunity to create an exceptional customer experience. Qualifications:...
-
GCP Data Engineer
2 weeks ago
bangalore, India Brillio Full timeAbout the Company: Brillio is one of the fastest growing digital technology service providers and a partner of choice for many Fortune 1000 companies seeking to turn disruption into a competitive advantage through innovative digital adoption. Brillio, renowned for its world-class professionals, referred to as "Brillians", distinguishes itself through their...
-
Gcp data engineer
4 weeks ago
Bangalore, India Brillio Full timeGCP Data Engineer Experience: 3 to 6 years overall Key Skills: Google Big Query – Must have worked on migration and pipeline creation SQL – Strong in both basic and advanced queries Python scripting Airflow / Cloud Composer Cloud platform exposure – GCP (primary), AWS (secondary acceptable) Terraform – for Infrastructure as Code (good...