Current jobs related to Data Engineer-GCP, PySpark, ETL, BigData/Warehousing, SQL,python 6+ Yr - Bengaluru Chennai Gurugram - Bcforward
-
Data Engineer-GCP 6+ Yrs C2H
2 weeks ago
Bengaluru, Chennai, Gurugram, India Bcforward Full time ₹ 1,04,000 - ₹ 1,30,878 per yearGreetings from BCforward INDIA TECHNOLOGIES PRIVATE LIMITED.Contract To Hire(C2H) RoleLocation: Delhi/Bengaluru/Gurgaon/Chennai-HybridPayroll: BCforwardWork Mode: HybridDomain: Banking & FinanceJDSkills: GCP; ETL - Big Data / Data Warehousing; SQL; Python; PySpark, AirflowExperienced data engineer with hands on experience on GCP offerings Experienced in...
-
Gcp+pyspark
1 week ago
Gurugram, Haryana, India Cognizant Full timeRole - GCP + Pyspark Exp - 5 to 12 YRS Notice period - 60 days Location - Gurgaon **Job description**: - 1. Strong knowledge of Pyspark 2. Experience of working in SQL and NOSQL 3. Bigquery and bigtable knowledge 4. working experience of working in GCP, 5 knowledge of hadoop, distributed execution, bigdata concepts 6 Hive shell scripting
-
Gcp Data Engineer
1 week ago
Bengaluru, Chennai, India Ltimindtree Full time ₹ 15,00,000 - ₹ 25,00,000 per yearGCP Bigquery + Python + Pyspark + BigdataGCP Bigquery + PL/SQL + Unix + Shell scriptingGCP Bigquery + SQL + ETLGCP Bigquery + Python + SQL + Dataflow, AirflowGCP Data ArchitectPlease fill the form
-
GCP Data Engineer
7 days ago
Bengaluru, India Trendsetter Consultant Services Full timeRole: GCP Data Engineer (C2H)Location: Bangalore (Manyata Tech Park)Work Mode: 35 days from Build and maintain scalable data pipelines- Work on ETL/ELT workflows from multiple data sources- Ensure data quality, integrity, and governance- Collaborate with analysts and business teamsSkills Required:- Python, PySpark, SQL, CI/CD- Kafka, Pub-Sub,...
-
Bigdata Python
2 weeks ago
Chennai, Tamil Nadu, India RF workforce Solutions Full timeSkill: Bigdata with Python & Pyspark Yrs of experience: 5+ yrs Notice: preferable immediate **Job Types**: Full-time, Temporary Contract length: 6 months Pay: ₹60,000.00 - ₹82,000.00 per month Schedule: - Monday to Friday **Experience**: - Big data: 5 years (required) - python: 5 years (required) - pyspark: 5 years (required) Work Location: In...
-
Python, Pyspark
2 weeks ago
Bengaluru, India Ziniosedge Full timePython, Pyspark 5 Technology Lead (Data Engineer) BE - 8+ years of industry experience as Lead Developer - Experience in implementing ETL and ELT data pipelines with PySpark - Spark Structured API, SPARK SQL & Spark Performance tuning are highly preferred. - Experience in building data pipelines on data lake or Lakehouse(AWS Databricks) & handling...
-
Bigdata + GCP Engineer
4 days ago
Chennai, India Impetus Technologies Full timeChennai, Tamil Nadu, India Qualification : Must have : Bigdata ,GCP (Bigquery, Dataproc) We are looking for energetic, high-performing and highly skilled data engineers to help shape our technology and product roadmap. You will be part of the fast-paced, entrepreneurial Global Campaign Tracking (GCT) team under Enterprise PersonalizationPortfolio focused on...
-
Big Data Gcp
2 weeks ago
Bengaluru, Karnataka, India Tata Consultancy Services Full timeHands-on experience in ETL and DB testing, Data migration, Report validations - GCP Big Query knowledge - Pyspark /Python experience - Expert in writing/understanding complex SQL queries - Advanced knowledge if Bigdata concepts and technologies (Hive, Impala, Pentaho, HBase, Hadoop etc.) - Exposure to the Property and Auto insurance business domain,...
-
Lead Cloud Data Engineer
7 days ago
Gurugram, India upGrad Full timeAbout the Job :We're hiring an cloud data engineering (preferably Azure) data pipelines and Spark. - Work with Databricks platform using Spark for big data processing and analytics. - Write optimized and efficient code using PySpark, Spark SQL and Python. - Develop and maintain ETL processes using Databricks notebooks and workflows. - Implement and...
-
GCP Data Engineer
2 weeks ago
Bengaluru, Karnataka, India Infogain Pte Ltd Full timeCore SkillsExp: 5-10 yearsMust have - Python, Java, Bigquery, PySpark- Extensive experience with Google Cloud Platform (GCP) data services such as BigQuery, Cloud Storage, and Dataflow.- Expertise in ETL (Extract, Transform, Load) processes and data integration on GCP.- Strong SQL and database querying optimization skills on GCP.- Experience with data...

Data Engineer-GCP, PySpark, ETL, BigData/Warehousing, SQL,python 6+ Yr
2 weeks ago
Greetings from BCforward INDIA TECHNOLOGIES PRIVATE LIMITED.
Contract To Hire(C2H) Role
Location: Bengaluru/Gurgaon/Chennai
Payroll: BCforward
Work Mode: Hybrid
JD
Skills: GCP; PySpark; ETL - Big Data / Data Warehousing; SQL; Python, BigQuery, Airflow
Experienced data engineer with hands on experience on GCP offerings Experienced in BigQuery/ BigTable/ Pyspark, worked on prior data engineering projects leveraging GCP product offerings Strong SQL background, Prior Banking Bigdata experience is big Plus
Please share your Updated Resume, PAN card soft copy, Passport size Photo & UAN History.
Interested applicants can share updated resume to
Note: Looking for Immediate to 30-Days joiners at most.
All the best