![LTIMindtree](https://media.trabajo.org/img/noimg.jpg)
GCP + Bigdata
1 week ago
• Must Have Qualifications:o Masters in computer applications or equivalent OR Bachelor degree in engineering or computer science or equivalent.o Deep understanding of Hadoop and Spark Architecture and its working principle.o Deep understanding of Data warehousing concepts.o Ability to design and develop optimized Data pipelines for batch and real time data processing.o 5+ years of software development experience.o 5+ years experience on Python or Java Hands-on experience on writing and understanding complex SQL (Hive/PySpark-data frames), optimizing joins while processing huge amount of data.o 3+ years of hands-on experience of working with Map-Reduce, Hive, Spark (core, SQL and PySpark).o Hands on Experience on Google Cloud Platform (BigQuery, DataProc, Cloud Composer)o 3+ years of experience in UNIX shell scriptingo Should have experience in analysis, design, development, testing, and implementation of system applications.o Ability to effectively communicate with internal and external business partners.• Additional Good to have requirements:o Understanding of Distributed eco system.o Experience in designing and building solutions using Kafka streams or queues.o Experience with NoSQL i.e., HBase, Cassandra, Couchbase or MongoDBo Experience with Data Visualization tools like Tableau, SiSense, Lookero Ability to learn and apply new programming concepts.o Knowledge of Financial reporting ecosystem will be a plus.o Experience in leading teams of engineers and scrum teams.
-
Gcp + Bigdata
1 month ago
Bengaluru, India LTIMindtree Full timePrimary Skills: Hands on experience is mandatory for the GCP, Python/Java, Spark, SQL and Bigdata. Expereince- 5 to 12 years Location- Gurgaon & Bangalore • Must Have Qualifications: o Masters in computer applications or equivalent OR Bachelor degree in engineering or computer science or equivalent. o Deep understanding of Hadoop and Spark Architecture...
-
Gcp + Bigdata
1 week ago
Bengaluru, Karnataka, India LTIMindtree Full timePrimary Skills: Hands on experience is mandatory for the GCP, Python/Java, Spark, SQL and Bigdata.Expereince- 5 to 12 yearsLocation- Gurgaon & BangaloreMust Have Qualifications:o Masters in computer applications or equivalent OR Bachelor degree in engineering or computer science or equivalent.o Deep understanding of Hadoop and Spark Architecture and its...
-
GCP + Bigdata
2 months ago
Bengaluru, India LTIMindtree Full timePrimary Skills: Hands on experience is mandatory for the GCP, Python/Java, Spark, SQL and Bigdata.Expereince- 5 to 12 yearsLocation- Gurgaon & Bangalore• Must Have Qualifications:o Masters in computer applications or equivalent OR Bachelor degree in engineering or computer science or equivalent.o Deep understanding of Hadoop and Spark Architecture and its...
-
GCP + Bigdata
2 months ago
Bengaluru, India LTIMindtree Full timePrimary Skills: Hands on experience is mandatory for the GCP, Python/Java, Spark, SQL and Bigdata. Expereince- 5 to 12 years Location- Gurgaon & Bangalore • Must Have Qualifications: o Masters in computer applications or equivalent OR Bachelor degree in engineering or computer science or equivalent. o Deep understanding of Hadoop and Spark Architecture and...
-
GCP + Bigdata
2 weeks ago
Bengaluru, India LTIMindtree Full timePrimary Skills: Hands on experience is mandatory for the GCP, Python/Java, Spark, SQL and Bigdata.Expereince- 5 to 12 yearsLocation- Gurgaon & Bangalore• Must Have Qualifications:o Masters in computer applications or equivalent OR Bachelor degree in engineering or computer science or equivalent.o Deep understanding of Hadoop and Spark Architecture and its...
-
Sdet - Bigdata
1 week ago
Bengaluru, Karnataka, India Uptycs Full timeStrong experience on testing Hadoop based Bigdata components running in a microservices environment.Strong programming skills in python is a must.Expertise in Kafka, Zookeeper, Postgres, Redis, Trino/impala, Http proxy etcExpertise in ETL, AWS, Kubernetes, databases, SQL and Linux.Expertise in Go, nodejs is a plus.Excellent troubleshooting skills and a...
-
Gcp Bigdata Cloud Architect
1 week ago
Bengaluru, Karnataka, India Tiger Analytics Full timeArchitect(Cloud)- Chennai/Bangalore/HyderabadAs an Architect(Cloud), you will analyze, architect, design, and actively develop cloud data warehouses, data lakes, and other cloud-based data solutions in Aws.Analyze, architect, design, and actively develop cloud data warehouses, data lakes, and other cloud-based data solutions in GCP.AWS experts with strong...
-
GCP Bigdata Cloud Architect
1 week ago
Bengaluru, Karnataka, India Tiger Analytics Full timeArchitect(Cloud)- Chennai/Bangalore/HyderabadAs an Architect(Cloud), you will analyze, architect, design, and actively develop cloud data warehouses, data lakes, and other cloud-based data solutions in Aws.Analyze, architect, design, and actively develop cloud data warehouses, data lakes, and other cloud-based data solutions in GCP.AWS experts with strong...
-
GCP Bigdata Cloud Architect
2 weeks ago
Bengaluru, India Tiger Analytics Full timeArchitect(Cloud)- Chennai/Bangalore/HyderabadAs an Architect(Cloud), you will analyze, architect, design, and actively develop cloud data warehouses, data lakes, and other cloud-based data solutions in Aws.Analyze, architect, design, and actively develop cloud data warehouses, data lakes, and other cloud-based data solutions in GCP.AWS experts with strong...
-
GCP Bigdata Cloud Architect
22 hours ago
Bengaluru, India Tiger Analytics Full timeArchitect(Cloud)- Chennai/Bangalore/HyderabadAs an Architect(Cloud), you will analyze, architect, design, and actively develop cloud data warehouses, data lakes, and other cloud-based data solutions in Aws.Analyze, architect, design, and actively develop cloud data warehouses, data lakes, and other cloud-based data solutions in GCP.AWS experts with strong...
-
GCP Data Engineer
1 week ago
Bengaluru, Karnataka, India Alp Consulting Limited Full timeRequirement:- Mandatory to have knowledge of Big Data Architecture Patterns and experience in delivery of BigData and Hadoop Ecosystems. Strong experience required in GCP . Must have done multiple large projects with GCP Big Query and ETL Experience working in GCP based Big Data deployments (Batch/Realtime) leveraging components like GCP Big Query, air...
-
GCP Bigdata Cloud Architect
1 week ago
Bengaluru, India Tiger Analytics Full timeArchitect(Cloud)- Chennai/Bangalore/HyderabadAs an Architect(Cloud), you will analyze, architect, design, and actively develop cloud data warehouses, data lakes, and other cloud-based data solutions in Aws.Analyze, architect, design, and actively develop cloud data warehouses, data lakes, and other cloud-based data solutions in GCP.AWS experts with strong...
-
GCP Technical Architect
4 weeks ago
Bengaluru, India Impetus Full timeJob Description:• 10-15 years of experience in the role of implementation of high end software products. • Provides technical leadership in Big Data space (Hadoop Stack like Spark, M/R, HDFS, Hive, HBase etc) and contributes to open source Big Data technologies. • Must have : Operating knowledge of cloud computing platforms (GCP, especially Big Query,...
-
GCP Technical Architect
4 weeks ago
Bengaluru, India Impetus Full timeJob Description:• 10-15 years of experience in the role of implementation of high end software products. • Provides technical leadership in Big Data space (Hadoop Stack like Spark, M/R, HDFS, Hive, HBase etc) and contributes to open source Big Data technologies. • Must have : Operating knowledge of cloud computing platforms (GCP, especially Big Query,...
-
GCP Technical Architect
3 weeks ago
Bengaluru, India Impetus Full timeJob Description:• 10-15 years of experience in the role of implementation of high end software products. • Provides technical leadership in Big Data space (Hadoop Stack like Spark, M/R, HDFS, Hive, HBase etc) and contributes to open source Big Data technologies. • Must have : Operating knowledge of cloud computing platforms (GCP, especially Big Query,...
-
GCP Technical Architect
3 weeks ago
Bengaluru, India Impetus Full timeJob Description:• 10-15 years of experience in the role of implementation of high end software products. • Provides technical leadership in Big Data space (Hadoop Stack like Spark, M/R, HDFS, Hive, HBase etc) and contributes to open source Big Data technologies. • Must have : Operating knowledge of cloud computing platforms (GCP, especially Big Query,...
-
GCP Technical Architect
3 weeks ago
Bengaluru, India Impetus Full timeJob Description: • 10-15 years of experience in the role of implementation of high end software products. • Provides technical leadership in Big Data space (Hadoop Stack like Spark, M/R, HDFS, Hive, HBase etc) and contributes to open source Big Data technologies. • Must have : Operating knowledge of cloud computing platforms (GCP, especially Big...
-
Senior GCP Big Data Engineer
1 week ago
Bengaluru, Karnataka, India Notus Full timeJob Title : GCP (Google Cloud Platform) SpecialistNumber of Positions : 6 (1 Senior Role, 5 Lead Roles)Shift : 12 : 00 PM to 9 : 00 PM OR 1 : 00 PM to 10 : 00 PMLocation : TechM India locationsNotice Period : Early/Immediate/30 days (Positions need to be filled by May)Budget :Senior Role : 4 to 5/6 years of experience - Depending upon hike % and candidate...
-
GCP-Big-Query
1 week ago
Bengaluru, Karnataka, India Maneva Consulting Pvt. Ltd Full timeGreetingsFromManevaJobDescriptionJobTitle GCPBigQueryLocationBangaloreExperience7 15Years JobRequirements: 7 years total experience with Min 3 yearsExperience with Building Pipelines using GCPBigQuery. Min 23 Experience in working on LargeScale Data warehouses likeTeradata. Experience building and optimizing datawarehousing data pipelines (ELT and ETL)...
-
GCP-Big-Query
4 weeks ago
Bengaluru, India Maneva Consulting Pvt. Ltd Full timeGreetingsFromManeva!JobDescriptionJobTitle GCPBigQueryLocationBangaloreExperience7 15Years JobRequirements: 7 years total experience with Min 3 yearsExperience with Building Pipelines using GCPBigQuery. Min 23 Experience in working on LargeScale Data warehouses likeTeradata. Experience building and optimizing datawarehousing data pipelines (ELT and ETL)...