
GCP Data Engineer
11 hours ago
About Impetus
Impetus Technologies is a digital engineering company focused on delivering expert services and products to help enterprises achieve their transformation goals. We solve the analytics, AI, and cloud puzzle, enabling businesses to drive unmatched innovation and growth.
Founded in 1991, we are cloud and data engineering leaders providing solutions to fortune 100 enterprises, headquartered in Los Gatos, California, with development centers in NOIDA, Indore, Gurugram, Bengaluru, Pune, and Hyderabad with over 3000 global team members. We also have offices in Canada and Australia and collaborate with a number of established companies, including American Express, Bank of America, Capital One, Toyota, United Airlines, and Verizon.
Roles & Responsibilities
6-10 years of experience in the role of implementation of high end software products.
Provides technical leadership in Big Data space (Hadoop Stack like Spark, M/R, HDFS, Hive, HBase etc) and contributes to open source Big Data technologies.
Must have : Operating knowledge of cloud computing platforms (GCP, especially Big Query, Dataflow, Dataproc, Storage, VMs, Networking, Pub Sub, Cloud Functions, Composer servics)
Should be aware with columnar database e.g parquet, ORC etc
Visualize and evangelize next generation infrastructure in Cloud platform/Big Data space (Batch, Near Real-time, Real-time technologies).
Passionate for continuous learning, experimenting, applying and contributing towards cutting edge open source technologies and software paradigms
Developing and implementing an overall organizational data strategy that is in line with business processes. The strategy includes data model designs, database development standards, implementation and management of data warehouses and data analytics systems
Expert-level proficiency in at-least 4-5 GCP services
Experience with technical solutions based on industry standards using GCP - IaaS, PaaS and SaaS capabilities
Strong understanding and experience in distributed computing frameworks, particularly
Experience working within a Linux computing environment, and use of command line tools including knowledge of shell/Python scripting for automating common task
-
GCP Data Engineer
11 hours ago
Bangalore Urban, India Impetus Full timeJob Description: Must have : Bigdata ,GCP (Bigquery, Dataproc)We are looking for energetic, high-performing and highly skilled data engineers to help shape our technology and product roadmap.You will be part of the fast-paced, entrepreneurial Global Campaign Tracking (GCT) team under Enterprise Personalization Portfolio focused on delivering the next...
-
GCP Data Engineer
20 hours ago
Bangalore, India Impetus Full timeJob Description We need GCP engineers for capacity building; - The candidate should have extensive production experience (1-2 Years ) in GCP, Other cloud experience would be a strong bonus. - Strong background in Data engineering 2-3 Years of exp in Big Data technologies including, Hadoop, NoSQL, Spark, Kafka etc. - Exposure to enterprise application...
-
GCP Data Engineer
19 hours ago
Bangalore, India Ascendion Full timeJob Title: GCP Data Engineer (4 - 12 Years) Job Type: Full-Time Work Mode: Hybrid Locations: Bengaluru, Hyderabad, Chennai, Pune Job Summary: We are looking for a talented GCP Big Query Data Engineer with strong SQL skills and basic proficiency in Python to join our data engineering team. The ideal candidate should have hands-on experience...
-
GCP Data Engineer
56 minutes ago
bangalore, India Ascendion Full timeJob Title: GCP Data Engineer (4 - 12 Years)Job Type: Full-Time Work Mode: HybridLocations: Bengaluru, Hyderabad, Chennai, PuneJob Summary:We are looking for a talented GCP Big Query Data Engineer with strong SQL skills and basic proficiency in Python to join our data engineering team. The ideal candidate should have hands-on experience working with Google...
-
GCP Data Engineer
1 hour ago
bangalore, India Tata Consultancy Services Full timeDear Candidates,Greetings from TCS!!!!TCS is looking for GCP Data Engineer Job Location : Chennai, Hyderabad, Bangalore, Pune, GurgaonExperience : 5to 10 Years Required technical skills: GCP Big query, Python, SQL , ETL Requirements:Proficiency in programming languages: Python, JavaExpertise in data processing frameworks: Apache Beam (Data Flow)Active...
-
GCP Data Engineer
20 hours ago
Bangalore, India Impetus Full timeWe are hiring Sr. GCP Data Engineer for Bangalore & Gurgaon Location. We are looking candidate should have good experience in Bigdata Spark, SQL, Pyspark, GCP - Bigquery, Dataflow, Airflow, Pubsub, Dataproc, GCS, Cloud Composer. If you are good in above skills & can join un in 0-30 Days notice, kindly share your Resume at Roles & Responsibilities- ...
-
GCP Data Engineer
20 hours ago
Bangalore, India Impetus Full timeJob Description: Must have : Bigdata ,GCP (Bigquery, Dataproc) We are looking for energetic, high-performing and highly skilled data engineers to help shape our technology and product roadmap. You will be part of the fast-paced, entrepreneurial Global Campaign Tracking (GCT) team under Enterprise Personalization Portfolio focused on delivering the...
-
Senior Data Engineer
11 hours ago
Bangalore Urban, India Impetus Full time4-7 years of IT experience range is preferred.Able to effectively use GCP managed services e.g. Dataproc, Dataflow, pub/sub, Cloud functions, Big Query, GCS - At least 4 of these Services.Good to have knowledge on Cloud Composer, Cloud SQL, Big Table, Cloud Function.Strong experience in Big Data technologies – Hadoop, Sqoop, Hive and Spark including...
-
GCP cloud engineer
19 hours ago
Bangalore, India Impetus Full timeJob Descriptions for Big data or Cloud Engineer Position Summary: We are looking for candidates with hands on experience in Big Data with GCP cloud. Qualifications 4-7 years of IT experience range is preferred. Able to effectively use GCP managed services e.g. Dataproc, Dataflow, pub/sub, Cloud functions, Big Query, GCS - At least 4 of these...
-
GCP Data Architect
20 hours ago
Bangalore, India LTIMindtree Full timeRoles and Responsibilities SF to BQ program To take the requirements from the business and create 1 Technical Solution design 2 Data Model 3 Data Quality Assurance 4 Identify Rationalization Automation Scope 5 Work with Spoke to clarify the solution design and review LLD 6 Identify solution technical dependencies 7 Access Security Design 8...