GCP Data Architect

1 month ago


india Huquo Full time

Role : GCP Data Architect

Description :


As a GCP Data Architect, you will be responsible for architecting enterprise data solutions for migration, transformation or modernization on Google Cloud Platform (GCP) integrating native GCP services and other 3rd party data solutions. You will collaborate with cloud partners and our cross-functional teams to enable sector-specific use cases. Your responsibilities will involve technology adoptions pertaining to AI/ML, Generative AI, Advanced analytics, serverless and service driven ecosystem for data led and AI enabled digital transformations.

Expert proficiency in comprehending the principles for large scale architecting, solutioning and operationalization of data pipelines, warehouses, data lakes, analytics platforms and activation services on GCP is crucial. We are looking for professionals with solid experience on design, delivery, and implementation of GCP infrastructure and solutions along with 3rd party technologies on Google Cloud.

Work Location : Hybrid - all EXL locations. Will involve relocation for selected outstation candidate.

Experience : 12+ Years

Skills :

- 12+ years of proven experience as an Architect in Big Data Engineering.

- Mandatory Technical Proficiency - Hands-on experience with Python, PySpark, Google Cloud Tech stack, SQL, DevOps.

- Experience with Google Cloud Services such as Streaming + Batch, Cloud Storage, Cloud Dataflow, Data Proc, Big Query & Big Table

- Proven real time exposure and use of contemporary data mining, cloud computing and data management ecosystem like Google Cloud, Hadoop, HDFS and Spark.

- Proficient in Data modelling that can represent complex data structures while ensuring accuracy, consistency, and efficiency; data warehousing, and ETL processes.

- Ability to perform system analysis and assessment of existing systems and operating methodologies leveraging in-depth knowledge of big data technologies and ecosystems.

- Excellent problem-solving skills and the ability to address complex technical challenges.

- Strong communication and leadership skills.

Highly Desired :

- Experience with Anaplan, Looker & PowerBI,

- Experience with Apigee, Apollo GraphQL.

- Experience with serverless data warehousing concepts,

- Additional programming/scripting languages: JavaScript, Java.

- Knowledge of Snowflake, Generative AI (LLMs), Marketing activation.

Role & Responsibilities :

- Solution Design: Participate in in requirements gathering and architectural discussions. Define technical requirements and create solution designs that align with business goals and objectives.

- Technical Leadership: Lead creation of technical design/specifications and provide technical leadership and guidance to development teams, promoting best practices. Provide expertise with Master Data Management, Reference Data Management, Data Quality, Meta Data Management and Data Governance in General.

- Programming Hands on: Develop and maintain software components using Python, PySpark, and GCP services to process and analyze large datasets efficiently.


- Build data pipelines and perform data transformations. Evaluate newest technologies for optimization opportunities and future enhancement needs like self-serve and ad hoc reporting. Implement necessary infrastructure for optimal and efficient ETL from a disparate data.

- Performance Optimization: Identify and address performance bottlenecks, ensuring the system meets required throughput and latency targets.

- Security and Compliance: Ensure that data solutions adhere to security and compliance standards, implementing necessary controls and encryption mechanisms.

- Scalability: Architect scalable and highly available data solutions, considering both batch and real-time processing.

- Documentation: Create and maintain comprehensive technical documentation to support the development and maintenance of data solutions.

Education :

Bachelor's degree in Computer Science, Software Engineering, MIS or equivalent combination of education and experience.

(ref:hirist.tech)
  • GCP DATA Architect

    7 days ago


    India GeorgiaTEK Systems Inc. Full time

    GCP DATA Architect Remote 8 to 15 years of relevant technology experience. Excellent experience in GCP cloud data development.SAP Experience is preferredExperience with GCP Cloud Data technologies such as Cloud Storage, BigQuery, Dataflow, GCP tools, Spark, and Kafka. Proficiency in programming languages such as Scala, Java or Python Experience with...

  • GCP DATA Architect

    7 days ago


    India GeorgiaTEK Systems Inc. Full time

    GCP DATA Architect Remote 8 to 15 years of relevant technology experience. Excellent experience in GCP cloud data development. SAP Experience is preferred Experience with GCP Cloud Data technologies such as Cloud Storage, BigQuery, Dataflow, GCP tools, Spark, and Kafka. Proficiency in programming languages such as Scala, Java or Python Experience with...

  • GCP Data Architect

    4 weeks ago


    Anywhere in India/Multiple Locations, IN Huquo Full time

    Role : GCP Data Architect Description : As a GCP Data Architect, you will be responsible for architecting enterprise data solutions for migration, transformation or modernization on Google Cloud Platform (GCP) integrating native GCP services and other 3rd party data solutions. You will collaborate with cloud partners and our cross-functional teams to enable...

  • GCP Data Architect

    2 weeks ago


    Anywhere in India/Multiple Locations Huquo Full time

    Role : GCP Data Architect Description : As a GCP Data Architect, you will be responsible for architecting enterprise data solutions for migration, transformation or modernization on Google Cloud Platform (GCP) integrating native GCP services and other 3rd party data solutions. You will collaborate with cloud partners and our cross-functional teams to...

  • GCP Data Engineer

    3 weeks ago


    india Awign Expert Full time

    Job Description Job Title: GCP Data Engineer Overview:- Design, build, and maintain scalable data pipelines and workflowsusing Apache Airflow on Google Cloud Platform (GCP).- Develop and optimize ETL processes to extract, transform, and loaddata into BigQuery for analysis and reporting purposes.- Collaborate with cross-functional teams to gather...

  • GCP Administrator

    2 days ago


    india Awign Expert Full time

    Job Description Role: GCP AdministratorDuration: 6 months Location: Remote Timings: Full Time (As per company timings) Notice Period: (Immediate Joiner - Required) Total Experience: 4+ Years Mandatory Skills: GCP, GCS, Google Analytics, Data Flow, Dataproc, Cloud Data Fusion etc, IaC tool like Terraform, BigQuery, Big Table, Looker, Programming language...

  • GCP Data Engineer

    14 hours ago


    india Enterprise Minds, Inc Full time

    Job title : GCP Data Engineer Location : Pune Mode : Hybrid Model Job Description: Expert level understanding of distributed computing principles Expert level knowledge and experience in Apache Spark Hands on programming with Python / Scala Proficiency with latest Hadoop, HDFS, Sqoop HANDS ON GCP Data Flow module is required Good Experience with Apache...

  • GCP Data Engineer

    1 week ago


    india InfoBeans Full time

    Good development experience in GCP Data flow with Java Exposure to Cloud databases like GCP big query, Postgres Good SQL, No-SQL knowledge Exposure to other cloud technologies like S3, Snowflake, Salesforce etc is a plus Good Communication skill (written/spoken)

  • GCP Data Engineer

    4 days ago


    india Tech Mahindra Full time

    Tech Mahindra hiring GCP Data Engineers for all across India Notice- Immediate/ Early joiners only** Experience- 4+ years Must have skills: Linux, SQL, GCP BigQuery, CloudRun, Cloud Functions, GC Storage, Python Send CVs to if eligible.


  • india Epam Full time

    Description EPAM is a leading global provider of digital platform engineering and development services. We are committed to having a positive impact on our customers, our employees, and our communities. We embrace a dynamic and inclusive culture. Here you will collaborate with multi-national teams, contribute to a myriad of innovative projects...

  • GCP Data Engineer

    4 weeks ago


    india EXL Full time

    Hi, Please Find the JD Below Hands-on experience in GCP cloud data implementation suite like Cloud Storage, Dataflow/Apache Beam, Dataproc, PubSub, BigQuery & Airflow/Composer. Experience with creating streaming & batch jobs using Dataflow/Apache Beam and monitoring the Jobs using GCP Cloud Logging. Extensive experience working with SQL (nested data) &...

  • Solutions Architect

    4 days ago


    india Genpact Full time

    As a Solutions Architect, you will liaise with clients to: Responsibilities Collate technical and functional requirements through workshops with senior stakeholders in risk, actuarial, pricing and product teams Translate business requirements to technical solutions leveraging strong business acumen Analyse current business practice, processes, and...


  • india Ascendion Full time

    DayToDay Responsbilities: - Design and implement scalable ETL solutions using GCP services such as Dataflow, Dataprep, BigQuery, and Cloud Storage. - Collaborate with data architects and business analysts to understand data requirements and translate them into ETL workflows. - Develop and optimize ETL pipelines to ensure high performance, reliability, and...

  • GCP Cloud Engineer

    3 days ago


    india RapidBrains Full time

    Job Title: GCP Cloud Engineer Experience : 10+ Years Location : Bengaluru Notice Period: 0-30 days Job Description Key Responsibilities: Design and develop cloud-based solutions leveraging Google Cloud Platform services. Stay updated with the latest trends and developments in cloud computing and GCP services. Evaluate new GCP services and...

  • Data Architect

    1 month ago


    india Artefact Full time

    Artefact is a new generation of data service providers specialising in data consulting and data-driven digital marketing. It is dedicated to transforming data into business impact across the entire value chain of organisations. We are proud to say we’re enjoying skyrocketing growth. The backbone of our consulting missions, today our Data consulting team...

  • Aptus Data Labs

    2 days ago


    india Aptus Data Labs Full time

    Role - Data Architect/Data Engineer Lead- AWSExp - 8+ YearsLocation - RemoteSkills - Data architecture, Data modelling & design, Data Governance, MDM, Python, SQL, ETL, AWS CloudWe are seeking a highly skilled and experienced Data Architect to join our team. The ideal candidate will have 8+ years of experience in data architecture, with a focus on AWS cloud...


  • india Wipro Full time

    Mandatory Skill: Architect with experience in GCP, java, spring boot, angular components. Key Roles: a. 10 + years of IT experience with at-least 3 to 5 years cloud experience b. Must be a GCP Certified Cloud Architect/Developer c. Candidate must be GCP experts and have knowledge of all GCP, java, spring boot, angular components. d. Have knowledge...

  • GCP Data Engineer

    3 weeks ago


    india Persistent Systems Full time

    About Position: As a GCP Data Engineer with strong experience in Python, Google Cloud and Bigquery Role: GCP Data Engineer Location: Mumbai Experience: 4-8 years Job Type: Full Time Employment What You'll Do: Establish data model design based on business requirements. Should be hands on. Develop end to end streaming and batch data analytics pipelines....


  • india Tredence Inc. Full time

    As a Data Science Solution Architect at Tredence, you will lead the design and development of data science solutions that leverage advanced analytics, machine learning, and artificial intelligence to address complex business challenges. You will work closely with cross-functional teams to understand business requirements and translate them into scalable and...

  • Solutions Architect

    1 week ago


    India Koantek Full time

    The Solution Architect at Koantek builds secure, highly scalable big data solutions to achieve tangible, data-driven outcomes all the while keeping simplicity and operational effectiveness in mind. This role collaborates with teammates, product teams, and cross-functional project teams to lead the adoption and integration of the Databricks Lakehouse Platform...