GCP Data Engineer

2 weeks ago


Bengaluru, Karnataka, India CustomerLabs ❤️ 1P Data OPs Full time

Position Overview:

"Yesterday is history, tomorrow is a mystery, but today is a gift. That's why we call it

the present." - Master Oogway

Join CustomerLabs' dynamic data team as a Data Engineer and play a pivotal role in

transforming raw marketing data into actionable insights that power our digital

marketing platform. As a key member of our data infrastructure team, you will design,

develop, and maintain robust data pipelines, data warehouses, and analytics platforms

that serve as the backbone of our digital marketing product development.

"Sometimes the hardest choices require the strongest wills." - Thanos (but we

promise, our data decisions are much easier )

In this role, you will collaborate with cross-functional teams including Data Scientists,

Product Managers, and Marketing Technology specialists to ensure seamless data flow

from various marketing channels, ad platforms, and customer touchpoints to our

analytics dashboards and reporting systems. You'll be responsible for building scalable,

reliable, and efficient data solutions that can handle high-volume marketing data

processing and real-time campaign analytics.

What You'll Do:


• Design and implement enterprise-grade data pipelines for marketing data

ingestion and processing


• Build and optimize data warehouses and data lakes to support digital marketing

analytics


• Ensure data quality, security, and compliance across all marketing data systems


• Create data models and schemas that support marketing attribution, customer

journey analysis, and campaign performance tracking


• Develop monitoring and alerting systems to maintain data pipeline reliability for

critical marketing operations


• Collaborate with product teams to understand digital marketing requirements

and translate them into technical solutions

Why This Role Matters:

"I can do this all day." - Captain America (and you'll want to, because this role is that

rewarding)

You'll be the backbone behind the data infrastructure that powers CustomerLabs' digital marketing platform, making marketers' lives easier and better. Your work directly translates to smarter automation, clearer insights, and more successful campaigns - helping marketers focus on what they do best while we handle the complex data heavy lifting.

"Sometimes you gotta run before you can walk." - Iron Man (and sometimes you gotta

build the data pipeline before you can analyze the data )

Our Philosophy:

We believe in the power of data to transform lives, just like the Dragon Warrior

transformed the Valley of Peace. Every line of code you write, every pipeline you build,

and every insight you enable has the potential to change how marketers work and

succeed. We're not just building data systems - we're building the future of digital

marketing, one insight at a time.

"Your story may not have such a happy beginning, but that doesn't make you who

you are. It is the rest of your story, who you choose to be." - Soothsayer

What Makes You Special:

We're looking for someone who embodies the spirit of both Captain America's

unwavering dedication and Iron Man's innovative genius. You'll need the patience to

build robust systems (like Cap's shield) and the creativity to solve complex problems

(like Tony's suit). Most importantly, you'll have the heart to make a real difference in

marketers' lives.

"Inner peace... Inner peace... Inner peace..." - Po (because we know data engineering

can be challenging, but we've got your back )

Key Responsibilities:

Data Pipeline Development


• Design, build, and maintain robust, scalable data pipelines and ETL/ELT processes


• Develop data ingestion frameworks to collect data from various sources (databases, APIs, files, streaming sources)


• Implement data transformation and cleaning processes to ensure data quality and consistency


• Optimize data pipeline performance and reliability Data Infrastructure Management


• Design and implement data warehouse architectures


• Manage and optimize database systems (SQL and NoSQL)


• Implement data lake solutions and data governance frameworks


• Ensure data security, privacy, and compliance with regulatory requirements Data Modeling and Architecture


• Design and implement data models for analytics and reporting


• Create and maintain data dictionaries and documentation


• Develop data schemas and database structures


• Implement data versioning and lineage tracking Collaboration and Support


• Work closely with Data Scientists, Analysts, and Business stakeholders


• Provide technical support for data-related issues and queries Monitoring and Maintenance


• Implement monitoring and alerting systems for data pipelines


• Perform regular maintenance and optimization of data systems


• Troubleshoot and resolve data pipeline issues


• Conduct performance tuning and capacity planning

Required Qualifications Experience:


• 2+ years of experience in data engineering or related roles


• Proven experience with ETL/ELT pipeline development


• Experience with cloud data platform (GCP)


• Experience with big data technologies (Spark, Kafka)

Technical Skills


• Programming Languages: Python, SQL, Golang (preferred)


• Databases: PostgreSQL, MySQL, Redis


• Big Data Tools: Apache Spark, Apache Kafka, Apache Airflow, DBT, Dataform


• Cloud Platforms: GCP (BigQuery, Dataflow, Cloud run, Cloud SQL, Cloud

Storage, Pub/Sub, App Engine, Compute Engine etc.)


• Data Warehousing: Google BigQuery


• Version Control: Git, GitHub


• Containerization: Docker

Soft Skills


• Strong problem-solving and analytical thinking


• Excellent communication and collaboration skills


• Ability to work independently and in team environments


• Strong attention to detail and data quality


• Continuous learning mindset

Preferred Qualifications Additional Experience


• Experience with real-time data processing and streaming


• Knowledge of machine learning pipelines and MLOps


• Experience with data governance and data catalog tools


• Familiarity with business intelligence tools (Tableau, Power BI, Looker, etc.


• Experience using AI-powered tools such as Cursor, Claude, Copilot, ChatGPT to accelerate coding, automate tasks, or assist in system design We believe run with machine, not against machine.

Interview Process

1. Initial Screening: Phone/video call with HR

2. Technical Interview: Deep dive into data engineering concepts

3. Final Interview: Discussion with senior leadership

Note: This job description is intended to provide a general overview of the position and

may be modified based on organizational needs and candidate qualifications.

Our Team Culture

"We are Groot." - We work together, we grow together, we succeed together.

We believe in:


• Innovation First - Like Iron Man, we're always pushing the boundaries of

what's possible


• Team Over Individual - Like the Avengers, we're stronger together than apart


• Continuous Learning - Like Po learning Kung Fu, we're always evolving and improving


• Making a Difference - Like Captain America, we fight for what's right (in this case, better marketing)

Growth Journey

"There is no charge for awesomeness... or attractiveness." -

Your journey with us will be like Po's transformation from noodle maker to Dragon

Warrior:


• Level 1: Master the basics of our data infrastructure


• Level 2: Build and optimize data pipelines


• Level 3: Lead complex data projects and mentor others


• Level 4: Become a data engineering legend (with your own theme music

What We Promise

"I am Iron Man." - We promise you'll feel like a superhero every day


• Work that matters - Every pipeline you build helps real marketers succeed


• Growth opportunities - Learn new technologies and advance your career


• Supportive team - We've got your back, just like the Avengers


• Work-life balance - Because even superheroes need rest


  • Gcp Lead Engineer

    3 weeks ago


    Bengaluru, Karnataka, India NTT DATA Full time

    Req ID 298270NTT DATA strives to hire exceptional innovative and passionate individuals who want to grow with us If you want to be part of an inclusive adaptable and forward-thinking organization apply now We are currently seeking a GCP Lead Engineer to join our team in Bangalore Karn xc4 x81taka IN-KA India IN Primary Skill ...

  • GCP Data Engineer

    1 week ago


    Bengaluru, Karnataka, India Infogain Full time US$ 1,00,000 - US$ 1,50,000 per year

    ROLES & RESPONSIBILITIESMust have skillset - Java8, Python, pyspark, Spark, Spring Boot, GCP/Cloud, Jenkins, TerraformCore SkillsExtensive experience with Google Cloud Platform (GCP) data services such as BigQuery, Cloud Storage, and Dataflow.Expertise in ETL (Extract, Transform, Load) processes and data integration on GCP.Strong SQL and database querying...

  • GCP Architect

    3 days ago


    Bengaluru, Karnataka, India NTT DATA Full time US$ 1,04,000 - US$ 1,30,878 per year

    Req ID: 341129NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now. We are currently seeking a GCP Architect to join our team in Bangalore, Karnātaka (IN-KA), India (IN). GCP - Architect Job DescriptionRole...

  • GCP Data Engineer

    1 week ago


    Bengaluru, Karnataka, India Ascendion Full time

    Job Title: GCP Data Engineer (4 - 12 Years)Job Type: Full-TimeWork Mode: HybridLocations: Bengaluru, Hyderabad, Chennai, PuneJob Summary:We are looking for a talented GCP Big Query Data Engineer with strong SQL skills and basic proficiency in Python to join our data engineering team. The ideal candidate should have hands-on experience working with Google...

  • GCP Data Engineer

    2 weeks ago


    Bengaluru, Karnataka, India Tata Consultancy Services Full time

    Dear Candidates,Greetings from TCSTCS is looking for GCP Data EngineerJob Location : Chennai, Hyderabad, Bangalore, Pune, GurgaonExperience : 5to 10 YearsRequired technical skills: GCP Big query, Python, SQL , ETLRequirements:- Proficiency in programming languages: Python, Java- Expertise in data processing frameworks: Apache Beam (Data Flow)- Active...

  • GCP Data Engineer

    1 week ago


    Bengaluru, Karnataka, India Ascendion Full time

    Job Title: GCP Data Engineer (4 - 12 Years) Job Type: Full-Time Work Mode: Hybrid Locations: Bengaluru, Hyderabad, Chennai, Pune Job Summary: We are looking for a talented GCP Big Query Data Engineer with strong SQL skills and basic proficiency in Python to join our data engineering team. The ideal candidate should have hands-on experience working with...

  • GCP Data Engineer

    6 days ago


    Bengaluru, Karnataka, India Ascendion Full time

    Job Title: GCP Data Engineer (4 - 12 Years)Job Type: Full-Time Work Mode: HybridLocations: Bengaluru, Hyderabad, Chennai, PuneJob Summary:We are looking for a talented GCP Big Query Data Engineer with strong SQL skills and basic proficiency in Python to join our data engineering team. The ideal candidate should have hands-on experience working with Google...

  • Gcp Data Engineer

    6 days ago


    Bengaluru, Karnataka, India Talentmatics Full time

    We are looking for an experienced GCP Data Engineer with 5–10 years of experience in Google Cloud Platform (GCP) services and Big Data Analytics solutions.This is an exciting opportunity for professionals passionate about designing and implementing scalable data engineering solutions while working on advanced cloud-based projects.Key...

  • GCP Data Engineer

    2 weeks ago


    Bengaluru, Karnataka, India Impetus Full time

    Job Description We need GCP engineers for capacity building; - The candidate should have extensive production experience (1-2 Years ) in GCP, Other cloud experience would be a strong bonus. - Strong background in Data engineering 2-3 Years of exp in Big Data technologies including, Hadoop, NoSQL, Spark, Kafka etc. - Exposure to enterprise application...

  • GCP Data Engineer

    7 days ago


    Bengaluru, Karnataka, India Impetus Full time

    Job Description We need GCP engineers for capacity building; - The candidate should have extensive production experience (1-2 Years ) in GCP, Other cloud experience would be a strong bonus. - Strong background in Data engineering 2-3 Years of exp in Big Data technologies including, Hadoop, NoSQL, Spark, Kafka etc. - Exposure to enterprise application...