GCP Data Engineer

3 weeks ago


Kota, Rajasthan, India CustomerLabs 1P Data OPs Full time

Position Overview:

"Yesterday is history, tomorrow is a mystery, but today is a gift. That's why we call it

the present." - Master Oogway

Join CustomerLabs' dynamic data team as a Data Engineer and play a pivotal role in

transforming raw marketing data into actionable insights that power our digital

marketing platform. As a key member of our data infrastructure team, you will design,

develop, and maintain robust data pipelines, data warehouses, and analytics platforms

that serve as the backbone of our digital marketing product development.

"Sometimes the hardest choices require the strongest wills." - Thanos (but we

promise, our data decisions are much easier )

In this role, you will collaborate with cross-functional teams including Data Scientists,

Product Managers, and Marketing Technology specialists to ensure seamless data flow

from various marketing channels, ad platforms, and customer touchpoints to our

analytics dashboards and reporting systems. You'll be responsible for building scalable,

reliable, and efficient data solutions that can handle high-volume marketing data

processing and real-time campaign analytics.

What You'll Do:


• Design and implement enterprise-grade data pipelines for marketing data

ingestion and processing


• Build and optimize data warehouses and data lakes to support digital marketing

analytics


• Ensure data quality, security, and compliance across all marketing data systems


• Create data models and schemas that support marketing attribution, customer

journey analysis, and campaign performance tracking


• Develop monitoring and alerting systems to maintain data pipeline reliability for

critical marketing operations


• Collaborate with product teams to understand digital marketing requirements

and translate them into technical solutions

Why This Role Matters:

"I can do this all day." - Captain America (and you'll want to, because this role is that

rewarding)

You'll be the backbone behind the data infrastructure that powers CustomerLabs' digital marketing platform, making marketers' lives easier and better. Your work directly translates to smarter automation, clearer insights, and more successful campaigns - helping marketers focus on what they do best while we handle the complex data heavy lifting.

"Sometimes you gotta run before you can walk." - Iron Man (and sometimes you gotta

build the data pipeline before you can analyze the data )

Our Philosophy:

We believe in the power of data to transform lives, just like the Dragon Warrior

transformed the Valley of Peace. Every line of code you write, every pipeline you build,

and every insight you enable has the potential to change how marketers work and

succeed. We're not just building data systems - we're building the future of digital

marketing, one insight at a time.

"Your story may not have such a happy beginning, but that doesn't make you who

you are. It is the rest of your story, who you choose to be." - Soothsayer

What Makes You Special:

We're looking for someone who embodies the spirit of both Captain America's

unwavering dedication and Iron Man's innovative genius. You'll need the patience to

build robust systems (like Cap's shield) and the creativity to solve complex problems

(like Tony's suit). Most importantly, you'll have the heart to make a real difference in

marketers' lives.

"Inner peace... Inner peace... Inner peace..." - Po (because we know data engineering

can be challenging, but we've got your back )

Key Responsibilities:

Data Pipeline Development


• Design, build, and maintain robust, scalable data pipelines and ETL/ELT processes


• Develop data ingestion frameworks to collect data from various sources (databases, APIs, files, streaming sources)


• Implement data transformation and cleaning processes to ensure data quality and consistency


• Optimize data pipeline performance and reliability Data Infrastructure Management


• Design and implement data warehouse architectures


• Manage and optimize database systems (SQL and NoSQL)


• Implement data lake solutions and data governance frameworks


• Ensure data security, privacy, and compliance with regulatory requirements Data Modeling and Architecture


• Design and implement data models for analytics and reporting


• Create and maintain data dictionaries and documentation


• Develop data schemas and database structures


• Implement data versioning and lineage tracking Collaboration and Support


• Work closely with Data Scientists, Analysts, and Business stakeholders


• Provide technical support for data-related issues and queries Monitoring and Maintenance


• Implement monitoring and alerting systems for data pipelines


• Perform regular maintenance and optimization of data systems


• Troubleshoot and resolve data pipeline issues


• Conduct performance tuning and capacity planning

Required Qualifications Experience:


• 2+ years of experience in data engineering or related roles


• Proven experience with ETL/ELT pipeline development


• Experience with cloud data platform (GCP)


• Experience with big data technologies (Spark, Kafka)

Technical Skills


• Programming Languages: Python, SQL, Golang (preferred)


• Databases: PostgreSQL, MySQL, Redis


• Big Data Tools: Apache Spark, Apache Kafka, Apache Airflow, DBT, Dataform


• Cloud Platforms: GCP (BigQuery, Dataflow, Cloud run, Cloud SQL, Cloud

Storage, Pub/Sub, App Engine, Compute Engine etc.)


• Data Warehousing: Google BigQuery


• Version Control: Git, GitHub


• Containerization: Docker

Soft Skills


• Strong problem-solving and analytical thinking


• Excellent communication and collaboration skills


• Ability to work independently and in team environments


• Strong attention to detail and data quality


• Continuous learning mindset

Preferred Qualifications Additional Experience


• Experience with real-time data processing and streaming


• Knowledge of machine learning pipelines and MLOps


• Experience with data governance and data catalog tools


• Familiarity with business intelligence tools (Tableau, Power BI, Looker, etc.


• Experience using AI-powered tools such as Cursor, Claude, Copilot, ChatGPT to accelerate coding, automate tasks, or assist in system design We believe run with machine, not against machine.

Interview Process

1. Initial Screening: Phone/video call with HR

2. Technical Interview: Deep dive into data engineering concepts

3. Final Interview: Discussion with senior leadership

Note: This job description is intended to provide a general overview of the position and

may be modified based on organizational needs and candidate qualifications.

Our Team Culture

"We are Groot." - We work together, we grow together, we succeed together.

We believe in:


• Innovation First - Like Iron Man, we're always pushing the boundaries of

what's possible


• Team Over Individual - Like the Avengers, we're stronger together than apart


• Continuous Learning - Like Po learning Kung Fu, we're always evolving and improving


• Making a Difference - Like Captain America, we fight for what's right (in this case, better marketing)

Growth Journey

"There is no charge for awesomeness... or attractiveness." -

Your journey with us will be like Po's transformation from noodle maker to Dragon

Warrior:


• Level 1: Master the basics of our data infrastructure


• Level 2: Build and optimize data pipelines


• Level 3: Lead complex data projects and mentor others


• Level 4: Become a data engineering legend (with your own theme music

What We Promise

"I am Iron Man." - We promise you'll feel like a superhero every day


• Work that matters - Every pipeline you build helps real marketers succeed


• Growth opportunities - Learn new technologies and advance your career


• Supportive team - We've got your back, just like the Avengers


• Work-life balance - Because even superheroes need rest


  • Data Engineer

    3 weeks ago


    Kota, Rajasthan, India R Systems Full time

    Job Title: Data EngineerContract Period: 12 MonthsLocation: Offshore candidates accepted (Singapore Based Company)Work Timing : 6.30 AM to 3.30 PM or 7.00 AM to 4.00 PM (IST - India timing)ExperienceMinimum 4+ years as a Data Engineer or similar role.(Please don't apply if less than 4 years exp in Data Engineer)Proven experience in Python, Spark, and PySpark...


  • Kota, Rajasthan, India beBeeDataEngineer Full time ₹ 2,00,00,000 - ₹ 2,50,00,000

    Job OpportunityWe are seeking a skilled professional to fill the position of Senior Data Engineer with expertise in GCP and ETL.This role involves working on a variety of projects, including data ingestion, processing, and storage. The ideal candidate will have experience with cloud-based technologies, particularly Google Cloud Platform (GCP), and a strong...


  • Kota, Rajasthan, India beBeeDataEngineer Full time ₹ 15,00,000 - ₹ 25,00,000

    Job Title: Data Engineer – Google Cloud Platform (GCP) LookerAbout the Role:We are seeking an experienced data engineer to design, develop and deploy data engineering solutions using Google Cloud Platform (GCP) Looker.This is a unique opportunity to collaborate with cross-functional teams to identify business requirements and design data architectures that...


  • Kota, Rajasthan, India beBeeDataEngineer Full time ₹ 1,50,00,000 - ₹ 2,50,00,000

    Job Overview:We are seeking a skilled professional to fill the role of Senior Data Engineer. In this position, you will be responsible for designing and building large-scale data pipelines using industry-leading tools.The ideal candidate will have experience working with stakeholders and business clients to identify opportunities for data acquisition. Strong...


  • Kota, Rajasthan, India Quess IT Staffing Full time

    Hi,Exp: 5-8 Years• Profound Cloud Technology, Network, Security and Platform Expertise (GCP) • Expertise in GCP cloud services like VPC, Compute, GCS, Cloud Run, Load balancing, IAM etc. • Expertise in terraform Clarity Call Note: GCP requirement Pune only DM role Top three skills: How landing zone created GCP - Infrastructure automation Rest API using...


  • Kota, Rajasthan, India beBeeDataEngineer Full time ₹ 1,50,00,000 - ₹ 2,00,00,000

    Job Overview:We are seeking a highly skilled Senior Data Engineer to join our organization.Key Responsibilities:Design and develop scalable data pipelines using Google Cloud services such as GCS, PubSub, Dataflow, or DataProc, Bigquery, and Airflow/Composer.Implement Extract, Transform, Load (ETL) processes on Google Cloud Platform (GCP), focusing on...


  • Kota, Rajasthan, India beBeeData Full time ₹ 1,50,00,000 - ₹ 2,50,00,000

    Senior Cloud Architect PositionWe are seeking a seasoned Senior Cloud Architect to lead the design and implementation of large-scale data engineering and cloud platforms.Develop, implement, and maintain scalable cloud architectures across AWS, Azure, and GCP.Collaborate with cross-functional teams to drive innovation and cost-effectiveness in cloud...

  • Data Scientist

    2 weeks ago


    Kota, Rajasthan, India beBeeMachineLearning Full time ₹ 15,00,000 - ₹ 25,00,000

    Job DescriptionWe are seeking a highly skilled Data Scientist to drive the development and execution of AI-related projects. As a key member of our team, you will be responsible for designing and implementing machine learning models, algorithms, and data-driven solutions to address complex business problems.In this role, you will work closely with...


  • Kota, Rajasthan, India beBeeDataEngineering Full time ₹ 32,00,000 - ₹ 40,00,000

    Data Architect - Cloud Engineering ExpertWe are seeking a seasoned professional to lead the design and development of scalable data pipelines on the Google Cloud Platform.About The RoleDesigning and developing robust data architectures for batch and real-time processing using GCP tools.Constructing large, complex datasets to meet diverse business...


  • Kota, Rajasthan, India beBeeDataEngineer Full time ₹ 1,50,00,000 - ₹ 2,00,00,000

    Data Engineer RoleAs a seasoned Data Engineer, you will be responsible for designing and implementing scalable IoT data solutions using AWS services.The ideal candidate will have hands-on experience in data transformation and ingestion, as well as knowledge of GCP BQ Dataflow Airflow and Python programming language.Key Responsibilities:Design and develop...