
GCP Data Engineer
7 hours ago
Position Overview:
"Yesterday is history, tomorrow is a mystery, but today is a gift. That's why we call it
the present." - Master Oogway
Join CustomerLabs' dynamic data team as a Data Engineer and play a pivotal role in
transforming raw marketing data into actionable insights that power our digital
marketing platform. As a key member of our data infrastructure team, you will design,
develop, and maintain robust data pipelines, data warehouses, and analytics platforms
that serve as the backbone of our digital marketing product development.
"Sometimes the hardest choices require the strongest wills." - Thanos (but we
promise, our data decisions are much easier )
In this role, you will collaborate with cross-functional teams including Data Scientists,
Product Managers, and Marketing Technology specialists to ensure seamless data flow
from various marketing channels, ad platforms, and customer touchpoints to our
analytics dashboards and reporting systems. You'll be responsible for building scalable,
reliable, and efficient data solutions that can handle high-volume marketing data
processing and real-time campaign analytics.
What You'll Do:
• Design and implement enterprise-grade data pipelines for marketing data
ingestion and processing
• Build and optimize data warehouses and data lakes to support digital marketing
analytics
• Ensure data quality, security, and compliance across all marketing data systems
• Create data models and schemas that support marketing attribution, customer
journey analysis, and campaign performance tracking
• Develop monitoring and alerting systems to maintain data pipeline reliability for
critical marketing operations
• Collaborate with product teams to understand digital marketing requirements
and translate them into technical solutions
Why This Role Matters:
"I can do this all day." - Captain America (and you'll want to, because this role is that
rewarding)
You'll be the backbone behind the data infrastructure that powers CustomerLabs' digital marketing platform, making marketers' lives easier and better. Your work directly translates to smarter automation, clearer insights, and more successful campaigns - helping marketers focus on what they do best while we handle the complex data heavy lifting.
"Sometimes you gotta run before you can walk." - Iron Man (and sometimes you gotta
build the data pipeline before you can analyze the data )
Our Philosophy:
We believe in the power of data to transform lives, just like the Dragon Warrior
transformed the Valley of Peace. Every line of code you write, every pipeline you build,
and every insight you enable has the potential to change how marketers work and
succeed. We're not just building data systems - we're building the future of digital
marketing, one insight at a time.
"Your story may not have such a happy beginning, but that doesn't make you who
you are. It is the rest of your story, who you choose to be." - Soothsayer
What Makes You Special:
We're looking for someone who embodies the spirit of both Captain America's
unwavering dedication and Iron Man's innovative genius. You'll need the patience to
build robust systems (like Cap's shield) and the creativity to solve complex problems
(like Tony's suit). Most importantly, you'll have the heart to make a real difference in
marketers' lives.
"Inner peace... Inner peace... Inner peace..." - Po (because we know data engineering
can be challenging, but we've got your back )
Key Responsibilities:
Data Pipeline Development
• Design, build, and maintain robust, scalable data pipelines and ETL/ELT processes
• Develop data ingestion frameworks to collect data from various sources (databases, APIs, files, streaming sources)
• Implement data transformation and cleaning processes to ensure data quality and consistency
• Optimize data pipeline performance and reliability Data Infrastructure Management
• Design and implement data warehouse architectures
• Manage and optimize database systems (SQL and NoSQL)
• Implement data lake solutions and data governance frameworks
• Ensure data security, privacy, and compliance with regulatory requirements Data Modeling and Architecture
• Design and implement data models for analytics and reporting
• Create and maintain data dictionaries and documentation
• Develop data schemas and database structures
• Implement data versioning and lineage tracking Collaboration and Support
• Work closely with Data Scientists, Analysts, and Business stakeholders
• Provide technical support for data-related issues and queries Monitoring and Maintenance
• Implement monitoring and alerting systems for data pipelines
• Perform regular maintenance and optimization of data systems
• Troubleshoot and resolve data pipeline issues
• Conduct performance tuning and capacity planning
Required Qualifications Experience:
• 2+ years of experience in data engineering or related roles
• Proven experience with ETL/ELT pipeline development
• Experience with cloud data platform (GCP)
• Experience with big data technologies (Spark, Kafka)
Technical Skills
• Programming Languages: Python, SQL, Golang (preferred)
• Databases: PostgreSQL, MySQL, Redis
• Big Data Tools: Apache Spark, Apache Kafka, Apache Airflow, DBT, Dataform
• Cloud Platforms: GCP (BigQuery, Dataflow, Cloud run, Cloud SQL, Cloud
Storage, Pub/Sub, App Engine, Compute Engine etc.)
• Data Warehousing: Google BigQuery
• Version Control: Git, GitHub
• Containerization: Docker
Soft Skills
• Strong problem-solving and analytical thinking
• Excellent communication and collaboration skills
• Ability to work independently and in team environments
• Strong attention to detail and data quality
• Continuous learning mindset
Preferred Qualifications Additional Experience
• Experience with real-time data processing and streaming
• Knowledge of machine learning pipelines and MLOps
• Experience with data governance and data catalog tools
• Familiarity with business intelligence tools (Tableau, Power BI, Looker, etc.
• Experience using AI-powered tools such as Cursor, Claude, Copilot, ChatGPT to accelerate coding, automate tasks, or assist in system design We believe run with machine, not against machine.
Interview Process
1. Initial Screening: Phone/video call with HR
2. Technical Interview: Deep dive into data engineering concepts
3. Final Interview: Discussion with senior leadership
Note: This job description is intended to provide a general overview of the position and
may be modified based on organizational needs and candidate qualifications.
Our Team Culture
"We are Groot." - We work together, we grow together, we succeed together.
We believe in:
• Innovation First - Like Iron Man, we're always pushing the boundaries of
what's possible
• Team Over Individual - Like the Avengers, we're stronger together than apart
• Continuous Learning - Like Po learning Kung Fu, we're always evolving and improving
• Making a Difference - Like Captain America, we fight for what's right (in this case, better marketing)
Growth Journey
"There is no charge for awesomeness... or attractiveness." -
Your journey with us will be like Po's transformation from noodle maker to Dragon
Warrior:
• Level 1: Master the basics of our data infrastructure
• Level 2: Build and optimize data pipelines
• Level 3: Lead complex data projects and mentor others
• Level 4: Become a data engineering legend (with your own theme music
What We Promise
"I am Iron Man." - We promise you'll feel like a superhero every day
• Work that matters - Every pipeline you build helps real marketers succeed
• Growth opportunities - Learn new technologies and advance your career
• Supportive team - We've got your back, just like the Avengers
• Work-life balance - Because even superheroes need rest
-
Gcp Lead Engineer
5 days ago
Bengaluru, Karnataka, India NTT DATA Full timeReq ID 298270NTT DATA strives to hire exceptional innovative and passionate individuals who want to grow with us If you want to be part of an inclusive adaptable and forward-thinking organization apply now We are currently seeking a GCP Lead Engineer to join our team in Bangalore Karn xc4 x81taka IN-KA India IN Primary Skill ...
-
GCP Data Engineer
2 days ago
Bengaluru, Karnataka, India beBeeDataEngineering Full time ₹ 9,00,000 - ₹ 12,00,000We're seeking a seasoned Data Engineering professional to join our team.Key ResponsibilitiesThe ideal candidate will have extensive experience in designing and implementing scalable data pipelines using GCP managed services, such as Dataproc, Dataflow, Pub/Sub, Cloud Functions, BigQuery, and GCS. They should be proficient in programming languages like Python...
-
GCP Data Engineer
4 weeks ago
Bengaluru, Karnataka, India Tata Consultancy Services Full timeTCS presents an excellent Opportunity For "GCP Data Engineer" RoleJob Description1 Role - GCP Data Engineer2 Location of Requirement - Bengaluru/Chennai/Hyderabad3 Experience Range - 10 to 14 YearsDesired Competencies- Professional GCP development experience preferred.- Experience in GCP (PubSub, BigQuery, Bigdata, DataProc, Pyspark), Hadoop- Experience in...
-
GCP Data Engineer
2 days ago
Bengaluru, Karnataka, India Tata Consultancy Services Full timeDear Candidates,Greetings from TCSTCS is looking for GCP Data EngineerJob Location : Chennai, Hyderabad, Bangalore, Pune, GurgaonExperience : 5to 10 YearsRequired technical skills: GCP Big query, Python, SQL , ETLRequirements:- Proficiency in programming languages: Python, Java- Expertise in data processing frameworks: Apache Beam (Data Flow)- Active...
-
GCP Data Engineer
3 weeks ago
Bengaluru, Karnataka, India Tata Consultancy Services Full timeTCS presents an excellent Opportunity For "GCP Data Engineer" RoleJob Description1 Role - GCP Data Engineer2 Location of Requirement - Bengaluru/Chennai/Hyderabad3 Experience Range - 10 to 14 YearsDesired CompetenciesProfessional GCP development experience preferred.Experience in GCP (PubSub, BigQuery, Bigdata, DataProc, Pyspark), HadoopExperience in working...
-
GCP Data Engineer
4 days ago
Bengaluru, Karnataka, India Tata Consultancy Services Full timeTCS present an excellent opportunity for GCP Data EngineerJob Location: Bangalore, Chennai , Hyderabad, PuneExperience required : 10 yrsSkills: GCP Data Engineer , Bigquery , PythonRequirements:- Proficiency in programming languages: Python, Java- Expertise in data processing frameworks: Apache Beam (Data Flow)- Active experience on GCP tools and...
-
GCP Data Engineer
4 weeks ago
Bengaluru, Karnataka, India Tata Consultancy Services Full timeTCS present an excellent opportunity for GCP Data EngineerJob Location: Chennai /Hyderabad /Bangalore /Pune /GurgaonExperience required -10-12 YrsSkills: GCP Data Engineer , Big Query , PythonRequirements:- Proficiency in programming languages: Python, Java- Expertise in data processing frameworks: Apache Beam (Data Flow)- Active experience on GCP tools and...
-
GCP Data Engineer
3 weeks ago
Bengaluru, Karnataka, India Impetus Full timeWe are hiring Sr. GCP Data Engineer for Bangalore & Gurgaon Location. We are looking candidate should have good experience in Bigdata Spark, SQL, Pyspark, GCP - Bigquery, Dataflow, Airflow, Pubsub, Dataproc, GCS, Cloud Composer. If you are good in above skills & can join un in 0-30 Days notice, kindly share your Resume at Roles & Responsibilities- Able to...
-
GCP Data Engineer
3 days ago
Bengaluru, Karnataka, India Impetus Full timeWe are hiring Sr. GCP Data Engineer for Bangalore & Gurgaon Location. We are looking candidate should have good experience in Bigdata Spark, SQL, Pyspark, GCP - Bigquery, Dataflow, Airflow, Pubsub, Dataproc, GCS, Cloud Composer. If you are good in above skills & can join un in 0-30 Days notice, kindly share your Resume at Roles & Responsibilities- Able to...
-
GCP Data Engineer
2 weeks ago
Bengaluru, Karnataka, India Tata Consultancy Services Full timeTCS present an excellent opportunity for GCP Data EngineerJob Location: Chennai /Hyderabad /Bangalore /Pune /GurgaonExperience required -10-12 YrsSkills: GCP Data Engineer , Big Query , PythonRequirements: - Proficiency in programming languages: Python, Java - Expertise in data processing frameworks: Apache Beam (Data Flow) - Active experience on GCP tools...