Senior Data Engineer

4 days ago


Gurgaon, Haryana, India WPP Full time ₹ 12,00,000 - ₹ 36,00,000 per year

WPP is the creative transformation company. We use the power of creativity to build better futures for our people, planet, clients, and communities.
Working at WPP means being part of a global network of more than 100,000 talented people dedicated to doing extraordinary work for our clients. We operate in over 100 countries, with corporate headquarters in New York, London and Singapore.
WPP is a world leader in marketing services, with deep AI, data and technology capabilities, global presence and unrivalled creative talent. Our clients include many of the biggest companies and advertisers in the world, including approximately 300 of the Fortune Global 500.
Our people are the key to our success. We're committed to fostering a culture of creativity, belonging and continuous learning, attracting and developing the brightest talent, and providing exciting career opportunities that help our people grow.
Why we're hiring:
We are seeking a highly skilled and experienced Senior Data Engineer to join our growing data team. In this critical role, you will be instrumental in designing, building, and optimizing our scalable data lakehouse platform using Databricks. You will be a key player in developing robust data pipelines that ingest data from various sources, including Google Analytics 4 (GA4), and transform it into reliable, analysis-ready datasets within the Databricks environment.

This role requires deep expertise in Databricks, Apache Spark, Python (PySpark), and SQL. You will be responsible for the entire data lifecycle within the lakehouse, from ingestion and transformation to governance and optimization, ensuring data quality and performance. You should be adept at analyzing performance bottlenecks in Spark jobs, providing enhancement recommendations, and collaborating effectively with both technical and non-technical stakeholders.

What you'll be doing:

  • Design, build, and deploy robust ETL/ELT pipelines within the Databricks Lakehouse Platform using PySpark and Spark SQL.
  • Implement and manage the Medallion Architecture (Bronze, Silver, Gold layers) using Delta Lake to ensure data quality and progressive data refinement.
  • Leverage Databricks Auto Loader for efficient, scalable, and incremental ingestion of data from sources like GA4 into the Bronze layer.
  • Develop, schedule, and monitor complex, multi-task data workflows using Databricks Workflows.
  • Optimize Spark jobs and Delta Lake tables (using techniques like OPTIMIZE, Z-ORDER, and partitioning) for high performance and cost efficiency.
  • Implement data governance, security, and discovery using Unity Catalog, including managing access controls and data lineage.
  • Write complex, customized SQL queries to manipulate data and support ad-hoc analytical requests from business teams.
  • Develop strategies for data ingestion from multiple sources, using various techniques including streaming, API consumption, and replication.
  • Document data engineering processes, data models, and technical specifications for the Databricks platform.
  • Conform to agile development practices, including version control (Git), continuous integration/delivery (CI/CD), and test-driven development.
  • Provide production support for data pipelines, actively monitoring and resolving issues to ensure the continuous flow of critical data.
  • Collaborate with analytics and business teams to understand data requirements and deliver well-modelled, performant datasets in the Gold layer.

What you'll need:

  • Education: Minimum of a Bachelor's degree in Computer Science, Engineering, Mathematics, or a related technical field preferred.
  • Experience: 8+ years of relevant experience in data engineering, with a significant focus on building data pipelines on distributed systems.
  • Engineer's Core Skills:

  • Databricks Lakehouse Platform Expertise:

  • Apache Spark: Deep, hands-on experience with Spark architecture, writing and optimizing complex PySpark and Spark SQL jobs. Proven ability to use the Spark UI to diagnose and resolve performance bottlenecks.
  • Delta Lake: Mastery of Delta Lake for building reliable data pipelines. Proficient with ACID transactions, time travel, schema evolution, and DML operations (MERGE, UPDATE, DELETE).
  • Data Ingestion: Experience with modern ingestion tools, particularly Databricks Auto Loader and COPY INTO for scalable file processing.
  • Unity Catalog: Strong understanding of data governance concepts and practical experience implementing security, lineage, and discovery using Unity Catalog.

  • Core Engineering & Cloud Skills:

  • Programming: 5+ years of strong, hands-on experience in Python, with an emphasis on PySpark for large-scale data transformation.

  • SQL: 6+ years of advanced SQL experience, including complex joins, window functions, and CTEs.
  • Cloud Platforms: 5+ years working with a major cloud provider (Azure, AWS, or GCP), including expertise in cloud storage (ADLS Gen2, S3), security (IAM), and networking.
  • Data Modeling: Experience designing star schemas and applying data warehouse methodologies to build analytical models (Gold layer).
  • CI/CD & DevOps: Hands-on experience with version control (Git) and CI/CD pipelines (e.g., GitHub Actions, Azure DevOps) for automating the deployment of Databricks assets.

  • Tools & Technologies:

  • Primary Data Platform: Databricks

  • Cloud Platforms: Azure (Preferred), GCP, AWS
  • Data Warehouses (Integration): Snowflake, Google BigQuery
  • Orchestration/Transformation: Databricks Workflows, dbt (data build tool)
  • Version Control: Git/GitHub or similar repositories
  • Infrastructure as Code (Bonus): Terraform
  • BI Tools (Bonus): Looker or Power BI

  • You're good at:

  • Working independently and proactively solving complex technical problems.

  • Collaborating positively within a team and partnering with remote members in different time zones.
  • Communicating complex technical concepts clearly to non-technical audiences.
  • Thriving in a fast-paced, service-oriented environment.
  • Working within an Agile methodology.

Who you are:
You're open
:
We are inclusive and collaborative; we encourage the free exchange of ideas; we respect and celebrate diverse views. We are open-minded: to new ideas, new partnerships, new ways of working.

You're optimistic
:
We believe in the power of creativity, technology and talent to create brighter futures or our people, our clients and our communities. We approach all that we do with conviction: to try the new and to seek the unexpected.

You're extraordinary:
we are stronger together: through collaboration we achieve the amazing. We are creative leaders and pioneers of our industry; we provide extraordinary every day.

What we'll give you:
Passionate, inspired people
– We aim to create a culture in which people can do extraordinary work.

Scale and opportunity
– We offer the opportunity to create, influence and complete projects at a scale that is unparalleled in the industry.

Challenging and stimulating work
– Unique work and the opportunity to join a group of creative problem solvers. Are you up for the challenge?

We believe the best work happens when we're together, fostering creativity, collaboration, and connection. That's why we've adopted a hybrid approach, with teams in the office around four days a week. If you require accommodations or flexibility, please discuss this with the hiring team during the interview process.
WPP is an equal opportunity employer and considers applicants for all positions without discrimination or regard to particular characteristics. We are committed to fostering a culture of respect in which everyone feels they belong and has the same opportunities to progress in their careers.
Please read our Privacy Notice ) for more information on how we process the information you provide.


  • Data Engineer

    4 days ago


    Gurgaon, Haryana, India CoPoint Data Full time ₹ 12,00,000 - ₹ 36,00,000 per year

    Senior Data Engineer Location: India (Gurugram)About CoPointAIAI isn't coming — it's here. And we help enterprises make it real. At CoPointAI, we work inside the enterprise — not just around it — to turn AI potential into practical wins. From hands-on C-suite workshops (our AI Foundations series) to AI-native MVPs built in weeks, our team partners...

  • Senior Data Engineer

    2 weeks ago


    Gurgaon, Haryana, India EXL Service Full time ₹ 15,00,000 - ₹ 25,00,000 per year

    Senior Data Engineer We are looking for talented professionals to join our growing team in key position as Senior Data Engineer Skills: PySpark, SQL, Hadoop, AWS, Python, Spark, Scala, Hive, GCP, NoSQL database (Redis, Valkey, OpenSearch) Experience: 5-10 years To apply, share your profile.


  • Gurgaon, Haryana, India Workassist Full time ₹ 12,00,000 - ₹ 36,00,000 per year

    Description : Position : Senior Data Engineering Team Leader (Azure Focus) Work Level : Senior Leadership Industry Type : IT Services & ConsultingJob Summary : We are seeking a Disciplined and innovation-focused Senior Data Engineering Team Leader to lead a team in designing and implementing advanced data solutions on the Microsoft Azure cloud...


  • Gurgaon, Haryana, India Weekday AI Full time ₹ 12,00,000 - ₹ 36,00,000 per year

    This role is for one of the Weekday's clientsMin Experience: 5 yearsLocation: GurgaonJobType: full-timeWe are seeking an experienced Senior Data Engineer with strong expertise in building and managing large-scale data pipelines within the AWS ecosystem. The ideal candidate will have a solid background in SQL, cloud-native data platforms, and orchestration...


  • Gurgaon, Haryana, India Weekday AI Full time ₹ 20,00,000 - ₹ 25,00,000 per year

    This role is for one of the Weekday's clientsMin Experience: 5 yearsLocation: GurgaonJobType: full-timeWe are seeking an experienced Senior Data Engineer with strong expertise in building and managing large-scale data pipelines within the AWS ecosystem. The ideal candidate will have a solid background in SQL, cloud-native data platforms, and orchestration...


  • Gurgaon, Haryana, India DGN INFOSOLUTION PVT. LTD. Full time ₹ 10,00,000 - ₹ 25,00,000 per year

    Senior Data EngineerLocation: GurgaonEmployment Type: full-timeTeam: Data & InsightsAbout Blu Parrot:Blu Parrot is a leading innovator in the design and development of cutting-edge software solutions and products leveraging Data Science, Artificial Intelligence (AI), Large Language Models (LLM), and Generative AI technologies.With a passion for creating...


  • Gurgaon, Haryana, India Weekday AI (YC W21) Full time ₹ 12,00,000 - ₹ 36,00,000 per year

    This role is for one of the Weekday's clientsMin Experience: 5 yearsLocation: GurgaonJobType: full-timeWe are seeking an experiencedSenior Data Engineerwith strong expertise in building and managing large-scale data pipelines within the AWS ecosystem. The ideal candidate will have a solid background in SQL, cloud-native data platforms, and orchestration...


  • Gurgaon, Haryana, India zyoin Full time ₹ 20,00,000 - ₹ 25,00,000 per year

    Description : We are seeking a Senior Data Engineer with 4+ years of experience in building scalable, distributed, and high-performance data systems. In this role, you will architect, develop, and optimize real-time pipelines and analytical systems that enable Amazon teams to extract actionable insights and power intelligent decision-making across the...

  • Senior Data Engineer

    2 weeks ago


    Gurgaon, Haryana, India Mastercard Full time ₹ 12,00,000 - ₹ 36,00,000 per year

    Our PurposeMastercard powers economies and empowers people in 200+ countries and territories worldwide. Together with our customers, we're helping build a sustainable economy where everyone can prosper. We support a wide range of digital payments choices, making transactions secure, simple, smart and accessible. Our technology and innovation, partnerships...

  • Senior Data Engineer

    2 weeks ago


    Gurgaon, Haryana, India Mastercard Full time ₹ 10,00,000 - ₹ 25,00,000 per year

    Our PurposeMastercard powers economies and empowers people in 200+ countries and territories worldwide. Together with our customers, we're helping build a sustainable economy where everyone can prosper. We support a wide range of digital payments choices, making transactions secure, simple, smart and accessible. Our technology and innovation, partnerships...