Current jobs related to Staff Data Platform Engineer - Pune - Zendesk
-
Staff Data Platform Engineer(Dev Ops)
2 days ago
Pune, Maharashtra, India Zendesk Full time ₹ 1,50,00,000 - ₹ 2,50,00,000 per yearJob DescriptionStaff Platform Engineer (DevOps)Our Enterprise Data & Analytics (EDA) is looking for an experienced Staff Data Platform Engineer to join our growing Platform engineering team. You'll work in a collaborative Agile environment using the latest engineering best practices with involvement in all aspects of the software development lifecycle.As a...
-
Staff Software Engineer
2 weeks ago
Pune, India Addepar Full timeJob Description Who We Are Addepar is a global technology and data company that helps investment professionals provide the most informed, precise guidance for their clients. Hundreds of thousands of users have entrusted Addepar to empower smarter investment decisions and better advice over the last decade. With client presence in more than 50 countries,...
-
Staff Data Engineer
5 days ago
Pune, Maharashtra, India Pattern Full time ₹ 12,00,000 - ₹ 24,00,000 per yearRole Overview As a Staff Data Engineer, you'll be a technical leader and systems thinker within Pattern's Data Engineering team—designing and scaling our canonical data model to deliver trusted, high-quality data. You'll transform complex, raw data with a set of efficient pipelines into tables that are easy to understand and efficient to power analytics,...
-
Staff Data Engineer
5 days ago
Pune, Maharashtra, India Pattern Full time ₹ 8,00,000 - ₹ 24,00,000 per yearRole Overview-As a Staff Data Engineer, you'll be a technical leader and systems thinker within Pattern's Data Engineering team—designing and scaling our canonical data model to deliver trusted, high-quality data. You'll transform complex, raw data with a set of efficient pipelines into tables that are easy to understand and efficient to power analytics,...
-
Data Engineer
7 days ago
Pune, Maharashtra, India Data Axle Full time ₹ 15,00,000 - ₹ 28,00,000 per yearAbout Data Axle:Data Axle Inc. has been an industry leader in data, marketing solutions, sales and research for 50 years in the US. Data Axle has set a strategic global centre of excellence in Pune. This centre delivers mission critical data services to its global customers powered by its proprietary cloud-based technology platform and leveraging proprietary...
-
Staff AI Platform Engineer
2 weeks ago
Pune, India Zscaler Full timeJob Description About Zscaler Serving thousands of enterprise customers around the world including 45% of Fortune 500 companies, Zscaler (NASDAQ: ZS) was founded in 2007 with a mission to make the cloud a safe place to do business and a more enjoyable experience for enterprise users. As the operator of the world's largest security cloud, Zscaler accelerates...
-
Data Operations Engineer
2 weeks ago
Pune, Maharashtra, India Mars Data Insights Full time ₹ 15,00,000 - ₹ 25,00,000 per yearTitle: Data Operations Engineer & RUN SupportSkills:Data Operations Engineering, data manipulation, Python, Talend, GCP, Bigquery, DataIku, ITSM/ticketing tools, Helix, Jira, task management, data pipelines, RUN Service, data infrastructure, data quality Job Location:Pune Job Type:Fulltime/Hybrid Work Experience:5+ years We are seeking a highly...
-
Senior Data Platform Engineer
2 weeks ago
Pune, Maharashtra, India Zywave, Inc. Full time ₹ 6,00,000 - ₹ 18,00,000 per yearJob Title: Senior Data Platform EngineerLocation: Pune, IndiaWork Mode: Work From Office (WFO), 5 Days a WeekShift Timing: 12:00 PM – 9:00 PM ISTAbout ZywaveZywave is a leading provider of InsurTech solutions, empowering insurance brokers and agencies with innovative software tools to grow and manage their business. We are building a modern data platform...
-
Druva - Staff Software Engineer
22 hours ago
Pune, India Druva Data Solutions Private Limited Full timeAbout Us :Druva is the leading provider of data security solutions, empowering customers to secure and recover their data from all threats. The Druva Data Security Cloud is a fully managed SaaS solution offering air-gapped and immutable data protection across cloud, on-premises, and edge environments. By centralizing data protection, Druva enhances...
-
Principal Engineer – Data Platforms
5 days ago
Pune, Maharashtra, India Codvo Full time ₹ 20,00,000 - ₹ 25,00,000 per yearPrincipal Engineer – Data Platforms & MLOps (Databricks)Company Overview At Codvo, software and people transformations go hand-in-hand. We are a global empathy-led technology services company. Product innovation and mature software engineering are part of our core DNA. Respect, Fairness, Growth, Agility, and Inclusiveness are the core values that we...
Staff Data Platform Engineer
4 weeks ago
Staff Data Platform Engineer
Our Enterprise Data & Analytics (EDA) is looking for an experienced Staff Data Platform Engineer to join our growing Platform engineering team. You’ll work in a collaborative Agile environment using the latest engineering best practices with involvement in all aspects of the software development lifecycle.
As a Staff Data Platform Engineer, you will shape the strategy, architecture, and execution of Zendesk’s data platform that powers next‑generation reporting and analytics across the product portfolio. You’ll lead complex, multi-team initiatives; set technical standards for scalable, reliable data systems; and mentor senior engineers while partnering closely with Product, Security, and Analytics to deliver high-quality, compliant, and cost-efficient data products at scale.
Data is at the heart of Zendesk’s business—this is a high-impact, high-ownership role with broad technical and organizational influence.
What you get to do every single dayLead architecture and roadmap
Define and evolve the end-to-end data platform architecture across ingestion, transformation, storage and governance.
Establish standardized data contracts, schemas, documentation, and tooling that improve consistency and reduce time-to-data for analytics and product teams.
Lead build-vs-buy evaluations and pilot new technologies to improve reliability, speed, or cost.
Deliver platform capabilities at scale
Design and deliver secure, and highly-available data services and pipelines handling large-scale, mission-critical workloads.
Establish SLOs/SLIs for data pipelines, lineage, and serving layers; implement robust observability (metrics, tracing, alerting) and incident response.
Partner with Analytics, Product, and Security to translate business needs into robust, scalable data solutions and SLAs.
Tune query and pipeline performance and enforce FinOps guardrails to reduce Snowflake storage / compute spend without compromising reliability.
Raise the engineering bar
Define standards for data modeling, testing (unit/integration/contract), CI/CD, IaC, and code quality; champion reproducibility and reliability.
Advance data quality and governance (DQ checks, metadata, lineage, PII handling, RBAC) with “privacy-by-design” practices.
Conduct deep root cause analyses; drive systemic fixes that improve resilience and developer experience.
8+ years of data engineering experience building, working & maintaining scalable data infrastructure (data pipelines & ETL processes on big data environments)
3+ years leading complex, cross-team initiatives at Senior/Staff level.
3+ years of experience with Cloud columnar databases (Snowflake)
Proven experience as a CI/CD Engineer or DevOps Engineer, with a focus on data platforms and analytics (Terraform, Docker, Kubernetes, Github Actions)
Experience with atleast 1 Cloud Platform (AWS, Google Cloud)
Proficiency in query authoring (SQL) and data processing (batch and streaming)
Intermediate experience with atleast one of the programming languages: Python, Go, Java, Scala, we use primarily Python
Experience with ETL schedulers such as Apache Airflow, AWS Glue or similar frameworks
Integration with 3rd party API SaaS applications like Salesforce, Zuora, etc
Ensure data integrity and accuracy by conducting regular data audits, identifying and resolving data quality issues, and implementing data governance best practices.
Experience performing root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement.
Excellent collaboration and communication skills.
Ability to work closely with data scientists, analysts, and business stakeholders to translate business requirements into technical solutions. Strong documentation skills for pipeline design and data flow diagrams.
ELT (Fivetran, Snowflake, dbt, Airflow)
Infrastructure (GCP, AWS, Kubernetes, Terraform, Github Actions)
Monitoring and Observability (DataDog, MonteCarlo)
BI (Tableau, Looker)
Please note that Zendesk can only hire candidates who are physically located and plan to work from Karnataka or Maharashtra. Please refer to the location posted on the requisition for where this role is based.
Hybrid: In this role, our hybrid experience is designed at the team level to give you a rich onsite experience packed with connection, collaboration, learning, and celebration - while also giving you flexibility to work remotely for part of the week. This role must attend our local office for part of the week. The specific in-office schedule is to be determined by the hiring manager.