Senior Data Engineer
2 weeks ago
Hello, Truecaller is calling you from Bangalore, India Ready to pick up? Our goal is to make communication smarter, safer, and more efficient, while building trust across the world. With our roots in Sweden and a global reach, we deliver smart services that create meaningful social impact. We are committed to protecting you from fraud, harassment, scam calls, and unwanted messages, so you can focus on the conversations that matter. Top 20 most downloaded apps globally, and world’s #1 caller ID and spam-blocking service for Android and iOS, with extensive AI capabilities, with more than 450 million active users per month. Founded in 2009, listed on Nasdaq OMX Stockholm and is categorized as a Large Cap. Our focus on innovation, operational excellence, sustainable growth, and collaboration has resulted in consistently high profitability and strong EBITDA margins. A team of 400 people from ~45 different nationalities spread across our headquarters in Stockholm and offices in Bangalore, Mumbai, Gurgaon and Tel Aviv with high ambitions . We in the Insights Team are responsible for SMS Categorization, Fraud detection and other Smart SMS features within the Truecaller app. The OTP & bank notifications, bill & travel reminder alerts are some examples of the Smart SMS features. The team has developed a patented offline text parser that powers all these features and the team is also exploring cutting edge technologies like LLM to enhance the Smart SMS features. The team’s mission is to become the World’s most loved and trusted SMS app which is aligned with Truecaller’s vision to make communication safe and efficient. Smart SMS is used by over 90M users every day. As a Senior Data Engineer, you will play an important role in the development of data pipelines, frameworks and models to support the understanding of our users and making better product decisions. You will contribute to empowering the product teams with a complete self-serve analytics platform by working on scalable and robust solutions while collaborating with data engineers, data scientists and data analysts across the company. What you bring in: 6+ years of experience as a Data Engineer Hands-on experience with Airflow for managing workflows and building complex data pipelines in a production environment. Experience working with big data and ETL development. Strong proficiency in SQL and experience working with relational databases Programming skills in PySpark, Spark with Scala, Apache Spark, Kafka, or Flink. Experience working with cloud computing services (eg : GCP, AWS, Azure). Experience with Data Science workflows. Experience in data modeling and creating data lakes using GCP services like BigQuery and Cloud Storage. Expertise in containerization and orchestration using Docker and Kubernetes (GKE) for scaling applications and services on GCP. Build data models and transformations using DBT following software engineering best practices (modularity, testing). Version control experience with Git and familiarity with CI/CD pipelines (e.g., Github actions). Strong understanding of data security, encryption, and GCP IAM roles to ensure privacy and compliance (especially in relation to GDPR and other regulations). Experience in ML model lifecycle management (model deployment, versioning, and retraining) using GCP tools like AI Platform, TensorFlow Extended (TFX), or Kubeflow, and Vertex AI. Experience in working with Data Analysts and Scientists in building Systems in Production. Excellent problem solving and communication skills both with peers and experts from other areas. Self-motivated and have a proven ability to take initiative to solve problems. The impact you will create: Design, develop, and maintain scalable data pipelines to process and analyze large data sets in real-time and batch environments. Play a crucial role in the team and own ETL pipelines. Collaborate with data scientists, analysts, and stakeholders to gather data requirements, translate them into robust ETL solutions, and optimize the data flows. Implement best practices for data ingestion, transformation, and data quality to ensure data consistency and accuracy. Develop, test, and deploy complex data models and ensure the performance, reliability, and security of the infrastructure. Own the architecture and design of data pipelines and systems, ensuring they are aligned with business needs and capable of handling growing volumes of data. Make data-driven decisions accompanied by past experience. Monitor data pipeline performance and troubleshoot any issues related to data ingestion, processing, or extraction. Work with big data technologies to enable storage, processing, and analysis of massive datasets. Ensure compliance with data protection and privacy regulations, particularly in regions like the EU where GDPR compliance is essential. It would be great if you also have: Familiarity with event-driven architecture and microservices using Cloud Pub/Sub, Cloud Run, or GKE to build highly scalable, resilient, and loosely coupled systems. Proficiency in backend programming languages like Go, Python, Java, or Scala specifically for building highly scalable, low-latency data services and APIs. Hands-on experience in designing and implementing RESTful APIs or gRPC services for seamless integration with data pipelines and external systems. Hands-on experience with GCP-native tools for advanced analytics, such as Looker, Data Studio, or BigQuery BI Engine, for building visualizations and reporting dashboards. Knowledge of real-time data processing and analytics using Apache Flink, Kafka Streams, or Druid for ultra-low latency use cases. Experience with data observability tools such as Monte Carlo, Databand.ai, or OpenLineage, ensuring the integrity and quality of data across pipelines. Experience optimizing Cloud Storage, BigQuery partitioning, and clustering strategies for large-scale datasets, ensuring cost-effectiveness and query performance. Domain knowledge in specific industries (e.g., telecom, calls, and message communication) where large-scale data pipelines and regulatory compliance are critical, allowing you to bring domain-specific expertise to complex challenges Life at Truecaller - Behind the code: Sounds like your dream job? We will fill the position as soon as we find the right candidate, so please send your application as soon as possible. As part of the recruitment process, we will conduct a background check. This position is based in Bangalore, India. We only accept applications in English . What we offer: A smart, talented and agile team: An international team where ~35 nationalities are working together in several locations and time zones with a learning, sharing and fun environment. A great compensation package: Competitive salary, 30 days of paid vacation, flexible working hours, private health insurance, parental leave, telephone bill reimbursement, Udemy membership to keep learning and improving and Wellness allowance. Great tech tools: Pick the computer and phone that you fancy the most within our budget ranges. Office life: We strongly believe in the in-person collaboration and follow an office-first approach while offering some flexibility. Enjoy your days with great colleagues with loads of good stuff to learn from, daily lunch and breakfast and a wide range of healthy snacks and beverages. In addition, every now and then check out the playroom for a fun break or join our exciting parties and or team activities such as Lab days, sports meetups etc. There something for everyone Come as you are: Truecaller is diverse, equal and inclusive. We need a wide variety of backgrounds, perspectives, beliefs and experiences in order to keep building our great products. No matter where you are based, which language you speak, your accent, race, religion, color, nationality, gender, sexual orientation, age, marital status, etc. All those things make you who you are, and that’s why we would love to meet you.
-
Senior Data Engineer
2 days ago
bangalore district, India Sonata Software Full timeThe Senior Software Engineer supports functions which require automation/systems development, including initial development and ongoing support of the suite of applications that would help manage loan, pricing, enterprise data pipeline, or other functions depending on the department. As the Application Developer, you must be proficient in Python, Data...
-
Senior Data Engineer
2 days ago
bangalore district, India USEReady Full timeJob Title: Senior Databricks Engineer Experience Level: 5-8 Years Job Summary As a Senior Databricks Engineer, you will be responsible for designing, developing, and optimizing our data architecture and pipelines on the Databricks Lakehouse Platform. You will leverage your deep expertise in Spark, Delta Lake, and cloud technologies to build scalable and...
-
Senior Data Engineer
2 days ago
bangalore district, India ValueLabs Full timeValueLabs is a global technology consulting and services company, driven by innovation and excellence. With operations in over 30 countries and a client-first culture, we deliver cutting-edge digital solutions across industries. Our teams are empowered to solve complex challenges, build secure systems, and drive transformation at scale. Join us and be part...
-
Senior GCP Data Engineer
2 weeks ago
bangalore district, India Ascendion Full timeJob Title: Senior GCP Data Engineer (7 - 12 Years) Job Type: Full-Time Work Mode: Hybrid Locations: Bengaluru, Hyderabad, Chennai, Pune Job Summary: We are looking for a talented GCP Big Query Data Engineer with strong SQL skills and basic proficiency in Python to join our data engineering team. The ideal candidate should have hands-on experience working...
-
Senior Data Scientist
9 hours ago
chennai district, India Crayon Data Full timeRole: Sr Data Scientist Experience level: 5 to 7 years Location: Chennai Why Crayon? Why now? Crayon is transforming into an AI first company, and every Crayon (that’s what we call ourselves!) is undergoing a journey of upskilling and expanding their capabilities in the AI space. We're building an organization where AI is not a department—it’s a way of...
-
Senior Data Engineer
4 days ago
bangalore, India AIQU Full timeWe are hiring for Senior Data Engineer to join one of our major clients based out of KSA. Job Details: Role: Senior Data Engineer Work Location: Remote Employment Type: Contract – 12 months & extendable Role Summary: The Senior Data Engineer plays a lead role in designing, building, optimizing, and governing data pipelines and architectures that power...
-
Senior Data Engineer
2 days ago
bangalore district, India Zeta Full timeAbout Zeta Zeta is a Next-Gen Banking Tech company that empowers banks and fintechs to launch banking products for the future. It was founded by and Ramki Gaddipati in 2015. Our flagship processing platform - Zeta Tachyon - is the industry’s first modern, cloud-native, and fully API-enabled stack that brings together issuance, processing, lending, core...
-
Senior Data Engineer
4 days ago
bangalore, India AIQU Full timeWe are hiring for Senior Data Engineer to join one of our major clients based out of KSA.Job Details:Role: Senior Data EngineerWork Location: RemoteEmployment Type: Contract – 12 months & extendableRole Summary:The Senior Data Engineer plays a lead role in designing, building, optimizing, and governing data pipelines and architectures that power analytics,...
-
Senior Data Engineer
2 weeks ago
bangalore district, India Guidewire Software Full timeResponsibilities: Design and Development: Architect, design, and develop robust, scalable, and efficient data pipelines. Design and manage platform solutions to support data engineering needs to ensure seamless integration and performance. Write clean, efficient, and maintainable code. Leadership and Collaboration: Lead and mentor a team of Data engineers,...
-
Senior Data Engineer
2 weeks ago
bangalore, India HireAlpha Full timeSenior Data Engineer (Informatica Platform Engineer / Administrator) – Bangalore We’re looking for a Senior Data Engineer with strong expertise in Informatica platform administration to join our Group IT Data Engineering team.This role is ideal for professionals who enjoy building and maintaining enterprise-grade data platforms that power large-scale...