Data Engineer

1 day ago


Bengaluru Karnataka India, Karnataka Wisemonk Full time

One of our US Based client who is transforming how people buy and sell secondhand apparel, and data is central to that mission. We’re looking for a Data Engineer to join our Data Engineering team with a focus on analytics engineering and enabling scalable, self-service data across the organization. In this role, you’ll design, build, and maintain scalable data infrastructure and robust pipelines to support a wide range of analytics use cases. You’ll collaborate with cross-functional partners to develop trusted data models that power reporting, analytics, and automation. You'll also contribute to platform-level efforts around content governance and performance optimization. This is a foundational role for someone who is eager to deepen their cloud-native data engineering skills, explore the application of AI in modern data workflows and drive business impact. What you will do ● Design, implement, and maintain scalable data and analytics infrastructure to support diverse datasets, ensuring high availability, performance, and data integrity. ● Build and optimize robust ELT pipelines using dbt and Databricks to ingest and transform structured and semi-structured data across business domains. ● Develop accessible and trusted data models to enable self-service analytics and operational reporting. ● Collaborate with analysts, data scientists, and product managers to understand stakeholder needs and deliver actionable insights. ● Contribute to platform usability and governance through workspace organization, access management, and performance tuning. ● Participate in debugging, monitoring, and ensuring data quality across production pipelines. ● Stay curious about emerging technologies, with an emphasis on incorporating AI capabilities and automation into workflows. Qualification●4+ yrs in data/analytics engineering, strong SQL + Python/Spark, hands-on with Databricks & dbt ● Strong development expertise in Python with experience building and deploying robust services, adhering to software engineering principles. ● Proficiency in SQL and data modeling for analytics use cases, with a strong understanding of data quality and semantic principles. ● Experience building distributed data pipelines using Spark, Databricks, or cloud-native data tools. ● Familiarity with AI/ML workflows and an eagerness to explore their application in production environments, particularly regarding feature engineering and AI experimentation. ● Proficiency in building data pipelines using Airflow and Apache Spark (RDD/DataFrames/Structured Streaming) ● Experience with distributed messaging systems like Kafka, SQS, or RabbitMQ. ● Knowledge of big data lake architectures using S3, Parquet/Avro, and Delta Lake in Databricks. ● Strong hands-on experience with schema validation (e.g., Pydantic, Avro) and data QA practices ● Comfort with Docker, Kubernetes/ECS, and production deployment workflows. ● Working knowledge of CI/CD, infrastructure-as-code, and monitoring tools like Datadog. ● Familiarity with building REST/gRPC APIs and data-serving interfaces. ● Ability to write scalable, reusable, and testable code with a focus on performance, cost, and reliability.● Understanding of CI/CD tooling, structured testing frameworks, and system-level monitoring. ● Excellent problem-solving and collaboration skills across engineering, product, and analytics teams.What you’ll own : ELT pipelines, Analytics infra, data models for self-service analytics, platform governance - all with an AI first mindset


  • Data Entry Operator

    2 days ago


    Bengaluru, Karnataka, India Design Engineer Full time ₹ 2,64,000 per year

    We are seeking a dedicated and detail-oriented Data Entry Operator (DEO) to support R&D projects. The role involves accurate data entry, documentation, and maintenance of research records in secure systems. The DEO will assist scientists, engineers, and administrative staff by ensuring timely and error-free handling of project information.Key...

  • Data Engineer

    1 week ago


    Bengaluru, Karnataka, India NTT DATA Full time ₹ 15,00,000 - ₹ 25,00,000 per year

    Design and implement tailored data solutions to meet customer needs and use cases, spanning from streaming to data lakes, analytics, and beyond within a dynamically evolving technical stack. Provide thought leadership by recommending the most appropriate technologies and solutions for a given use case, covering the entire spectrum from the application layer...

  • Data Engineer

    2 weeks ago


    Bengaluru, Karnataka, India NTT DATA Full time ₹ 15,00,000 - ₹ 25,00,000 per year

    Migrate ETL workflows from SAP BODS to AWS Glue/dbt/Talend. Develop and maintain scalable ETL pipelines in AWS. Write PySpark scripts for large-scale data processing. Optimize SQL queries and transformations for AWS PostgreSQL. Work with Cloud Engineers to ensure smooth deployment and performance tuning. Integrate data pipelines with existing Unix systems....

  • Data Engineer

    1 week ago


    Bengaluru, Karnataka, India NTT DATA, Inc. Full time ₹ 1,04,000 - ₹ 1,30,878 per year

    Req ID:321800We are currently seeking a Data Engineer (Talend &Pyspark) to join our team in Bangalore, Karntaka (IN-KA), India (IN)."Job Duties: Key Responsibilities: Design and implement tailored data solutions to meet customer needs and use cases, spanning from streaming to data lakes, analytics, and beyond within a dynamically evolving technical...

  • Data Engineer

    1 week ago


    Bengaluru, Karnataka, India NTT DATA, Inc. Full time ₹ 15,00,000 - ₹ 25,00,000 per year

    Key Responsibilities:Design and implement tailored data solutions to meet customer needs and use cases, spanning from streaming to data lakes, analytics, and beyond within a dynamically evolving technical stack.Provide thought leadership by recommending the most appropriate technologies and solutions for a given use case, covering the entire spectrum from...


  • Bengaluru, Karnataka, India Astar Data Full time ₹ 12,00,000 - ₹ 36,00,000 per year

    Kindly find the Job Description Below.Job Title:Technical Lead- Data EngineerLocation: BangaloreYears of Experience: 8+ years of experienceSigmoidworks with a variety of clients from start-ups to fortune 500 companies. We are looking for a detailed oriented self-starter to assist our engineering and analytics teams in various roles as a Software...


  • Bengaluru, Karnataka, India NTT DATA Full time ₹ 12,00,000 - ₹ 36,00,000 per year

    Framework Design & Architecture Architect a metadata-driven, Python/Spark-based framework for automated data validation across high-volume production datasets. Define DQ rule templates for completeness, integrity, conformity, accuracy, and timeliness. Establish data quality thresholds, escalation protocols, and exception workflows. Automation & Integration...

  • Data Engineer

    2 days ago


    Bengaluru, Karnataka, India NTT DATA, Inc. Full time ₹ 10,00,000 - ₹ 25,00,000 per year

    Job Duties:Migrate ETL workflows from SAP BODS to AWS Glue/dbt/Talend. Develop and maintain scalable ETL pipelines in AWS. Write PySpark scripts for large-scale data processing. Optimize SQL queries and transformations for AWS PostgreSQL. Work with Cloud Engineers to ensure smooth deployment and performance tuning. Integrate data pipelines with existing Unix...

  • Data Engineer

    2 weeks ago


    Bengaluru, Karnataka, India, Karnataka AMISEQ Full time

    Data Engineer Bangalore, KA WFO A minimum of 3 years of experience is required. Key ResponsibilitiesDevelop and maintain ETL/ELT pipelines using modern data engineering tools and frameworks7 * 24 On-call support data pipeline health, performance, and SLA complianceDocument data processes, schemas, and best practices SOPImplement data quality checks,...

  • Data Engineer

    3 days ago


    Bengaluru, Karnataka, India, Karnataka Acqueon Full time

    About the JobWe are building a Customer Data Platform (CDP) designed to unlock the full potential of customer experience (CX) across our products and services. This role offers the opportunity to design and scale a platform that unifies customer data from multiple sources, ensures data quality and governance, and provides a single source of truth for...