AWS Data Engineer

17 hours ago


Bengaluru, Karnataka, India Wipro Full time ₹ 2,00,00,000 - ₹ 2,50,00,000 per year

Wipro Limited (NYSE: WIT, BSE: 507685, NSE: WIPRO) is a leading technology services and consulting company focused on building innovative solutions that address clients' most complex digital transformation needs. Leveraging our holistic portfolio of capabilities in consulting, design, engineering, and operations, we help clients realize their boldest ambitions and build future-ready, sustainable businesses. With nearly 245,000 employees and business partners across 65 countries, we deliver on the promise of helping our clients, colleagues, and communities thrive in an ever-changing world. Wipro is an Equal Employment Opportunity employer and makes all employment and employment-related decisions without regard to a person's race, sex, national origin, ancestry, disability, sexual orientation, or any other status protected by applicable law.

Website

Role Purpose

The AWS Data Engineer plays a critical role within Wipro Technologies' Data Analytics & AI service line by architecting, developing, and maintaining scalable data pipelines and ETL processes using AWS Glue and Amazon S3. This position is essential in enabling data-driven decision-making by transforming raw data into actionable insights, ensuring data accessibility, consistency, and high quality across varied data sources. The role demands deep expertise in AWS big data services and a strong focus on innovation, efficiency, and collaboration to meet evolving business needs.

The ideal candidate will have extensive experience designing robust data architectures that integrate smoothly with other AWS services such as Lambda, EMR, Redshift, Kinesis, and Athena, ensuring optimal performance and cost-effectiveness. This role provides an excellent opportunity to contribute to a forward-thinking digital transformation journey while advancing technical and leadership capabilities.

The AWS Data Engineer is also responsible for promoting best practices in data governance, quality assurance, security, and compliance within all data processes and fostering a culture of continuous learning and technological advancement within the team.

Required Skills and Qualifications:

  • AWS Services Expertise: Demonstrated mastery of AWS Glue, Amazon S3, AWS Lambda, Amazon EMR, Amazon Redshift, Amazon Kinesis, and AWS Athena. Strong ability to design and leverage these services for optimized big data workflows.
  • Programming Languages: Advanced proficiency in Python, Scala, and Apache Spark for developing Glue ETL scripts, performing complex data transformations, and supporting scalable pipelines.
  • Data Warehousing and Data Lakes: Solid understanding of data lake architectures, data warehousing concepts, and best practices in organizing, storing, and retrieving large-scale structured and unstructured datasets.
  • SQL and Database Skills: Expertise in writing complex SQL queries, optimizing database performance, and designing resilient database schemas for data integration and analysis.
  • Analytical and Problem-Solving Skills: Proven ability to diagnose complex data issues, architect innovative solutions, and troubleshoot performance bottlenecks in high-volume data environments.
  • Knowledge of Data Governance and Security: Familiarity with data privacy regulations, AWS security best practices, and implementation of access controls, encryption, and compliance monitoring mechanisms.
  • Communication and Collaboration: Excellent skills in working effectively with interdisciplinary teams, including data engineers, data scientists, and business stakeholders to translate requirements into technical solutions.

Additional qualifications include a Bachelor's or Master's degree in Computer Science, Information Technology, or a related field, and professional AWS certifications (e.g., AWS Certified Big Data – Specialty or AWS Certified Solutions Architect) are highly desirable.

Key Responsibilities:

  • Design, Develop, and Maintain Data Pipelines: Architect scalable, reliable, and efficient data pipelines utilizing AWS Glue and other AWS data services to support diverse analytics and business intelligence needs.
  • ETL Job Development and Optimization: Create and optimize ETL jobs written in Spark, Python, or Scala, ensuring data is accurately transformed, cleansed, and enriched for downstream processes.
  • Data Catalog Management: Implement and maintain AWS Glue Data Catalogs and crawlers to automate metadata discovery, classification, and management for seamless data accessibility.
  • Amazon S3 Data Storage Administration: Manage Amazon S3 buckets and object lifecycle policies for optimized data storage, adhering to organizational security policies and operational standards.
  • Data Quality Assurance: Integrate comprehensive validation tests and monitoring mechanisms within ETL workflows to guarantee data accuracy and integrity throughout processing stages.
  • Monitoring, Troubleshooting, and Optimization: Continuously monitor job performance and pipeline health, proactively addressing issues, tuning resources, and ensuring cost-efficient processing.
  • Stakeholder Engagement and Collaboration: Collaborate closely with cross-functional teams, including data scientists, analysts, and business users, to align technical implementations with strategic goals.
  • Documentation and Knowledge Sharing: Maintain thorough technical documentation and share best practices to elevate team expertise and drive process improvements.

Competencies and Personal Attributes:

  • Client Centricity: Focus on understanding and exceeding client expectations by delivering high-quality, tailored data solutions.
  • Passion for Results: Demonstrate strong motivation to achieve measurable outcomes and contribute actively to business growth and innovation.
  • Execution Excellence: Exhibit meticulous attention to detail and commitment to delivering dependable, scalable data systems on schedule.
  • Collaborative Working: Foster a supportive, diverse, and inclusive team environment encouraging open communication and collective success.
  • Learning Agility: Embrace continuous learning, adapt quickly to emerging technologies, and apply knowledge to solve complex challenges.
  • Problem Solving & Decision Making: Employ analytical thinking to identify root causes and devise effective, practical solutions.
  • Effective Communication: Clearly articulate technical concepts to both technical and non-technical stakeholders, ensuring mutual understanding and alignment.

These competencies align with Wipro's values and are essential for contributing to a dynamic and innovative organizational culture while driving technical excellence.

About Wipro Technologies and Opportunity:

Wipro Technologies is a global leader in technology services and consulting, committed to driving transformational change for clients through innovation and purposeful collaboration. With a dedicated focus on Data Analytics & AI under the Technology Services business unit, Wipro empowers organizations to harness the power of data to revolutionize their operations and strategic decision-making.

This role offers an inspiring career path within a company that values diversity and inclusion, fosters professional growth, and supports employees' ambitions through continual reinvention. Wipro encourages applications from individuals of all backgrounds, including those with disabilities, reflecting our commitment to equal opportunity.

Experience Required: 8-10 years of relevant experience with demonstrable success in designing and implementing AWS Glue-based data solutions.

Mandatory Skills: Proven expertise in AWS Glue is essential.

Join Wipro to work alongside talented professionals in a forward-thinking environment where your contributions will directly impact the evolution of industries and technology worldwide. Together, we build not just solutions but a sustainable future.

Experience: 8-10 Years .

Reinvent your world. We are building a modern Wipro. We are an end-to-end digital transformation partner with the boldest ambitions. To realize them, we need people inspired by reinvention. Of yourself, your career, and your skills. We want to see the constant evolution of our business and our industry. It has always been in our DNA - as the world around us changes, so do we. Join a business powered by purpose and a place that empowers you to design your own reinvention. Come to Wipro. Realize your ambitions. Applications from people with disabilities are explicitly welcome.

Wipro Limited (NYSE: WIT, BSE: 507685, NSE: WIPRO) is a leading technology services and consulting company focused on building innovative solutions that address clients' most complex digital transformation needs. Leveraging our holistic portfolio of capabilities in consulting, design, engineering, and operations, we help clients realize their boldest ambitions and build future-ready, sustainable businesses. With nearly 245,000 employees and business partners across 65 countries, we deliver on the promise of helping our clients, colleagues, and communities thrive in an ever-changing world. Wipro is an Equal Employment Opportunity employer and makes all employment and employment-related decisions without regard to a person's race, sex, national origin, ancestry, disability, sexual orientation, or any other status protected by applicable law.

Website


  • AWS Sr Data Engineer

    17 hours ago


    Bengaluru, Karnataka, India NTT DATA Full time ₹ 10,00,000 - ₹ 25,00,000 per year

    Req ID: NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now. We are currently seeking a AWS Sr Data Engineer to join our team in Bangalore or Remote, Karnātaka (IN-KA), India (IN). Position Location:OUS with...

  • Data Engineer

    2 weeks ago


    Bengaluru, Karnataka, India NTT DATA Full time ₹ 15,00,000 - ₹ 25,00,000 per year

    Migrate ETL workflows from SAP BODS to AWS Glue/dbt/Talend. Develop and maintain scalable ETL pipelines in AWS. Write PySpark scripts for large-scale data processing. Optimize SQL queries and transformations for AWS PostgreSQL. Work with Cloud Engineers to ensure smooth deployment and performance tuning. Integrate data pipelines with existing Unix systems....


  • Bengaluru, Karnataka, India NTT DATA, Inc. Full time ₹ 8,00,000 - ₹ 25,00,000 per year

    6+ years of experience in Data Engineering or related role.Hands-on experience with Snowflake (data modelling, performance tuning, query optimization, Snowpipe, Time Travel, Streams & Tasks).Strong expertise in AWS Glue for ETL job development, orchestration, and optimization.Proficiency with AWS services such as S3, Lambda, Redshift, Athena, Step Functions,...

  • AWS Architect

    1 week ago


    Bengaluru, Karnataka, India NTT DATA, Inc. Full time ₹ 20,00,000 - ₹ 25,00,000 per year

    Role SummaryAs a AWS Architect, you will lead the design and implementation of scalable, secure, and high-performance data solutions using AWS Platform. You will define architectural standards, guide engineering teams, and collaborate with stakeholders to align data strategies with business goals. Your role will focus on leveraging AWS capabilities to build...

  • AWS Data Engineer

    3 days ago


    Bengaluru, Karnataka, India Info Origin Inc. Full time ₹ 12,00,000 - ₹ 36,00,000 per year

    Note: Virtual Drive on 14th and 15th OctRole OverviewWe are looking for an experienced AWS Engineer with around 5-7 years of hands-on experience in designing, deploying, and managing applications on the AWS cloud platform. The ideal candidate should have strong expertise in AWS services, automation, and cloud security, along with the ability to collaborate...

  • AWS Data Engineer

    5 days ago


    Bengaluru, Karnataka, India eSkillport Management Services Pvt. Ltd Full time ₹ 15,00,000 - ₹ 25,00,000 per year

    Job Description Job Title: AWS Data Engineer (IoT/Streaming Data) Location: Chennai or Bangalore. Experience: 3 Years Employment Type: Full-time About the Role We are seeking a highly skilled AWS Data Engineer to design, build, and optimize large-scale data pipelines and analytics platforms. The ideal candidate will have strong...

  • Data Engineer

    18 hours ago


    Bengaluru, Karnataka, India NTT DATA, Inc. Full time ₹ 10,00,000 - ₹ 25,00,000 per year

    Job Duties:Migrate ETL workflows from SAP BODS to AWS Glue/dbt/Talend. Develop and maintain scalable ETL pipelines in AWS. Write PySpark scripts for large-scale data processing. Optimize SQL queries and transformations for AWS PostgreSQL. Work with Cloud Engineers to ensure smooth deployment and performance tuning. Integrate data pipelines with existing Unix...


  • Bengaluru, Karnataka, India Luxoft Full time ₹ 15,00,000 - ₹ 25,00,000 per year

    Project description We are looking for a skilled Data Engineer with strong experience in AWS and Apache Airflow to join a dynamic data team. You will be responsible for building and maintaining scalable data pipelines, orchestrating workflows, and ensuring data quality and availability across platforms. - Responsibilities- ?Data Pipeline DevelopmentDesign,...


  • Bengaluru, Karnataka, India Amazon Web Services (AWS) Full time ₹ 12,00,000 - ₹ 36,00,000 per year

    DescriptionAWS Utility Computing (UC) provides product innovations — from foundational services such as Amazon's Simple Storage Service (S3) and Amazon Elastic Compute Cloud (EC2), to consistently released new product innovations that continue to set AWS's services and features apart in the industry. As a member of the UC organization, you'll support the...

  • AWS Data Engineer

    6 days ago


    Bengaluru, Karnataka, India Zenotis Group Full time ₹ 1,00,00,000 - ₹ 3,00,00,000 per year

    Position Title:AWS Data EngineerYear of Experience Needed:6+ YearsLocation:Hyderabad | Bangalore | ChennaiWorking Time:2 PM – 11 PM ISTWorking Mode:Hybrid (3 Days WFO)Note: This opportunity is for one of our MNC Clients.Primary Skills:Python, Pyspark, AWS, Data EngineerJob Description / Skills Required:Primary Skills:• Python, Pyspark, Glue, Redshift,...