Etl / Data Engineer With Snowflake Experience

3 weeks ago


Bengaluru, Karnataka, India NTT DATA Full time

Req ID 318488NTT DATA strives to hire exceptional innovative and passionate individuals who want to grow with us If you want to be part of an inclusive adaptable and forward-thinking organization apply now We are currently seeking a ETL Data Engineer with Snowflake Experience to join our team in Bangalore Karnxc4x81taka IN-KA India IN Job Duties Team Overview The Controls Engineering Measurement and Analytics CEMA department is responsible for Cyber Risk and Control assessment management monitoring and reporting capabilities across Technology resulting in risk reduction and better oversight of the technology risk landscape of the firm Our work is always client focused our engineers are problem-solvers and innovators We seek exceptional technologists to help deliver solutions on our user-facing applications data stores and reporting and metric platforms while being cloud-centric leveraging multi-tier architectures and aligned with our DevOps and Agile strategies We are in the process of modernizing our technology stack across multiple platforms with the goal of building scalable front-to-back assessment measurement and monitoring systems using the latest cloud web and data technologies We are looking for someone with a systematic problem-solving approach coupled with a sense of ownership and drive The successful candidate will be able to influence and collaborate globally They should be a strong team-player have an entrepreneurial approach push innovative ideas while appropriately considering risk and adapt in a fast-paced changing environment Role Summary As an ETL Data Engineer you will be a member of the CEDAR C3 Data Warehouse team with a focus on sourcing and storing data from various technology platforms across the firm into a centralized data platform used to build various reporting and analytics solutions for the Technology Risk functions within Morgan Stanley In this role you will be primarily responsible for the development of data pipelines database views and stored procedures in addition to performing technical data analysis and monitoring and tuning queries and data loads You will be working closely with data providers data analysts data developers and data analytics teams to facilitate the implementation of client-specific business requirements and requests KEY RESPONSIBILITIES To develop ETLs stored procedures triggers and views on our existing DB2-based Data Warehouse and on our new Snowflake-based Data Warehouse To perform data profiling and technical analysis on source system data to ensure that source system data can be integrated and represented properly in our models To monitor the performance of queries and data loads and perform tuning as necessary To provide assistance and guidance during the QA UAT phases to quickly confirm the validity of potential issues and to determine the root cause and best resolution of verified issues Minimum Skills Required Bachelor s degree in Computer Science Software Engineering Information Technology or related field required At least 5 years of experience in data development and solutions in highly complex data environments with large data volumes At least 5 years of experience developing complex ETLs with Informatica PowerCenter At least 5 years of SQL PLSQL experience with the ability to write ad-hoc and complex queries to perform data analysis At least 5 years of experience developing complex stored procedures triggers MQTs and views on IBM DB2 Experience with performance tuning DB2 tables queries and stored procedures An understanding of E-R data models conceptual logical and physical Strong understanding of advanced data warehouse concepts Factless Fact Tables Temporal Bi-Temporal models etc Experience with Python a plus Experience with developing data transformations using DBT a plus Experience with Snowflake a plus Experience with Airflow a plus Experience with using Spark PySpark for data loading and complex transformations a plus Strong analytical skills including a thorough understanding of how to interpret customer business requirements and translate them into technical designs and solutions Strong communication skills both verbal and written Capable of coll About NTT DATANTT DATA is a 30 billion trusted global innovator of business and technology services We serve 75 of the Fortune Global 100 and are committed to helping clients innovate optimize and transform for long term success As a Global Top Employer we have diverse experts in more than 50 countries and a robust partner ecosystem of established and start-up companies Our services include business and technology consulting data and artificial intelligence industry solutions as well as the development implementation and management of applications infrastructure and connectivity We are one of the leading providers of digital and AI infrastructure in the world NTT DATA is a part of NTT Group which invests over 3 6 billion each year in R D to help organizations and society move confidently and sustainably into the digital future Visit us atNTT DATA endeavors to make accessible to any and all users If you would like to contact us regarding the accessibility of our website or need assistance completing the application process please contact us at This contact information is for accommodation requests only and cannot be used to inquire about the status of applications NTT DATA is an equal opportunity employer Qualified applicants will receive consideration for employment without regard to race color religion sex sexual orientation gender identity national origin disability or protected veteran status For our EEO Policy Statement please click If you d like more information on your EEO rights under the law please click For Pay Transparency information please click



  • Bengaluru, Karnataka, India DXC Technology Full time US$ 90,000 - US$ 1,20,000 per year

    Job Description:Snowflake Data Engineer (Snowflake, SQL, Python Automation) Job Location: Hyderabad / Bangalore / Chennai / Kolkata / Noida/ Gurgaon / Pune / Indore / MumbaiJob DetailsWe are seeking a skilled Data Engineer to design, build, and maintain scalable data pipelines and analytics solutions using Snowflake and Python. The ideal candidate will have...


  • Bengaluru, Karnataka, India Tredence Inc. Full time

    Job Location- Kolkata, Chennai, Pune, Bangalore, GurugramExperience- 4 to 15 yrsEarly joiners preferredPrimary Roles and Responsibilities:● Developing Modern Data Warehouse solutions using Snowflake, Databricks and ADF.● Ability to provide solutions that are forward-thinking in data engineering and analytics space● Collaborate with DW/BI leads to...


  • Bengaluru, Karnataka, India Tietoevry Full time

    We are looking for a skilled Snowflake Developer with hands-on experience in Python, SQL, and Snowpark to join our data engineering team. You will be responsible for designing and building scalable data pipelines, developing Snowpark-based data applications, and enabling advanced analytics solutions on the Snowflake Data Cloud platform.Key Responsibilities-...


  • Bengaluru, Karnataka, India NTT Data Full time

    Job DescriptionNTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now.We are currently seeking a Data Engineer with Snowflake, AWS Glue, Kafka, API to join our team in Bangalore, Karntaka (IN-KA), India (IN).- 6+...


  • Bengaluru, Karnataka, India EduRun Full time

    Key Responsibilities :- Design, develop, and maintain Snowflake data models, ELT pipelines, and data integration workflows.- Write complex SQL queries, stored procedures, and optimize query performance in Snowflake.- Collaborate with data analysts, data scientists, and engineering teams to support data requirements and analytics needs.- Develop and maintain...


  • Bengaluru, Karnataka, India beBeeSnowflake Full time ₹ 15,00,000 - ₹ 20,00,000

    Senior Snowflake Engineer Job DescriptionWe are seeking a proactive Senior Snowflake Engineer to lead the migration of data solutions from SAP HANA to Snowflake. You will be responsible for designing and implementing Snowflake-based data architectures, developing and optimizing complex SQL queries, and utilizing Snowpark and DBT for data pipeline automation....

  • Snowflake Architect

    3 weeks ago


    Bengaluru, Karnataka, India Nazztec Private Limited Full time

    Job Title : Snowflake ArchitectExperience : 1320 YearsLocation : Bangalore, Pune, Chennai, Kolkata, GurugramJoining Timeline : Immediate to 15 DaysKey Responsibilities :- Architect, design, and develop scalable data solutions using Snowflake Cloud Data Platform.- Lead the design and implementation of robust data pipelines, ensuring scalability,...

  • ETL Quality Engineer

    2 weeks ago


    Bengaluru, Karnataka, India Manuh Technologies Full time

    Key Responsibilities:· Review and understand business requirements, data models, and ETL specifications.· Design, develop, and execute ETL test plans, test cases, and test scripts.· Perform data validation and data integrity checks from source to target systems (e.g., from databases, APIs, or flat files into data warehouses).· Identify and document...

  • ETL Quality Engineer

    2 weeks ago


    Bengaluru, Karnataka, India Manuh Technologies Full time

    Key Responsibilities: · Review and understand business requirements, data models, and ETL specifications. · Design, develop, and execute ETL test plans, test cases, and test scripts. · Perform data validation and data integrity checks from source to target systems (e.g., from databases, APIs, or flat files into data warehouses). · Identify and document...

  • Snowflake Architect

    3 weeks ago


    Bengaluru, Karnataka, India BLJ Tech Geeks Full time

    About the Role :We are seeking an experienced Snowflake Architect with over 12-15 years of experience specializing in data services, data architecture, and data platforms.The ideal candidate should have a strong background in designing and implementing scalable data solutions on Snowflake.Candidate should have hands-on knowledge working with Snowflake...