Snowflake Data Engineer
6 days ago
2:00 pmorld Wide Technology (WWT), a global technology integrator and supply chain solutions provider. WWT employs more than 10,000 people worldwide and operates in more than 2 million square feet of state-of-the-art warehousing, distribution, and integration space strategically located worldwide. WWT ranked on Glassdoor's Best Places to Work & on Fortune's 100 Best Companies to Work For list for the 13th consecutive year We are number 9 on Indias Great Place to Work for Mid-size Companies, 2025
World Wide Technology Holding Co, LLC. (WWT) has an opportunity available for a Snowflake Data Engineer role with competitive pay depending on your experience and current salary. Please respond with an updated resume and the required details at the bottom of the email.
Role Snowflake Data Engineer
Long-Term Contract -24+ Months with high possibility of extensions
Work Time : 2:00pm to 11:00pm IST
Location : Bangalore, India
Mode of work: WFO, India.
KEY RESPONSIBILITIES:
- Designing, implementing, managing scalable data solutions using Snowflake environment for optimized data storage and processing.
- Migrate existing data domains/flows from relational data store to cloud data store (Snowflake).
- Identify and optimize new/existing data workflows.
- Identify and implement data integrity practices.
- Integrate data governance and data science tools with Snowflake ecosystem as per practice.
- Support the development of data models and ETL processes to ensure high quality data ingestion in cloud data store.
- Collaborate with team members to design and implement effective data workflows and transformations.
- Assist in the maintenance and optimization of Snowflake environments to improve performance and reduce costs.
- Contribute to proof of concept, documentation and best practices for data management and governance within the Snowflake ecosystem.
- Participate in code reviews and provide constructive feedback to improve team deliverables quality
- Design and develop data ingestion pipeline using Talend/Informatica using industry best practices.
- Writing efficient SQL and Python scripts for large dataset analysis and building end to end automation process on a set schedule.
- Design, implement data distribution layer using Snowflake REST API.
SKILLS / QUALIFICATIONS
- Bachelors degree in computer science, Software Engineering, Information Technology, Management Information Systems or related field required (Master's degree preferred)
- 10 years' experience in data analysis, data objects development and modeling in Snowflake data store.
- Snowflake REST API experience
- Informatica ETL and/or Talend ETL experience.
- Efficient SQL and PLSQL queries development and Python scripts development experience.
- Proven ability to work in distributed systems
- Proficiency with relational databases (such as DB2) querying and focus on data transformations.
- Excellent problem-solving skills and team-oriented mindset
- Strong data modelling concepts and schema design on relational data store and on cloud data store
- Strong communication skills both verbal and written. Capable of collaborating effectively across a variety of IT and Business groups, across regions and different roles.
- Familiarity with data visualization tools such as Tableau and PowerBI is a plus.
- Collaborating with data scientists/experts to integrate machine learning models into Snowflake
- Data Warehousing background is require
-
Snowflake Architect
2 weeks ago
Bengaluru, Karnataka, India NTT DATA Full time ₹ 1,20,000 - ₹ 2,40,000 per yearReq ID: 341051NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now. We are currently seeking a Snowflake Architect to join our team in Bangalore, Karnātaka (IN-KA), India (IN). Snowflake Data Platform - Architect...
-
Snowflake Data Engineer
2 days ago
Bengaluru, Karnataka, India Vidpro Consultancy Services Full time ₹ 6,00,000 - ₹ 18,00,000 per yearExp: YrsWork Mode: HybridLocation: Bangalore, Chennai, Kolkata, Pune and GurgaonPrimary Skills: Python, pyspark, Azure Data Factory, snowflake, snowpipe, snowsql, Snowsight, Snowpark, ETL, SQL, and Architect Designing.Snowpro certified is plusPrimary Roles and Responsibilities:Developing Modern Data Warehouse solutions using Snowflake, Databricks and...
-
Snowflake Data Engineer
2 weeks ago
Bengaluru, Karnataka, India Ashra Technology Full time ₹ 10,00,000 - ₹ 20,00,000 per yearGreeting from dayRole: Snowflake data engineerExp: 5+YearsLocation: RemoteDescription:Role - Snowflake Data EngineerMust Required Skills in KEYWORDS:SnowflakeETLSQLAzure Cloud ServicesExp - 5+ years in Data workJob Location – RemoteBilling Start date: October 10thShift Timings: US- Eastern 8-5* Experience in ETL tools with strong SQL skills, minimum 5+...
-
Snowflake Data Engineer
1 week ago
Bengaluru, Karnataka, India Kasmo Cloud Solutions Full time ₹ 20,00,000 - ₹ 25,00,000 per yearAbout the opportunity:We are seeking a highly skilled and experienced Snowflake Developer with a strong background in SQL, Python, and a minimum of 3 years of hands-on experience with Snowflake. The ideal candidate will be a Snowflake Certified with a proven track record in data warehousing, data modelling, and implementing ETL/ELT pipelines using...
-
Snowflake Data Engineer
6 days ago
Bengaluru, Karnataka, India Techno Facts Solutions Full time ₹ 20,00,000 - ₹ 25,00,000 per yearA Senior Snowflake Developer with Cortex experience is responsible for designing,building, and optimizing data solutions on the Snowflake platform, with a specificfocus on leveraging Snowflake Cortex for AI-driven features like RAG (Retrieval-Augmented Generation), LLM functions, and vector search. Key responsibilitiesinclude developing scalable ETL/ELT...
-
Snowflake Engineer
4 days ago
Bengaluru, Karnataka, India Futran Solutions Full time ₹ 12,00,000 - ₹ 36,00,000 per yearDesignation: Snowflake EngineerLocation: PuneExperience: 4 to 6 yearsJob Description:We are seeking a highly skilledSenior Snowflake Engineerwith hands-on experience inSnowflake Cortex,data engineering, andAI-driven solutions. The ideal candidate will be responsible for designing and building scalable data pipelines, optimizing data models, and developing...
-
Data Engineer
2 weeks ago
Bengaluru, Karnataka, India NTT DATA, Inc. Full time ₹ 15,00,000 - ₹ 28,00,000 per yearWe are currently seeking a Data Engineer (HRIS) to join our team in Bangalore, Karntaka (IN-KA), India (IN).Job Duties: Key Responsibilities -Data Management - Develop data mapping specifications and transformation rules Perform data cleansing, validation, and reconciliation activities Create and execute data conversion scripts and processes Document data...
-
AWS Snowflake Data Engineer
2 weeks ago
Bengaluru, Karnataka, India Contactx Resource Management Pvt. Ltd. Full time ₹ 10,00,000 - ₹ 25,00,000 per yearApply at : call : Design and implement data pipelines for seamless data ingestion.Utilize AWS technologies, including S3, Glue, and Lambda, for effective data management.Collaborate with cross-functional teams to enhance data processes using Python, PySpark, and SQL.Implement CI/CD deployments for streamlined development processes.Understand Data Warehouse...
-
Snowflake Data Engineer
3 weeks ago
Bengaluru, Karnataka, India, Karnataka Impactron Global Full timeRole OverviewWe’re seeking a Snowflake Data Engineer / Sr Data Engineer passionate about building scalable, cloud-native data solutions using Snowflake, dbt, and modern ETL frameworks. You will design, implement, and support data architectures and pipelines that support analytics, reporting, and AI-driven use cases for global clients.You’ll work within a...
-
Snowflake Data Engineer
2 weeks ago
Bengaluru, Karnataka, India EduRun Group Full time ₹ 12,00,000 - ₹ 36,00,000 per yearJob Description : Key Responsibilities : - Design, build, and optimize scalable batch and real-time streaming data pipelines using Databricks (PySpark) and Snowflake. - Lead the architecture and implementation of application data stores using PostgreSQL, DynamoDB, and advanced SQL to support diverse application needs. - Develop efficient data...