Snowflake Developer

1 week ago


Gurugram, India Nexiva Full time

Title: Snowflake Developer Location: 100% Remote Duration: 1+ Year MOI: Video Need LinkedIn and Genuine We need: A senior (12+ years) Snowflake developer with strong SQL capabilities with hands-on experience in Alteryx. In this role, you will design and implement Snowflake-based data models and pipelines, develop robust transformation logic, and optimize performance. Candidate will design, implement, and maintain Snowflake schemas, tables, views, materialized views, and data sharing, Develop robust ELT/ETL transformations using Snowflake SQL, UDFs, and semi-structured data handling as well as Implement and optimize Snowflake features such as warehouses, clustering, time travel, zero-copy cloning, streams, tasks, and data sharing.  Candidates must have recent experience with Snowflake development and Alteryx. They must have long projects and excellent communication skills. Candidates must have Long Projects/Good Tenure, Excellent communication skills and a State issued ID (Not Bills) showing they are Local. Job Description: Job Summary Flagstar Bank is seeking a skilled Snowflake Developer who combines strong SQL capabilities with hands-on experience in Alteryx. In this role, you will design and implement Snowflake-based data models and pipelines, develop robust transformation logic, and optimize performance. You will collaborate with data engineers, analysts, and business stakeholders to support regulatory, risk, and customer-focused data needs in a banking environment, while ensuring data quality, governance, and security. Key Responsibilities •            Snowflake development and data modeling: o            Design, implement, and maintain Snowflake schemas, tables, views, materialized views, and data sharing o            Develop robust ELT/ETL transformations using Snowflake SQL, UDFs, and semi-structured data handling o            Implement and optimize Snowflake features such as warehouses, clustering, time travel, zero-copy cloning, streams, tasks, and data sharing •            Alteryx integration and workflow optimization: o            Build and maintain Alteryx workflows that connect to Snowflake as a data source/target o            Optimize Alteryx Designer/Server workflows for performance, scalability, and reliability o            Collaborate with data engineers to design efficient data pipelines and transformations •            Data quality, governance, and security: o            Implement data quality checks, data lineage, and metadata management o            Apply RBAC, masking policies, encryption, and network policies as appropriate for banking data o            Maintain documentation and support regulatory data requirements •            Collaboration and delivery: o            Translate business requirements into technical designs, data mappings, and specs o            Perform code reviews, contribute to design best practices, and ensure changes meet regulatory and security standards o            Partner with analytics, BI, and governance teams to fulfill reporting and analytics needs •            Testing, deployment, and automation: o            Write unit/integration tests for SQL and data pipelines; participate in CI/CD for data assets o            Automate routine maintenance and deployment tasks with Python, SQL, or shell scripting •            Incident response and support: o            Troubleshoot production issues, perform root-cause analysis, and implement durable fixes o            Create runbooks and knowledge base articles for common scenarios •            Documentation: o            Maintain data dictionaries, lineage, and design documentation for data assets Required Qualifications •                Bachelor’s degree in Computer Science, Information Systems, or a related field (or equivalent work experience) •                6+ years of Snowflake development experience (designing schemas, pipelines, and transformations) •                Strong SQL skills with the ability to write complex queries and optimize performance •                Practical experience building and maintaining Alteryx workflows that interact with Snowflake •                Thorough understanding of Snowflake concepts: warehouses, clustering, time travel, zero-copy cloning, data sharing, streams/tasks, and security features •                Experience with data governance, data quality, and metadata management •                Familiarity with cloud environments (Snowflake on AWS/Azure/GCP) and cloud storage integrations •                Scripting experience (Python, Bash, or similar) for task automation •                Excellent problem-solving, communication, and collaboration skills •                Ability to work effectively in a remote or distributed team and align with US banking/regulatory workflows •                SnowPro Core Certification or equivalent •                Alteryx Designer Core/Advanced Certification; experience with Alteryx Server administration •                Banking/financial services domain experience and knowledge of regulatory requirements •                Experience with data lineage tools (e.g., Collibra, Alation) and data governance frameworks •                Familiarity with orchestration tools (e.g., Apache Airflow) and ETL/ELT best practices •                Knowledge of BI/analytics tools (Tableau, Power BI) and data visualization needs Work Environment and Benefits •                Remote-friendly role based in the United States; occasional travel or on-site attendance may be required for onboarding or team meetings •                Competitive salary commensurate with experience •                Comprehensive benefits package (health, dental, vision, retirement plans, paid time off, and more) •                Opportunity to work with a stable, well-established bank that embraces modern data platforms


  • Snowflake Developer

    4 hours ago


    Gurugram, India Netlink Full time

    Design, develop, and optimize Snowflake-based data pipelines, data models, and ETL workflows to support business intelligence and reporting needs. Write complex SQL queries for data extraction, transformation, and analysis, ensuring data is accurately processed and made available for reporting. Work with Snowflake’s data warehousing architecture to manage...

  • Snowflake Developer

    3 weeks ago


    Gurugram, India Mobile Programming LLC Full time

    Snowflake DeveloperJob Title : Snowflake DeveloperLocation : Pune, Chandigarh, Bangalore, Chennai, Panchkula, Mumbai, Gurugram, MumbaiNotice Period : ImmediateJob Description :- We are urgently seeking a skilled Snowflake Developer to join our team.- The ideal candidate must have a very clear understanding of Snowflake Architecture and at least 3+ years of...

  • Snowflake Architect

    1 week ago


    Gurugram, India Live Connections Full time

    Role :Data Architect, Snowflake Architect Location: Gurgaon Experience: 7+ Years (Relevant) CTC: Up to 35 LPA Working Days: 5 Days a Week Job Summary A Seasoned Data Architect with 7+ years of experience in Data Warehousing, Business Intelligence, and Enterprise Data Architecture . The ideal candidate will possess deep hands-on expertise in Snowflake , Azure...


  • Gurugram, India Genpact Full time

    Ready to shape the future of work? At Genpact, we don’t just adapt to change—we drive it. AI and digital innovation are redefining industries, and we’re leading the charge. Genpact’s AI Gigafactory, our industry-first accelerator, is an example of how we’re scaling advanced technology solutions to help global enterprises work smarter, grow faster,...


  • Bengaluru, Gurugram, India Compunnel Full time

    Job Title: Snowflake & DBT Data EngineerLocation: Bangalore / GurugramBudget: 13-14 LPAWork Mode: HybridAbout the RoleWe are seeking an experienced Snowflake & DBT Data Engineer to join our data engineering team. The ideal candidate will be responsible for building, optimizing, and maintaining scalable data pipelines and data warehouse solutions. You will...


  • Bengaluru, Gurugram, India SP Software Full time ₹ 8,00,000 - ₹ 12,00,000 per year

    Role & responsibilitiesThe incumbent will be responsible for, but not limited to, the following key deliverables:• Design, develop, and maintain scalable data pipelines and ETL processes using AWS services and Snowflake.• Implement data models, data integration, and data migration solutions.• Snowflake DBWB creations and migrations. Follow up with the...

  • Talend Developer

    4 hours ago


    Gurugram, India Virtusa Full time

    Talend Developer - CREQ Description Talend - Designing, developing, and documenting existing Talend ETL processes, technical architecture, data pipelines, and performance scaling using tools to integrate Talend data and ensure data quality in a big data environment. AWS / Snowflake - Design, develop, and maintain data models using SQL and Snowflake / AWS...


  • Gurugram, India Insight Global Full time

    Title: Tableau/Power BI Developer Location: Remote Required Skills: - Overall 5+ years of experience in data visualization. - Tableau: Advanced dashboard design, calculated fields, LOD expressions, data blending, and performance tuning. - Extensive Power BI experience: DAX, Power Query (M), data modelling, and row-level security. - SQL (Snowflake preferred):...


  • Gurugram, India Insight Global Full time

    Title: Tableau/Power BI Developer Location: Remote Required Skills: Overall 5+ years of experience in data visualization. Tableau : Advanced dashboard design, calculated fields, LOD expressions, data blending, and performance tuning. Extensive Power BI experience : DAX, Power Query (M), data modelling, and row-level security. SQL (Snowflake preferred) :...

  • ETL Developer

    4 weeks ago


    Gurugram, India Terra Technology Circle Consulting Private Limited Full time

    We are seeking an experienced ETL Developer with strong hands on expertise in Spark,SQL,Databricks and snowflake and prior experience in pharmaceutical/life sciences domain.The ideal candidate will design ,develop and maintain scalable ETL pipelines..Key responsibilitiesDesign build end to end pipelines for data processing.Work with Databricks and Spark to...