
Snowflake Dbt
6 days ago
Snowflake Development & Optimization:
Write complex SQL queries against Snowflake, including optimization and troubleshooting of queries for improved performance.
Develop ETL/ELT scripts in Python, Unix, and other scripting languages to extract, load, and transform data between systems.
Utilize Snowflake utilities such as SnowSQL, SnowPipe, Streams, Tasks, and Stored Procedures to automate data pipeline management and enhance data workflows.
Leverage Time Travel for data recovery and audit purposes, ensuring data integrity across historical states.
ETL Pipeline Development:
Design, implement, and maintain ETL pipelines using DBT (Data Build Tool) for data transformation and automation tasks.
Work with various ETL tools, primarily DBT, and integrate them with Snowflake and other systems as required.
Optimize and ensure the scalability of data pipelines to handle large volumes of data.
Data Modeling & Architecture:
Develop data models, including conceptual, logical, and physical models, entity-relationship diagrams (ERDs), star schemas, and third normal form (3NF) to support analytical requirements.
Perform source-to-target mapping to ensure data accuracy and completeness in data transformation and integration.
Cloud Integration & Automation:
Automate infrastructure deployments using CloudFormation and ensure system scalability and resiliency.
Collaboration & Problem Solving:
Work effectively within a global team environment, collaborating with other data engineers, analysts, and stakeholders.
Provide solutions to complex problems using creative and analytical approaches to continuously improve processes.
Ensure that data solutions adhere to performance, security, and compliance standards.
Documentation & Communication:
Write clear and concise technical documentation for developed pipelines, queries, data models, and processes.
Communicate complex technical information to both technical and non-technical stakeholders.
Maintain detailed records of data integration tasks, ETL processes, and data quality metrics.
Snowflake
Expertise in Snowflake SQL, including advanced query writing and optimization.
Hands-on experience with Snowflake tools and utilities such as SnowSQL, SnowPipe, Streams, Tasks, Time Travel, and Metadata Manager.
Experience with Snowflake Data Sharing and Stored Procedures.
ETL & Data Transformation:
Strong experience with ETL/ELT processes, including the development and optimization of DBT scripts for data transformation.
Proficiency in advanced SQL and Python scripting for automating data tasks and pipeline management.
Cloud & AWS:
Experience with AWS services, including IAM, S3, ECS, CloudFormation, and CloudWatch for cloud-based data integration and automation.
Data Modeling:
Proficiency in data modeling techniques, including the creation of conceptual, logical, and physical data models.
Experience with entity-relationship diagrams (ERDs), star schemas, and third normal form (3NF).
Agile Methodologies:
Familiar with Agile delivery processes and practices for iterative and collaborative development.
Experience working in Agile environments, including participating in sprint planning, backlog grooming, and daily standups.
**About Virtusa**
Teamwork, quality of life, professional and personal development: values that Virtusa is proud to embody. When you join us, you join a team of 27,000 people globally that cares about your growth — one that seeks to provide you with exciting projects, opportunities and work with state of the art technologies throughout your career with us.
Great minds, great potential: it all comes together at Virtusa. We value collaboration and the team environment of our company, and seek to provide great minds with a dynamic place to nurture new ideas and foster excellence.
Virtusa was founded on principles of equal opportunity for all, and so does not discriminate on the basis of race, religion, color, sex, gender identity, sexual orientation, age, non-disqualifying physical or mental disability, national origin, veteran status or any other basis covered by appropriate law. All employment is decided on the basis of qualifications, merit, and business need.
-
Noida, Uttar Pradesh, India DXC Technology Full timeSnowflake Data Engineer Architect Snowflake DBT Azure Job Location Hyderabad Bangalore Chennai Kolkata Noida Gurgaon Pune Indore MumbaiJob DetailsTechnical Expertise Strong proficiency in Snowflake architecture including data sharing partitioning clustering and materialized views Advanced experience with DBT for data...
-
Pyspark Snowflake
1 week ago
Andhra Pradesh, India Virtusa Full timeDesign, develop, and maintain data pipelines and ETL processes using AWS and Snowflake. Implement data transformation workflows using DBT (Data Build Tool). Write efficient, reusable, and reliable code in Python. Optimize and tune data solutions for performance and scalability. Collaborate with data scientists, analysts, and other stakeholders to understand...
-
Snowflake Dbt
2 weeks ago
Andhra Pradesh, India Virtusa Full time**About Virtusa** Teamwork, quality of life, professional and personal development: values that Virtusa is proud to embody. When you join us, you join a team of 27,000 people globally that cares about your growth — one that seeks to provide you with exciting projects, opportunities and work with state of the art technologies throughout your career with...
-
Power BI Architect
6 days ago
Noida, Uttar Pradesh, India Spectral Consultants Full timeProven experience in a technical lead role, particularly involving data visualization and embedding. - Familiar with most of the following languages / frameworks / Platforms**:SQL, dbt, Typescript, React, Vue, Python, Java, Terraform, Snowflake, Power BI** - Strong understanding of data architecture and the ability to translate technical requirements into...
-
Data Engineer-1
4 days ago
Noida, Uttar Pradesh, India Realign LLC Full time**Job Type**: Contract Job Category: IT **Role: Data Engineer** **Location: Noida, Pune, Hyderabad, Bangalore (Any locations - 100% Onsite)** **Long Term Contract** **Must have skills**: **HRIS module:workday is mandatory** **Skillset:power bi+sql+python+aws** **Required Skills**: **Data Management** - Develop data mapping specifications and...
-
Data Architect
1 week ago
Noida, Uttar Pradesh, India ShyftLabs Full time**Position Overview**: The Enterprise Architect works with other IT groups in defining architectures, leading projects that cross functional agency systems and other state entities that require coordinating, planning, and scheduling during project transition, development and implementation stages. Works under mínimal supervision, with extensive latitude...