Snowflake, Dbt, Python, Airflow

7 hours ago


Kochi, India UST Full time

Role Proficiency:
Provide expertise on data analysis techniques using software tools. Under supervision streamline business processes.

Outcomes:

- Design and manage the reporting environment; which include data sources security and metadata.
- Provide technical expertise on data storage structures data mining and data cleansing.
- Support the data warehouse in identifying and revising reporting requirements.
- Support initiatives for data integrity and normalization.
- Assess tests and implement new or upgraded software. Assist with strategic decisions on new systems. Generate reports from single or multiple systems.
- Troubleshoot the reporting database environment and associated reports.
- Identify and recommend new ways to streamline business processes
- Illustrate data graphically and translate complex findings into written text.
- Locate results to help clients make better decisions. Solicit feedback from clients and build solutions based on feedback.
- Train end users on new reports and dashboards.

Set FAST goals and provide feedback on FAST goals of repartees

Measures of Outcomes:

- Quality - number of review comments on codes written
- Data consistency and data quality.
- Illustrates data graphically; translates complex findings into written text.
- Number of results located to help clients make informed decisions.
- Number of business processes changed due to vital analysis.
- Number of Business Intelligent Dashboards developed
- Number of productivity standards defined for project

Number of mandatory trainings completed

Outputs Expected:
Determine Specific Data needs:

- Work with departmental managers to outline the specific data needs for each business method analysis project

Critical business insights:

- Mines the business’s database in search of critical business insights; communicates findings to relevant departments.

Code:

- Creates efficient and reusable SQL code meant for the improvement
manipulation
and analysis of data.
- Creates efficient and reusable code. Follows coding best practices.

Create/Validate Data Models:

- Builds statistical models; diagnoses
validates
and improves the performance of these models over time.

Predictive analytics:

- Seeks to determine likely outcomes by detecting tendencies in descriptive and diagnostic analysis

Prescriptive analytics:

- Attempts to identify what business action to take

Code Versioning:

- Organize and manage the changes and revisions to code. Use a version control tool for example git
bitbucket. etc.

Create Reports:

- Create reports depicting the trends and behaviours from analyzed data

Document:

- Create documentation for worked performed. Additionally
perform peer reviews of documentation of others' work

Manage knowledge:

- Consume and contribute to project related documents
share point
libraries and client universities

Status Reporting:

- Report status of tasks assigned

Comply with project related reporting standards and processes

Skill Examples:

- Analytical Skills: Ability to work with large amounts of data: facts figures and number crunching.
- Communication Skills: Communicate effectively with a diverse population at various organization levels with the right level of detail.
- Critical Thinking: Data Analysts must review numbers trends and data to come up with original conclusions based on the findings.
- Presentation Skills - facilitates reports and oral presentations to senior colleagues
- Strong meeting facilitation skills as well as presentation skills.
- Attention to Detail: Vigilant in the analysis to determine accurate conclusions.
- Mathematical Skills to estimate numerical data.
- Work in a team environment

Proactively ask for and offer help

Knowledge Examples:

- Database languages such as SQL
- Programming language such as R or Python
- Analytical tools and languages such as SAS & Mahout.
- Proficiency in MATLAB.
- Data visualization software such as Tableau or Qlik.
- Proficient in mathematics and calculations.
- Efficiently with spreadsheet tools such as Microsoft Excel or Google Sheets
- DBMS
- Operating Systems and software platforms

Knowledge regarding customer domain and sub domain where problem is solvedAdditional Comments:

- Experience in python, specifically in the area of data engineering in a commercial setting (e.g. hands on coding experience) (pandas (or dask, vaex etc), Airflow) - Good understanding of ETL/ELT patterns, idempotency and other data engineering best practices. - Extensive Experience with data modelling (3rd normal form, star schemas, wide/tall projections) - Experience of dealing with Metadata and best practices for cataloguing datasets in Snowflake/other Warehouses - Excellent SQL knowledge, including an understanding of how to write optimised SQL code, good general knowledge of different SQL engines, and what considerations they bring when optimising. - Experience with integrating DataWarehouse/Data pipelines with Data Governance tools like Collibra - Familiar



  • Kochi, India Applicantz Full time

    THIS IS A LONG TERM CONTRACT POSITION WITH ONE OF THE LARGEST, GLOBAL, TECHNOLOGY LEADER. Qualifications:Snowflake Expertise:3+ years of hands-on experience with Snowflake (recent project must be on Snowflake)Proficiency in Snowflake SQL, developing complex queries, stored procedures, and ETL scriptsExperience with Snowflake utilities such as SnowSQL,...


  • Kochi, India UST Full time

    Job Role : Senior Snowflake DeveloperJob Location : Bangalore, Chennai, Kochi, TrivandrumExperience Required : 6+ yearsRole Proficiency :Systematically develops and promotes technology solutions ensuring they meet both functional and non-functional requirements.Core Skills :- Snowflake- SQL- Azure Data Factory (ADF)- Python- DBT- DevOps (Azure DevOps, GitHub...

  • ETL Developer

    6 days ago


    Kochi, India CliqHR Full time

    We are looking for an ETL Developer to build and optimize data pipelines using Snowflake. You will play a key role in ensuring reliable, high-quality data integration while collaborating with business and technical teams.Key Responsibilities :- Design and optimize ETL/ELT pipelines on Snowflake.- Write and tune advanced SQL queries and transformations.-...

  • Data Architect

    2 weeks ago


    Kochi, Kerala, India UST Global Full time

    15 - 25 Years - 1 Opening - Kochi **Role description**: We are seeking a highly skilled **Data Architect** to design, develop, and maintain end-to-end data architecture solutions, leveraging leading-edge platforms such as **Snowflake**, **Azure**, and **Azure Data Factory (ADF)**. The role involves translating complex business requirements into scalable,...

  • Data Engineer

    1 week ago


    Kochi, India Focaloid Technologies Full time

    Strong proficiency in SQL; experience with Snowflake is a strong plus.Proficiency in Python for data processing, automation, and ETL workflows.Ability to design, build, and optimize complex data pipelines and procedures for data flow.Experience with data modeling, schema design, and query optimization.System UnderstandingKnowledge of data warehousing...


  • Bengaluru, Chennai, Kochi, India Carnation Infotech Full time ₹ 15,00,000 - ₹ 25,00,000 per year

    Job Overview:Seeking a skilled Snowflake Engineer with 5 to 7 years of experience to join our team. The ideal candidate will play a key role in the development, implementation, and optimization of data solutions on the Snowflake cloud platform. The candidate should have strong expertise in Snowflake, PySpark, and a solid understanding of ETL processes, along...

  • Data Engineer

    1 week ago


    Kochi, India Focaloid Technologies Full time

    - Strong proficiency in SQL; experience with Snowflake is a strong plus.- Proficiency in Python for data processing, automation, and ETL workflows.- Ability to design, build, and optimize complex data pipelines and procedures for data flow.- Experience with data modeling, schema design, and query optimization.- System Understanding- Knowledge of data...

  • Senior Data Engineer

    2 weeks ago


    Kochi, India Scoop Technologies Full time

    Job Data 7+ Skill : Python, Pyspark, Azure Data Bricks, Period : Immediate to 15 : Trivandrum, of Work : Hybrid Type : Full Skills & Experience :- Experience with large-scale distributed data processing systems.- Expertise in data modelling, testing, quality, access, and storage.- Proficiency in Python, SQL, and experience with Databricks and DBT.-...

  • Data engineer

    2 weeks ago


    Kochi, India IntraEdge Full time

    Job Title: Data EngineerLocation: (Remote)Experience: (5+ years)Employment Type: Full-timeJob Summary:We are seeking a skilled Data Engineer with hands-on experience in Snowflake, AWS (Lambda, Glue), DBT, and SQL to design, build, and optimize scalable data pipelines. The ideal candidate will be responsible for enabling seamless data integration,...

  • Data Engineer

    1 week ago


    Kochi, India Focaloid Technologies Full time

    Strong proficiency in SQL; experience with Snowflake is a strong plus.Proficiency in Python for data processing, automation, and ETL workflows.Ability to design, build, and optimize complex data pipelines and procedures for data flow.Experience with data modeling, schema design, and query optimization.System UnderstandingKnowledge of data warehousing...