Analitica Global
3 weeks ago
Description : Key Responsibilities :- Lead the design and implementation of modern data platforms across Azure, AWS, and Snowflake.- Translate business requirements into robust technical solutions covering ingestion, transformation, integration, warehousing, and validation.- Architect, build, and maintain data pipelines for analytics, reporting, and machine learning use cases.- Develop and maintain ETL processes to move data from multiple sources into cloud data lakes and warehouses.- Design and implement data models, lineage, and metadata management to ensure consistency and traceability.- Optimize pipelines and workflows for performance, scalability, and cost efficiency.- Enforce data quality, security, and governance standards across all environments.- Support migration of legacy/on-premises ETL solutions to cloud-native platforms.- Develop and tune SQL queries, database objects, and distributed processing workflows.- Drive adoption of CI/CD, test automation, and DevOps practices in data engineering.- Collaborate with architects, analysts, and data scientists to deliver end-to-end data solutions.- Provide technical leadership, mentorship, and training to junior engineers.- Produce and maintain comprehensive technical documentation.Requirements & Skills : - Strong experience designing and developing ETL/data pipelines on Azure, AWS, and Snowflake.- Proficiency in SQL, Python, and distributed processing (e.g., Spark, Databricks, EMR).- Hands-on expertise with : - Azure : Data Factory, Synapse, Databricks, Azure SQL- AWS : Glue, Redshift, S3, Lambda, EMR- Snowflake : Data warehousing, performance optimization, security features- Solid understanding of data modeling, lineage, metadata management, and governance.- Experience with CI/CD, infrastructure-as-code, and automation frameworks.- Strong problem-solving and communication skills with the ability to work across teams.Desired Profile : - Bachelors or masters degree in computer science, Engineering, or related discipline.- 6 - 10 years of progressive data engineering experience, with at least 5 years in cloud-based data platforms.- Strong expertise in data modelling, database design, and warehousing concepts.- Proficiency in Python (including Pandas, API integrations, and automation).- Familiarity with varied data formats and sources (CSV, Parquet, JSON, APIs, relational and NoSQL databases).- Exposure to modern orchestration and workflow tools, with strong understanding of CI/CD practices.- Experience with Databricks and Microsoft Fabric is a plus.- Excellent analytical, problem-solving, and communication skills.- Ability to evaluate new technologies and adopt them where appropriate. (ref:hirist.tech)
-
Analitica Global
2 weeks ago
Bengaluru, India Analitica Full timeRole : .Net ArchitectResponsibilities : - Build the detailed design.- Create the detailed design document.- Communicate with customer group members and stakeholders to decide the design.- Share the detailed design with other developers.- Research the technical information and deploy them to our product.Requirements : - Experience between 12 to 15 years in...
-
Analitica Global
3 weeks ago
Bengaluru, India Analitica Full timeDescription : Key Responsibilities :- Develop and maintain data pipelines for ingestion, transformation, and integration across multiple platforms.- Work with structured and semi-structured data (CSV, JSON, Parquet, APIs, databases).- Support the design and development of data models, ETL workflows, and data validation.- Assist in building and maintaining...