
Data Engineer
3 weeks ago
Job Title: Data Engineer (Snowflake + dbt)
Location: Hyderabad, India
Job Type: Full-time
Job Description:
We are looking for an experienced and results-driven Data Engineer to join our growing Data Engineering team. The ideal candidate will be proficient in building scalable, high-performance data transformation pipelines using Snowflake and dbt, and able to effectively work in a consulting setup. In this role, you will be instrumental in ingesting, transforming, and delivering high-quality data to enable data-driven decision-making across the client's organization.
Key Responsibilities:
- Design and implement scalable ELT pipelines using dbt on Snowflake, following industry-accepted best practices.
- Build ingestion pipelines from various sources including relational databases, APIs, cloud storage, and flat files into Snowflake.
- Implement data modelling and transformation logic to support layered architecture (e.g., staging, intermediate, and mart layers or medallion architecture) to enable reliable and reusable data assets.
- Leverage orchestration tools (e.g., Airflow, dbt Cloud, or Azure Data Factory) to schedule and monitor data workflows.
- Apply dbt best practices: modular SQL development, testing, documentation, and version control.
- Perform performance optimizations in dbt/Snowflake through clustering, query profiling, materialization, partitioning, and efficient SQL design.
- Apply CI/CD and Git-based workflows for version-controlled deployments.
- Contribute to growing internal knowledge base of dbt macros, conventions, and testing frameworks.
- Collaborate with multiple stakeholders such as data analysts, data scientists, and data architects to understand requirements and deliver clean, validated datasets.
- Write well-documented, maintainable code using Git for version control and CI/CD processes.
- Participate in Agile ceremonies including sprint planning, stand-ups, and retrospectives.
- Support consulting engagements through clear documentation, demos, and delivery of client-ready solutions.
Required Qualifications:
- 3 to 5 years of experience in data engineering roles, with 2+ years of hands-on experience in Snowflake and dbt.
- Experience building and deploying dbt models in a production environment.
- Expert-level SQL and strong understanding of ELT principles. Strong understanding of ELT patterns and data modelling (Kimball/Dimensional preferred).
- Familiarity with data quality and validation techniques: dbt tests, dbt docs, etc.
- Experience with Git, CI/CD, and deployment workflows in a team setting.
- Familiarity with orchestrating workflows using tools like dbt Cloud, Airflow, or Azure Data Factory.
Core Competencies:
Data Engineering and ELT Development:
- Building robust and modular data pipelines using dbt.
- Writing efficient SQL for data transformation and performance tuning in Snowflake.
- Managing environments, sources, and deployment pipelines in dbt.
Cloud Data Platform Expertise:
- Strong proficiency with Snowflake: warehouse sizing, query profiling, data loading, and performance optimization.
- Experience working with cloud storage (Azure Data Lake, AWS S3, or GCS) for ingestion and external stages.
Technical Toolset:
- Languages & Frameworks:
- Python: For data transformation, notebook development, automation.
- SQL: Strong grasp of SQL for querying and performance tuning.
Best Practices and Standards:
- Knowledge of modern data architecture concepts including layered architecture (e.g., staging → intermediate → marts, Medallion architecture).
- Familiarity with data quality, unit testing (dbt tests), and documentation (dbt docs).
Security & Governance:
- Access and Permissions:
- Understanding of access control within Snowflake (RBAC), role hierarchies, and secure data handling.
- Familiar with data privacy policies (GDPR basics), encryption at rest/in transit.
Deployment & Monitoring:
- DevOps and Automation:
- Version control using Git, experience with CI/CD practices in a data context.
- Monitoring and logging of pipeline executions, alerting on failures.
Soft Skills:
- Communication & Collaboration:
- Ability to present solutions and handle client demos/discussions.
- Work closely with onshore and offshore teams of analysts, data scientists, and architects.
- Ability to document pipelines and transformations clearly.
- Basic Agile/Scrum familiarity – working in sprints and logging tasks.
- Comfort with ambiguity, competing priorities, and fast-changing client environment.
Nice to Have:
- Experience in client-facing roles or consulting engagements.
- Exposure to AI/ML data pipelines, feature stores.
- Exposure to ML flow for basic ML model tracking.
- Experience/Exposure using Data quality tooling.
Education:
- Bachelor's or master's degree in computer science, Data Engineering, or a related field.
- Certifications such as Snowflake SnowPro, dbt Certified Developer Data Engineering are a plus.
Why Join Us?
- Opportunity to work on diverse and challenging projects in a consulting environment.
- Collaborative work culture that values innovation and curiosity.
- Access to cutting-edge technologies and a focus on professional development.
- Competitive compensation and benefits package.
- Be part of a dynamic team delivering impactful data solutions.
About Us:
Logic Pursuits provides companies with innovative technology solutions for everyday business problems. Our passion is to help clients become intelligent, information-driven organizations, where fact-based decision-making is embedded into daily operations, leading to better processes and outcomes. Our team combines strategic consulting services with growth-enabling technologies to evaluate risk, manage data, and leverage AI and automated processes more effectively. With deep, big four consulting experience in business transformation and efficient processes, Logic Pursuits is a game-changer in any operations strategy.
-
Data Engineer
4 weeks ago
Mangalore, Karnataka, India R Systems Full timeJob Title: Data EngineerContract Period: 12 MonthsLocation: Offshore candidates accepted (Singapore Based Company)Work Timing : 6.30 AM to 3.30 PM or 7.00 AM to 4.00 PM (IST - India timing)ExperienceMinimum 4+ years as a Data Engineer or similar role.(Please don't apply if less than 4 years exp in Data Engineer)Proven experience in Python, Spark, and Py...
-
Data Engineer
3 weeks ago
Mangalore, Karnataka, India People Prime Worldwide Full timeAbout Client:Our Client is a multinational IT services and consulting company headquartered in USA, With revenues 19.7 Billion USD, with Global work force of 3,50,000 and Listed in NASDAQ, It is one of the leading IT services firms globally, known for its work in digital transformation, technology consulting, and business process outsourcing, Business Focus...
-
Data Engineer
4 weeks ago
Mangalore, Karnataka, India R Systems Full timeJob Title: Data EngineerContract Period: 12 MonthsLocation: Offshore candidates accepted (Singapore Based Company)Work Timing : 6.30 AM to 3.30 PM or 7.00 AM to 4.00 PM (IST - India timing)ExperienceMinimum 4+ years as a Data Engineer or similar role.(Please don't apply if less than 4 years exp in Data Engineer)Proven experience in Python, Spark, and PySpark...
-
Senior Data Engineer
5 days ago
Mangalore, Karnataka, India UnifyCX Full time ₹ 6,00,000 - ₹ 18,00,000 per yearRequired Experience8+ Years for Senior Data EngineerPreferred SkillsetsVery Strong Python skills.Very efficient in SQL/Stored procedures.At least 8 years of Hands on Experience in AWS Step function, AWS Lambda, AWS S3, AWS ECS, AWS Cloud Watch, AWS Event Bridge, AWS Athena, AWS Glue.At least 8 years of Experience in TerraformAt least 8 years of Experience in...
-
Quality Assurance Lead
3 weeks ago
Mangalore, Karnataka, India DATA POEM Full timeCompany DescriptionData Poem is a core AI company focused on uncovering true causation in data, going beyond correlation. We have developed the world's first Large Causal Architecture, enabling organizations to derive deep, actionable insights. Our flagship product, POEM365, is an enterprise-scale AI model that leverages a vast data universe to deliver...
-
Analytics Engineer
1 week ago
Mangalore, Karnataka, India UnifyCX Full time ₹ 1,04,000 - ₹ 1,30,878 per yearWho We AreunifyCX is an emerging Global Business Process Outsourcing company with a strong presence in the U.S., Colombia, Dominican Republic, India, Jamaica, Honduras, and the Philippines. We provide personalized contact centers, business processing, and technology outsourcing solutions to clients worldwide. In nearly two decades, unifyCX has grown from a...
-
Senior Data Scientist
4 weeks ago
Mangalore, Karnataka, India Intelligence Node Full timeIntelligence Node is a real-time retail intelligence platform that empowers businesses to drive product-level profitability and grow margins using data-driven real-time competitive insights. Intelligence Node's cutting-edge technology leverages AI to aggregate and analyze billions of data points across 1,900 retail categories in 34 global markets....
-
AI Engineer
3 weeks ago
Mangalore, Karnataka, India MightyBot Full timeJoin our team as an AI Engineer, where we're focused on graduating AI from interesting demos to indispensable products. You will build reliable, self-improving systems that empower subject matter experts to automate their most complex, high-value work. This is a role for engineers who want to solve real-world business challenges and create AI tools that are...
-
Sr. Data Scientist
4 weeks ago
Mangalore, Karnataka, India EG AS Full timeSenior Data ScientistAt EG, we develop software for our customers so they can focus on their profession.Our industry-specific software is built by peers from the industry, and backed by the scale of EG for stability, innovation, and security.We are committed to advancing industries by tackling big challenges such as resource use, efficiency, and...
-
OpenAI Engineer
4 weeks ago
Mangalore, Karnataka, India Hyqoo Full timeTitle - Application Engineer AI SMEType - Contract Location - Remote Roles and Responsibilities:- Configure and manage data privacy controls to prevent the training of AI models on sensitive data and enforce zero-retention policies.- Deliver comprehensive audit logs, compliance reporting, and monitoring for OpenAI usage to ensure adherence to regulatory...