Current jobs related to Carnera Technologies - Bengaluru, Karnataka - CARNERA TECHNOLOGIES PRIVATE LIMITED
-
Carnera Technologies
2 weeks ago
Bengaluru, Karnataka, India CARNERA TECHNOLOGIES PRIVATE LIMITED Full time ₹ 9,00,000 - ₹ 12,00,000 per yearJob Overview :We are seeking a highly skilled SAP Fieldglass Configuration Consultant with a minimum of 5 years of hands-on experience in implementing, configuring, and supporting SAP Fieldglass solutions. This role will be pivotal in ensuring the seamless configuration and optimization of the SAP Fieldglass platform for managing contingent workforce and...
-
Carnera Technologies
6 days ago
Bengaluru, India CARNERA TECHNOLOGIES PRIVATE LIMITED Full timeSenior Java : 5 to 8 yrs of overall Skills (Must-have) : Java, Spring, Web Development, MySQL, AWSSecondary Skills (Good-to-have) : N/ALocation : Bengaluru (Hybrid)Shift (IST Hours) : General :- Build or enhance service functionalities, test suites, utilities, and documentation to improve service scalability, availability, observability, development...

Carnera Technologies
2 weeks ago
About the Job :We are seeking a highly skilled and motivated Senior Data Engineer to join our growing data team. The ideal candidate will possess deep expertise in Snowflake, AWS, and Python, with a strong focus on building and maintaining scalable and efficient data pipelines. This role requires a hands-on engineer who can design, develop, and optimize data solutions to support our business needs.
You will be responsible for leveraging your expertise in Snowflake to build and maintain our data warehouse, ensuring data quality and performance. Responsibilities :Snowflake Data Warehouse Development :
- Design, develop, and maintain data pipelines and ETL processes using Snowflake.
- Write complex SQL queries for data extraction, transformation, and loading (ETL) within Snowflake.
- Implement and optimize Snowflake utilities such as SnowSQL, SnowPipe, Snowpark, Tasks, Streams, Time travel, Optimizer, Metadata Manager, data sharing, and stored procedures.
- Develop and maintain data models within Snowflake, adhering to best practices for performance and scalability.
- Ensure data quality and integrity within the Snowflake data warehouse.
AWS Cloud Infrastructure :
- Utilize AWS services (e.g., S3, EC2, Lambda) to support data ingestion, processing, and storage.
- Design and implement cloud-based data solutions that are scalable, reliable, and cost-effective.
- Work with AWS data services to integrate with Snowflake.
Python Development :
- Develop Python scripts for data extraction, transformation, and loading.
- Automate data pipeline processes using Python and related libraries.
- Build and maintain data APIs and microservices using Python.
Data Pipeline Development and Optimization :
- Design, develop, document, test, and debug new and existing software systems related to data pipelines.
- Implement best practices for data warehousing, ETL, and data modeling.
- Monitor and optimize data pipeline performance, ensuring data availability and reliability.
Collaboration and Communication :
- Collaborate with cross-functional teams, including data scientists, analysts, and business stakeholders.
-
- Gather and analyze system requirements to design and implement effective data solutions.
- Effectively communicate technical concepts and solutions to both technical and non-technical audiences.
- Work within an agile development environment.
Tools and Technologies :
- Utilize Matillion or DBT for data transformation and orchestration (Good to have).
- Familiarity with Airflow for workflow management (Good to have).
- Familiarity with data visualization tools like Tableau or Power BI (Good to have).
- Utilize Snowpark for advanced data processing within Snowflake(Good to have).
Qualifications :Experience :
- 5+ years of overall experience in data engineering.
- 3+ years of hands-on experience with Snowflake, AWS, and Python.
- Recent project must be on Snowflake.
- 2+ years of experience in designing, developing, documenting, testing, and debugging software systems.
- Experience in Data warehousing
- OLTP, OLAP, Dimensions, Facts, and Data modeling.
Technical Skills :
- Strong proficiency in Snowflake SQL and data warehousing concepts.
- Expertise in Python programming for data processing and automation.
- Hands-on experience with AWS services, particularly S3, EC2, and Lambda.
- In-depth understanding of ETL concepts and data modeling principles.
- Experience with Snowflake utilities (SnowSQL, SnowPipe, Snowpark, Tasks, Streams, Time travel, Optimizer, Metadata Manager, data sharing, and stored procedures).
- Good working knowledge of Matillion or DBT (Good to have).
- Familiarity with Airflow (Good to have).
- Familiarity with data visualization tools (Tableau/Power BI) (Good to have).
- Experience with Snowpark (Good to have).
Soft Skills :
- Strong problem-solving and analytical skills.
- Excellent communication and collaboration skills.
- Ability to work independently and as part of a team.
- Strong attention to detail and a commitment to quality.
Location : Bangalore/Pune (Hybrid) )