InfoObjects Software

1 week ago


Jaipur, India Info Full time

Key Responsibilities :- Design, develop, and manage data pipelines using Databricks (Spark, Delta Lake).- Optimize large-scale data processing workflows for performance and reliability.- Collaborate with Data Scientists, Analysts, and Business Stakeholders to gather requirements and deliver actionable insights.- Maintain and enforce data quality and integrity across multiple data sources and systems.- Work with cloud data platforms such as Azure, AWS, or GCP.- Implement data governance and lineage tracking using tools like Unity Catalog, Great Expectations, or similar.- Monitor, debug, and troubleshoot data pipelines and jobs.Required Qualifications :- 7+ years of professional experience in data engineering or similar roles.- Strong experience with Databricks, Apache Spark, and Delta Lake.- Proficient in SQL, Python, and distributed data processing concepts.- Experience working with cloud platforms (Azure/AWS/GCP) and cloud-native tools.- Hands-on experience with ETL/ELT processes, data warehousing, and modern data stack.- Familiarity with CI/CD practices and version control tools (e.g., Git).- Strong problem-solving skills and ability to work independently or in a team environment. (ref:hirist.tech)


  • Machine Learning

    1 week ago


    Jaipur, India Infoobjects Full time

    **Role Category**: Programming & Design **Role**: Machine Learning - Artificial Intelligence **Job Location**: Jaipur We are looking for a talented Software Engineer who can thrive in a fast-paced, agile environment. You’ll implement machine learning algorithms, drive data science analysis for generating insights and actions for decision-making, and build...


  • Jaipur, India Info Full time

    Key Responsibilities :- Design, develop, and maintain applications using Node.js, Golang, Java, or Python, along with JavaScript and TypeScript.- Build and integrate REST APIs, microservices, and GenAI-based APIs (e.g., OpenAI, Hugging Face, LangChain).- Work with SQL (PostgreSQL, MySQL) and NoSQL (MongoDB) databases.- Develop and optimize applications using...


  • Jaipur, India Info Full time

    Key Responsibilities :- Design, develop, and manage data pipelines using Databricks (Spark, Delta Lake).- Optimize large-scale data processing workflows for performance and reliability.- Collaborate with Data Scientists, Analysts, and Business Stakeholders to gather requirements and deliver actionable insights.- Maintain and enforce data quality and...