
ODI Developer
6 days ago
Job Title : Data Engineer with ODI- Manufacturing Domain | Azure | Python | Databricks
Job Description :
We are seeking a highly skilled Data Engineer with ODI and strong expertise in the Manufacturing domain to join our dynamic data engineering team. The ideal candidate will have hands-on experience in Oracle Data Integrator (ODI), designing and implementing robust ETL solutions to support manufacturing business processes, integrated with Azure Cloud Services, Python-based scripting, and Databricks environments.
This role will focus on developing and maintaining data integration solutions that enable the organization to collect, transform, and manage large volumes of manufacturing data for analytics, reporting, and operational decision-making.
Key Responsibilities :
- Design, develop, and maintain ETL workflows and data pipelines using Oracle Data Integrator (ODI) to support Manufacturing domain data integration requirements.
- Build and optimize data ingestion solutions from multiple manufacturing data sources (ERP systems, MES, SCADA, IoT devices) into the Azure Data Lake.
- Develop and maintain data transformation and enrichment logic using ODI, Python scripts, and Databricks notebooks.
- Collaborate closely with business analysts and stakeholders to understand business processes and data requirements in the Manufacturing domain (Inventory Management, Production Planning, Quality Control, Supply Chain).
- Work with Azure Data Factory (ADF), Databricks, and Azure Synapse Analytics to manage cloud-based data workflows and analytics processes.
- Develop and manage Python scripts for data cleansing, automation, and processing tasks.
- Optimize ODI interfaces and mappings for high performance and efficient data processing.
- Monitor and troubleshoot ETL jobs, proactively identifying and resolving data discrepancies or system errors.
- Implement automated scheduling, monitoring, and error handling for data workflows.
- Ensure data quality, data consistency, and security compliance across all integration layers.
- Maintain detailed technical and functional documentation of ETL solutions, data flows, and architecture.
- Perform code reviews, unit testing, and assist in UAT processes.
- Collaborate with DevOps teams to implement CI/CD pipelines for automated deployment of ODI jobs and related artifacts.
Required Skills & Experience :
- 3 years of hands-on experience in Data Engineering, sOracle Data Integrator (ODI) development and administration.
- Strong understanding of Manufacturing domain processes: ERP (Oracle Manufacturing, SAP PP/MM), MES, SCADA integrations.
- Solid experience with Azure Cloud Services: Azure Data Factory, Azure Data Lake, Azure Databricks, Azure Synapse Analytics.
- Proficient in Python scripting for data manipulation, automation, and integration tasks.
- Experience in building and maintaining Databricks notebooks for data transformations and analytics workflows.
- Good understanding of Data Modeling, Data Warehousing concepts, and Big Data technologies.
- Experience in developing and managing batch and real-time data pipelines.
- Strong knowledge of SQL, PL/SQL, and database concepts (Oracle, Azure SQL Database).
- Hands-on experience in integrating on-premise systems with cloud platforms using connectors and APIs.
- Familiarity with Source Control Systems (Git, Azure DevOps) and implementing CI/CD pipelines for data solutions.
- Experience with data quality frameworks and monitoring solutions.
- Good communication skills and ability to work in a collaborative, cross-functional team environment.
Preferred Qualifications :
- Azure Certification (e.g., Azure Data Engineer Associate).
- Experience with Apache Spark within Databricks.
- Exposure to Machine Learning workflows in Databricks.
- Experience in working with Agile methodologies and JIRA for project management.