Data Engineer
Data Engineer
REQUIREMENTS
-
Data Integration: Informatica PowerCenter, Kafka (Shared Event Streaming Platform).
-
Database: Oracle Database (PL/SQL, Performance Tuning), Oracle APEX.
-
Scripting & Backend: Java (Spring Boot), Shell Scripting, Python (for data processing/automation).
- Scheduling: Automic (Broadcom/CA Automic).
- Protocols: SFTP, IBM MQ, REST APIs.
FUNCTIONS
-
ETL Migration & Management: Manage and migrate critical data pipelines currently running on Informatica PowerCenter and Kafka. Ensure seamless data flow between Salesforce (OneCRM), IBM MQ, and local Oracle databases.
-
Oracle Database Development: Optimize database performance, manage PL/SQL logic, and support the Oracle APEX front-end application. Handle database upgrades (e.g., moving to Oracle 19c/21c) and address desupported features.
-
Legacy Modernization: Reverse-engineer and containerize legacy monolithic ETL scripts running on VMs. Implement the “Strangler Fig” pattern to gradually route traffic from legacy systems to new Kubernetes-based implementations.
-
Job Scheduling: Manage and upgrade Automic job scheduling workflow
-
Data Integrity & Compliance: Ensure all data handling complies with GDPR, particularly regarding personal data stored in the Oracle database
WE OFFER
- Professional and personal development.
- Opportunity to participate in the projects remotely.
- Loyalty program.
ABOUT PROJECT
Digicode is a custom software, mobile app and next generation global technologies development company based in Dallas Texas, with development centers in the US, Costa Rica, Israel & Ukraine.