Senior Data Consultant
- Senior Data Consultant
- 12 Months
- Experience: 10+ years IT experience, with at least 6+ years in “ETL/Data Migration” focused title/role,
- Process Automation: Automating processes in file management, testing data in files, analysis and design, configuration management.
- Ingestion Automation: Building tools or automations to process csv, JSON, XML, KVP data files in to Ingestion Data Hub (Hive/HDFS, Relational).
- Script Automation: Ability to design/create scripts to automate/improve data copy/migration/ETL for repetitive activity.
- Real-Time: Experience designing/create/processing real-time data feeds such as KAFKA
- Big data: Hive (Schema), HDFS, Kafka, Beam
- Solutions: Define solutions from High level to detail design to address automating ingestion activity. Facilitate/confirm requirements from product owners, business team members and technical associates,
- Ingestion Design: Create/assess Source/Target mapping low level designs vs. it being done for you
- Performance: Assess, recommend, improve Informatica mappings, SQL queries, Batch feeds
- Platform/OS: Mainframe, Mid-range/Unix, Microsoft
- Databases: Relational. E.g. Oracle, etc.
- Languages: Unix (Shell) Scripting/commands, Java, Python, SQL
- File formats: Mainframe (format types), Text, CSV, Parquet, JSON
- ETL Tools: Informatica ETL tool (Power Center or BDM)
- Scheduling: Control-M, Jenkins, etc.
- Source/Binary Control tools: GIT , Artifactory
- Delivery Models: Waterfall, Agile, Scrum, Kanban,
- Frameworks: Development and delivery frameworks, design patterns,
- Activity Reporting/Repository: Jira, Confluence
- Industry: Financial Services/Banking
Nice to have:
- Cloud: Building tools or automations to process data files in to ingestion hub in the Cloud (AWS) in a nonexclusive manner. Possible use of Docker, Kubernetics, lambda, databases in the cloud, queues, elastic architecture solutions and enterprise applications build needs.
- Real-Time: monitoring, searching and acting on streams of real-time data. E.g. Splunk
- ETL Tools: Apache Beam
- Configuration Tool: Chef, Puppet
- Automation Tools: Other tools that may add value to an automation program and generally support development. E.g. API, REST, JDBC, Webservices, Message Queues/Load balancer,
- Dev DevOps: Microservices in general, Puppet
- Collaborate with application teams on subjects relating to the ingestions, firewall, network, file orchestration. Meta data, domains, scripts and
- Able to collaborate in an off shore-onshore organization/environment.
How to apply:
Please apply using the link below or call RAMS on 03 8506 6524 for further details. Applications closes based on the volume of applications received. Only short-listed candidates will be contacted.
Adaps is an equal opportunity employer that actively embraces diversity in its workforce through accurate community representation of gender, culture; thought and work arrangements.
Connect with Adaps: