Senior Software Engineer
Senior Software Engineer
Large Financial Organization
Permanent role @ Melbourne CBD
Excellent salary up to $150k with some flexibility and Bonus
Work Life Balance and opportunity to enhance your skills
One of our leading financial organization in Melbourne is hiring Senior Software Engineer with solid understanding of programming languages such as Python, Java, Scala and shell scripting. Experience with infrastructure tools such as Terraform and Ansible, and CI/CD pipelines are highly regarded. Also required is hands-on experience with AWS web services, particularly with a focus on configuring a secure environment.
- Spark (e.g. Scala, PySpark, etc.)
- Data engineering and data pipelines.
- Security, automation, devops.
Client is building an advanced analytics/data science capability on AWS and looking for someone with strong technical leadership who can help guide the project. This is a key attribute of the candidate we’re looking for.
In this role, you will also be responsible for:
- Promote AWS and cloud best practices to maximise compute performance while minimising infrastructure costs.
- Design and develop usage patterns and APIs to enable user self-service.
- Ability to write and optimize complex queries using both SQL and NoSQL paradigms.
- Able to work with different data types e.g. streaming, real-time, file based, RDMS, unstructured data, etc.
- Ability to design and build complex event processing, preferably in a large-data environment.
- Design and develop NDC core capabilities such as data ingestion, feature engineering, data science pipeline, model management.
- Actively contribute to planning sessions both within the core NDC engineering team, the wider NDC team and NDC users.
To be successful in this role you must have the below skills and experience:
- Experience working with large datasets in a restrictive environment (considerations around privacy, limits around user access, encryption, etc.).
- Demonstrable experience of Hadoop technologies e.g. Spark, PySpark, Kafka, Hive, Flume, Hue, Sqoop, etc.
- Experience working with, and configuring AWS services, preferably in a production environment.
- Experience with DevOps and Agile – with demonstrated ability to drive continuous improvement.
- Ability to mentor less experienced colleagues.
- Excellent communication and interpersonal skills, both oral and written.
- Ability to assess and implement new technologies and processes.
- An open mindset and proven ability to innovate and influence.
Preference will be given to candidates with the following:
- Administration of Hadoop ecosystem.
- Data ingestion technologies and capturing meta-data and data lineage.
- Experience with ‘infrastructure as code’, e.g. Terraform, Ansible.
- Experience with shell scripting.
- Experience with configuring secure environments with AD groups, SSO, SAML, etc.
- Experience of Reporting and Analytics, and/or experience working with analysts and data scientists.
- Experience using productivity and collaboration tools such as JIRA and confluence in a software delivery environment
- Postgrad qualifications and self-learning courses and certifications (Coursera, Udacity, AWS, etc.) highly regarded.
To be eligible to apply, you must be an Australian / New Zealand citizen or hold permanent residency status in Australia.
How to apply:
Please apply using the link below or call (Neil – 03 8506 6522) for further details. Applications closes based on the volume of applications received. Only short-listed candidates will be contacted. Please share your resume in Word format only.
Adaps is an equal opportunity employer that actively embraces diversity in its workforce through accurate community representation of gender, culture; thought and work arrangements.
Connect with Adaps: