I'm a senior applied scientist at Oracle, where I work on natural language processing and reinforcement learning, developing models that support and streamline healthcare. Previously, I worked at Amazon, developing a foundation model for 3D computer vision. I completed my PhD at Victoria University of Wellington (VUW) in New Zealand, where my research focused on meta-learning loss functions for deep neural networks. My current research interests include meta-learning, meta-optimization, hyperparameter optimization, few-shot learning, and continual learning.
-
Oracle โ Health and AI
- Melbourne, Australia
- decadz.github.io/
Highlights
- Pro
Pinned Loading
-
Evolved-Model-Agnostic-Loss
Evolved-Model-Agnostic-Loss Public[TPAMI 2023] Learning Symbolic Model-Agnostic Loss Functions via Meta-Learning. Paper Link: https://arxiv.org/abs/2209.08907
Python 15
-
Online-Loss-Function-Learning
Online-Loss-Function-Learning Public[TMLR 2025] Meta-Learning Adaptive Loss Functions. Paper Link: https://arxiv.org/abs/2301.13247
Python 1
-
Sparse-Label-Smoothing-Regularization
Sparse-Label-Smoothing-Regularization Public[TPAMI 2023] PyTorch code for Sparse Label Smoothing Regularization presented in "Learning Symbolic Model-Agnostic Loss Functions via Meta-Learning". Paper Link: https://arxiv.org/abs/2209.08907
Python 2
-
Genetic-Programming-with-Rademacher-Complexity
Genetic-Programming-with-Rademacher-Complexity Public[CEC 2019] Genetic Programming with Rademacher Complexity. Paper Link: https://ieeexplore.ieee.org/document/8790341
-
Meta-Learning-Literature-Overview
Meta-Learning-Literature-Overview PublicList of AI/ML papers related to my thesis on "Meta-Learning Loss Functions for Deep Neural Networks". Thesis link: https://arxiv.org/abs/2406.09713
If the problem persists, check the GitHub status page or contact support.



