Simple Transformer
Simple Transformer is a from-scratch implementation of a decoder-only transformer language model, written with PyTorch and trained on Jane Austen.
- Character tokenisation
- Multi-headed masked self-attention
- Pre-norm & dropout regularisation
PPO Sandbox
PPO Sandbox is a lightly optimised implementation of PPO using Pytorch and OpenAI Gym.
- Asynchronous vector environments
- GAE rewards and advantages
- Minibatch training
- Comprehensive stats logged and progress videos recorded
COVID Futures
COVID Futures provides forecasts of COVID case numbers in Australian states.
- Simple custom-built ML data pipeline
- ResNet architecture (extended from this paper) implemented with PyTorch
- Website via Flask / Tailwind / billboard.js
Australian Fungi ID
Fungi ID predicts species identification for Australian fungi.
- Dataset assembled from iNaturalist
- Image classification with resnet34 via FastAI library
Research Background
- Towards a Total Synthesis of Strychnine - organic chemistry honours thesis (H1, 2010)