tpu, ensemble_nets, jax, projects, haiku
a 4 (and a bit) part tutorial / colab / screencast series starting with jax fundamentals working up a data parallel approach to running on a cloud tpu pod slice... all focused on solving the toughest problem in machine learning; 1d y=mx+b
objax, projects, ensemble_nets, jax
ensemble nets; using jax vmap to batch over not just the inputs of a model but also sets of multiple models parameters.
random embedding networks can be used to generate weakly labelled data for metric learning and they see a large benefit from being run in ensembles. can we represent these ensembles as a single forward pass in jax? why yes! yes we can!
ensemble nets : training ensembles as a single model using jax on a tpu pod slice(sept 2020)
bnn : counting bees with a rasp pi (may 2018)
drivebot : learning to do laps with reinforcement learning and neural nets (feb 2016)
wikipedia philosophy : do all first links on wikipedia lead to philosophy? (aug 2011)
cartpole++ : deep RL hacking with a complex 3d cart pole environment (aug 2016)
malmomo : deep RL hacking on minecraft with malmo (jan 2017)
some papers from my time at google research / brain...
- Natural Questions: a Benchmark for Question Answering Research
- Using Simulation and Domain Adaptation to Improve Efficiency of Deep Robotic Grasping
- WikiReading: A Novel Large-scale Language Understanding Task over Wikipedia
my honours thesis
the co-evolution of cooperative behaviour (1997) evolving neural nets with genetic algorithms for communication problems.
- latent semantic analysis via the singular value decomposition (for dummies)
- semi supervised naive bayes
- statistical synonyms
- round the world tweets
- decomposing social graphs on twitter
- do it yourself statistically improbable phrases
- should i burn it?
- the median of a trillion numbers
- deduping with resemblance metrics
- simple supervised learning / should i read it?
- audioscrobbler experiments
- chaoscope experiment