code (and youtube walkthrough) of a port of some numpy code i did recently to einsum that i thought was illustrative.
quick example code of a simple way to gauge baseline random performance.
quick example code demoing a way to derive class_weights from performance on validation data. this can often speed training up.
some code showing how you can init the bias of classifier to match the base distribution of your training data.
some code that demos how to run pybullet for generating a truck load of synthetic training under google cloud dataflow.
a short walkthrough explainer on fully convolutional networks.
bnn : counting bees with a rasp pi
drivebot : learning to do laps with reinforcement learning and neural nets
wikipedia philosophy : do all first links on wikipedia lead to philosophy?
cartpole++ : deep RL hacking with a complex 3d cart pole environment
malmomo : deep RL hacking on minecraft with malmo
some papers from my time at google research / brain...
- Natural Questions: a Benchmark for Question Answering Research
- Using Simulation and Domain Adaptation to Improve Efficiency of Deep Robotic Grasping
- WikiReading: A Novel Large-scale Language Understanding Task over Wikipedia
my honours thesis
the co-evolution of cooperative behaviour (1997) evolving neural nets with genetic algorithms for communication problems.
- latent semantic analysis via the singular value decomposition (for dummies)
- semi supervised naive bayes
- statistical synonyms
- round the world tweets
- decomposing social graphs on twitter
- do it yourself statistically improbable phrases
- should i burn it?
- the median of a trillion numbers
- deduping with resemblance metrics
- simple supervised learning / should i read it?
- audioscrobbler experiments
- chaoscope experiment