Recent posts

In this post we’ll improve our training algorithm from the previous post. When we’re done we’ll be able to achieve 98% precision on the MNIST data set, after just 9 epochs of training—which only takes about 30 seconds to run on my laptop. For comparison, last time we only achieved 92% precision after 2,000 epochs of training, which took over an hour! The main driver in this improvement is just switching from batch gradient descent to mini-batch gradient descent.... Read more

In this post we’re going to build a neural network from scratch. We’ll train it to recognize hand-written digits, using the famous MNIST data set. We’ll use just basic Python with NumPy to build our network (no high-level stuff like Keras or TensorFlow). We will dip into scikit-learn, but only to get the MNIST data and to assess our model once its built. We’ll start with the simplest possible “network”: a single node that recognizes just the digit 0.... Read more

Location: University of Toronto Dates: June 12–14, 2018 Keynote Speakers: Lara Buchak and Mike Titelbaum Submission Deadline: February 12, 2018 Authors Notified: March 31, 2018 We are pleased to invite papers in formal epistemology, broadly construed to include related areas of philosophy as well as cognate disciplines like statistics, psychology, economics, computer science, and mathematics. Submissions should be: prepared for anonymous review, no more than 6,000 words, accompanied by an abstract of up to 300 words, and in PDF format.... Read more

Recent papers




I'm an Associate Professor of Philosophy at the University of Toronto. I research uncertainty in human reasoning. I also indulge in some programming and related nerdery.