Workshop: [SOLD OUT] (Deep) Learn You a Neural Net For Great Good!

Location: Bayview B

Duration: 1:00pm - 4:00pm

Day of week: Thursday

Level: Beginner

Key Takeaways

  • Neural Nets are quite similar to other, simpler, methods, and can be written from scratch without deep mathematical knowledge!

  • NNs are extremely composable, making it easy to mix and match modules to create more complex architectures.

  • You (yes you!) can build your own NN from scratch using the tools you already know.

Prerequisites

  • Participants should have experience using python interactively (either in a notebook environment or via a terminal such as IPython/BPython)
  • Participants should have a working Python 3 installation on their laptop, with Numpy and Scikit-learn installed.

You (yes you!) can build your own NN from scratch using the tools you already know.

Neural networks and deep learning are fundamental to modern machine learning, yet often appear scarier than they really are. Many users of Scikit-learn et al. can apply ML techniques (perhaps including deep learning) through these tools, but do not always grok fully what happens beneath the surface. Other more engineering-oriented practitioners are put off entirely by the seeming complexity of DL.

We walk through a live coding practicum (in a Jupyter Notebook) in which we implement a feed-forward, fully-connected neural net in numpy, initially training it via a for-loop to demonstrate core concepts, and finally codifying the NN as a Scikit-learn style classifier with which one can fit & predict on one’s own data. We make iterative improvements to code quality along the way, and reach a level of abstraction suitable for reusable, modular machine learning.

The focus of this talk is on the practicum of implementing one’s own NN algorithm, though we also review the most important mathematical and theoretical components of NNs to ground the practicum for attendees. Mathematical review touches on the nature of gradients, what they are, how they relate to derivates, and how backpropagation works at a high level. Attendees will leave the talk with a better understanding of deep learning through iterative optimization.

Speaker: Michael Stewart

Machine Learning Engineer @Opendoor

Stu (Michael Stewart) is a machine learning engineer at Opendoor in San Francisco. Previously, he was a data scientist and engineer at Uber, and an economic researcher at the Federal Reserve Bank of New York. He is a National Science Foundation Graduate Research Fellowship Awardee in Economic Sciences.

Find Michael Stewart at

Tracks

Monday, 5 November

Tuesday, 6 November

Wednesday, 7 November