Deep Learning From Scratch

Neural Networks continue to be the bleeding edge of Machine Learning, from smashing image benchmarks in the 2010s to powering ever-improving language models today. But how do they work “under the hood” - are there some graduate physics-level mathematical tricks involved, for example?

In this workshop, we’ll demystify how neural networks work, showing that they consist of simple mathematical building blocks that, when stitched together in the right way, can learn complex relationships between inputs and outputs. To solidify this understanding, we’ll code up some simple neural networks in Python.

Key Takeaways

1 Understand the math behind neural networks

2 Reinforce this understanding by coding them up neural networks scratch in Python

3 Time-permitting, touch on convolutional neural networks


Speaker

Seth Weidman

Data Scientist @SentiLink

Seth has been a Data Scientist at SentiLink, a SaaS company that uses machine learning to detect different kinds of fraud for the financial services industry, for almost three years, though he has recently moved into Product Management. Prior to SentiLink he worked as a Data Scientist at several other companies including Facebook, and authored a book on the subject of this workshop called 'Deep Learning From Scratch' that was published by O’Reilly in September 2019. 

Read more

Date

Thursday Oct 27 / 01:00PM PDT ( 3 hours)

Level

Level intermediate

Topics

Machine Learning Neural Networks

Share

Prerequisites

Jupyter Notebooks with the standard Python data science stack:

  • Pandas
  • Numpy
  • Sci-Kit Learn