Machine Learning

  • Machine Learning (ML)
  • Deep Machine Learning (DML)
  • Artificial Intelligence (AI)

Master Machine Learning Algorithms

Machine Learning Mastery with Python

Deep Learning With Python
From First Model, To State-of-the-Art Results
book by top deep learning scientists
Ian Goodfellow, Yoshua Bengio and Aaron Courville
and includes coverage of all of the main algorithms.

The Elements of Statistical Learning: Data Mining, Inference, and Prediction,
(Springer Series in Statistics) 2nd ed. 7th printing 2013
by Trevor Hastie, Robert Tibshirani, Jerome Friedman

Deep Learning: A Practitioner’s Approach 1st Edition
by Adam Gibson, Josh Patterson

Fundamentals of Deep Learning: Designing Next-Generation
Machine Intelligence Algorithms 1st Edition
by Nikhil Buduma (Author)

Java Deep Learning Essentials

Books by Timothy Masters:
Deep Belief Nets in C++ and CUDA C: Volume 1:
Restricted Boltzmann Machines and Supervised Feedforward Networks
Deep Belief Nets in C++ and CUDA C: Volume II:
Autoencoding in the Complex Domain

Jeff Heaton:
Artificial Intelligence for Humans, Volume 1: Fundamental Algorithms

Artificial Intelligence for Humans, Volume 2: Nature-Inspired Algorithms

Artificial Intelligence for Humans, Volume 3: Deep Learning and Neural Networks

Deep Learning Made Easy with R: A Gentle Introduction for Data Science. – by N.D Lewis

Neural Networks and Deep Learning

Grokking Deep Learning

Several books on TensorFlow:

Machine Learning with TensorFlow

TensorFlow Machine Learning Cookbook

Getting Started with TensorFlow

Hands-On Machine Learning with Scikit-Learn and TensorFlow:
Concepts, Tools, and Techniques for Building Intelligent Systems

DL4J: –


Two main languages used to write DML libraries: C++ and Rust.

About Rust:
Rust was started/created as a personal project by Mozilla employee Graydon Hoare in 2006.
The “production-ready” stable release v.1.0 – in May 2015.
Rust is a compiled language, similar to C/C++, but designed to cause much less problems:
(memory safe, does not permit null pointers or dangling pointers, etc.).
Rust is designed for concurrency (parallelism) and speed.
Rust doesn’t have classes, and they have no plans to implement classes.
(They were dropped from the language a while ago… along with the garbage collector).
Rust uses traits, impl, and structs to achieve similar results.

docs: –

Video tutorials for Rust:
– The History of Rust –
– Rust Tutorial

rust book(s):
Programming Rust – by Jim Blandy, Jason Orendorff (2016)

Leaf – an open Machine Learning Framework written in Rust –

Educational Videos:
Deep Learning SIMPLIFIED – series of 30+ youtube videos

NIPS – Neural Information Processing Systems Foundation, Inc.:

ODSC – Open Data Science Conference – predictive modelling and analytics competitions


– – 7 weeks, zero cost

Deep Learning – –

Alan Turing 1950 essay – test for AGI (artificial general intelligence) – 5 min text exchange – pretend to be a human
Frank Rosenblatt (Cornell psychologist, late 1950s) – the Perceptron , an artificial neural network .

The simplest description of a neural network is that it’s a machine
that makes classifications or predictions based on its ability to discover patterns in data .

Quoc Le et al – cat paper (using unlabeled data – reconize a cat)

– –
– –
NVIDIA DGX-1 – purpose-built system for deep learning and AI, equiv to 250 servers, price $130K
TPUs = tensor processing units :
– Google TPU chips – –
– Intel Nervana Chip coming in 2017 – –

Some people:

Marc Raibert robotics
Andrew Ng Google Brain, chief scientist at Baidu Research in Silicon Valley, cofounder of Coursera
Yann LeCun convolutional neural net (CNN), 1998
optical character recognition and computer vision
in late 1990s was processing up to 20% of all checks in USA.
NYU-CDS = NYU Center for Data Science
director of Facebook AI Research in New York City
International Conference on Learning Representations
Geoff Hinton “Father of Deep Learning”, Univ of Toronto & Google
Restricted Boltzmann Machines (RBM)
Deep Belief Networks (DBN) – stacked RBMs
Yoshua Bengio Theano
Juergen Schmidhuber,
Sepp Hochreiter
recurrent neural nets (RNNs) and Long short-term memory (LSTM) architecture – Sepp Hochreiter and Jürgen Schmidhuber (1997) – good for time series, handwriting recognition, speech recognition –
Paul Werbos backpropagation
Frank Rosenblatt perceptron
Demis Hassabis, David Silver, Nando De Freitas researchers at DeepMind, combining convnets with reinforcement learning
Quoc Le, Oriol Vinyals, Ilya Sutskever Google Research
Tomas Mikolov Facebook for Word2vec
Ryan Kiros Attention Models and Skip-Thought vectors
Ivakhnenko Ivakhnenko published the first general learning algorithms for deep networks (e.g., Ivakhnenko and Lapa, 1965). First deep learning net with 8 layers was designed by him.


Google Brain – department in Google since 2011
Jeff Dean – head of Google Brain
Greg Corrado
Tomas Mikolov
Mike Schuster, Yonghui Wu, Zhifeng Chen
Macduff Hughes – director of Google Translate

Michael Tetelman –,





  • University of Toronto, Canada
  • Google, DeepMind – Google’s AlphaGo AI beat Lee Sedol, the board game Go champion (March 2016),
  • Facebook
  • Baidu_Research
  • IBM – SystemML ,
  • Elon Musk et al – OpenAI
  • Microsoft
  • Apple
  • Intel