Machine Learning

  • Machine Learning (ML)
  • Deep Machine Learning (DML)
  • Artificial Intelligence (AI)

http://machinelearningmastery.com/

Master Machine Learning Algorithms
https://machinelearningmastery.com/master-machine-learning-algorithms/

Machine Learning Mastery with Python
https://machinelearningmastery.com/machine-learning-with-python/

Deep Learning With Python
From First Model, To State-of-the-Art Results
https://machinelearningmastery.com/deep-learning-with-python/

http://www.deeplearningbook.org/
book by top deep learning scientists
Ian Goodfellow, Yoshua Bengio and Aaron Courville
and includes coverage of all of the main algorithms.

The Elements of Statistical Learning: Data Mining, Inference, and Prediction,
(Springer Series in Statistics) 2nd ed. 7th printing 2013
by Trevor Hastie, Robert Tibshirani, Jerome Friedman
http://www.amazon.com/dp/0387848576

pre-order:
Deep Learning: A Practitioner’s Approach 1st Edition
by Adam Gibson, Josh Patterson
http://www.amazon.com/dp/1491914254

pre-order:
Fundamentals of Deep Learning: Designing Next-Generation
Machine Intelligence Algorithms 1st Edition
by Nikhil Buduma (Author)
https://www.amazon.com/dp/1491925612

Java Deep Learning Essentials
https://www.amazon.com/dp/1785282190

Books by Timothy Masters:
Deep Belief Nets in C++ and CUDA C: Volume 1:
Restricted Boltzmann Machines and Supervised Feedforward Networks
Deep Belief Nets in C++ and CUDA C: Volume II:
Autoencoding in the Complex Domain

Jeff Heaton:
Artificial Intelligence for Humans, Volume 1: Fundamental Algorithms
https://www.amazon.com/Artificial-Intelligence-Humans-Fundamental-Algorithms/dp/1493682229/

Artificial Intelligence for Humans, Volume 2: Nature-Inspired Algorithms
https://www.amazon.com/Artificial-Intelligence-Humans-Nature-Inspired-Algorithms/dp/1499720572/

Artificial Intelligence for Humans, Volume 3: Deep Learning and Neural Networks
https://www.amazon.com/Artificial-Intelligence-Humans-Learning-Networks/dp/1505714346/

Deep Learning Made Easy with R: A Gentle Introduction for Data Science. – by N.D Lewis
https://www.amazon.com/dp/B01AEXMX34

Neural Networks and Deep Learning
http://neuralnetworksanddeeplearning.com/index.html

Grokking Deep Learning
http://www.amazon.com/dp/1617293709

Several books on TensorFlow:

Machine Learning with TensorFlow
http://www.amazon.com/dp/1617293873

TensorFlow Machine Learning Cookbook
http://www.amazon.com/dp/1786462168

Getting Started with TensorFlow
– http://www.amazon.com/dp/B01H1JD6JO

Hands-On Machine Learning with Scikit-Learn and TensorFlow:
Concepts, Tools, and Techniques for Building Intelligent Systems
http://www.amazon.com/dp/1491962291

DL4J: https://deeplearning4j.org/neuralnet-overview –

=====================

Two main languages used to write DML libraries: C++ and Rust.

About Rust:
https://en.wikipedia.org/wiki/Rust_(programming_language)
https://www.rust-lang.org/en-US/
Rust was started/created as a personal project by Mozilla employee Graydon Hoare in 2006.
The “production-ready” stable release v.1.0 – in May 2015.
Rust is a compiled language, similar to C/C++, but designed to cause much less problems:
(memory safe, does not permit null pointers or dangling pointers, etc.).
Rust is designed for concurrency (parallelism) and speed.
Rust doesn’t have classes, and they have no plans to implement classes.
(They were dropped from the language a while ago… along with the garbage collector).
Rust uses traits, impl, and structs to achieve similar results.

docs:
https://www.rust-lang.org/en-US/documentation.html
http://rustbyexample.com/
https://doc.rust-lang.org/book/ – https://doc.rust-lang.org/stable/book/
https://github.com/ctjhoa/rust-learning

Video tutorials for Rust:
– The History of Rust –
– Rust Tutorial

rust book(s):
Programming Rust – by Jim Blandy, Jason Orendorff (2016)
– https://www.amazon.com/Programming-Rust-Fast-Systems-Development/dp/1491927283/

Leaf – an open Machine Learning Framework written in Rust
https://github.com/autumnai/leaf –

Educational Videos:
Deep Learning SIMPLIFIED – series of 30+ youtube videos

NIPS – Neural Information Processing Systems Foundation, Inc.:
https://nips.cc/Conferences/2016/Schedule?type=Workshop

ODSC – Open Data Science Conference
https://www.odsc.com/boston

https://www.kaggle.com – predictive modelling and analytics competitions

Subscriptions:
– DataScienceWeekly.org

Bootcamps:
– http://insightdatascience.com – 7 weeks, zero cost

Deep Learning – https://en.wikipedia.org/wiki/Deep_learning –

Alan Turing 1950 essay – test for AGI (artificial general intelligence) – 5 min text exchange – pretend to be a human
Frank Rosenblatt (Cornell psychologist, late 1950s) – the Perceptron , an artificial neural network .

The simplest description of a neural network is that it’s a machine
that makes classifications or predictions based on its ability to discover patterns in data .

Quoc Le et al – cat paper (using unlabeled data – reconize a cat)
https://static.googleusercontent.com/media/research.google.com/en//archive/unsupervised_icml2012.pdf

=====================
GPUs, NVIDIA:
– http://www.forbes.com/sites/aarontilley/2016/11/30/nvidia-deep-learning-ai-intel/#51b3ede839cc –
– http://www.nvidia.com/object/deep-learning-system.html –
NVIDIA DGX-1 – purpose-built system for deep learning and AI, equiv to 250 servers, price $130K
=====================
TPUs = tensor processing units :
– Google TPU chips – https://www.google.com/amp/www.recode.net/platform/amp/2016/5/20/11719392/google-ai-chip-tpu-questions-answers –
– Intel Nervana Chip coming in 2017 – http://www.globalfuturist.org/2016/12/intels-latest-nervana-chip-shows-the-giant-is-serious-about-owning-the-ai-market/ – http://www.nextbigfuture.com/2016/11/intel-will-deliver-100x-increase-in.html
===============================

Some people:

Marc Raibert robotics
Andrew Ng Google Brain, chief scientist at Baidu Research in Silicon Valley, cofounder of Coursera
http://www.mlyearning.org
Yann LeCun convolutional neural net (CNN), 1998
optical character recognition and computer vision
in late 1990s was processing up to 20% of all checks in USA.
NYU-CDS = NYU Center for Data Science
director of Facebook AI Research in New York City
International Conference on Learning Representations
Geoff Hinton “Father of Deep Learning”, Univ of Toronto & Google
Restricted Boltzmann Machines (RBM)
Deep Belief Networks (DBN) – stacked RBMs
Yoshua Bengio Theano
Juergen Schmidhuber,
Sepp Hochreiter
recurrent neural nets (RNNs) and Long short-term memory (LSTM) architecture – Sepp Hochreiter and Jürgen Schmidhuber (1997) – good for time series, handwriting recognition, speech recognition – https://en.wikipedia.org/wiki/Long_short-term_memory
Paul Werbos backpropagation
Frank Rosenblatt perceptron
Demis Hassabis, David Silver, Nando De Freitas researchers at DeepMind, combining convnets with reinforcement learning
Quoc Le, Oriol Vinyals, Ilya Sutskever Google Research
Tomas Mikolov Facebook for Word2vec
Ryan Kiros Attention Models and Skip-Thought vectors
Ivakhnenko Ivakhnenko published the first general learning algorithms for deep networks (e.g., Ivakhnenko and Lapa, 1965). First deep learning net with 8 layers was designed by him.

 

Google Brain – department in Google since 2011
Jeff Dean – head of Google Brain
Greg Corrado
Tomas Mikolov
Mike Schuster, Yonghui Wu, Zhifeng Chen
Macduff Hughes – director of Google Translate

Michael Tetelman – https://www.linkedin.com/in/michaeltetelman, https://plus.google.com/+MichaelTetelman

================

================
ImageNet

 

================
Companies:

  • University of Toronto, Canada
  • Google, DeepMind – Google’s AlphaGo AI beat Lee Sedol, the board game Go champion (March 2016),
    TensorFlow
  • Facebook
  • Baidu_Research
  • IBM – SystemML ,
  • Elon Musk et al – OpenAI
  • Microsoft
  • Apple
  • Intel