- Prerequisites
- Basic Statistics
- Python coding
- Intro to Linear Algebra
- Basic Statistics
- Feature versus Target
- Features are data attributes, variables, we use for training in order to predict the target result.
- Training versus testing
- Split data into training and testing subsets and shuffle
- Train test split
- Data Split
- Performance Metric
- Coefficient of Determination R2
- Decision Tree and maximum depth
- Not covered in details online
- We had a separate lecture in-person on decision tree
- Entropy - coin example
- Coin is random, 50% head 50% tail, can't predict it Entropy = 1 bit
- Coin with double heads, 100% head, can predict always get head, no Entropy. Entropy = 0
- Learning Curve
- Model Selection
- occam's razor the simpler model is preferred [wikipedia source https://en.wikipedia.org/wiki/Occam%27s_razor]
Your byte size news and commentary from Silicon Valley the land of startup vanities, coding, learn-to-code and unicorn billionaire stories.
Sunday, October 30, 2016
Udacity Machine Learning Nanodegree Udacity Connect Intensive Cheatsheet Key Concepts
Subscribe to:
Post Comments (Atom)
Debug Google Cloud Error - Failed to enable API please make sure you have the IAM permission to enable API
If you are not expecting this message, and think you have the permission to enable API read on. Before using Google Cloud services, genera...

-
Can hack schools solve Silicon Valley's talent crunch? The truth about coding bootcamps and the students left behind http://t.co/xXNfqN...
-
In this downtown startup work space design the designers used fat boy bean bags and an extra wide step tiered staircase to create work space...
-
The bogus request from P2PU to hunt for HTML tags in real life has yielded a lot of good thoughts. My first impression was that this is stup...

No comments:
Post a Comment