Ad

Sunday, March 11, 2018

Entropy, Cross Entropy, Information Gain Explained by Stanford University Lecturer





entropy: how spread out is the distribution. many bits to transmit info. randomly uniform  distribution (see our 1 minute post on randomly uniform distribution) have a lot of entropy, uniform distribution very little. e.g. if i tell you the price starts with 0 increase by 5% every day. that's very few summary stats telling a big distribution. Versus, i can't describe random.

No comments:

Post a Comment

Codecademy - Machine Learning Fundamentals - Syllabus

Upgrade your skills with Codecademy's Pro Intensive, Machine Learning Fundamentals. Each unit will cover conceptual and syntax lessons ...