Ad

Sunday, March 11, 2018

Entropy, Cross Entropy, Information Gain Explained by Stanford University Lecturer





entropy: how spread out is the distribution. many bits to transmit info. randomly uniform  distribution (see our 1 minute post on randomly uniform distribution) have a lot of entropy, uniform distribution very little. e.g. if i tell you the price starts with 0 increase by 5% every day. that's very few summary stats telling a big distribution. Versus, i can't describe random.

No comments:

Post a Comment

Natural Language Processing NLP - Useful libraries, tools and code samples

Basic Concepts Stop words removal Stop words are words that may not carry valuable information In some cases stop words matter . For e...