Ad

Sunday, March 11, 2018

Entropy, Cross Entropy, Information Gain Explained by Stanford University Lecturer





entropy: how spread out is the distribution. many bits to transmit info. randomly uniform  distribution (see our 1 minute post on randomly uniform distribution) have a lot of entropy, uniform distribution very little. e.g. if i tell you the price starts with 0 increase by 5% every day. that's very few summary stats telling a big distribution. Versus, i can't describe random.

No comments:

Post a Comment

Understand the Softmax Function in Minutes

Reposted from Uniqtech's Medium publication with permission. This is retrieved on May 14 2019. Uniqtech may have a newer version. Unde...