Ad

Friday, February 22, 2019

Udacity Deep Learning Nanodegree Course Summary - Convolutional Neural Network

10000 seem hard to train hard to spot cancer sometimes there are too many miles

3.1.15 Validation loss: first choose a percentage to make into validation set

3.1.18 LOCAL CONNECTIVITY improve image classification with MLP. Vanilla MLP is fully connected. MLP is good for cleaned dataset that are easier like MNSIT. Address tow MLP issues, MLP uses a lot of parameters, easily reach 1/2 million even for 28x28 image, limitation 2: only accepts vector as input. Spatial info was not relevant in MLP. CNN uses sparsely connected layers, accept matrix as input. How to visualize, flatten for MLP. Fully connected redundancy: does every hidden node needs to be connected with every pixel in an image? Perhaps not.

3.8.4 why is this task hard

3.8.5 how the data is collected and biopsy confirmed. More than 2000+ disease classes. Melanoma is the most lethal

3 Project Github can opt out, limited time availability 3.github.1 why github is useful 3.g.2 Matt points out Udacity courses on github and version control, makes hilarious jokes about his portfolio, GitHub as a host for technical portfolio

Best practice:

MLP VS CNN CNN has lower test error.

CNN do much better than MLP in most datasets. Though the difference is not obvious for MNIST

weight initialization help model find best place to start to optimize for best weight that fits between input and output. Transfer learning initialized starts with optimal weight trained in the model.

What’s the relationship between epoch, batch and number of records?
How about iterations?
Gradient Descent has two hyperparameters epoch and batch size. 

No comments:

Post a Comment

React UI, UI UX, Reactstrap React Bootstrap

React UI MATERIAL  Install yarn add @material-ui/icons Reactstrap FORMS. Controlled Forms. Uncontrolled Forms.  Columns, grid