machine learning - My 2 layer neural network gets only 2.05% test error rate on MNIST. Too low to be correct? -


is @ reasonable low of test error rate? doubt it, because test error rates listed on mnist database website higher this, evidence otherwise seems irrefutable. using 2 layer net 100 hidden units. using no regularization. initialize small random weights , use nesterov momentum method on mini-batches. training error rate 0.0000667 (4 training examples misclassified). average cross entropy loss on training examples 0.00410, , same on test examples 0.207.

here link original metaoptimize post.

edit:

i found result must have missed on mnist page: 2-layer nn, 800 hu, cross-entropy loss, obtains test error rate of 1.6%. guess other results high error rates there make point types of fancy modifications neural nets harm performance. net hasn't done better 1.6%, believe result more readily.


Comments

Popular posts from this blog

jquery - How can I dynamically add a browser tab? -

node.js - Getting the socket id,user id pair of a logged in user(s) -

keyboard - C++ GetAsyncKeyState alternative -