machine learning - The convolutional neural network i'm trying to train is settling at a particular range of loss value, how should i avoid it? -


description: trying train alexnet similar(actually same without groups) cnn scratch (50000 images, 1000 classes , x10 augmentation). each epoch has 50,000 iterations , image size 227x227x3.

there smooth cost decline , improvement in accuracy few initial epochs i'm facing problem cost has settled ~6(started 13) long time, been day , cost continuously oscillating in range 6.02-6.7. accuracy has become stagnant.

now i'm not sure , not having proper guidance. problem of vanishing gradients in local minima? so, avoid should decrease learning rate? learning rate 0.08 relu activation (which helps in avoiding vanishing gradients), glorot initialization , batch size of 96. before making change , again training days, want make sure i'm moving in correct direction. possible reasons?


Comments

Popular posts from this blog

php - Wordpress website dashboard page or post editor content is not showing but front end data is showing properly -

How to get the ip address of VM and use it to configure SSH connection dynamically in Ansible -

javascript - Get parameter of GET request -