Skip to main content



Is Using DropOut in training Deep Neural Network is waste of GPU Computation!

Is Using DropOut in training Deep Neural Network is waste of GPU Computation! Lets break this statement down first,

Deep Neural Network: Its a simulated network of nodes consisting of some operation per layer of the network mostly this operation is Multiplication of Matrix or something similar, Its called Deep when it consist of many layers(hidden layers) in it, this is used with Back Propagation to train this network by moving forward and backward while updating the wights of those nodes.

Here in this image below you can see a simple Neural Network.

DropOut: Dropout is the mechanism used in the Deep Learning Neural Network training phase when we intentionally randomly remove 50%(Generally) of total activation's values while training a Deep Neural Network: WIki Def for you "neural networks (ANNs) or connectionist systems are computing systems inspired by the biological neural networks that constitute animal brains. Such systems learn (progressively improve perfor…

Latest posts

Security Of Virtualization Solutions

Risk Assessment Report of Bluetooth technology

What's the Next Step in the Evolution of AI?

What will we call a AI agent based Trojan/malware which can develop its own version of advance Trojan by unsupervised learning and spread like wildfire on networks?

Windows 10 Privacy Tips

Game of Thrones season 7 episode 6 just leaked no spoilers in this article

OpenCV installation for Ubuntu

Flashing/Installing stock MIUI image to the Rooted Redmi Devices and Fixing the Error 7

Highlights of WWDC 2017