altGrape
altGrape

As someone in AI, which concept blew your mind away when you first learnt about it?

For me? It was GradCAM was a gamechanger at selling computer vision initiatives internally to the non-technical stakeholders.

The gradCAM function computes the importance map by taking the derivative of the reduction layer output for a given class with respect to a convolutional feature map. If you have a 5 layered convolutional neural network, then you can use this against any layer and check the class activation map.

The best explanation to give is: "Regions in Red are the areas of the image that the neural network is looking at to make a decision"

Well to be honest, regions in red represent the class activations arising from that region but, it would be too much for normie business guys.

Post image
6mo ago
Discover more
Curated from across