Skip to content

Conversation

shaibagon
Copy link
Member

adding a generic numerical gradient check for python layers. much like include/caffe/test/test_gradient_check_util.hpp for complied layers.
This utility can be used from python interface.

Numerically testing python layer's gradients.

How to use:
Suppose you have a python layer

  layer {
    type: "Python"
    bottom: "in_cont"
    bottom: "in_binary"
    top: "out1"
    top: "out2"
    python_param: {
      module: "folder.my_layer_module_name"
      layer: "my_layer_class_name"
      param_str: "some params"
    }
    propagate_down: true
    propagate_down: false
  }

Then you can test it's backward() gradients like this:

import numpy as np
from test_gradient_for_python_layer import test_gradient_for_python_layer

# set the inputs
input_names_and_values = [('in_cont', np.random.randn(3,4)), ('in_binary', np.random.binomial(1, 0.4, (3,1))]
output_names = ['out1', 'out2']
py_module = 'folder.my_layer_module_name'
py_layer = 'my_layer_class_name'
param_str = 'some params'
propagate_down = [True, False]

# call the test
test_gradient_for_python_layer(input_names_and_values, output_names, py_module, py_layer, param_str, propagate_down)

# you are done!

@abhijitkundu
Copy link
Contributor

@shaibagon Very nice work.
There are couple of small glitches:

  1. missing import os for os.unlink() in line 122
  2. For layers with single top, outputs is not iterable and cannot be used in the zip() in line 115

@shaibagon
Copy link
Member Author

@intbots thank you for your comment. I am fixing it now.

…e include/caffe/test/test_gradient_check_util.hpp for complied layers.
@shaibagon shaibagon force-pushed the test_gradient_for_python_layer branch from 9308514 to 316a09c Compare March 13, 2017 08:33
@shaibagon
Copy link
Member Author

@xdtl it is very difficult to read your code when it is not formatted.

@xdtl
Copy link

xdtl commented Apr 3, 2017

Thank you for your reply! I found the problem was not caused by the gradient check. It's my bad. Sorry...

@lpsilvestrin
Copy link

I am getting "failed check" when I run it with EuclideanLossLayer from pyloss.py from caffe examples. The gradient test always estimates the double of the of the value computed by the layer's backward function.

AssertionError: Failed check for d[loss][0]/d[in1][0]: computed=0.773203611374, 
estimated=1.5464091301, 0.773205518723 > 0.386602282524 (0.25*1.5464091301)

I am testing using this script:

data1 = np.random.randn(1,3)
data2 = np.random.randn(1,3)
inputs = [('in1', data1), ('in2', data2)]
output = ['loss']
module = 'pyloss'
layer = 'EuclideanLossLayer'
pstr = ''
propagate_down = [True, False]
test_gradient_for_python_layer(inputs, output, module, layer,  pstr, propagate_down)

@woshichase
Copy link

I am sorry but how can I install 'test_gradient_for_python_layer' module?
When I run the script, it goes wrong noticing me that 'no module named 'test_gradient_for_python_layer''

@vimalthilak
Copy link

@shaibagon , is there any reason why you hardcoded loss_weight to 2.0? Thanks

@shaibagon
Copy link
Member Author

@vimalthilak I picked loss_weight=2 in accordance with test_gradient_check_util.hpp

@williford
Copy link
Contributor

Is there a reason why cmake/Modules/FindMKL.cmake is in this PR?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

Projects

None yet

Development

Successfully merging this pull request may close these issues.

8 participants