Skip to content

Conversation

bwilbertz
Copy link
Contributor

In case a net is loaded, which has already be initialized and contains valid blob data, the layerSetUp of the scale_layer would add the blob for the bias term a second time (resulting in three blobs instead of only two).

This PR only adds the blob, if it was not present before.

@jeffdonahue I think the original code of the scale layer was written by you. Could you have a look, if this fix does not miss any case which you had in mind.

Thanks,

Benedikt

@jeffdonahue
Copy link
Contributor

Thanks for the fix @bwilbertz. Can't this happen for the scale parameter as well, though? It seems like you could just skip all parameter setup whenever this->blobs_.size() > 0 as done in InnerProductLayer, for example. (I could easily be misunderstanding the problem though.)

@bwilbertz
Copy link
Contributor Author

bwilbertz commented Aug 18, 2016

The thing is a bit tricky here, because we derive from bottom.size(), if we should learn the scale parameter or receive it externally (via bottom[1], c.f. #3591). Therefore we cannot just use this->blobs_.size() > 0 like in InnerProductLayer.

Having understood a bit more the case bottom.size() == 2, I think that my PR only solved the case of a learned scale parameter (bottom.size() == 1), and that we have to change the condition

if ( this->blobs_.size() < 2 )

into

if ( this->blobs_.size() + bottom.size() < 3 )

in order to treat both cases correctly.

If you agree, I would update the PR accordingly.

Regarding the initialization of the scale blob: I think, this is OK like it is.

@bwilbertz
Copy link
Contributor Author

@jeffdonahue I have updated the PR according to my last comment. Did you have any chance to look at it?

Thanks,

Benedikt

bias_param_id_ = this->blobs_.size();
this->blobs_.resize(bias_param_id_ + 1);
this->blobs_[bias_param_id_] = bias_layer_->blobs()[0];
if ( this->blobs_.size() + bottom.size() < 3 ) {
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

rm leading/trailing whitespace (use if (x) rather than if ( x ))

@jeffdonahue
Copy link
Contributor

Ok, thanks for the explanation and fix @bwilbertz -- this makes sense now. It seems a bit fragile and difficult to understand, and would ideally have unit tests, but that's unfortunately just kind of the state of things across Caffe right now... Maybe you could add a comment in at least one of the cases, e.g., for the if case, // Bias param not already initialized or a bottom.. Please address the minor whitespace issues to follow our conventions, then I can merge this.

@bwilbertz
Copy link
Contributor Author

@jeffdonahue PR is updated.

Thanks,

Benedikt

@bwilbertz
Copy link
Contributor Author

Hi @jeffdonahue. Is there anything I can improve to get this PR merged?

Thanks,
Benedikt

@jeffdonahue jeffdonahue merged commit d208b71 into BVLC:master Sep 20, 2016
@jeffdonahue
Copy link
Contributor

Nope -- thanks for the fixes, and apologies for the delay!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants