Skip to content

Saliency #283

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 29 commits into from
Dec 11, 2018
Merged

Saliency #283

merged 29 commits into from
Dec 11, 2018

Conversation

corochann
Copy link
Member

saliency modules

[Calculator]

  • gradient
  • integrated gradient
  • occlusion

[Visualizer]

  • mol, smiles
  • image
  • table data

[Addtional utils]

  • VariableMonitorLinkHook
  • TrainConfigFunctionHook

@corochann
Copy link
Member Author

corochann commented Dec 7, 2018

TODO:

  • write test
  • add TrainConfigFunctionHook
    --> Decided to skip. Not implemented.
  • write document
  • reference, citation

@codecov-io
Copy link

Codecov Report

Merging #283 into master will decrease coverage by 6.6%.
The diff coverage is 0%.

@@            Coverage Diff             @@
##           master     #283      +/-   ##
==========================================
- Coverage   81.73%   75.12%   -6.61%     
==========================================
  Files         122      137      +15     
  Lines        6116     6654     +538     
==========================================
  Hits         4999     4999              
- Misses       1117     1655     +538

@codecov-io
Copy link

codecov-io commented Dec 7, 2018

Codecov Report

Merging #283 into master will increase coverage by 1.51%.
The diff coverage is 92.81%.

@@            Coverage Diff             @@
##           master     #283      +/-   ##
==========================================
+ Coverage   81.73%   83.24%   +1.51%     
==========================================
  Files         122      147      +25     
  Lines        6116     7092     +976     
==========================================
+ Hits         4999     5904     +905     
- Misses       1117     1188      +71

"""

def __init__(self, target_link, name='VariableMonitorLinkHook',
timing='post', extract_fn=None, logger=None):
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

How about using input and output, instead of pre and post?
It is clear what we would like to get by this class.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

So maybe fetch_target is better than timing? How do you feel?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Technically, it is actually possible to fetch "input" with timing=post, but this use case is quite limited.
So it is also ok to change name as you suggested.

return h


if __name__ == '__main__':
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Please delete debug codes.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

sorry i should remove this file

maxv = xp.max(saliency)
minv = xp.min(saliency)
if maxv == minv:
saliency = xp.zeros_like(saliency)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

raise Warning?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

actually this case happened several times with bayesgrad, so may be just debug logging or info logging is enough.

"""
xp = cuda.get_array_module(saliency)
maxv = xp.max(xp.abs(saliency))
if maxv <= 0:
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

raise warning?

link_hooks = chainer._get_link_hooks()
name = prefix + linkhook.name
if name in link_hooks:
print('[WARNING] hook {} already exists, overwrite.'.format(name))
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

logging?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

done

name = prefix + linkhook.name
link_hooks = chainer._get_link_hooks()
if name not in link_hooks.keys():
print('[WARNING] linkhook {} is not registered'.format(name))
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

ditto

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

done

"""
xp = cuda.get_array_module(saliency)
vsum = xp.sum(xp.abs(saliency), axis=axis, keepdims=True)
if vsum <= 0:
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

ditto



def normalize_scaler(saliency, axis=None):
"""Normalize saliency to be sum=1
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

only for saliency which all value are > 0?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

modified to show warning message

num_atoms = mol.GetNumAtoms()

# --- type check ---
if not saliency.ndim == 1:
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

saliency.ndmi != 1 ?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

done

raise ValueError("Unexpected value saliency.shape={}"
.format(saliency.shape))

# Cut saliency array for unnecessary tail part
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is this expected behavior? I feel when len(saliency) == num_atoms, raising Warning is better.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

it is expected. atom_array is often 0 padded with concat_mols, so we need to truncate this padded length.

@corochann corochann changed the title [WIP] Saliency Saliency Dec 10, 2018
in_array) for in_array in input_list]

result = [_concat(output) for output in output_list]
if len(result) == 1:
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This specification may be confusing. And I think test is necessary.


# 2. test with `save_filepath=None` runs without error
image = numpy.random.uniform(0, 1, (ch, h, w))
visualizer.visualize(
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Visualilze method stops unit testing.
Can you insert plt.ion() and plt.close()? like this.

    plt.ion()
    visualizer.visualize(
        saliency, save_filepath=None, feature_names=['hoge', 'huga', 'piyo'],
        num_visualize=2)
    plt.close()

visualizer.visualize(saliency, save_filepath=save_filepath)
assert os.path.exists(save_filepath)
# 2. test with `save_filepath=None` runs without error
visualizer.visualize(
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

As same as image_visualizer

@corochann
Copy link
Member Author

Updated based on comment!

@mottodora mottodora merged commit 8e1a578 into chainer:master Dec 11, 2018
@corochann corochann deleted the saliency branch December 11, 2018 02:32
@corochann corochann added this to the 0.5.0 milestone Feb 7, 2019
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants