Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Shearlets #40

Open
wants to merge 28 commits into
base: master
Choose a base branch
from
Open
Changes from 1 commit
Commits
Show all changes
28 commits
Select commit Hold shift + click to select a range
bd4ea22
Update unet.py with exact reconstruction
kevinmicha Apr 26, 2021
baa8d4f
Back to previous version
kevinmicha Apr 26, 2021
4499a95
Merge branch 'zaccharieramzi:master' into master
kevinmicha May 12, 2021
9d88a3a
modified analysis class to allow fixed filters
kevinmicha Jun 14, 2021
fadd9af
including shearlets and exact_recon parameter
kevinmicha Jun 14, 2021
774e972
Added extra dot
kevinmicha Jun 17, 2021
ee1c145
adapted tiling part
kevinmicha Jun 17, 2021
42b7674
fixed indentation for tiling func
kevinmicha Jun 17, 2021
3b8dc98
Update learning_wavelets/training_scripts/learnlet_training.py
kevinmicha Jun 17, 2021
3e5652d
Update learning_wavelets/models/learnlet_layers.py
kevinmicha Jun 17, 2021
e8f1649
changed layer name
kevinmicha Jun 17, 2021
678b328
back to original version
kevinmicha Jul 23, 2021
0e0eb29
including synthesis filters part
kevinmicha Jul 23, 2021
f7bc4a0
optional import: cadmos
kevinmicha Jul 23, 2021
beedb0b
removed wrong indent
kevinmicha Jul 23, 2021
6f249cf
used kernel's keywords
kevinmicha Jul 23, 2021
a0589a7
added spaces back
kevinmicha Jul 23, 2021
3139c2b
added untrainable layers
kevinmicha Jul 23, 2021
9a923d3
fixed white space
kevinmicha Jul 23, 2021
8ff2877
not prepared for exact recon True
kevinmicha Jul 23, 2021
6f8c3f0
changed n_tiling. Doesn't make sense 3 as default
kevinmicha Jul 23, 2021
5af5fd2
removed last change
kevinmicha Jul 23, 2021
3001d00
changed some default values for testing matters
kevinmicha Jul 23, 2021
a61f1ef
more def values changed
kevinmicha Jul 23, 2021
07ee9b1
removing exact recon test from this branch. No sense and tests fail
kevinmicha Jul 23, 2021
5d928be
exact recon unet test is back
kevinmicha Jul 23, 2021
19b021a
Removed the part that tests exact recon. No sense
kevinmicha Jul 23, 2021
7e061ee
added fixed keyword in names
kevinmicha Jul 23, 2021
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Prev Previous commit
Next Next commit
modified analysis class to allow fixed filters
  • Loading branch information
kevinmicha authored Jun 14, 2021
commit 9d88a3afe5e631359840a29ff2fe5c6563ed7146
22 changes: 19 additions & 3 deletions learning_wavelets/models/learnlet_layers.py
Original file line number Diff line number Diff line change
Expand Up @@ -139,18 +139,32 @@ def __init__(
if self.tiling_unit_norm:
constraint = UnitNorm(axis=[0, 1, 2])
tiling_prefix = 'details_tiling'
self.convs_detail_tiling = [
self.convs_detail_tiling_fixed = [
Conv2D(
n_tiling,
n_tiling//2,
self.kernel_size,
activation='linear',
padding='same',
kernel_initializer='glorot_uniform',
use_bias=tiling_use_bias,
kernel_constraint=constraint,
trainable = False,
kevinmicha marked this conversation as resolved.
Show resolved Hide resolved
name=f'{tiling_prefix}_{str(K.get_uid(tiling_prefix))}',
Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
name=f'{tiling_prefix}_{str(K.get_uid(tiling_prefix))}',
name=f'{tiling_prefix}_fixed_{str(K.get_uid(tiling_prefix))}',

Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@kevinmicha I don't understand why you resolved this

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

was a mistake. Now really solved

) for i in range(self.n_scales)
]
self.convs_detail_tiling_train = [
Conv2D(
n_tiling//2,
self.kernel_size,
activation='linear',
padding='same',
kernel_initializer='glorot_uniform',
use_bias=tiling_use_bias,
kernel_constraint=constraint,
name=f'{tiling_prefix}_{str(K.get_uid(tiling_prefix))}',
) for i in range(self.n_scales)
]

if self.mixing_details:
mixing_prefix = 'details_mixing'
self.convs_detail_mixing = [
Expand All @@ -172,7 +186,9 @@ def call(self, image):
wav_coarse = wav_coeffs[-1]
outputs_list = []
for i_scale, wav_detail in enumerate(wav_details):
details_tiled = self.convs_detail_tiling[i_scale](wav_detail)
details_tiled_fixed = self.convs_detail_tiling_fixed[i_scale](wav_detail)
details_tiled_train = self.convs_detail_tiling_train[i_scale](wav_detail)
details_tiled = Concatenate()([details_tiled_fixed, details_tiled_train])
Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

do not use the layer Concatenate here: it will create a layer at inference time which we want to avoid esp. when running in eager mode.

I think it's best to use tf.concat

Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

same for below

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

solved

if self.mixing_details:
details_tiled = self.convs_detail_mixing[i_scale](details_tiled)
if self.skip_connection:
Expand Down