question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

Where does inception-resnet-1 implement shortcuts?

See original GitHub issue

Brilliant work on this, I’ve been working on building a model that implements this in Deeplearning4j…one thing I haven’t been able to figure out is how are you implementing your residual shortcuts?

For example, this block:

# Inception-Renset-A
def block35(net, scale=1.0, activation_fn=tf.nn.relu, scope=None, reuse=None):
    """Builds the 35x35 resnet block."""
    with tf.variable_scope(scope, 'Block35', [net], reuse=reuse):
        with tf.variable_scope('Branch_0'):
            tower_conv = slim.conv2d(net, 32, 1, scope='Conv2d_1x1')
        with tf.variable_scope('Branch_1'):
            tower_conv1_0 = slim.conv2d(net, 32, 1, scope='Conv2d_0a_1x1')
            tower_conv1_1 = slim.conv2d(tower_conv1_0, 32, 3, scope='Conv2d_0b_3x3')
        with tf.variable_scope('Branch_2'):
            tower_conv2_0 = slim.conv2d(net, 32, 1, scope='Conv2d_0a_1x1')
            tower_conv2_1 = slim.conv2d(tower_conv2_0, 32, 3, scope='Conv2d_0b_3x3')
            tower_conv2_2 = slim.conv2d(tower_conv2_1, 32, 3, scope='Conv2d_0c_3x3')
        mixed = tf.concat(3, [tower_conv, tower_conv1_1, tower_conv2_2])
        up = slim.conv2d(mixed, net.get_shape()[3], 1, normalizer_fn=None,
                         activation_fn=None, scope='Conv2d_1x1')
        net += scale * up
        if activation_fn:
            net = activation_fn(net)
    return net

How is the actual shortcut implemented? Typically there’s an identity layer or addition somewhere after the concat. I thought perhaps scale * up was it, but it’s not clear to me what that is doing.

Thanks for your insight!

Issue Analytics

  • State:closed
  • Created 7 years ago
  • Comments:6 (3 by maintainers)

github_iconTop GitHub Comments

1reaction
davidsandbergcommented, Jan 16, 2017
  1. Yes, it’s done using slim’s arg_scope.
  2. If you’re running facenet_train.py there is indeed a parameter embedding_size that does exactly that. But this functionality is kept outside of the inception-resnet-v1 model.
0reactions
crockpotveggiescommented, Jan 16, 2017

@davidsandberg last questions then I’ll fully understand your architecture 😃 I have indeed read the ResNet paper, although I feel like it lacked much of the same information.

  1. Was there a reason you didn’t implement BatchNorm on your convolution layers? Or is slim doing that automatically?
  2. The logits size depends on the number of identities. However, the beauty of FaceNet was that you could select a specific embeddings size and stick with that. Any reason why you didn’t implement the same elegance in Inception-ResNet? Is that due to constraints of center loss?

Thanks so much being informative. I’ll probably compile this into a blog post and reference your work.

Read more comments on GitHub >

github_iconTop Results From Across the Web

Keyboard shortcuts in Windows - Microsoft Support
Ctrl + Alt + Tab. Use the arrow keys to switch between all open apps. ; Alt + Shift + arrow keys. When...
Read more >
Mac keyboard shortcuts - Apple Support
To use a keyboard shortcut, press and hold one or more modifier keys and then press the last key of the shortcut. For...
Read more >
Keyboard shortcuts - GitHub Docs
The following sections list some of the available keyboard shortcuts, organized by the pages where you can use them on GitHub.com. Site wide...
Read more >
Use accessibility shortcuts - Android Accessibility Help
You can set up as many shortcuts as you like for the accessibility apps that you use on your Android device. On your...
Read more >
What are keyboard shortcuts and how do I use them in Skype?
Keyboard shortcuts are specific combinations of two or more keys on your keyboard. They make it easier and faster for you to use...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found