latency() got an unexpected keyword argument 'num_outputs' and latency() got multiple values for argument 'num_steps'
See original GitHub issue- snntorch version: 0.2.11
- Python version: 3.7.10
Description
In the latest updates, I see that the num_output
has been removed from the spikegen
. No clue why!
Moreover, also when I just decide about removing the num_output
parameter. I get another error from the num_steps
when I set it to any int value?
BTW, the same exists if I tried to do rate encoding instead of latency encoding.
What I Did
Here: you will find a quick ipynb file that shows the errors.
Issue Analytics
- State:
- Created 2 years ago
- Comments:5 (3 by maintainers)
Top Results From Across the Web
TypeError: got multiple values for argument - Stack Overflow
This happens when a keyword argument is specified that overwrites a positional argument. For example, let's imagine a function that draws a ...
Read more >TypeError: got multiple values for argument in Python
The Python TypeError: got multiple values for argument occurs when we overwrite the value of a positional argument with a keyword argument in...
Read more >TypeError: guide() got multiple values for argument 'index'
I am running the code for boosting following is my guide function. Blockquote def guide(X_data, index): print(X_data.shape) features = X_data.
Read more >Can't register a model from a run · Issue #300 · Azure/azureml ...
TypeError: register_model() got multiple values for argument ... TypeError: _register_with_asset() got an unexpected keyword argument 'run'.
Read more >Custom model field/form: __init__() got multiple values for ...
Hello, The problematic code can be found here: <https://gist.github.com/1229708> TypeError: __init__() got multiple values for keyword argument 'baz'
Read more >Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start FreeTop Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
Top GitHub Comments
Gotcha!
This should work.
Note that
spikegen.latency
is now intended for input features. So including the number of outputs/classes would be meaningless.You can instead separately enter that into
spikegen.targets_latency
.Hi @Dola47,
Just a reminder that issues are intended for reporting bugs. When in doubt, open a new thread in discussions so that users with the same question can refer to that easily. If it turns out it’s a bug, then we can convert the discussion into an issue!
A 2d target of [batch_size x num_outputs] can be used to apply a separate MSELoss to each output neuron. Such a target represents the ‘goal’ of the membrane potential of each neuron. The 10 losses are then summed together both over time, and over neurons. You can can check this notebook out as an example:
https://colab.research.google.com/drive/1j9-ddyadCMyn6N12zjPmu98VAmYF3YIl?usp=sharing
Another note; I’ve since added a couple other neuron types.
In my own experiments,
Leaky
consistently outperformsSynaptic
(used to beStein
) for the right set of parameters. The only difference is that Leaky takes an input, integrates it into membrane potential (decay at rate beta), and then emits a spike. Stein/Synaptic take an input, integrate it into a synaptic current (decay at rate alpha), which is integrated again into membrane potential (decay at rate beta), and then emits a spike. The notebook above shows a code sample withLeaky
.Next question -
Yeah - you are absolutely right. No need for the converted targets for this particular task. You only compare the raw ground truth targets to the spike count.
There are obviously other tasks that might require the converted targets to measure accuracy, e.g., if you are trying to encourage a neuron to fire at very specific times, or the goal is to achieve a given pattern/evolution of membrane potential.