scattering3d returns a tuple instead of tensor
See original GitHub issueEach implementation has very different expectations of its inputs and outputs:
- 1D: Expects inputs of size
(B, 1, T)
, outputs(B, P, T/2**J)
, whereP
is the number of scattering coefficients. Note that if the second dimension of the input is not1
, it errors. - 2D: Expects inputs of size
(B, C, M, N)
, outputs(B, C, P, M/2**J, N/2**J)
. Note that the number of dimensions changes here between input and output (this may be desirable but is not consistent with 1D). Also, from what I understand, the second dimension here is essentially treated as a batch dimension. - 3D: Expects inputs of size
(B, M, N, O)
, outputs a tuple with elements of size(B, ?, ?, ?)
. Not sure what’s going on here.
We should think about harmonizing these to some extent.
Issue Analytics
- State:
- Created 5 years ago
- Comments:24
Top Results From Across the Web
Op request: Tensorflow ScatterNd node #2074 - onnx ... - GitHub
TensorFlow has a node ScatterNd which is described here: ... u in zip(indices, updates): data[tuple(i)] += u return data.
Read more >kymatio 0.3.0 documentation
If max_order is 2 it returns a torch.Tensor with the first- and second- order scattering coefficients, concatenated along the feature axis. Return type:...
Read more >Model outputs — transformers 4.2.0 documentation
Those are data structures containing all the information returned by the model, but that can also be used as tuples or dictionaries.
Read more >BaseDataset — hybrid_learning documentation - GitHub Pages
Abstract base class for tuple datasets with storage location. ... The transformation transforms is applied to data tuples before return from __getitem__() ...
Read more >What is the second argument of TensorFlow's tf.data.filter ...
The map() is returning a tuple of (features, label) . The second argument is of course the label as a tensor.
Read more >Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start FreeTop Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
Top GitHub Comments
Regarding the relationship between ndims of input and ndims of output, it is true that it would be good to be more consistent. At the beginning I wanted to modify the output of
Scattering1D
, but upon second thought I think it’s best to modify the input. So that:(B,T)
maps to(B,P,T')
just like in other modules. Then in the future we can think about implementing something more generic, as @janden and @eickenberg suggested:(T,)
->(P,T')
(B,T)
->(B,P,T')
(B,C,T)
->(B,C,P,T')
and so forth. I can open a PR for that.FWIW “local” measures scattering transform near a single point, much in the same way as the foveal scattering of @AndreuxMath and @beedotkiran So it makes sense that it does not have spatial coordinates Likewise for integral @janden we might consider implementing integral in 1d in the future, as it runs faster than convolution with phi
@gexarcha I think that there are two issues here. One is how to interface local and integral with standard. The other is the output type: tuple vs tensor. The former can be postponed to
v0.2
, but the output type issue needs to be resolved before we releasev0.1.0-alpha
.