pre_transform works not as expected
See original GitHub issueI want to run an expensive transform to modify data on vertices, so I think I should pass it to pre_transform key – it would run a transform on every graph in the dataset and will cache processed dataset.
But it works not as expected. For example, take OneHotDegree
transform. Note that I delete dataset folder on every run of the following snippets:
No transform
dataset = TUDataset(root="../data/", name="MUTAG")
dataset[0].x
tensor([[1., 0., 0., 0., 0., 0., 0.],
[1., 0., 0., 0., 0., 0., 0.],
[1., 0., 0., 0., 0., 0., 0.],
[1., 0., 0., 0., 0., 0., 0.],
[1., 0., 0., 0., 0., 0., 0.],
[1., 0., 0., 0., 0., 0., 0.],
[1., 0., 0., 0., 0., 0., 0.],
[1., 0., 0., 0., 0., 0., 0.],
[1., 0., 0., 0., 0., 0., 0.],
[1., 0., 0., 0., 0., 0., 0.],
[1., 0., 0., 0., 0., 0., 0.],
[1., 0., 0., 0., 0., 0., 0.],
[1., 0., 0., 0., 0., 0., 0.],
[1., 0., 0., 0., 0., 0., 0.],
[0., 1., 0., 0., 0., 0., 0.],
[0., 0., 1., 0., 0., 0., 0.],
[0., 0., 1., 0., 0., 0., 0.]])
Transform
dataset = TUDataset(root="../data/", name="MUTAG", transform=OneHotDegree(5))
dataset[0].x
tensor([[1., 0., 0., 0., 0., 0., 0., 0., 0., 1., 0., 0., 0.],
[1., 0., 0., 0., 0., 0., 0., 0., 0., 1., 0., 0., 0.],
[1., 0., 0., 0., 0., 0., 0., 0., 0., 1., 0., 0., 0.],
[1., 0., 0., 0., 0., 0., 0., 0., 0., 0., 1., 0., 0.],
[1., 0., 0., 0., 0., 0., 0., 0., 0., 0., 1., 0., 0.],
[1., 0., 0., 0., 0., 0., 0., 0., 0., 1., 0., 0., 0.],
[1., 0., 0., 0., 0., 0., 0., 0., 0., 1., 0., 0., 0.],
[1., 0., 0., 0., 0., 0., 0., 0., 0., 1., 0., 0., 0.],
[1., 0., 0., 0., 0., 0., 0., 0., 0., 0., 1., 0., 0.],
[1., 0., 0., 0., 0., 0., 0., 0., 0., 0., 1., 0., 0.],
[1., 0., 0., 0., 0., 0., 0., 0., 0., 1., 0., 0., 0.],
[1., 0., 0., 0., 0., 0., 0., 0., 0., 1., 0., 0., 0.],
[1., 0., 0., 0., 0., 0., 0., 0., 0., 0., 1., 0., 0.],
[1., 0., 0., 0., 0., 0., 0., 0., 0., 1., 0., 0., 0.],
[0., 1., 0., 0., 0., 0., 0., 0., 0., 0., 1., 0., 0.],
[0., 0., 1., 0., 0., 0., 0., 0., 1., 0., 0., 0., 0.],
[0., 0., 1., 0., 0., 0., 0., 0., 1., 0., 0., 0., 0.]])
Pre-transform
dataset = TUDataset(root="../data/", name="MUTAG", pre_transform=OneHotDegree(5))
dataset[0].x
tensor([[0., 0., 1., 0., 0., 0.],
[0., 0., 1., 0., 0., 0.],
[0., 0., 1., 0., 0., 0.],
[0., 0., 0., 1., 0., 0.],
[0., 0., 0., 1., 0., 0.],
[0., 0., 1., 0., 0., 0.],
[0., 0., 1., 0., 0., 0.],
[0., 0., 1., 0., 0., 0.],
[0., 0., 0., 1., 0., 0.],
[0., 0., 0., 1., 0., 0.],
[0., 0., 1., 0., 0., 0.],
[0., 0., 1., 0., 0., 0.],
[0., 0., 0., 1., 0., 0.],
[0., 0., 1., 0., 0., 0.],
[0., 0., 0., 1., 0., 0.],
[0., 1., 0., 0., 0., 0.],
[0., 1., 0., 0., 0., 0.]])
I expect to have the result of transform (case 2) in the case of pre-transform too. Is it bug or feature?
How should I apply expensive transforms and get the data modified as in the case 2?
Issue Analytics
- State:
- Created a year ago
- Comments:5 (1 by maintainers)
Top Results From Across the Web
HDK: Object Transform Model
Use this function to convert co-ordinates that have not been transformed by the follow path or pre-transform. Typically, this function is only used...
Read more >Custom function transformer not performing as expected
The transformer seems to work on it's own, and the fit() and transform() methods work individually, but when I include it in a...
Read more >API Changes — Matplotlib 1.5.1 documentation
It is not expected that this will affect any users as the current ids are generated from an md5 hash of properties of...
Read more >Pretransformation Rules
Use a pretransformation rule to populate an order attribute or set its default value before Order Management transforms the source order.
Read more >Solved: Custom Transform Node & xform - Maya
Using the joint's extra pre-transform attributes works pretty good, except for non-uniform scaling because of how inverseScale works.
Read more >Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start FreeTop Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
Top GitHub Comments
This will be fixed in https://github.com/pyg-team/pytorch_geometric/pull/4669. It was caused by the weird interplay of
TUDataset
and the detection of “categorical” features induced by theuse_node_attr
argument.Moreover, as a pre-transform it accepts only a vector of all ones, truncating the matrix of all ones to a single column