[Optimization] Apply multi times apply passes will result in check shape error
See original GitHub issueThis is the reduced case from real model leveraging test_fold_scale_axis.py
import nnvm
import nnvm.testing.resnet
import numpy as np
from nnvm import symbol as sym
from nnvm.compiler import graph_util, graph_attr
def test_fold_axis_conv():
def before(x, conv_weight, conv_bias, in_scale, out_scale, channels):
x = x * sym.expand_dims(in_scale, axis=1, num_newaxis=2)
y = sym.conv2d(x, conv_weight, conv_bias,
channels=channels,
kernel_size=(3, 3),
padding=(1, 1),
groups=54,
name="conv")
y = sym.relu(y)
y = y * sym.expand_dims(out_scale, axis=1, num_newaxis=2)
return y
def expected(x, conv_weight, conv_bias, in_scale, out_scale, channels):
conv_weight = conv_weight * sym.expand_dims(out_scale, axis=1, num_newaxis=3)
conv_weight = conv_weight * sym.expand_dims(in_scale, axis=1, num_newaxis=2)
conv_bias = conv_bias * out_scale
y = sym.conv2d(x,
conv_weight,
conv_bias,
channels=channels,
kernel_size=(3, 3),
padding=(1, 1),
groups=54,
name="conv")
y = sym.relu(y)
return y
# Before simplify
def check(shape, channels):
x = sym.Variable("x") + 1
weight = sym.Variable("weight", shape=(54, 1, 3, 3))
bias = sym.Variable("bias", shape=(54,))
in_scale = sym.Variable("in_scale")
out_scale = sym.Variable("out_scale")
y1 = before(x, weight, bias, in_scale, out_scale, channels)
y2 = expected(x, weight, bias, in_scale, out_scale, channels)
ishape = {"x": shape, "out_scale": (channels,), "in_scale": (shape[1],)}
g1 = nnvm.graph.create(y1)
g2 = nnvm.graph.create(y2)
graph_attr.set_shape_inputs(g1, ishape)
g1 = g1.apply("InferShape").apply("FoldScaleAxis")
# The second time
graph_attr.set_shape_inputs(g1, ishape)
g1 = g1.apply("InferShape").apply("FoldScaleAxis")
check((1, 54, 63, 27), 54)
test_fold_axis_conv()
The problem is the second time. Will result in:
Traceback (most recent call last): File “/Users/blue/Documents/dev/tmp/tvm/nnvm/python/nnvm/_base.py”, line 75, in check_call raise NNVMError(py_str(_LIB.NNGetLastError())) nnvm._base.NNVMError: [19:28:33] /Users/blue/Documents/dev/tmp/tvm/nnvm/src/top/nn/convolution.cc:91: Operator conv2d(padding=(1, 1), kernel_size=(3, 3), channels=54, name=conv) expects weight’s shape to be [54,54,3,3], but got [54,1,3,3].
The second time to apply optimization is the same as apply it into g2.
Issue Analytics
- State:
- Created 5 years ago
- Comments:7 (7 by maintainers)
Top Results From Across the Web
Gradient Descent Algorithm and Its Variants | by Imad Dabbura
Optimization algorithm that is iterative in nature and applied to a set of problems that have non-convex cost functions such as neural networks....
Read more >Scientific Python: Using SciPy for Optimization
Use SciPy to cluster a dataset by several variables; Use SciPy to find the optimum of a function. You can follow along with...
Read more >Optimization (scipy.optimize) — SciPy v1.9.3 Manual
However, because it does not use any gradient evaluations, it may take longer ... As a result, the user can provide either a...
Read more >How Can I Optimize My Members in RISA-3D?
Based on that member the program will provide a suggestion of a shape that will work that is either bigger or smaller depending...
Read more >Group by: split-apply-combine — pandas 1.5.2 documentation
In fact, in many situations we may wish to split the data set into groups and do ... On a grouped DataFrame ,...
Read more >Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start FreeTop Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
Top GitHub Comments
@merrymercy I am applying the process of open source in my company, could you wait for a while? Sorry for this inconvenience.
The reason is FoldScaleAxis doesn’t support convolution with groups (including depthwise conv2d) currently. The fix can be done modify ScaleAxisForward for conv2d. Close it.