question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

Assign requires shapes of both tensors to match. lhs shape= [2048,40] rhs shape= [2048,84]

See original GitHub issue

I am trying to run the demo for another data named “vedai” which has 9 classes. But when I run the demo it doesn’t show the pictures, and when I run the demo with the new data, it shows an error as:

Assign requires shapes of both tensors to match. lhs shape= [2048,40] rhs shape= [2048,84]

My data has 9 classes (9 + 1 for background = 10 and 10 * 4 =40). I think I should change a 21 to 9 somewhere, but I don’t know where.

##This is my code for demo:

from __future__ import absolute_import
from __future__ import division
from __future__ import print_function

import _init_paths
from model.config import cfg
from model.test import im_detect
from model.nms_wrapper import nms

from utils.timer import Timer
import tensorflow as tf
import matplotlib.pyplot as plt
import numpy as np
import os, cv2
import argparse

from nets.vgg16 import vgg16
from nets.resnet_v1 import resnetv1

CLASSES = ('__background__',
           'car', 'pickup', 'tractor', 'boat',
           'truck', 'campingcar', 'van', 'other', 'plane')

NETS = {'vgg16': ('vgg16_faster_rcnn_iter_5000.ckpt',),'res101': ('res101_faster_rcnn_iter_110000.ckpt',)}
DATASETS= {'pascal_voc': ('voc_2007_trainval',),'pascal_voc_0712': ('voc_2007_trainval+voc_2012_trainval',)}

def vis_detections(im, class_name, dets, ax, thresh=0.5):
    """Draw detected bounding boxes."""
    inds = np.where(dets[:, -1] >= thresh)[0]
    if len(inds) == 0:
        return

    for i in inds:
        bbox = dets[i, :4]
        score = dets[i, -1]

        ax.add_patch(
            plt.Rectangle((bbox[0], bbox[1]),
                          bbox[2] - bbox[0],
                          bbox[3] - bbox[1], fill=False,
                          edgecolor='red', linewidth=3.5)
            )
        ax.text(bbox[0], bbox[1] - 2,
                '{:s} {:.3f}'.format(class_name, score),
                bbox=dict(facecolor='blue', alpha=0.5),
                fontsize=14, color='white')

    ax.set_title(('{} detections with '
                  'p({} | box) >= {:.1f}').format(class_name, class_name,
                                                  thresh),
                  fontsize=14)
    plt.axis('off')
    plt.tight_layout()
    plt.draw()

def vedai(sess, net, image_name):
    """Detect object classes in an image using pre-computed object proposals."""

    # Load the vedai image
    im_file = os.path.join(cfg.DATA_DIR, 'vedai', image_name)
    im = cv2.imread(im_file)

    # Detect all object classes and regress object bounds
    timer = Timer()
    timer.tic()
    scores, boxes = im_detect(sess, net, im)
    timer.toc()
    print('Detection took {:.3f}s for {:d} object proposals'.format(timer.total_time, boxes.shape[0]))

    # Visualize detections for each class
    im = im[:, :, (2, 1, 0)]
    fig, ax = plt.subplots(figsize=(12, 12))
    ax.imshow(im, aspect='equal')

    CONF_THRESH = 0.8
    NMS_THRESH = 0.3
    for cls_ind, cls in enumerate(CLASSES[1:]):
        cls_ind += 1 # because we skipped background
        cls_boxes = boxes[:, 4*cls_ind:4*(cls_ind + 1)]
        cls_scores = scores[:, cls_ind]
        dets = np.hstack((cls_boxes,
                          cls_scores[:, np.newaxis])).astype(np.float32)
        keep = nms(dets, NMS_THRESH)
        dets = dets[keep, :]
        vis_detections(im, cls, dets, ax, thresh=CONF_THRESH)

def parse_args():
    """Parse input arguments."""
    parser = argparse.ArgumentParser(description='Tensorflow Faster R-CNN vedai_test')
    parser.add_argument('--net', dest='vedai_net', help='Network to use [vgg16 res101]',
                        choices=NETS.keys(), default='res101')
    parser.add_argument('--dataset', dest='dataset', help='Trained dataset [pascal_voc pascal_voc_0712]',
                        choices=DATASETS.keys(), default='pascal_voc_0712')
    args = parser.parse_args()

    return args

if __name__ == '__main__':
    cfg.TEST.HAS_RPN = True  # Use RPN for proposals
    args = parse_args()

    # model path
    vedainet = args.vedai_net
    dataset = args.dataset
    tfmodel = os.path.join('output', vedainet, DATASETS[dataset][0], 'default',
                              NETS[vedainet][0])


    if not os.path.isfile(tfmodel + '.meta'):
        raise IOError(('{:s} not found.\nDid you download the proper networks from '
                       'our server and place them properly?').format(tfmodel + '.meta'))

    # set config
    tfconfig = tf.ConfigProto(allow_soft_placement=True)
    tfconfig.gpu_options.allow_growth=True

    # init session
    sess = tf.Session(config=tfconfig)
    # load network
    if vedainet == 'vgg16':
        net = vgg16(batch_size=1)
    elif vedainet == 'res101':
        net = resnetv1(batch_size=1, num_layers=101)
    else:
        raise NotImplementedError
    net.create_architecture(sess, "TEST", 10,
                          tag='default', anchor_scales=[8, 16, 32])
    saver = tf.train.Saver()
    saver.restore(sess, tfmodel)

    print('Loaded network {:s}'.format(tfmodel))

    im_names = ['00000024_co.png', '00000028_co.png', 
		'00000034_co.png', '00000039_co.png', 
		'00000341_co.png', '00000365_co.png',
		'00001233.png', '00001237.png']

    for im_name in im_names:
        print('~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~')
        print('vedai for data/vedai/{}'.format(im_name))
        vedai(sess, net, im_name)

    plt.show()

Issue Analytics

  • State:closed
  • Created 6 years ago
  • Comments:6 (2 by maintainers)

github_iconTop GitHub Comments

1reaction
heyan4869commented, Jun 13, 2018

@zqdeepbluesky just had this issue as well, mine is because I didn’t change the number of classes argument in net.create_architecture(). In the above code, net.create_architecture(sess, "TEST", 10, tag='default', anchor_scales=[8, 16, 32]) should be net.create_architecture(sess, "TEST", len(CLASSES), tag='default', anchor_scales=[8, 16, 32])

0reactions
zqdeepblueskycommented, Jun 10, 2018

@hadi-ghnd hi,I met the same error,can you tell me how to fix it?thanks so much.

Read more comments on GitHub >

github_iconTop Results From Across the Web

Tensorflow Assign requires shapes of both tensors to match ...
I am not sure where the RHS parameter is coming from. I've looked at all config files and there doesn't seem to be...
Read more >
tensorflow错误Assign requires shapes of both tensors to ...
在跑tensorflow任务的时候,有时候会碰到这种错误. Tensorflow Assign requires shapes of both tensors to match. lhs shape= [20] rhs shape= [48].
Read more >
tflearn执行报错:Assign requires shapes of both tensors to ...
Assign requires shapes of both tensors to match. lhs shape= [32,2] rhs shape= [32,32]. 报错的原因很简单,就是由于修改了网络结构造成的。
Read more >
faster-rcnn错误信息: tensorflow.python.framework ... - 博客园
... requires shapes of both tensors to match. lhs shape= [21] rhs shape= [2]. 复制代码. 1 Traceback (most recent call last): 2 File ......
Read more >
'shape' Dialect - MLIR - LLVM
concat (::mlir::shape::ConcatOp) ¶. Concatenates two shapes. Syntax: operation ::= `shape.concat` $lhs `,` $rhs attr-dict `:` type($ ...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found