-
Notifications
You must be signed in to change notification settings - Fork 621
Description
Prerequisite
- I have searched Issues and Discussions but cannot get the expected help.
- I have read the FAQ documentation but cannot get the expected help.
- The bug has not been fixed in the latest version (master) or latest version (1.x).
Task
I'm using the official example scripts/configs for the officially supported tasks/models/datasets.
Branch
master branch https://github.com/open-mmlab/mmrotate
Environment
I tried to use dict(type='RRandomCrop') in the list named train_pipeline of my configuration *py file, but it occurs bbox mismatch error, if only one valid bounding box exists in the cropped image!
** Error message:
File "/home/yechani9/PycharmProjects/AMOD-ExpKit/mmdetection/mmdet/core/bbox/assigners/max_iou_assigner.py", line 111, in assign
overlaps = self.iou_calculator(gt_bboxes, bboxes)
File "/home/yechani9/PycharmProjects/AMOD-ExpKit/mmdetection/mmdet/core/bbox/iou_calculators/iou2d_calculator.py", line 65, in call
return bbox_overlaps(bboxes1, bboxes2, mode, is_aligned)
File "/home/yechani9/PycharmProjects/AMOD-ExpKit/mmdetection/mmdet/core/bbox/iou_calculators/iou2d_calculator.py", line 198, in bbox_overlaps
assert bboxes1.shape[:-2] == bboxes2.shape[:-2], f'{bboxes1.shape} != {bboxes2.shape}'
AssertionError: torch.Size([1, 1, 4]) != torch.Size([142110, 4])
** I cannot find the usage of RRandomCrop online. How to properly use RRandomCrop?
bboxes1.shape[:-2] == bboxes2.shape[:-2],
Reproduces the problem - code sample
Case 1)
train_pipeline = [
dict(type='LoadImageFromFile'),
dict(type='LoadAnnotations', with_bbox=True),
dict(type='RResize',
img_scale=[(1536, 1152), (2340, 1728)], # 0.8x - 1.2x (1x: 1920x1440)
multiscale_mode='range'),
dict(type='RRandomFlip',
flip_ratio=[0.25, 0.25, 0.25],
direction=['horizontal', 'vertical', 'diagonal'],
version=angle_version),
dict(type='Normalize', **img_norm_cfg),
dict(type='RRandomCrop', crop_size=(800, 800), allow_negative_crop=False,
crop_type='absolute', version=angle_version),
dict(type='Pad', size_divisor=32),
dict(type='DefaultFormatBundle'),
dict(type='Collect', keys=['img', 'gt_bboxes', 'gt_labels'])
]
Case 2)
train_pipeline = [
dict(type='LoadImageFromFile'),
dict(type='LoadAnnotations', with_bbox=True),
dict(type='RRandomCrop', crop_size=(800, 800), allow_negative_crop=False,
crop_type='absolute', version=angle_version),
dict(type='RResize',
img_scale=[(800, 800)]),
dict(type='RRandomFlip',
flip_ratio=[0.25, 0.25, 0.25],
direction=['horizontal', 'vertical', 'diagonal'],
version=angle_version),
dict(type='Normalize', **img_norm_cfg),
dict(type='Pad', size_divisor=32),
dict(type='DefaultFormatBundle'),
dict(type='Collect', keys=['img', 'gt_bboxes', 'gt_labels'])
]
Reproduces the problem - command or script
** Error message:
File "/home/yechani9/PycharmProjects/AMOD-ExpKit/mmdetection/mmdet/core/bbox/assigners/max_iou_assigner.py", line 111, in assign
overlaps = self.iou_calculator(gt_bboxes, bboxes)
File "/home/yechani9/PycharmProjects/AMOD-ExpKit/mmdetection/mmdet/core/bbox/iou_calculators/iou2d_calculator.py", line 65, in call
return bbox_overlaps(bboxes1, bboxes2, mode, is_aligned)
File "/home/yechani9/PycharmProjects/AMOD-ExpKit/mmdetection/mmdet/core/bbox/iou_calculators/iou2d_calculator.py", line 198, in bbox_overlaps
assert bboxes1.shape[:-2] == bboxes2.shape[:-2], f'{bboxes1.shape} != {bboxes2.shape}'
AssertionError: torch.Size([1, 1, 4]) != torch.Size([142110, 4])
Reproduces the problem - error message
** Error message:
File "/home/yechani9/PycharmProjects/AMOD-ExpKit/mmdetection/mmdet/core/bbox/assigners/max_iou_assigner.py", line 111, in assign
overlaps = self.iou_calculator(gt_bboxes, bboxes)
File "/home/yechani9/PycharmProjects/AMOD-ExpKit/mmdetection/mmdet/core/bbox/iou_calculators/iou2d_calculator.py", line 65, in call
return bbox_overlaps(bboxes1, bboxes2, mode, is_aligned)
File "/home/yechani9/PycharmProjects/AMOD-ExpKit/mmdetection/mmdet/core/bbox/iou_calculators/iou2d_calculator.py", line 198, in bbox_overlaps
assert bboxes1.shape[:-2] == bboxes2.shape[:-2], f'{bboxes1.shape} != {bboxes2.shape}'
AssertionError: torch.Size([1, 1, 4]) != torch.Size([142110, 4])
Additional information
No response