基于改进YOLOv5的密集场景行人检测方法研究
DOI:
作者:
作者单位:

1.天津理工大学电气工程与自动化学院 天津 300384;2.天津福莱迪科技发展有限公司 天津 300385

作者简介:

通讯作者:

中图分类号:

TP181

基金项目:


Research on Pedestrian detection Method in dense scene based on improved YOLOv5
Author:
Affiliation:

Fund Project:

  • 摘要
  • |
  • 图/表
  • |
  • 访问统计
  • |
  • 参考文献
  • |
  • 相似文献
  • |
  • 引证文献
  • |
  • 资源附件
  • |
  • 文章评论
    摘要:

    当前的研究中密集场景行人检测精度较低,为提高检测精度,本文提出一种基于YOLOv5网络的改进方法V-YOLO,采用加权融合金字塔网络BiFPN( Bi-directional Feature Pyramid Network)改进原始网络中的路径聚合网络PANet(Path Aggregation Network),加强多尺度特征的融合能力,提高对行人目标的检测能力。为了保留更多的特征信息,提高主干网络的特征提取能力,添加残差结构VBlock;引入SKNet(Select Kernel Networks)注意力机制,动态融合不同感受野的特征图,提高对不同行人特征的利用率。本文使用CrowdHuman数据集进行训练和测试,实验结果表明,所提出算法比原始网络的精确度、召回率和平均精度值分别提高1.8%、2.3%和2.6%,验证了所提出算法能有效的提高密集场景下行人目标检测的准确率。

    Abstract:

    In the current study, pedestrian detection accuracy in dense scenes is low. In order to improve the detection accuracy, an improved method based on YOLOv5 network, V-YOLO, is proposed in this paper. The weighted fusion Pyramid Network BiFPN (Bi-directional Feature Pyramid Network) is used to improve the Path Aggregation network PANet (Path Aggregation Network) in the original network to strengthen the multi-scale feature fusion capability. Improve the ability of pedestrian target detection. For retain more feature information and improve the feature extraction capability of the backbone network, a residual structure VBlock is added. SKNet(Select Kernel Networks) were introduced to integrate the feature maps of different receptive fields dynamically to improve the utilization rate of different pedestrian features. In this paper, CrowdHuman data set is used for training and testing. The experimental results show that compared with the original network, the accuracy, recall rate and average accuracy of the proposed algorithm are increased by 1.8%, 2.3% and 2.6%, respectively, which verifies that the proposed algorithm can effectively improve the accuracy of pedestrian target detection in dense scenes.

    参考文献
    相似文献
    引证文献
引用本文
分享
文章指标
  • 点击次数:
  • 下载次数:
  • HTML阅读次数:
  • 引用次数:
历史
  • 收稿日期:2022-12-29
  • 最后修改日期:2023-03-02
  • 录用日期:2023-03-07
  • 在线发布日期:
  • 出版日期: