基于改进 YOLOv5的密集场景行人检测方法研究
DOI:
CSTR:
作者:
作者单位:

作者简介:

通讯作者:

中图分类号:

TP181

基金项目:


Research on pedestrian detection method in dense scene based on improved YOLOv5
Author:
Affiliation:

Fund Project:

  • 摘要
  • |
  • 图/表
  • |
  • 访问统计
  • |
  • 参考文献
  • |
  • 相似文献
  • |
  • 引证文献
  • |
  • 资源附件
  • |
  • 文章评论
    摘要:

    当前的研究中密集场景行人检测精度较低,为提高检测精度,提出一种基于YOLOv5 网络的改进方法 V-YOLO, 采 用加权双向特征金字塔网络(bi-directional feature pyramid network,BiFPN)改进原始网络中的路径聚合网络(path aggrega- tion network,PANet),加强多尺度特征的融合能力,提高对行人目标的检测能力。为了保留更多的特征信息,提高主干网络 的特征提取能力,添加残差结构VBlock;引入SKNet(select kernel networks)注意力机制,动态融合不同感受野的特征图,提 高对不同行人特征的利用率。使用CrowdHuman 数据集进行训练和测试,实验结果表明,所提出算法比原始网络的精确度、 召回率和平均精度值分别提高1.8%、2.3%和2.6%,验证了所提出算法能有效的提高密集场景下行人目标检测的准确率。

    Abstract:

    In the current study,pedestrian detection accuracy in dense scenes is low.In order to improve the detection ac- curacy,an improved method based on YOLOv5 network,V-YOLO,is proposed in this paper.The bi directional feature pyramid network (BiFPN)isused to improve the path aggregation network (PANet)in the original network to strength- en the multi-scale feature fusion capability.Improve the ability of pedestrian target detection.For retain more feature in- formation and improve the feature extraction capability of the backbone network,a residual structure VBlock is added. Select kernel networks(SKNet)were introduced to integrate the feature maps of different receptive fields dynamically to improve the utilization rate of different pedestrian features.In this paper,CrowdHumandata set is used for training and testing.The experimental results show that compared with the original network,the accuracy,recall rate and average accuracy of the proposed algorithm are increased by 1.8%,2.3%and 2.6%,respectively,which verifies that the pro posed algorithm can effectively improve the accuracy of pedestrian target detection in dense scenes

    参考文献
    相似文献
    引证文献
引用本文

高 强,唐 福 兴,李 栋,吉 月 辉,刘 俊 杰,史 涛,苏 艳 杰.基于改进 YOLOv5的密集场景行人检测方法研究[J].国外电子测量技术,2023,42(4):125-130

复制
分享
文章指标
  • 点击次数:
  • 下载次数:
  • HTML阅读次数:
  • 引用次数:
历史
  • 收稿日期:
  • 最后修改日期:
  • 录用日期:
  • 在线发布日期: 2024-10-29
  • 出版日期:
文章二维码