版权说明 操作指南
首页 > 成果 > 详情

An Improved Time Feedforward Connections Recurrent Neural Networks

认领
导出
Link by DOI
反馈
分享
QQ微信 微博
成果类型:
期刊论文
作者:
Wang, Jin;Zou, Yongsong;Lim, Se -Jung
通讯作者:
Lim, S.-J.
作者机构:
[Zou, Yongsong; Wang, Jin] Changsha Univ Sci & Technol, Sch Hydraul & Environm Engn, Changsha 410014, Peoples R China.
[Wang, Jin] Changsha Univ Sci & Technol, Sch Comp & Commun Engn, Changsha 410014, Peoples R China.
[Lim, Se -Jung] Honam Univ, Div Convergence, AI Liberal Arts Studies, Gwangju Si 62399, South Korea.
通讯机构:
[Lim, S.-J.] A
AI Liberal Arts Studies, South Korea
语种:
英文
关键词:
gated recurrent unit;long-short term memory;RNNs;SGRU;Time feedforward connections
期刊:
Intelligent Automation and Soft Computing
ISSN:
1079-8587
年:
2023
卷:
36
期:
3
页码:
2743-2755
基金类别:
Funding Statement: This work was funded by the National Science Foundation of Hunan Province (2020JJ2029). This work was also supported by a research fund from Honam University, 2022.
机构署名:
本校为第一机构
院系归属:
计算机与通信工程学院
摘要:
Recurrent Neural Networks (RNNs) have been widely applied to deal with temporal problems, such as flood forecasting and financial data processing. On the one hand, traditional RNNs models amplify the gradient issue due to the strict time serial dependency, making it difficult to realize a long-term memory function. On the other hand, RNNs cells are highly complex, which will signifi-cantly increase computational complexity and cause waste of computational resources during model training. In this paper, an improved Time Feedforward Connections R...

反馈

验证码:
看不清楚,换一个
确定
取消

成果认领

标题:
用户 作者 通讯作者
请选择
请选择
确定
取消

提示

该栏目需要登录且有访问权限才可以访问

如果您有访问权限,请直接 登录访问

如果您没有访问权限,请联系管理员申请开通

管理员联系邮箱:yun@hnwdkj.com