As for patients with cognitive impairments, it is vital to accurately identify their hand gestures for both the assessment and rehabilitation of their cognitive function. In recent years, computer vision has gradually been applied to hand gesture recognition. However, it is still challenging to accurately recognize gestures in complex environments, due to insufficient fusion of global and local semantic information, and dynamic hand gestures. To address these problems, this paper proposes a novel gesture recognition model based on the You Only Look Once (YOLO) architecture for patients with co...