Visual Computing Research Group
Our research in the Visual Computing Group (VCG) covers both theoretical and practical development for visual data processing and analysis.
Making use of knowledge and technologies across creative computing, machine learning, computer vision/graphics and biomedical engineering, our research topics cover real-time sensing and 3D vision, 4D facial expression synthesis and perception, facial and human action analysis, affective analysis and brain-computer interface – and our work has applications in human-computer interaction, healthcare, entertainment, security, virtual reality and manufacturing.
Our work is supported by various research funders including EPSRC, EU FP7, Royal Academy of Engineering, Leverhulme Trust, Royal Society, Innovate UK, and industries. You can find out more about our recent funding awards here.
The group also has established long-term collaboration with industrial partners, such as Emteq, which is a leading VR/AR company based in Brighton – as well as long-held partnerships with academic institutions including Imperial College, Southampton University, University of Glasgow, Virginia Commonwealth University.
Facial expression is among the primary non-verbal means in human communication – for example, a smile can express happiness or positive opinions.
However, existing VR and AR head mounted displays (HMDs) such as HTC Vive and MS Hololens occlude the wearer’s face significantly, especially the eye region which is one of the most emotion-salient facial parts, which severely prohibits natural interactions between the user and the virtual environment.
To alleviate this problem, this project integrated advanced biometric sensors into the HMD in an unobtrusive manner to capture fine-scale dynamic facial movements which are further mapped to a user-specific 3D face model to recover the user’s facial expression with high-fidelity.
Read the project publication, Realistic Facial Expression Reconstruction for VR HMD Users.
Parkinson's disease (PD) affects 30 million people worldwide and is the second commonest neurodegenerative disorder – symptoms include tremors, muscle rigidity, dropping posture, walking difficulty, loss of facial expressivity, loss of vocal projection.
Experts in the field state that biomarkers (which include clinical, biochemical, genetic, proteomic, or neuroimaging) may help early detection, differentiation of atypical Parkinson's and earlier symptom control.
Clinical biomarkers, such as motor function are particularly attractive as they enable early monitoring, are non-invasive and may be applied to a number of clinical conditions affecting a similar demographic. In addition to aiding at-home monitoring and assessment, several studies have also shown that gait performance in PD can be improved by applying continuous external rhythmic auditory or visual cues.
This project aims to develop a novel multi-sensory wearable device, incorporating multiple sensors, to detect physical activity, gait, posture, voice and facial EMG.
Staff at the Visual Computing Group
Jianyuan Sun, Professor Hui Yu, Guoqiang Zhong, Junyu Dong, Shu Zhang & Hongchuan Yu, 23 Mar 2020, (Early online) In: IEEE Transactions on Cybernetics. 10 p.
Xiaoqiang Yan, Yangdong Ye, Xueying Qiu & Professor Hui Yu, 9 Oct 2019, (Early online) In: Information Fusion.
Yifan Xia, Jianwen Lou, Junyu Dong, Lin Qi, Gongfa Li & Professor Hui Yu, 1 Jan 2020, In: Multimedia Tools and Applications. 79, p. 805-824 20 p.
Jianwen Lou, Xiaoxu Cai, Yiming Wang & Professor Hui Yu, 1 Dec 2019, In: Multimedia Tools and Applications. 78, 24, p. 35455-35469 15 p.
Muwei Jian, Junyu Dong, Maoguo Gong, Professor Hui Yu, Liqiang Nie, Yilong Yin & Kin Man Lam, 23 Aug 2019, (Early online) In: IEEE Transactions on Multimedia.