Radar-based gesture recognition has emerged as a reliable alternative to vision-based systems for human-computer interaction, especially in environments with low illumination, occlusion, or privacy constraints. This study explores the implementation of a radar-based gesture recognition system using advanced signal processing and machine learning techniques to classify dynamic hand movements with high precision. The central challenge addressed involves extracting discriminative features from radar signals and developing robust classifiers capable of performing effectively under real-world conditions. The proposed approach includes preprocessing radar data through bandpass filtering (5-50 Hz) and normalization, followed by the extraction of key features such as signal energy, mean Doppler shift (7.6-7.9 Hz), and spectral centroid. A Support Vector Machine (SVM) classifier with a radial basis function (RBF) kernel is employed and optimized for gesture classification. Comparative analysis reveals that the SVM model outperforms the K-nearest neighbors (KNN) method, achieving a classification accuracy of 86% and an F1-score of 0.89, compared to 82% accuracy and a 0.84 F1-score obtained with KNN at. These results demonstrate the effectiveness of radar-based systems in detecting and classifying hand gestures accurately, achieving up to 97.3% accuracy in controlled environments. Unlike traditional camera-based systems, radar maintains functionality in poor lighting and occluded conditions while preserving user privacy by avoiding optical recordings. The system also offers low power consumption and real-time processing capabilities, making it suitable for deployment in privacy-sensitive and resource-constrained applications. This work confirms radar’s potential in fine-grained gesture interpretation and aligns with prior studies in crowd tracking and digit recognition, where similar performance metrics were observed. The integration of radar sensing with machine learning offers a promising path toward more secure, responsive, and environment-agnostic interaction systems.
Published in | International Journal of Wireless Communications and Mobile Computing (Volume 12, Issue 2) |
DOI | 10.11648/j.wcmc.20251202.12 |
Page(s) | 72-80 |
Creative Commons |
This is an Open Access article, distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution and reproduction in any medium or format, provided the original work is properly cited. |
Copyright |
Copyright © The Author(s), 2025. Published by Science Publishing Group |
Radar-based Gesture Recognition, Support Vector Machine, Contactless Interface, Signal Processing, Human-computer Interaction
Data Representation and Dimensions | Algorithmic Details | Frequency | No. of Gestures | Distance Between Hand and Sensor | Participants | Samples per Gesture | Number of Radars |
---|---|---|---|---|---|---|---|
N/A | Hardware only, no algorithm proposed | 94 GHz | N/A | Not mentioned | Tested only | Hand tracking only | 1 |
Time-Range (2D) | SVM | 7.29 GHz | 5 | 0-1 m | 1 | 500 | 1 |
Time-Amplitude (1D) | Conditional statements | 6.8 GHz | 6 | 1 m | 1 | 50 | 1 |
Time-Range (2D) | Neural Network | 6.8 GHz | 6 | Not specified | 1 | 10 s (unspecified) | 1 |
Time-Doppler (3D-RGB) | DCNN | 5.8 GHz | 10 | 0.1 m | 1 | 500 | 1 |
Time-Range (2D matrix) | K-means (Unsupervised clustering) | 6.8 GHz | 5 | ~1 m approx | 3 | 50 | 1 |
Time-Amplitude (1D) | 1-D CNN | Not mentioned | 6 | 0.15 m | 5 | 81 | 1 |
Time-Doppler (3D-RGB) | DCNN | 5.8 GHz | 7 | 0.1 m | 1 | 25 | 1 |
Range-Doppler image (2D grayscale) | HMM (Active sensing) | 300 kHz | 7 | Not specified | 9 | 50 | 1 |
Time-Range (2D grayscale) | Deep Learning | 7.29 GHz | 5 | 0.45 m | 3 | 100 | 1 |
Time-Range envelope (1D) | DCNN | 60 GHz | 3 | 0.10-0.30 m | 2 | 180 | 1 |
Range-RCS (1D) | Observing backscattered waves | 60 GHz | 3 | 0.25 m | Not specified | 1000 | 1 |
Time-Range (2D grayscale) | Multiclass SVM | 7.29 GHz | 9 | < 0.5 m | 4 | 100 | 4 |
Time-Range (2D grayscale) | CNN | 7.29 GHz | 10 | 0-1 m | 5 | 400 | 3 |
Time-Range (3D-RGB) | GoogLeNet Framework | 7.29 GHz | 8 | 3-8 m | 3 | 100 | 1 & 2 |
Time-Range (2D grayscale) | DCNN | 7.29 GHz | Drawing gesture | Not specified | 5 | Not specified | 4 |
Time-Range (2D grayscale) | CNN | 7.29 GHz | Digit writing | 0-1 m | 3 | 300 | 4 |
SVM | Support Vector Machine |
RBF | Radial Basis Function |
KNN | K-Nearest Neighbors |
IR-UWB | Impulse Radio Ultra-Wideband |
GPR | Ground-penetrating Radar |
CNN | Convolutional Neural Network |
SHARAD | SHAllow RADar |
FMCW | Frequency-modulated Continuous Wave |
PCA | Principal Component Analysis |
LO | Low-frequency Oscillator |
PA | Power Amplifier |
ADC | Analogue to Digital Convertor |
Tx | Transmitter |
Rx | Receiver |
[1] | J. Lien et al., "Soli: Ubiquitous gesture sensing with millimeter wave radar," ACM Trans. Graph., vol. 35, no. 4, pp. 1-19, 2016, |
[2] | C. Liu, Y. Li, D. Ao, and H. Tian, "Spectrum-based hand gesture recognition using millimeter-wave radar parameter measurements," IEEE Access, vol. 7, pp. 79147-79158, 2019, |
[3] | E. Miller et al., "RadSense: Enabling one hand and no hands interaction for sterile manipulation of medical images using Doppler radar," Smart Health, vol. 15, Pp. 100089, 2020, |
[4] | S. Mitra and T. Acharya, "Gesture recognition: A survey," IEEE Trans. Syst., Man, Cybern. C, vol. 37, no. 3, pp. 311-324, 2007, |
[5] | J. A. Nanzer, "A review of microwave wireless techniques for human presence detection and classification," IEEE Trans. Microw. Theory Techn., vol. 65, no. 5, pp. 1780-1794, 2017, |
[6] | S. Pisa et al., "A double-sideband continuous-wave radar sensor for carotid wall movement detection," IEEE Sensors J., vol. 18, no. 20, pp. 8162-8171, 2018, |
[7] | N. E. Putzig et al., "Three-dimensional radar imaging of structures and craters in the Martian polar caps," Icarus, vol. 308, pp. 138-147, 2018, |
[8] | S. S. Rautaray and A. Agrawal, "Vision based hand gesture recognition for human computer interaction: A survey," Artif. Intell. Rev., vol. 43, no. 1, pp. 1-54, 2015, |
[9] | A. Santra, R. V. Ulaganathan, and T. Finke, "Short-range millimetric-wave radar system for occupancy sensing application," IEEE Sensors Lett., vol. 2, no. 1, pp. 1-4, 2018, |
[10] | S. Skaria, A. Al-Hourani, M. Lech, and R. J. Evans, "Hand-gesture recognition using two-antenna Doppler radar with deep convolutional neural networks," IEEE Sensors J., vol. 19, no. 8, pp. 3041-3048, 2019, |
[11] | N. T. P. Van et al., "Microwave radar sensing systems for search and rescue purposes," Sensors, vol. 19, no. 13, Pp. 2879, 2019, |
[12] | J. P. Wachs et al., "A gesture-based tool for sterile browsing of radiology images," J. Amer. Med. Inform. Assoc., vol. 15, no. 3, pp. 321-323, 2008, |
[13] | Y. Wang, S. Wang, M. Zhou, Q. Jiang, and Z. Tian, "TS-I3D based hand gesture recognition method with radar sensor," IEEE Access, vol. 7, pp. 22902-22913, 2019, |
[14] | H.-S. Yeo, B.-G. Lee, and H. Lim, "Hand tracking and gesture recognition system for human-computer interaction using low-cost hardware," Multimedia Tools Appl., vol. 74, no. 7, pp. 2687-2715, 2015, |
[15] | Z. Zhang, Z. Tian, and M. Zhou, "Latern: Dynamic continuous hand gesture recognition using FMCW radar sensor," IEEE Sensors J., vol. 18, no. 8, pp. 3278-3289, 2018, |
[16] | Ahmed, S., Kallu, K. D., Ahmed, S., & Cho, S. H. Hand gestures recognition using radar sensors for human-computer interaction: A review. Remote Sensing, 13(3), 527, 2021 |
APA Style
Philip-Kpae, F. O., Nkolika, N., Endurance, O. L., Jude, N. B. (2025). Robust Radar-driven Gesture Recognition for Contactless Human-computer Interaction Using Support Vector Machine and Signal Feature Optimization. International Journal of Wireless Communications and Mobile Computing, 12(2), 72-80. https://doi.org/10.11648/j.wcmc.20251202.12
ACS Style
Philip-Kpae, F. O.; Nkolika, N.; Endurance, O. L.; Jude, N. B. Robust Radar-driven Gesture Recognition for Contactless Human-computer Interaction Using Support Vector Machine and Signal Feature Optimization. Int. J. Wirel. Commun. Mobile Comput. 2025, 12(2), 72-80. doi: 10.11648/j.wcmc.20251202.12
AMA Style
Philip-Kpae FO, Nkolika N, Endurance OL, Jude NB. Robust Radar-driven Gesture Recognition for Contactless Human-computer Interaction Using Support Vector Machine and Signal Feature Optimization. Int J Wirel Commun Mobile Comput. 2025;12(2):72-80. doi: 10.11648/j.wcmc.20251202.12
@article{10.11648/j.wcmc.20251202.12, author = {Friday Oodee Philip-Kpae and Nwazor Nkolika and Ogbondamati Lloyd Endurance and Nwinuaka Berebari Jude}, title = {Robust Radar-driven Gesture Recognition for Contactless Human-computer Interaction Using Support Vector Machine and Signal Feature Optimization }, journal = {International Journal of Wireless Communications and Mobile Computing}, volume = {12}, number = {2}, pages = {72-80}, doi = {10.11648/j.wcmc.20251202.12}, url = {https://doi.org/10.11648/j.wcmc.20251202.12}, eprint = {https://article.sciencepublishinggroup.com/pdf/10.11648.j.wcmc.20251202.12}, abstract = {Radar-based gesture recognition has emerged as a reliable alternative to vision-based systems for human-computer interaction, especially in environments with low illumination, occlusion, or privacy constraints. This study explores the implementation of a radar-based gesture recognition system using advanced signal processing and machine learning techniques to classify dynamic hand movements with high precision. The central challenge addressed involves extracting discriminative features from radar signals and developing robust classifiers capable of performing effectively under real-world conditions. The proposed approach includes preprocessing radar data through bandpass filtering (5-50 Hz) and normalization, followed by the extraction of key features such as signal energy, mean Doppler shift (7.6-7.9 Hz), and spectral centroid. A Support Vector Machine (SVM) classifier with a radial basis function (RBF) kernel is employed and optimized for gesture classification. Comparative analysis reveals that the SVM model outperforms the K-nearest neighbors (KNN) method, achieving a classification accuracy of 86% and an F1-score of 0.89, compared to 82% accuracy and a 0.84 F1-score obtained with KNN at. These results demonstrate the effectiveness of radar-based systems in detecting and classifying hand gestures accurately, achieving up to 97.3% accuracy in controlled environments. Unlike traditional camera-based systems, radar maintains functionality in poor lighting and occluded conditions while preserving user privacy by avoiding optical recordings. The system also offers low power consumption and real-time processing capabilities, making it suitable for deployment in privacy-sensitive and resource-constrained applications. This work confirms radar’s potential in fine-grained gesture interpretation and aligns with prior studies in crowd tracking and digit recognition, where similar performance metrics were observed. The integration of radar sensing with machine learning offers a promising path toward more secure, responsive, and environment-agnostic interaction systems.}, year = {2025} }
TY - JOUR T1 - Robust Radar-driven Gesture Recognition for Contactless Human-computer Interaction Using Support Vector Machine and Signal Feature Optimization AU - Friday Oodee Philip-Kpae AU - Nwazor Nkolika AU - Ogbondamati Lloyd Endurance AU - Nwinuaka Berebari Jude Y1 - 2025/08/08 PY - 2025 N1 - https://doi.org/10.11648/j.wcmc.20251202.12 DO - 10.11648/j.wcmc.20251202.12 T2 - International Journal of Wireless Communications and Mobile Computing JF - International Journal of Wireless Communications and Mobile Computing JO - International Journal of Wireless Communications and Mobile Computing SP - 72 EP - 80 PB - Science Publishing Group SN - 2330-1015 UR - https://doi.org/10.11648/j.wcmc.20251202.12 AB - Radar-based gesture recognition has emerged as a reliable alternative to vision-based systems for human-computer interaction, especially in environments with low illumination, occlusion, or privacy constraints. This study explores the implementation of a radar-based gesture recognition system using advanced signal processing and machine learning techniques to classify dynamic hand movements with high precision. The central challenge addressed involves extracting discriminative features from radar signals and developing robust classifiers capable of performing effectively under real-world conditions. The proposed approach includes preprocessing radar data through bandpass filtering (5-50 Hz) and normalization, followed by the extraction of key features such as signal energy, mean Doppler shift (7.6-7.9 Hz), and spectral centroid. A Support Vector Machine (SVM) classifier with a radial basis function (RBF) kernel is employed and optimized for gesture classification. Comparative analysis reveals that the SVM model outperforms the K-nearest neighbors (KNN) method, achieving a classification accuracy of 86% and an F1-score of 0.89, compared to 82% accuracy and a 0.84 F1-score obtained with KNN at. These results demonstrate the effectiveness of radar-based systems in detecting and classifying hand gestures accurately, achieving up to 97.3% accuracy in controlled environments. Unlike traditional camera-based systems, radar maintains functionality in poor lighting and occluded conditions while preserving user privacy by avoiding optical recordings. The system also offers low power consumption and real-time processing capabilities, making it suitable for deployment in privacy-sensitive and resource-constrained applications. This work confirms radar’s potential in fine-grained gesture interpretation and aligns with prior studies in crowd tracking and digit recognition, where similar performance metrics were observed. The integration of radar sensing with machine learning offers a promising path toward more secure, responsive, and environment-agnostic interaction systems. VL - 12 IS - 2 ER -