Laboratory mice (Mus musculus) play a crucial role in scientific research, where accurate classification and sorting are essential for ensuring reliable experimental results. This study presents an intelligent multi-sensor system for the automated classification and sorting of laboratory mice based on three key parameters: health status, gender, and weight. The system integrates thermal imaging cameras AMG8833 for monitoring the health status of mice, object detection algorithms (YOLOv8) for gender classification, and load cell HX711 sensors for weight measurement. The integration of these sensors leverages advanced sensor fusion techniques to improve classification accuracy and efficiency. Thermal imaging detects physiological anomalies to assess the health condition of the mice, while object detection algorithms identify gender characteristics in real-time with high precision. Additionally, load cell sensors provide accurate weight data for further categorization. The combined system eliminates the need for manual intervention, ensuring a non-invasive, efficient, and scalable approach to laboratory animal management. The proposed system performed evaluation through multiple test scenarios aimed at assessing the health of mice and classifying their weight. The detection of mice gender was evaluated using a dataset comprising over 6,722 images stored in the STASRG laboratory. The test results indicated that the accuracy of animal sorting across three parameters achieved a 100% success rate. The accuracy of gender sorting was 86.67%, while the accuracy of weight measurement exhibited a difference of approximately 0.1 gram. The overall response time for sorting was 19 seconds. This multi-sensor integration demonstrates the potential to enhance laboratory workflows, minimize human error, and promote the welfare of laboratory animals via automated, data-driven processes.
Published in | Journal of Electrical and Electronic Engineering (Volume 13, Issue 1) |
DOI | 10.11648/j.jeee.20251301.16 |
Page(s) | 59-81 |
Creative Commons |
This is an Open Access article, distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution and reproduction in any medium or format, provided the original work is properly cited. |
Copyright |
Copyright © The Author(s), 2025. Published by Science Publishing Group |
Mice, Object Detection, Multi-Sensor, Integration, Fusion Sensor, Deep Learning, Mus Musculus, YOLOv8
Sensor | Technology | Accuracy | Range | Resolution | Main Application |
---|---|---|---|---|---|
AMG8833 | Infrared grid | ±2.5°C | 1-5 m | 8 x 8 | Small Animal |
MLX90614 | infrared | ±0.5°C | 0.5-2 m | Single point | Body temperture |
MLX90640 | infrared | ±1.5°C | 0.5-4 m | 32 x 24 | Thermal analysis |
FLIR Lepton | Thermal imaging | ±1°C | 0.5-10 m | 80 x 60 | Temperature visualization |
D6T (omron) | thermopile | ±0.5°C | 0.5-3 m | Multi-point | Human/animal presence |
TMP006 | thermopile | ±0.5°C | 01-0.5 m | Single-point | Skin Surface |
Characteristic | Male Mice | Female Mice |
---|---|---|
Anogenital distance | Longer | Shorter |
Nipples | Not visible | Visible, especially after maturation |
Body size | Larger | Smaller |
Testes | Visible (in mature males) | Absent |
Aggressiveness | More aggressive, specially toward other males | Less aggressive, more social |
Reproductive Cycle | Absent | Has an estrous cycle (4-5 days) |
Dominant Hormones | Testosterone | Estrogen and progesterone |
Behavioural Pattern | Territorial, Frequent marking with urine | Less territorial, less frequent marking |
Response to mating | Actively seeks mate | Receptive only during estrus |
Use in research | Common in study, aggression, metabolism, and testosterone | Common in studies on reproduction and estrogen-related effects |
Aspect | YOLO | Faster R-CNN | SSD |
---|---|---|---|
Speed | Very fast | Slow | Moderate |
Detection Accuracy | Good, especially for larger objects | Excellent, for small objects | Good, but sometimes inconsistent |
Small Object Detection | Improved in newer version | Highly accurate | Good, but not always consistent |
Overlapping Object Detection | Limited | Very good | Limited |
Computational Requirement | Low to moderate | High | Moderate |
Bounding Box Precision | Good, but less precise | Highly precise | Good |
Age (week) | Male Weight (gr) | Female Weight (gr) |
---|---|---|
6 | 18-22 | 15-19 |
8 | 20-25 | 17-21 |
10 | 22-30 | 18-23 |
12 | 25-35 | 20-25 |
Aspect | Load Cell | Strain Gauge | Capacitive Sensor |
---|---|---|---|
Accuracy | Very high | High | Moderate |
Response Time | fast | Fast | Fast |
Measurement Range | Low to high range | Low to high range | Limited to light weight |
Sensitivity | Good for small weight | Excellent for small weight | Low for small weight |
Suitability for mice | Highly suitable | Suitable | Less suitable |
Weakness | Sensitive to vibrations | Requires signal amplification | Prone to environmental noise |
Cost | Moderate | Moderate | Relatively low |
Implementation | Easy | Requires additional circuitry | Easy |
Mice | Predicted Male | Predicted Female | Avg response time (second) | Avg confidence value (%) |
---|---|---|---|---|
Actual Male | 13 | 2 | 3.38 | 91% |
Actual Female | 2 | 13 | 3.20 | 87% |
No of Mice | Chamber 1 (Temperature °C) | Chamber 2 (gender detection male/female) | Chamber 3 (weight) | Conveyor | Total Process (second) | ||||
---|---|---|---|---|---|---|---|---|---|
Status | Time (s) | Gender | Time (s) | Weight (gr) | Class Chamber | Time (s) | Time (s) | ||
1 | Healthy | 2.26 | Male | 2.32 | 9.95 | Right | 3.92 | 3.13 | 11.63 |
2 | Healthy | 3.01 | Male | 2.59 | 19.9 | Left | 4.17 | 6.06 | 15.82 |
3 | Healthy | 2.55 | Female | 2.71 | 39.75 | Centre | 4.27 | 7.58 | 17.11 |
4 | Healthy | 2.43 | Male | 2.55 | 9.88 | Right | 4.30 | 3.55 | 12.83 |
5 | Healthy | 2.63 | Female | 2.66 | 19.85 | Left | 4.34 | 4.09 | 13.72 |
6 | Sick | 2.34 | N/A | N/A | N/A | N/A | N/A | N/A | 2.34 |
7 | Healthy | 2.77 | Female | 2.73 | 9.68 | Right | 4.62 | 2.55 | 12.67 |
8 | Healthy | 2.62 | Male | 2.45 | 14.15 | Left | 4.64 | 4.64 | 14.35 |
9 | Healthy | 2.69 | Female | 2.92 | 29.66 | Centre | 4.33 | 7.46 | 17.40 |
10 | Healthy | 2.74 | Male | 2.75 | 9.91 | Right | 4.22 | 2.43 | 12.14 |
11 | Healthy | 2.85 | Female | 2.88 | 16.8 | Left | 3.38 | 5.14 | 14.82 |
12 | Healthy | 2.82 | Female | 3.17 | 26.74 | Centre | 4.42 | 6.92 | 17.33 |
13 | Healthy | 2.56 | Male | 3.06 | 29.88 | Centre | 4.87 | 7.36 | 17.85 |
14 | Healthy | 2.41 | Male | 3.13 | 10.12 | Right | 4.44 | 3.22 | 13.20 |
15 | Healthy | 2.60 | Male | 3.20 | 20.23 | Left | 4.82 | 5.80 | 16.42 |
15 | Healthy | 2.53 | Male | 3.04 | 34.31 | Centre | 4.55 | 7.54 | 17.66 |
17 | Sick | 2.39 | N/A | N/A | N/A | N/A | N/A | N/A | 2.39 |
18 | Healthy | 2.21 | Female | 3.28 | 16.54 | Left | 3.82 | 5.03 | 14.34 |
19 | Healthy | 2.25 | Male | 3.57 | 25.34 | Centre | 4.52 | 5.56 | 15.90 |
20 | Sick | 3.19 | N/A | N/A | N/A | N/A | N/A | N/A | 3.19 |
21 | Sick | 3.40 | N/A | N/A | N/A | N/A | N/A | N/A | 3.40 |
22 | Healthy | 3.27 | Female | 3.59 | 9.32 | Right | 4.65 | 2.28 | 13.79 |
23 | Healthy | 3.57 | Female | 4.12 | 36.76 | Centre | 4.77 | 6.68 | 19.14 |
24 | Sick | 3.11 | N/A | N/A | N/A | N/A | N/A | N/A | 3.11 |
25 | Healthy | 3.49 | Female | 3.33 | 18.69 | Left | 4.98 | 4.47 | 16.27 |
YOLOv8 | You Only Look Once Version 8 |
Kg | Kilograms |
s | Second |
AGD | Anogenital Distance |
R-CNN | Region-based Convolutional Neural Network |
SSD | Single Shot Detector |
WiFi. | Wireless Fidelity |
OpenCV | Open Source Computer Vision Library |
STASRG | Smart Technology and Applied Sciences the Rapid Research Generator |
mAP | Mean Average Precision |
N/A | Not Applicable |
TP | True Positive |
TN | True Negative |
FP | False Positive |
FN | False Negative |
[1] | K. Bayne, Revised Guide for the Care and Use of Laboratory Animals available. American Physiological Society., vol. 39, no. 4. 1996. |
[2] | M. I. P. Gentil, “Mouse Biomethodology,” Lab. Anim., vol. 45, no. 4, pp. 231-239, 2012, [Online]. Available: |
[3] | G. A. Mutiara, G. I. Hapsari, M. R. Alfarisi, Periyadi, L. Meisaroh, and N. N. Adisty Hadian, “Body Weight-Based Animal Laboratory Classification Utilizing Load Cell and IoT,” in 2024 12th International Conference on Information and Communication Technology, ICoICT 2024, 2024, pp. 425-431, |
[4] | N. Kokras and C. Dalla, “Sex differences in animal models of psychiatric disorders,” Br. J. Pharmacol., vol. 171, no. 20, pp. 4595-4619, 2014, |
[5] | J. M. Chalovich and E. Eisenberg, “Psychological Stress and the Human Immune System: A Meta- Analytic Study of 30 Years of Inquiry Suzanne,” Magn Reson Imaging, vol. 31, no. 3, pp. 477-479, 2013, |
[6] | D. Verdun, Occupational Health For Animal Workers, no. March 1996. 2021, pp. 1-2, |
[7] | V. Barrile, S. Simonetti, R. Citroni, A. Fotia, and G. Bilotta, “Experimenting Agriculture 4.0 with Sensors: A Data Fusion Approach between Remote Sensing, UAVs and Self-Driving Tractors,” Sensors (Basel)., vol. 22, no. 20, 2022, |
[8] | G. A. Mutiara, Periyadi, L. Meisaroh, M. R. Alfarisi, W. M. Yasin, N. N Adisty, M. A. R. Zain, “Integrated Sensor-Based Smart Mannequin for Injury Detection in Armored Vehicle,” Int. J. Saf. Secur. Eng., vol. 13, no. 4, pp. 625-633, 2023, |
[9] | A. Rifqi, G. A Mutiara, Periyadi, M. R. Alfarisi, A. Sitorus, and R. D. Adhira, “Developing Smart Skin: A Force Sensor-Based Touch and Pressure Detector,” in International Conference on Information and Communication Technology (ICoICT), 2024, pp. 1-8, |
[10] | C. A. Aguilar-Lazcano, I. E. Espinosa-Curiel, J. A. Ríos-Martínez, F. A. Madera-Ramírez, and H. Pérez-Espinosa, “Machine Learning-Based Sensor Data Fusion for Animal Monitoring: Scoping Review,” Sensors, vol. 23, no. 12, pp. 1-28, 2023, |
[11] | R. Arablouei, Z. Wang, G. J. Bishop-Hurley, and J. Liu, “Multimodal sensor data fusion for in-situ classification of animal behavior using accelerometry and GNSS data,” Smart Agric. Technol., vol. 4, no. December 2022, p. 100163, 2023, |
[12] | K. Gao, G. Mei, F. Piccialli, S. Cuomo, J. Tu, and Z. Huo, “Julia language in machine learning: Algorithms, applications, and open issues,” Comput. Sci. Rev., vol. 37, p. 100254, 2020, |
[13] | H. Yoshikawa, Design Methodology for Research and Development Strategy (Realizing a Sustainable Society), vol. 1, no. 5. 2012. |
[14] | P. R. Intan, “The Use of Laboratory Animals in Supporting The Development of The Medical World,” in Sinasis, 2020, vol. 1, no. 1, pp. 141-144. |
[15] | Ujwala Ashokrao Jadhao, Chatraguna Eknathrao Lagad, and Rajesh Kundlikrao Ingole, “Laboratory animal handling techniques, basic facilities and care: A review,” World J. Biol. Pharm. Heal. Sci., vol. 12, no. 3, pp. 170-182, 2022, |
[16] | F. Lydén, Handling methods of laboratory mice and rats Hanteringsmetoder av försöksmöss och råttor Uppsala 2016 Ethology and Animal Welfare-Bachelor’s programme, no. 657. 2016, pp. 1-28. |
[17] | A. S, P. K. Atul, K. R, N. Kumar, I. A. Mir, and P. Kumar, “Contribution of experimental animals in biomedical research,” Int. J. Vet. Sci. Anim. Husb., vol. 9, no. 1S, pp. 10-15, 2024, |
[18] | V. Reinhardt; and A. Reinhardt, Variables, refinement and environmental enrichment for rodents and rabbits kept in research institutions: making life easier for animals in laboratories. 2006. |
[19] | J. Hau and G. Van Hoosier, Handbook of laboratory animal science, second edition: Essential principles and practices, vol. 1. 2002. |
[20] | P. Mukherjee, S. Roy, D. Ghosh, and S. K. Nandi, “Role of animal models in biomedical research: a review,” Lab. Anim. Res., vol. 38, no. 1, pp. 1-17, 2022, |
[21] | K. M. Cholerisa, Elena, Galea, Liisa A. M., Sohrabji, Farida, and Frick, “Sex differences in the brain: Implications for behavioral and biomedical research,” Physiol. Behav., vol. 176, no. 3, pp. 139-148, 2017, |
[22] | T. C. Ngun, N. Ghahramani, F. J. Sánchez, S. Bocklandt, and E. Vilain, “The genetics of sex differences in brain and behavior,” Front. Neuroendocrinol., vol. 32, no. 2, pp. 227-246, 2011, |
[23] | C. H. Tsao, K. Y. Wu, N. C. Su, A. Edwards, and G. J. Huang, “The influence of sex difference on behavior and adult hippocampal neurogenesis in C57BL/6 mice,” Sci. Rep., vol. 13, no. 1, pp. 1-12, 2023, |
[24] | S. Lee et al., “Body weight changes of laboratory animals during transportation,” Asian-Australasian J. Anim. Sci., vol. 25, no. 2, pp. 286-290, 2012, |
[25] | O. V. Perepelkina, V. A. Golibrodo, I. G. Lilp, and I. I. Poletaeva, “Mice selected for large and small brain weight: The preservation of trait differences after the selection was discontinued,” Adv. Biosci. Biotechnol., vol. 04, no. 06, pp. 1-8, 2013, |
[26] | S. Jacob, A. B. Nair, and M. A. Morsy, “Dose Conversion Between Animals and Humans: A Practical Solution,” Indian J. Pharm. Educ. Res., vol. 56, no. 3, pp. 600-607, 2022, |
[27] | H. C. Ansel, Pharmaceutical Calculations, vol. 11, no. 1. 2019. |
[28] | D. Hickman, J. Johnson, T. Vemulapalli, J. Crisler, and R. Shepherd, Commonly Used Animal Models, vol. 8, no. 75. 2020, |
[29] | T. Kolbe, C. Lassnig, A. Poelzl, R. Palme, K. E. Auer, and T. Rülicke, “Effect of Different Ambient Temperatures on Reproductive Outcome and Stress Level of Lactating Females in Two Mouse Strains,” Animals, vol. 12, no. 16, 2022, |
[30] | B. Tanya, F. Charmaine, K. Eleanor, L. C. Garry, and M. S. Joanne, Health Evaluation of Experimental Laboratory Mice, vol. 23, no. 1. 2012, |
[31] | B. Knoop et al., “Cardioprotection by very mild hypothermia in mice,” Cardiovasc. Diagn. Ther., vol. 9, no. 1, pp. 64-67, 2019, |
[32] | F. C. Hankenson, J. O. Marx, C. J. Gordon, and J. M. David, “Effects of Rodent Thermoregulation on Animal Models in the Research Environment,” Comp. Med., vol. 68, no. 6, pp. 425-438, 2018, |
[33] | A. Fawcett, “Guidelines for the Housing of Mice in Scientific Institutions,” 2012. |
[34] | H. Khosravi, E. Khalilzadeh, and G. Vafaei Saiah, “Pain-induced aggression and changes in social behavior in mice,” Aggress. Behav., vol. 47, no. 1, pp. 89-98, 2021, |
[35] | Y. Chen, M. Niimi, L. Zhang, X. Tang, J. Lu, and J. Fan, “A Simple Telemetry Sensor System for Monitoring Body Temperature in Rabbits—A Brief Report,” Animals, vol. 13, no. 10, pp. 1-9, 2023, |
[36] | A. A. Hanson and R. C. Venette, “Thermocouple design for measuring temperatures of small insects,” Cryo-Letters, vol. 34, no. 3, pp. 261-266, 2013. |
[37] | H. X. Kou, Y. Q. Zhao, K. Ren, X. L. Chen, Y. Q. Lu, and D. Wang, “Automated measurement of cattle surface temperature and its correlation with rectal temperature,” PLoS One, vol. 12, no. 4, pp. 1-10, 2017, |
[38] | R. Arif, K. Santoso, and D. S. Wibawa, “Rats Development of Contactless Thermal Detector for Animal: Comparison of Three Sensor Types,” Proc. 2nd Int. Conf. Vet. Anim. Environ. Sci. (ICVAES 2020), vol. 12, no. Icvaes 2020, pp. 25-28, 2021, |
[39] | Z. Wang and X. Liu, “Design of Animal Detector Based on Thermal Imaging Sensor,” J. Phys. Conf. Ser., vol. 1550, no. 4, pp. 1-8, 2020, |
[40] | L. Anishchenko, A. Tataraidze, A. Bugaev, and V. Razevig, “Automated long-term contactless temperature monitoring in animals via a thermographic camera,” Proc. Annu. Int. Conf. IEEE Eng. Med. Biol. Soc. EMBS, pp. 1262-1265, 2017, |
[41] | G. R. Quintana and J. G. Pfaus, “Do Sex and Gender Have Separate Identities?,” Arch. Sex. Behav., vol. 53, no. 8, pp. 2957-2975, 2024, |
[42] | G. G. Dimijian, “Evolution of Sexuality: Biology and Behavior,” Baylor Univ. Med. Cent. Proc., vol. 18, no. 3, pp. 244-258, 2005, |
[43] | T. Subramoniam, “Mode of reproduction: Invertebrate animals,” Encycl. Reprod., no. October, pp. 32-40, 2018, |
[44] | V. Yadav, S. Sun, and J. Heitman, “On the evolution of variation in sexual reproduction through the prism of eukaryotic microbes,” Proc. Natl. Acad. Sci. U. S. A., vol. 120, no. 10, pp. 1-11, 2023, |
[45] | J. W. M. Arts, K. Kramer, S. S. Arndt, and F. Ohl, “Sex differences in physiological acclimatization after transfer in Wistar rats,” Animals, vol. 4, no. 4, pp. 693-711, 2014, |
[46] | X. Liu et al., “Male and female mice display consistent lifelong ability to address potential life-threatening cues using different post-threat coping strategies,” BMC Biol., vol. 20, no. 1, pp. 1-16, 2022, |
[47] | P. Y. Zeng, Y. H. Tsai, C. L. Lee, Y. K. Ma, and T. H. Kuo, “Minimal influence of estrous cycle on studies of female mouse behaviors,” Front. Mol. Neurosci., vol. 16, no. July, pp. 1-17, 2023, |
[48] | T. M. Reichard, C. H. Miller, J. Yang, and M. J. Sheehan, “Seasonality of the estrus cycle in laboratory mice under constant conditions,” 2024, |
[49] | V. Võikar, S. Kõks, E. Vasar, and H. Rauvala, “Strain and gender differences in the behavior of mouse lines commonly used in transgenic studies,” Physiol. Behav., vol. 72, no. 1-2, pp. 271-281, 2001, |
[50] | A. Dean and R. M. Sharpe, “Anogenital distance or digit length ratio as measures of fetal androgen exposure: Relationship to male reproductive development and its disorders,” J. Clin. Endocrinol. Metab., vol. 98, no. 6, pp. 2230-2238, 2013, |
[51] | C. L. Schwartz, S. Christiansen, A. M. Vinggaard, M. Axelstad, U. Hass, and T. Svingen, “Anogenital distance as a toxicological or clinical marker for fetal androgen action and risk for reproductive disorders,” Arch. Toxicol., vol. 93, no. 2, pp. 253-272, 2019, |
[52] | M. A. Suckow, S. A. Hashway, and K. R. Pritchett-Corning, The Laboratory Mouse, Third Edition, no. November. 2023, |
[53] | J. Frohlich, Rats and Mice, Fourth Edi. Elsevier, 2020, |
[54] | J. S. Bruce, F. Max, R. J. Esequiel, B. Sarah, and C. Gerald, “Sexual Differentiation in the Male and Female Mouse from Days 0 to 21: A Detailed and Novel Morphometric Description,” Neuron, vol. 61, no. 1, pp. 1-7, 2009, |
[55] | R. Singh, M. Gholipourmalekabadi, and S. H. Shafikhani, “Animal models for type 1 and type 2 diabetes: advantages and limitations,” Front. Endocrinol. (Lausanne)., vol. 15, no. February, pp. 1-17, 2024, |
[56] | US FDA, “Guidance for Industry and Other Stakeholders Toxicological Principles for the Safety Assessment of Food Ingredients,” Hum. Stud., vol. 3, no. July 2000, pp. 1-286, 2007. |
[57] | Y. J. Zhang, Z. Luo, Y. Sun, J. Liu, and Z. Chen, “From beasts to bytes: Revolutionizing zoological research with artificial intelligence,” Zool. Res., vol. 44, no. 6, pp. 1115-1131, 2023, |
[58] | M. Willi et al., “Identifying animal species in camera trap images using deep learning and citizen science,” Methods Ecol. Evol., vol. 10, no. 1, pp. 80-91, 2019, |
[59] | G. Kaul, J. McDevitt, J. Johnson, and A. Eban-Rothschild, “DAMM for the detection and tracking of multiple animals within complex social and environmental settings,” Sci. Rep., vol. 14, no. 1, pp. 1-16, 2024, |
[60] | G. Pang, C. Shen, L. Cao, and A. Van Den Hengel, “Deep Learning for Anomaly Detection: A Review,” ACM Comput. Surv., vol. 54, no. 2, 2021, |
[61] | O. Olorunshola, P. Jemitola, and A. Ademuwagun, “Comparative Study of Some Deep Learning Object Detection Algorithms: R-CNN, FAST R-CNN, FASTER R-CNN, SSD, and YOLO,” Nile J. Eng. Appl. Sci., no. 0, p. 1, 2023, |
[62] | P. Kumar, S. Luo, and K. Shaukat, “A Comprehensive Review of Deep Learning Approaches for Animal Detection on Video Data,” Int. J. Adv. Comput. Sci. Appl., vol. 14, no. 11, pp. 1420-1437, 2023, |
[63] | M. Fraiwan, “Efficient Convolutional Neural Network Model for the Taxonomy and Sex Identification of Three Phlebotomine Sandfly Species (Diptera, Psychodidae, and Phlebotominae),” Animals, vol. 14, no. 3712, pp. 1-17, 2024, |
[64] | E. M. K. de Silva et al., “Feasibility of using convolutional neural networks for individual-identification of wild Asian elephants,” Mamm. Biol., vol. 102, no. 3, pp. 909-919, 2022, |
[65] | W. Hao, C. Ren, M. Han, L. Zhang, F. Li, and Z. Liu, “Cattle Body Detection Based on YOLOv5-EMA for Precision Livestock Farming,” Animals, vol. 13, no. 22, pp. 1-28, 2023, |
[66] | C.-Y. Wang and K. L. James, “A Mouse Model of Diet-Induced Obesity and Insulin Resistance,” Methods Mol. Biol., vol. 821, no. 5, pp. 1-14, 2013, |
[67] | D. Djajasasmita, M. R. Hidayat, and S. Sambasri, “Design of Automation for Infusion Monitoring Using a Weight Sensor (Load Cell) Based on AT-MEGA 328 Microcontroller,” Int. J. Ind. Res. Appl. Eng., vol. 5, no. 1, pp. 14-18, 2021, |
[68] | C. Alberto Mandarim-de-Lacerda, M. del Sol, B. Vásquez, and M. Barbosa Aguila, “Mice as an Animal Model for the Study of Adipose Tissue and Obesity Ratones como Modelo Animal para el Estudio del Tejido Adiposo y la Obesidad,” Int. J. Morphol., vol. 39, no. 6, pp. 1521-1528, 2021, |
[69] | T. Kushartadi, M. A. Laagu, and M. Asvial, “Design and Implementation of The Smart Weighing Precision Livestock Monitoring Technology based on the Internet of Things (IoT),” Int. J. Adv. Sci. Eng. Inf. Technol., vol. 13, no. 4, pp. 1438-1448, 2023, |
[70] | A. Satrya, S. Styawati, I. Ismail, and S. Alim, “IoT-based implementation of rickshaws for real-time monitoring and measuring the weight of cattle,” J. Soft Comput. Explor., vol. 5, no. 1, pp. 26-31, 2024, |
[71] | F. M. Tangorra, E. Buoio, A. Calcante, A. Bassi, and A. Costa, “Internet of Things (IoT): Sensors Application in Dairy Cattle Farming,” Animals, vol. 14, no. 3071, pp. 1-24, 2024, |
[72] | S. Ulyanida, A. Supriyanto, S. W. Suciyati, and J. Junaidi, “Automatization of Weight and Height Measurement Using Ultrasonic Sensors HC-SR04 and Load Cell Based on Arduino UNO at Integrated Services Posts (Posyandu),” J. Energy, Mater. Instrum. Technol., vol. 3, no. 4, pp. 127-137, 2022, |
[73] | S. Inagaki, “Application of load cells in precision agricultural equipment,” World J. Agric. For. Sci., vol. 2, no. 1, 2024, |
[74] | K. P. Kartika Riyanti, I. Kakaravada, and A. A. Ahmed, “An Automatic Load Detector Design to Determine the Strength of Pedestrian Bridges Using Load Cell Sensor Based on Arduino,” Indones. J. Electron. Electromed. Eng. Med. Informatics, vol. 4, no. 1, pp. 15-22, 2022, |
[75] | M. Mihaila, P. Barsanescu, and C. Moraras, “Weigh-in-Motion Sensors and Traffic Monitoring Systems. State of the Art and Perspectives,” Bull. Polytech. Inst. Iași. Mach. Constr. Sect., vol. 68, no. 1, pp. 125-146, 2022, |
[76] | L. Galbusera, G. Theodoridis, and G. Giannopoulos, “Intelligent Energy Systems: Introducing Power-ICT Interdependency in Modeling and Control Design,” IEEE Trans. Ind. Electron., vol. 62, no. 4, pp. 2468-2477, 2015, |
[77] | T. Y. Ravi, P. Mehul, and S. Laskmisri, Phyton for beginners. 2021. |
[78] | J. Sigut, M. Castro, R. Arnay, and M. Sigut, “OpenCV Basics: A Mobile Application to Support the Teaching of Computer Vision Concepts,” IEEE Trans. Educ., vol. 63, no. 4, pp. 328-335, 2020, |
[79] | N. Aiman, A. Norizan, M. Razali, M. Tomari, W. Nurshazwani, and W. Zakaria, “Object Detection Using YOLO for Quadruped Robot Manipulation,” Evol. Electr. Electron. Eng., vol. 4, no. 1, pp. 329-336, 2023, [Online]. Available: |
[80] | K. Liu, H. Tang, S. He, Q. Yu, Y. Xiong, and N. Wang, “Performance Validation of Yolo Variants for Object Detection,” Proc. 2021 Int. Conf. Bioinforma. Intell. Comput. BIC 2021, pp. 239-243, 2021, |
[81] | M. Hussain, “YOLO-v1 to YOLO-v8, the Rise of YOLO and Its Complementary Nature toward Digital Manufacturing and Industrial Defect Detection,” Machines, vol. 11, no. 7, 2023, |
[82] | F. M. Talaat and H. ZainEldin, “An improved fire detection approach based on YOLO-v8 for smart cities,” Neural Comput. Appl., vol. 35, no. 28, pp. 20939-20954, 2023, |
[83] | S. Sathyanarayanan and B. R. Tantri, “Confusion Matrix-Based Performance Evaluation Metrics,” African J. Biol. Sci., vol. 27, no. December, pp. 1-10, 2024, |
[84] | S. Farhadpour, T. A. Warner, and A. E. Maxwell, “Selecting and Interpreting Multiclass Loss and Accuracy Assessment Metrics for Classifications with Class Imbalance: Guidance and Best Practices,” Remote Sens., vol. 16, no. 3, pp. 1-22, 2024, |
APA Style
Mutiara, G. A., Periyadi, Alfarisi, M. R., Meisaroh, L. (2025). Intellimice Classifier: Towards Smart Object Detection and Classification of Laboratory Mice Using Multi-Sensor Integration. Journal of Electrical and Electronic Engineering, 13(1), 59-81. https://doi.org/10.11648/j.jeee.20251301.16
ACS Style
Mutiara, G. A.; Periyadi; Alfarisi, M. R.; Meisaroh, L. Intellimice Classifier: Towards Smart Object Detection and Classification of Laboratory Mice Using Multi-Sensor Integration. J. Electr. Electron. Eng. 2025, 13(1), 59-81. doi: 10.11648/j.jeee.20251301.16
AMA Style
Mutiara GA, Periyadi, Alfarisi MR, Meisaroh L. Intellimice Classifier: Towards Smart Object Detection and Classification of Laboratory Mice Using Multi-Sensor Integration. J Electr Electron Eng. 2025;13(1):59-81. doi: 10.11648/j.jeee.20251301.16
@article{10.11648/j.jeee.20251301.16, author = {Giva Andriana Mutiara and Periyadi and Muhammad Rizqy Alfarisi and Lisda Meisaroh}, title = {Intellimice Classifier: Towards Smart Object Detection and Classification of Laboratory Mice Using Multi-Sensor Integration }, journal = {Journal of Electrical and Electronic Engineering}, volume = {13}, number = {1}, pages = {59-81}, doi = {10.11648/j.jeee.20251301.16}, url = {https://doi.org/10.11648/j.jeee.20251301.16}, eprint = {https://article.sciencepublishinggroup.com/pdf/10.11648.j.jeee.20251301.16}, abstract = {Laboratory mice (Mus musculus) play a crucial role in scientific research, where accurate classification and sorting are essential for ensuring reliable experimental results. This study presents an intelligent multi-sensor system for the automated classification and sorting of laboratory mice based on three key parameters: health status, gender, and weight. The system integrates thermal imaging cameras AMG8833 for monitoring the health status of mice, object detection algorithms (YOLOv8) for gender classification, and load cell HX711 sensors for weight measurement. The integration of these sensors leverages advanced sensor fusion techniques to improve classification accuracy and efficiency. Thermal imaging detects physiological anomalies to assess the health condition of the mice, while object detection algorithms identify gender characteristics in real-time with high precision. Additionally, load cell sensors provide accurate weight data for further categorization. The combined system eliminates the need for manual intervention, ensuring a non-invasive, efficient, and scalable approach to laboratory animal management. The proposed system performed evaluation through multiple test scenarios aimed at assessing the health of mice and classifying their weight. The detection of mice gender was evaluated using a dataset comprising over 6,722 images stored in the STASRG laboratory. The test results indicated that the accuracy of animal sorting across three parameters achieved a 100% success rate. The accuracy of gender sorting was 86.67%, while the accuracy of weight measurement exhibited a difference of approximately 0.1 gram. The overall response time for sorting was 19 seconds. This multi-sensor integration demonstrates the potential to enhance laboratory workflows, minimize human error, and promote the welfare of laboratory animals via automated, data-driven processes. }, year = {2025} }
TY - JOUR T1 - Intellimice Classifier: Towards Smart Object Detection and Classification of Laboratory Mice Using Multi-Sensor Integration AU - Giva Andriana Mutiara AU - Periyadi AU - Muhammad Rizqy Alfarisi AU - Lisda Meisaroh Y1 - 2025/02/27 PY - 2025 N1 - https://doi.org/10.11648/j.jeee.20251301.16 DO - 10.11648/j.jeee.20251301.16 T2 - Journal of Electrical and Electronic Engineering JF - Journal of Electrical and Electronic Engineering JO - Journal of Electrical and Electronic Engineering SP - 59 EP - 81 PB - Science Publishing Group SN - 2329-1605 UR - https://doi.org/10.11648/j.jeee.20251301.16 AB - Laboratory mice (Mus musculus) play a crucial role in scientific research, where accurate classification and sorting are essential for ensuring reliable experimental results. This study presents an intelligent multi-sensor system for the automated classification and sorting of laboratory mice based on three key parameters: health status, gender, and weight. The system integrates thermal imaging cameras AMG8833 for monitoring the health status of mice, object detection algorithms (YOLOv8) for gender classification, and load cell HX711 sensors for weight measurement. The integration of these sensors leverages advanced sensor fusion techniques to improve classification accuracy and efficiency. Thermal imaging detects physiological anomalies to assess the health condition of the mice, while object detection algorithms identify gender characteristics in real-time with high precision. Additionally, load cell sensors provide accurate weight data for further categorization. The combined system eliminates the need for manual intervention, ensuring a non-invasive, efficient, and scalable approach to laboratory animal management. The proposed system performed evaluation through multiple test scenarios aimed at assessing the health of mice and classifying their weight. The detection of mice gender was evaluated using a dataset comprising over 6,722 images stored in the STASRG laboratory. The test results indicated that the accuracy of animal sorting across three parameters achieved a 100% success rate. The accuracy of gender sorting was 86.67%, while the accuracy of weight measurement exhibited a difference of approximately 0.1 gram. The overall response time for sorting was 19 seconds. This multi-sensor integration demonstrates the potential to enhance laboratory workflows, minimize human error, and promote the welfare of laboratory animals via automated, data-driven processes. VL - 13 IS - 1 ER -