-
Research Article
Performance Evaluation of Custom A-, D-, and I-Optimal Designs for Non-Standard Second-Order Models
Iwundu Mary Paschal,
Israel Chinomso Fortune*
Issue:
Volume 13, Issue 5, October 2024
Pages:
92-114
Received:
14 July 2024
Accepted:
12 August 2024
Published:
26 September 2024
Abstract: The performances of Custom A-, D-, and I-optimal designs on non-standard second-order models are examined using the alphabetic A-, D-, and G-optimality efficiencies, as well as the Average Variance of Prediction. Designs of varying sizes are constructed with the help of JMP Pro 14 software and are customized for specified non-standard models, optimality criteria, prespecified experimental runs, and a specified range of input variables. The results reveal that Custom-A optimal designs perform generally better in terms of G-efficiency. They show high superiority to A-efficiency as the worst G-efficiency value of the created Custom-A optimal designs exceeds the best A-efficiency value of the designs, and also does well in terms of D-efficiency. Custom-D optimal designs perform generally best in terms of G-efficiency, as the worst G-efficiency value exceeds all A- and D-efficiency values. Custom-I optimal designs perform generally best in terms of G-efficiency as the worst G-efficiency value is better than the best A-efficiency value and performs generally better than the corresponding D-efficiency values. For the Average Variance of Prediction, Custom A- and I-optimal designs perform competitively well, with relatively low Average Variances of Prediction. On the contrary, the Average Variance of Prediction is generally larger for Custom-D optimal designs. Hence when seeking designs that minimize the variance of the predicted response, it suffices to construct Custom A-, D-, or I-optimal designs, with a preference for Custom-D optimal designs.
Abstract: The performances of Custom A-, D-, and I-optimal designs on non-standard second-order models are examined using the alphabetic A-, D-, and G-optimality efficiencies, as well as the Average Variance of Prediction. Designs of varying sizes are constructed with the help of JMP Pro 14 software and are customized for specified non-standard models, optim...
Show More
-
Research Article
Non-Parametric Fuzzy Regression Discontinuity Design in Modelling Viral Load Suppression Among People Living With HIV/AIDS (PLWHA)
Issue:
Volume 13, Issue 5, October 2024
Pages:
115-126
Received:
12 August 2024
Accepted:
10 September 2024
Published:
10 October 2024
Abstract: Adherence to antiretroviral therapy (ART) is a significant determinant of viral load suppression in HIV patients. There are inadequate statistical models that bring out the direct effects of ART on the suppression of HIV/AIDS. Traditional regression models address the general determinants of viral load suppression. Regression discontinuity designs, on the other hand, bring out the causal effects of ART on viral load suppression based on various thresholds. This study used the non-parametric fuzzy regression discontinuity design (FRDD) to model viral load suppression in PLWHA. The study began with developing a non-parametric FRDD, simulating the model to assess its performance, and applying the model to the Quality-of-Care dataset from Kaggle. The study focused on viral load suppression as the outcome variable, CD4 count and age as the running variables, gender, and whether a patient received counseling as additional covariates. The optimal thresholds were 40.5 years and 320 cubic millimeters for the CD4. There was an increasing negative treatment effect of ART on viral load suppression as the cutoff points for CD4 count increase. At the same time, there was an increasing negative treatment effect of ART on viral load suppression with increasing age. The compliance ratios for respondents increased with the negative increase in the treatment effect. Other analyses, such as the McCrary density test, bunching test, and manipulation test, indicated that the non-parametric fuzzy regression discontinuity design is effective in modeling viral load suppression.
Abstract: Adherence to antiretroviral therapy (ART) is a significant determinant of viral load suppression in HIV patients. There are inadequate statistical models that bring out the direct effects of ART on the suppression of HIV/AIDS. Traditional regression models address the general determinants of viral load suppression. Regression discontinuity designs,...
Show More
-
Research Article
Comparative Analysis of Cross-Validation Techniques: LOOCV, K-folds Cross-Validation, and Repeated K-folds Cross-Validation in Machine Learning Models
Issue:
Volume 13, Issue 5, October 2024
Pages:
127-137
Received:
4 September 2024
Accepted:
23 September 2024
Published:
10 October 2024
Abstract: Effective model evaluation is crucial for robust machine learning, and cross-validation techniques play a significant role. This study compares Repeated k-folds Cross Validation, k-folds Cross Validation, and Leave-One-Out Cross Validation (LOOCV) on imbalanced and balanced datasets across four models: Support Vector Machine (SVM), K-Nearest Neighbors (K-NN), Random Forest (RF), and Bagging, both with and without parameter tuning. On imbalanced data without parameter tuning, Repeated k-folds cross-validation demonstrated strong performance for SVM with a sensitivity of 0.541 and balanced accuracy of 0.764. K-folds Cross Validation showed a higher sensitivity of 0.784 for RF and a balanced accuracy of 0.884. In contrast, LOOCV achieved notable sensitivity for RF and Bagging at 0.787 and 0.784, respectively, but at the cost of lower precision and higher variance, as detailed in Table 1. When parameter tuning was applied to balanced data, the performance metrics improved. Sensitivity for SVM reached 0.893 with LOOCV and balanced accuracy for Bagging increased to 0.895. Stratified k-folds provided enhanced precision and F1-Score for SVM and RF. Notably, processing times varied significantly, with k-folds being the most efficient with SVM taking 21.480 seconds and Repeated k-folds showing higher computational demands where RF took approximately 1986.570 seconds in model processing, as shown in Table 4. This analysis underscores that while k-folds and repeated k-folds are generally efficient, LOOCV and balanced approaches offer enhanced accuracy for specific models but require greater computational resources. The choice of cross-validation technique should thus be tailored to the dataset characteristics and computational constraints to ensure optimal model evaluation.
Abstract: Effective model evaluation is crucial for robust machine learning, and cross-validation techniques play a significant role. This study compares Repeated k-folds Cross Validation, k-folds Cross Validation, and Leave-One-Out Cross Validation (LOOCV) on imbalanced and balanced datasets across four models: Support Vector Machine (SVM), K-Nearest Neighb...
Show More
-
Research Article
Application of Deterministic Multiple Linear Regression and Stochastic Analysis Through Monte Carlo Simulation to Model Loan Consumption Assuming Kenyan Data
Conlet Biketi Kikechi*,
Dau Malek Dau
Issue:
Volume 13, Issue 5, October 2024
Pages:
138-156
Received:
28 August 2024
Accepted:
24 September 2024
Published:
18 October 2024
Abstract: The main focus of this survey is to examine the key determinants of loan consumption in Kenya, with a keen focus on variables such as monthly expenditure per adult, deposits, economic strength index, and diverse economic opportunities. The study's target population is Kenyan citizens aged 18 years and above. The study obtains data from the Kenya national census, aggregated at the county level. The literature review section presents sufficient support for the study’s hypothesis. In particular, the authors highlight Kenya as a country with one of the highest loan consumption in Africa, which plays as a motivation for conducting this survey. The survey methodology involves the use of quantitative analysis using descriptive statistics like tables, graphs and charts, deterministic multiple linear regression, and stochastic analysis through Monte Carlo simulation. The results of the survey show that monthly expenditure per adult, deposits, and economic strength index have a significant impact on loan consumption amount; that is, R-squared is equal to 0.88. The implication of this survey is based on its contribution to the understanding of loan consumption behaviors in Kenya, presenting relevant insights for policymakers, financial institutions, and other stakeholders in the lending and borrowing processes. The outcome has the potential to inform loan structure policies to enhance the promotion of responsible and sustainable loan consumption.
Abstract: The main focus of this survey is to examine the key determinants of loan consumption in Kenya, with a keen focus on variables such as monthly expenditure per adult, deposits, economic strength index, and diverse economic opportunities. The study's target population is Kenyan citizens aged 18 years and above. The study obtains data from the Kenya na...
Show More
-
Research Article
Modelling the Relationship Between Merchandise Trade Flows and Some Macroeconomic Variables in Ghana
Azebre Abu Ibrahim*,
Anuwoje Ida Logubayom Abonongo
Issue:
Volume 13, Issue 5, October 2024
Pages:
157-174
Received:
2 September 2024
Accepted:
25 September 2024
Published:
29 October 2024
Abstract: Macroeconomic variables serve as economic indicators that offer valuable insights into the overall health and stability of an economy. Changes in these variables can have significant impacts on a country's trade balance and overall economic performance. This study employed multivariate time series analysis to study the relationship between Merchandise Trade Flows (MTF), Monetary Policy Rate (MPR), Commercial Lending Rate (CLR), Nominal Growth Rate (NGR) and Consumer Price Index (CPI) with Money Supply (MoS) as exogenous variable. The nature of trend in each series was investigated. The results revealed that quadratic trend model best models MTF, MPR, CLR and NGR whiles an exponential trend best models CPI. Johansen’s co-integration test with unrestricted trend performed revealed the existence of long-run equilibrium relationships between the variables and three (3) co-integrating equations described this long-run relationship. In terms of short-run relationships, the VEC (2) model revealed that, CLR, NGR, MoS have positive and significant impact on MTF. CLR, NGR and MoS have positive and significant impact on MPR, NGR have positive and significant impact on CLR, CPI and MoS have significant impact on NGR whiles NGR and MoS have significant impact on CPI. Model diagnostics performed on the VEC (2) model showed that, all the model parameters are structurally stable over time and the residuals of the individual models are free from serial correlation and conditional heteroscedasticity. Forecast error variance decomposition (FEVD) analysis showed that each variable primarily explained its own variance and the influence of other variables increase over time. Hence, adopting a broad perspective on macroeconomic variables can help policymakers anticipate and mitigate ripple effects across various economic sectors.
Abstract: Macroeconomic variables serve as economic indicators that offer valuable insights into the overall health and stability of an economy. Changes in these variables can have significant impacts on a country's trade balance and overall economic performance. This study employed multivariate time series analysis to study the relationship between Merchand...
Show More
-
Research Article
Applying Fractal Theory: Solving the Geometric Challenge of Price Change and Scaling in Economics
Issue:
Volume 13, Issue 5, October 2024
Pages:
175-180
Received:
24 September 2024
Accepted:
15 October 2024
Published:
31 October 2024
Abstract: Background: Price changes in economics present significant geometric challenges due to sharp discontinuities, which cannot be efficiently described by continuous processes like Brownian motion. Traditional models often rely on linear assumptions, yet financial data frequently exhibit irregular, complex patterns. Fractal theory, a mathematical framework, offers a more accurate way to describe these fluctuations by revealing the underlying self-similar structures in price changes and scaling phenomena. This study explores the use of fractal geometry to gain deeper insights into market behavior. Objective: The objective is to demonstrate that an alternative model, constructed based on geometric scaling assumptions, offers a more accurate description of price changes in competitive markets. Method: The study combined the scaling principle from fractal geometry with a stable Levy model to formulate an integrated model. The logarithmic transformation of the model was applied over successive price changes to observe the behavior of market prices. Result: The scaling principle asserts that no specific time interval (such as a day or a week) holds inherent significance in competitive markets. Instead, these time features are compensated or arbitrated away, supporting the idea that market behavior is self-similar across different time scales. Conclusion: The scaling principle provides a more reliable framework for modeling price changes and is recommended for consideration in economic analyses.
Abstract: Background: Price changes in economics present significant geometric challenges due to sharp discontinuities, which cannot be efficiently described by continuous processes like Brownian motion. Traditional models often rely on linear assumptions, yet financial data frequently exhibit irregular, complex patterns. Fractal theory, a mathematical frame...
Show More