Hyperparameter Optimization Using Grid Search and Random Search to Improve the Performance of Prediction Models with Decision Trees
DOI:
https://doi.org/10.59653/jimat.v3i03.2025Keywords:
hyperparameter tuning, grid search, random search, decision tree, breast cancer datasetAbstract
Hyperparameter selection to obtain optimal accuracy results is an important factor in improving model performance in data science. This study discusses a comparison of two hyperparameter optimization methods, namely Grid Search and Random Search, in the Decision Tree Classifier algorithm using the Breast Cancer Wisconsin (Diagnostic) Dataset from the UCI Machine Learning Repository. The dataset contains 569 samples with 30 numerical features describing the characteristics of breast cancer cells, such as mean radius, texture, perimeter, area, and smoothness, which are classified into two classes, namely malignant and benign. This study uses the CRISP-DM approach, which includes the stages of business understanding, data understanding, data preparation, modeling, and evaluation. In the modeling stage, three testing scenarios were conducted, namely the Decision Tree model without tuning, the model with Grid Search optimization, and the model with Random Search optimization. Performance evaluation was carried out using accuracy, precision, recall, and F1-score metrics. The results showed that hyperparameter optimization had a significant effect on model performance. The Decision Tree model without tuning produced an accuracy of 92.98%, while the model with Grid Search achieved the highest accuracy of 95.61%, and Random Search obtained an accuracy of 97.37%. Thus, it can be concluded that Grid Search provides the most optimal results in finding the best parameter combination, even though it requires longer computation time compared to Random Search.
Downloads
References
Anggreani, D. (2024). Grid Search Hyperparameter Analysis in Optimizing The Decision Tree Method for Diabetes Prediction. Indonesian Journal of Data and Science Volume, 5(3), 190–197.
Anugerah Simanjuntak, Rosni Lumbantoruan, Kartika Sianipar, Rut Gultom, Mario Simaremare, Samuel Situmeang, & Erwin Panggabean. (2024). Research and Analysis of IndoBERT Hyperparameter Tuning in Fake News Detection. Jurnal Nasional Teknik Elektro Dan Teknologi Informasi, 13(1), 60–67. https://doi.org/10.22146/jnteti.v13i1.8532
Arifin, M., & Adiyono, S. (2024). Hyperparameter Tuning in Machine Learning to Predicting Student Academic Achievement. International Journal of Artificial Intelegence Research, 8(1), 1–8.
Cielen, D., Meysman, A. D. B., & Ali, M. (2016). Introducing Data Science.
Cielen, D., Meysman, A. D. B., & Ali, M. (2018). Introducing Data Science: Big Data, Machine Learning, and more, using Python tools 1st Edition. Manning Publications.
Dalal, S., Onyema, E. M., & Malik, A. (2022). Hybrid XGBoost model with hyperparameter tuning for prediction of liver disease with better accuracy. World Journal of Gastroenterology, 28(46), 6551–6563. https://doi.org/10.3748/wjg.v28.i46.6551
Elgeldawi, E., Sayed, A., Galal, A. R., & Zaki, A. M. (2021). Hyperparameter Tuning for Machine Learning Algorithms Used for Arabic Sentiment Analysis. Informatics Article, 1–21.
Fajri, M., & Primajaya, A. (2023). Komparasi Teknik Hyperparameter Optimization pada SVM untuk Permasalahan Klasifikasi dengan Menggunakan Grid Search dan Random Search. Journal of Applied Informatics and Computing, 7(1), 14–19. https://doi.org/10.30871/jaic.v7i1.5004
Firgiawan, W., Yustianisa, D., & Nur, N. A. (2025). Hyperparameter Tuning for Optimizing Stunting Classification with KNN , SVM , and Naïve Bayes Algorithms. Jurnal TEKNO KOMPAK, 19(1), 92–104.
Fordana, M. D. Y., & Rochmawati, N. (2022). Optimisasi Hyperparameter CNN Menggunakan Random Search Untuk Deteksi COVID-19 Dari Citra X-Ray Dada. Journal of Informatics and Computer Science (JINACS), 4(01), 10–18. https://doi.org/10.26740/jinacs.v4n01.p10-18
G, S. G. C. (2020). Grid Search Tuning of Hyperparameters in Random Forest Classifier for Customer Feedback Sentiment Prediction. 11(9), 173–178.
Géron, A. (2019). Hands-on Machine Learning with Scikit-Learn, Keras, and TensorFlow Concepts, Tools, and Techniques to Build Intelligent Systems. O’Reilly Media, Inc.
Gupta, S. C., & Goel, N. (2023). ScienceDirect ScienceDirect Predictive Modeling and Analytics for Diabetes using Predictive Modeling and Analytics for Diabetes using Hyperparameter tuned Machine Learning Techniques Hyperparameter tuned Machine Learning Techniques. Procedia Computer Science, 218(2022), 1257–1269. https://doi.org/10.1016/j.procs.2023.01.104
Hendradinata, N., Gede, I., & Astawa, S. (2022). Hyperparameter Tuning Algoritma KNN Untuk Klasifikasi Kanker Payudara Dengan Grid Search CV. JNATIA Volume, 1(November), 397–402.
Khatib, J., & Dalam, S. (2023). Indonesian Journal of Computer Science. Indonesian Journal of Computer Science, 12(1), 1351–1365.
Lindawati, L., Fadhli, M., & Wardana, A. S. (2023). Optimasi Gaussian Naïve Bayes dengan Hyperparameter Tuning dan Univariate Feature Selection dalam Prediksi Cuaca. Edumatic: Jurnal Pendidikan Informatika, 7(2), 237–246. https://doi.org/10.29408/edumatic.v7i2.21179
Massahiro, A., Instituto, S., Tecnológicas, D. P., Paulo, D. S., Univer-, F., Paulo, S., Cordeiro, R., & Paulo, S. (2023). The evolution of CRISP-DM for Data Science : Methods , Processes and Frameworks. SBC Reviews on Computer Science, 4(1). https://doi.org/10.5753/reviews.2024.3757
Nugraha, W., & Sasongko, A. (2022). Hyperparameter Tuning pada Algoritma Klasifikasi dengan Grid Search Hyperparameter Tuning on Classification Algorithm with. SISTEMASI: Jurnal Sistem Informasi Volume, 11, 391–401.
Prabu, S., Thiyaneswaran, B., Sujatha, M., Nalini, C., & Rajkumar, S. (2022). Grid Search for Predicting Coronary Heart Disease by Tuning. Computer Systems Science & Engineering, 43(2). https://doi.org/10.32604/csse.2022.022739
Pramudhyta, N. A., & Rohman, M. S. (2024). Perbandingan Optimasi Metode Grid Search dan Random Search dalam Algoritma XGBoost untuk Klasifikasi Stunting. Jurnal Media Informatika Budidarma, 8(1), 19. https://doi.org/10.30865/mib.v8i1.6965
Rizky, M. H., Faisal, M. R., Budiman, I., & Kartini, D. (2024). Effect of Hyperparameter Tuning Using Random Search on Tree-Based Classification Algorithm for Software Defect Prediction. Rizky, M. H., Faisal, M. R., Budiman, I., & Kartini, D. (2024). Effect of Hyperparameter Tuning Using Random Search on Tree-Based Classification Algorithm for Software Defect Prediction. 18(1), 95–106., 18(1), 95–106.
Saputra, A. G. (2024). Hyperparameter Tuning Decision Tree and Recursive Feature Elimination Technique for Improved Chronic Kidney Disease Classification. Scientific Journal of Informatics, 11(3), 821–830. https://doi.org/10.15294/sji.v11i3.12990
Science, A. C. (2023). DATA ENGINEERING IN CRISP-DM PROCESS PRODUCTION DATA – CASE STUDY. Applied Computer Science, 19(3), 83–95. https://doi.org/10.35784/acs-2023-26
Shaik, S., & Sreeja, D. (2025). Ensemble Machine Learning-based Heart Disease Prediction with Hyper- tuning Parameters. Asian Journal of Research in Computer Science Volume, 18(5). https://doi.org/10.9734/ajrcos/2025/v18i5642
Downloads
Published
How to Cite
Issue
Section
Categories
License
Copyright (c) 2025 Muhammad Sholeh, Uning Lestari, Dina Andayati

This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.
Authors who publish with this journal agree to the following terms:
- Authors retain copyright and grant the journal right of first publication with the work simultaneously licensed under a Creative Commons Attribution-ShareAlike that allows others to share the work with an acknowledgement of the work's authorship and initial publication in this journal.
- Authors are able to enter into separate, additional contractual arrangements for the non-exclusive distribution of the journal's published version of the work (e.g., post it to an institutional repository or publish it in a book), with an acknowledgement of its initial publication in this journal.
- Authors are permitted and encouraged to post their work online (e.g., in institutional repositories or on their website) prior to and during the submission process, as it can lead to productive exchanges, as well as earlier and greater citation of published work (See The Effect of Open Access).













