Main Article Content
Abstract
Keywords
Article Details
Authors who publish with this journal agree to the following terms:
- Authors retain copyright and grant the journal right of first publication with the work simultaneously licensed under a Creative Commons Attribution License that allows others to share the work with an acknowledgment of the work's authorship and initial publication in this journal.
- Authors are able to enter into separate, additional contractual arrangements for the non-exclusive distribution of the journal's published version of the work (e.g., post it to an institutional repository or publish it in a book), with an acknowledgment of its initial publication in this journal.
- Authors are permitted and encouraged to post their work online (e.g., in institutional repositories or on their website) prior to and during the submission process, as it can lead to productive exchanges, as well as earlier and greater citation of published work (See The Effect of Open Access).
References
- Baker, Data mining for Education. In McGaw, B., Peterson, P., Baker, E., International Encyclopedia of Education 3nd Edition, vol.7, Elsevier, United Kingdom, 2010, pp. 112-118
- S.S. Pangastuti, K., Fithriasari, N. Iriawan, W. Suryaningtyas, Classification of Boosting in Embalanced Data. Malaysian Journal of Science, Vol. 8, Special Issue (2) (2019) 36-45.
- Directorate General of Learning and Student Affairs, K.R.T.d.P.T. (2018). Bidikmisi Guide 2018, Kemenristekdikti, Jakarta, 2018.
- W. Suryaningtyas, N. Iriawan, K. Fithriasari, B.S.S. Ulama, L. Susanto, A.A. Pravitasari, On the Bernoulli Micture model for Bidikmisi Scholarship Classfication with Bayesian MCMC. Journal of Physics: Conference Series, 1090 (2018) 1-8.
- R. E. Schapire, Y. Frued, Boosting: Foundations and Algorithms, MIT Press, London, 2018.
- L. Breiman, Bagging Predictors. Machine Learning, 1996, 123-140
- N.V. Chawla, K.W. Bowyer, L.O Hall, W.P. Kegelmeyer, SMOTE: Synthetic Minority Over-sampling technique. Journal of Artificial intelligence research, 16 (2002), 321-357.
- S. Wang, X. Yao, Diversity Analysis on Imbalances Data Sets by Using Ensemble Models. IEEE Symp. Comput. Intell. Data Mining (2009) 324-331.
- J. Han, M. Kamber, J. Pei, Data Mining Concepts and Technique 2nd Edition. Kaufman Publisher, USA, 2006.
- M. Kubat, S. Matwin, Addressing the curse of Imbalanced Training Sets: One-Sided Selection. Fourteenth International Conference on Machine Learning.Conference Proceedings, (1997) 179-186.
References
Baker, Data mining for Education. In McGaw, B., Peterson, P., Baker, E., International Encyclopedia of Education 3nd Edition, vol.7, Elsevier, United Kingdom, 2010, pp. 112-118
S.S. Pangastuti, K., Fithriasari, N. Iriawan, W. Suryaningtyas, Classification of Boosting in Embalanced Data. Malaysian Journal of Science, Vol. 8, Special Issue (2) (2019) 36-45.
Directorate General of Learning and Student Affairs, K.R.T.d.P.T. (2018). Bidikmisi Guide 2018, Kemenristekdikti, Jakarta, 2018.
W. Suryaningtyas, N. Iriawan, K. Fithriasari, B.S.S. Ulama, L. Susanto, A.A. Pravitasari, On the Bernoulli Micture model for Bidikmisi Scholarship Classfication with Bayesian MCMC. Journal of Physics: Conference Series, 1090 (2018) 1-8.
R. E. Schapire, Y. Frued, Boosting: Foundations and Algorithms, MIT Press, London, 2018.
L. Breiman, Bagging Predictors. Machine Learning, 1996, 123-140
N.V. Chawla, K.W. Bowyer, L.O Hall, W.P. Kegelmeyer, SMOTE: Synthetic Minority Over-sampling technique. Journal of Artificial intelligence research, 16 (2002), 321-357.
S. Wang, X. Yao, Diversity Analysis on Imbalances Data Sets by Using Ensemble Models. IEEE Symp. Comput. Intell. Data Mining (2009) 324-331.
J. Han, M. Kamber, J. Pei, Data Mining Concepts and Technique 2nd Edition. Kaufman Publisher, USA, 2006.
M. Kubat, S. Matwin, Addressing the curse of Imbalanced Training Sets: One-Sided Selection. Fourteenth International Conference on Machine Learning.Conference Proceedings, (1997) 179-186.