參考文獻 |
Atenas, J., & Havemann, L. (2014). Questions of quality in repositories of open educational resources: a literature review. Research in Learning Technology, 22.
Bosch, A., Zisserman, A., & Munoz, X. (2007, October). Image classification using random forests and ferns. In Computer Vision, 2007. ICCV 2007. IEEE 11th International Conference on (pp. 1-8). IEEE.
Breiman, L. (2001). Random forests. Machine learning, 45(1), 5-32.
Chen, M., Mao, S., & Liu, Y. (2014). Big data: a survey. Mobile Networks and Applications, 19(2), 171-209.
Cutler, D. R., Edwards, T. C., Beard, K. H., Cutler, A., Hess, K. T., Gibson, J., & Lawler, J. J. (2007). Random forests for classification in ecology. Ecology, 88(11), 2783-2792.
Davenport, T. H., & Prusak, L. (1998). Working knowledge: How organizations manage what they know. Harvard Business Press.
Dias, P., & Sousa, A. P. (1997). Understanding navigation and disorientation in hypermedia learning environments. Journal of educational multimedia and hypermedia, 6, 173-186.
Díaz-Uriarte, R., & De Andres, S. A. (2006). Gene selection and classification of microarray data using random forest. BMC bioinformatics, 7(1), 3.
García-Peñalvo, F. J., García de Figuerola, C., & Merlo, J. A. (2010). Open knowledge: Challenges and facts. Online Information Review, 34(4), 520-539.
Gardner, J. W., Craven, M., Dow, C., & Hines, E. L. (1998). The prediction of bacteria type and culture growth phase by an electronic nose with a multi-layer perceptron network. Measurement Science and Technology, 9(1), 120.
John, G. H., & Langley, P. (1995, August). Estimating continuous distributions in Bayesian classifiers. In Proceedings of the Eleventh conference on Uncertainty in artificial intelligence (pp. 338-345). Morgan Kaufmann Publishers Inc..
Su, J., Shirab, J. S., & Matwin, S. (2011). Large scale text classification using semi-supervised multinomial naive bayes. In Proceedings of the 28th international conference on machine learning (icml-11) (pp. 97-104).
Izenman, A. J. (2013). Linear discriminant analysis. In Modern multivariate statistical techniques (pp. 237-280). Springer New York.
Lawrence, R. L., & Wright, A. (2001). Rule-based classification systems using classification and regression tree (CART) analysis. Photogrammetric engineering and remote sensing, 67(10), 1137-1142.
Liu, C., & Wechsler, H. (2002). Gabor feature based classification using the enhanced fisher linear discriminant model for face recognition. IEEE Transactions on Image processing, 11(4), 467-476.
Lotte, F., Congedo, M., Lécuyer, A., Lamarche, F., & Arnaldi, B. (2007). A review of classification algorithms for EEG-based brain–computer interfaces. Journal of neural engineering, 4(2), R1.
Manek, A. S., Shenoy, P. D., Mohan, M. C., & Venugopal, K. R. (2017). Aspect term extraction for sentiment analysis in large movie reviews using Gini Index feature selection method and SVM classifier. World wide web, 20(2), 135-154.
Maroco, J., Silva, D., Rodrigues, A., Guerreiro, M., Santana, I., & de Mendonça, A. (2011). Data mining methods in the prediction of Dementia: A real-data comparison of the accuracy, sensitivity and specificity of linear discriminant analysis, logistic regression, neural networks, support vector machines, classification trees and random forests. BMC research notes, 4(1), 299.
Morgan, N., & Bourlard, H. (1990, April). Continuous speech recognition using multilayer perceptrons with hidden Markov models. In Acoustics, Speech, and Signal Processing, 1990. ICASSP-90., 1990 International Conference on (pp. 413-416). IEEE.
Nijhuis, J. A. G., Ter Brugge, M. H., Helmholt, K. A., Pluim, J. P. W., Spaanenburg, L., Venema, R. S., & Westenberg, M. A. (1995, November). Car license plate recognition with neural networks and fuzzy logic. In Neural Networks, 1995. Proceedings., IEEE International Conference on (Vol. 5, pp. 2232-2236). IEEE.
Quinlan, J. R. (1986). Induction of decision trees. Machine learning, 1(1), 81-106.
Rokach, L., & Maimon, O. (2014). Data mining with decision trees: theory and applications. World scientific.
Samant, A., & Adeli, H. (2000). Feature extraction for traffic incident detection using wavelet transform and linear discriminant analysis. Computer‐Aided Civil and Infrastructure Engineering, 15(4), 241-250.
Sebastiani, F. (2002). Machine learning in automated text categorization. ACM computing surveys (CSUR), 34(1), 1-47.
Romero, C., López, M. I., Luna, J. M., & Ventura, S. (2013). Predicting students′ final performance from participation in on-line discussion forums. Computers & Education, 68, 458-472.
Shelton, B. E., Duffin, J., Wang, Y., & Ball, J. (2010). Linking open course wares and open education resources: creating an effective search and recommendation system. Procedia Computer Science, 1(2), 2865-2870.
Statnikov, A., Wang, L., & Aliferis, C. F. (2008). A comprehensive comparison of random forests and support vector machines for microarray-based cancer classification. BMC bioinformatics, 9(1), 319.
Xanthopoulos, P., Pardalos, P. M., & Trafalis, T. B. (2013). Linear discriminant analysis. In Robust Data Mining (pp. 27-33). Springer New York.
Ikonomakis, M., Kotsiantis, S., & Tampakas, V. (2005). Text classification using machine learning techniques. WSEAS transactions on computers, 4(8), 966-974.
Thaoroijam, K. (2014). A Study on Document Classification using Machine Learning Techniques. International Journal of Computer Science Issues (IJCSI), 11(2), 217.
Baldwin, R. A. (2009). Use of maximum entropy modeling in wildlife research. Entropy, 11(4), 854-866.
Yao, Y., Welp, T., Liu, Q., Niu, N., Wang, X., Britto, C. J., ... & Montgomery, R. R. (2017). Multiparameter single cell profiling of airway inflammatory cells. Cytometry Part B: Clinical Cytometry.
Grzymala-Busse, J. W., & Hu, M. (2000, October). A comparison of several approaches to missing attribute values in data mining. In International Conference on Rough Sets and Current Trends in Computing (pp. 378-385). Springer, Berlin, Heidelberg.
Bottou, L. (2010). Large-scale machine learning with stochastic gradient descent. In Proceedings of COMPSTAT′2010 (pp. 177-186). Physica-Verlag HD.
Henson, J. M., Reise, S. P., & Kim, K. H. (2007). Detecting mixtures from structural model differences using latent variable mixture modeling: A comparison of relative model fit statistics. Structural Equation Modeling: A Multidisciplinary Journal, 14(2), 202-226.
Roweis, S. T., & Saul, L. K. (2000). Nonlinear dimensionality reduction by locally linear embedding. science, 290(5500), 2323-2326.
Zhen, X., Zheng, F., Shao, L., Cao, X., & Xu, D. (2017). Supervised Local Descriptor Learning for Human Action Recognition. IEEE Transactions on Multimedia.
Li, Z., Liu, J., Tang, J., & Lu, H. (2015). Robust structured subspace learning for data representation. IEEE transactions on pattern analysis and machine intelligence, 37(10), 2085-2098.
Belkin, M., & Niyogi, P. (2003). Laplacian eigenmaps for dimensionality reduction and data representation. Neural computation, 15(6), 1373-1396.
Wang, Q., Wu, Y., Shen, Y., Liu, Y., & Lei, Y. (2015). Supervised sparse manifold regression for head pose estimation in 3D space. Signal Processing, 112, 34-42. |