博碩士論文 100423050 詳細資訊




以作者查詢圖書館館藏 以作者查詢臺灣博碩士 以作者查詢全國書目 勘誤回報 、線上人數:6 、訪客IP:3.89.87.12
姓名 羅偉成(Wei-Cheng Lo)  查詢紙本館藏   畢業系所 資訊管理學系
論文名稱 複數模糊類神經系統於多類別分類問題之研究
(A Study on Multi-class Classification Using Complex Neuro-Fuzzy System)
相關論文
★ 變數選擇在智慧型系統與應用之研究★ 智慧型系統之參數估測研究─一個新的DE方法
★ 合奏學習式智慧型系統在分類問題之研究★ 融入後設認知策略的複數模糊認知圖於分類問題之研究
★ 分類問題之研究-以複數型模糊類神經系統為方法★ 智慧型差分自回歸移動平均模型於時間序列預測之研究
★ 計算智慧及複數模糊集於適應性影像處理之研究★ 智慧型模糊類神經計算模式使用複數模糊集合與ARIMA模型
★ Empirical Study on IEEE 802.11 Wireless Signal – A Case Study at the NCU Campus★ 自我建構式複數模糊ARIMA於指數波動預測之研究
★ 資料前處理之研究:以基因演算法為例★ 針對文字分類的支援向量導向樣本選取
★ 智慧型區間預測之研究─以複數模糊類神經、支持向量迴歸、拔靴統計為方法★ 複數模糊類神經網路在多目標財經預測
★ 複數型模糊類神經系統及連續型態之多蟻群演化在時間序列預測之研究★ 多群基因演化及複數型模糊類神經系統在多目標數據預測之研究
檔案 [Endnote RIS 格式]    [Bibtex 格式]    [相關文章]   [文章引用]   [完整記錄]   [館藏目錄]   至系統瀏覽論文 ( 永不開放)
摘要(中) 本研究提出一種分類器架構:CNFS-OAA,其為一基於複數模糊類神經系統
(Complex neuro-fuzzy system, CNFS) 的建模程序,透過一對全部(One-against-all, OAA)
方法將資料集分解為多個二類別資料,並以動態探勘模糊法則的方式來處理分類問題。
在CNFS的建模過程中,將使用標準粒子群演算法(Standard particle swarm optimization,
SPSO) 來調整其前鑑部參數,與遞迴最小平方估計法(Recursive least squares estimator,
RLSE) 來調整其後鑑部參數。而CNFS的法則探勘方式,其法則數量將依據訓練階段之
分類正確率來動態增加。當訓練正確率未達到門檻值時,將會探勘更多的法則,並將已
經可被正確分類的資料從訓練資料集中移除。為了提升建模效率,本研究將使用F-score
屬性選取方式,來降低資料集的維度,在維持甚至提升正確率的情形下節省計算成本。
最後,從UCI機器學習資料庫取得十一個真實世界的資料集,來檢驗本研究提出的方法,
並與其他學者提出的分類演算法比較。從實驗結果可以發現,本研究所提出的方法在分
類正確率上能擁有良好的表現。
摘要(英) In this study, a classifier called CNFS-OAA has been presented, where modeling
procedure is based on complex neuro-fuzzy system (CNFS). The training dataset are divided
into multiple binary-class subsets gradually by using one-against-all (OAA), as the training
procedure proceeds. The fuzzy rules of CNFS are mined dynamically. In the CNFS modeling
procedure, the method of standard particle swarm optimization (SPSO) is used to adjust the
premise parameters and the algorithm of recursive least squares estimator (RLSE) is used to
adapt the consequent parameters. The method of rules mining for CNFS is that the number of
fuzzy IF-THEN rules is incremented dynamically according to the accuracy of classification
while training. More rules will be mined when the training accuracy cannot reach the
threshold, and the tuples classified correctly will be removed from the training dataset. For the
purpose of increasing modeling performance, a method of feature selection called F-score is
used to choose useful features and so to reduce the feature dimensions of dataset. By this way,
the computational cost can be saved while keeping or even improving the accuracy. In this
study, eleven datasets from the UCI machine learning repository have been used to evaluate
the approach proposed. The results by the proposed approach are compared with those by
other noted approaches. The experimental results show that the approach proposed has fine
performance on classification.
關鍵字(中) ★ 多類別分類問題
★ 複數模糊類神經系統
★ 一對全部
★ 混合式學習法
★ F-score
★ 屬性選取
★ 標準粒子群演算法
★ 遞迴最小平方估計法
關鍵字(英) ★ multi-class classification
★ complex neuro-fuzzy system (CNFS)
★ one-against-all
★ hybrid learning
★ F-score
★ feature selection
★ standard particle swarm optimization
★ recursive least squares estimator
論文目次 論文摘要 .................................... i
Abstract.....................................ii
誌謝…......................................iii
目錄…...................................... iv
圖目錄...................................... vi
表目錄......................................vii
符號說明.................................... ix
第1 章緒論 .................................. 1
1.1 研究背景、動機與目的 .................... 1
1.2 研究方法 ................................ 3
1.3 論文架構 ................................ 3
第2 章文獻探討 .............................. 4
2.1 複數模糊集合 ............................ 4
2.2 複數類神經模糊系統 ...................... 7
2.3 F-score............................. .....9
2.4 參數學習 (一) :標準粒子群最佳化演算法.. 10
2.5 參數學習 (二) :遞迴最小平方估計法...... 14
2.6 資料前處理:一對全部 ................... 16
2.7 動態法則探勘 ........................... 17
第3 章系統架構 ............................. 18
3.1 CNFS-OAA 模型........................... 18
3.2 資料前處理 ............................. 19
3.3 法則探勘 ............................... 20
3.4 資料修正 ............................... 23
3.5 分類器測試 ............................. 23
第4 章實驗 ................................. 24
4.1 實驗一:威斯康辛州乳癌原始資料集 ....... 26
4.2 實驗二:國會投票紀錄資料集.............. 29
4.3 實驗三:繁殖力資料集 ................... 32
4.4 實驗四:心臟超音波檢查資料集 ........... 35
4.5 實驗五:帕金森氏症資料集................ 38
4.6 實驗六:鳶尾花資料集 ................... 42
4.7 實驗七:肺癌資料集 ..................... 45
4.8 實驗八:葡萄酒資料集 ................... 47
4.9 實驗九:Hayes-Roth 資料集............... 50
4.10 實驗十:乳房組織資料集 ................ 53
4.11 實驗十一:動物園資料集................. 56
4.12 實驗結果小結 .......................... 60
第5 章討論與結論 ........................... 61
5.1 討論 ................................... 61
5.1.1 其他建模方式 ......................... 63
5.1.2 研究限制 ............................. 64
5.2 結論 ................................... 65
第6 章未來研究方向 ......................... 66
參考文獻.................................... 67
參考文獻 [1] R. Karchin, K. Karplus, and D. Haussler, “Classifying G-protein coupled receptors with
support vector machines,” Bioinformatics, vol. 18, no. 1, pp. 147-159, Jan. 2002.
[2] C. S. Leslie, E. Eskin, A. Cohen, J. Weston, and W. S. Noble, “Mismatch string kernels for
discriminative protein classification,” Bioinformatics, vol. 20, no. 4, pp. 467-476, Jan. 2004.
[3] K. Tsuda, H. Shin, and B. Scholkopf, “Fast protein classification with multiple networks,”
Bioinformatics, vol. 21, no. 2, pp. ii59-ii65, Sep. 2005.
[4] C. P. Lee, W. S. Lin, Y. M. Chen, and B.J. Kuo, “Gene selection and sample classification on
microarray data based on adaptive genetic algorithm/k-nearest neighbor method,” Expert
Systems with Applications, vol. 38, no. 5, pp. 4661–4667,May 2011.
[5] K. Moorthy, M. S. Bin Mohamad, and S. Deris, “Multiple gene sets for cancer classification
using gene range selection based on random forest,” Lecture Notes in Computer Science, vol.
7802, pp. 385-393, Mar. 2013.
[6] G. Shuster et al., “Classification of breast cancer precursors through exhaled breath,”
Breast Cancer Research and Treatment, vol. 126, no. 3, pp. 791-796, 2011.
[7] J. Zhenga and B. L. Lua, “A support vector machine classifier with automatic confidence
and its application to gender classification,” Neurocomputing, vol. 74, pp. 1926–1935, May
2011.
[8] V. K. Anagnostou et al., “Molecular classification of nonsmall cell lung cancer using a
4-protein quantitative assay,” Cancer, vol. 118, no. 6, pp. 1607–1618, Mar. 2012.
[9] C. Huang, D. Yang, and Y. Chuang, “Application of wrapper approach and composite
classifier to the stock trend prediction,” Expert Systems with Applications, vol. 34, no. 4, pp.
2870-2878, May 2008.
[10] K. Assaleh, H. El-Baz, and S. Al-Salkhadi, “Predicting stock prices using polynomial
classifiers: the case of Dubai financial market,” Journal of Intelligent Learning Systems and
Applications, vol. 3, no. 2A, pp. 82-89, May 2011.
[11] Y. Son, D. J. Noh, and J. Lee, “Forecasting trends of high-frequency KOSPI200 index data
using learning classifiers,” Expert Systems with Applications, vol. 39, no. 14, pp. 11607–
11615, Oct. 2012.
[12] Cortes, Corinna, and V. Vapnik, “Support-vector networks,” Machine Learning, vol. 20, no.
3, pp. 273-297, Sep. 1995.
[13] V. N. Vapnik, The nature of statistical learning theory: Springer-Verlag New York, Inc.,
1995.
[14] J. R. Quinlan, “Induction of Decision Trees,” Machine Learning, vol. 1, no. 1, pp. 81-106,
1986.
[15] I. Rish, “An empirical study of the naive Bayes classifier,” IJCAI 2001 Workshop on
Empirical Methods in Artificial Intelligence, vol. 3, no. 22, pp. 41-46, 2001.
[16] Cover, Thomas and P. Hart, “Nearest neighbor pattern classification,” IEEE Transactions on
Information Theory, vol. 13, no. 1, pp. 21-27, Jan. 1967.
[17] L. Jiang, Z. Cai, D. Wang, and S. Jiang, “Survey of improving k-nearest-neighbor for
classification,” Proceedings of the 4h International Conference on Fuzzy Systems and
Knowledge Discovery, vol. 1, pp. 679-683, Aug. 2007.
[18] Lior Rokach, “Ensemble-based classifiers,” Artificial Intelligence Review, vol. 33, no. 1-2,
pp. 1-39, Feb. 2010.
[19] M.-L. Huang, H.-Y. Chen, and J.-J. Huang, “Glaucoma detection using adaptive neuro-fuzzy
inference system,” Expert Systems with Applications, vol. 32, no. 2, pp. 458-468, Feb. 2007.
[20] A. Das and M. Bhattacharya, “GA based neuro fuzzy techniques for breast cancer
identification,” Proceedings of the 2008 International Machine Vision and Image Processing
Conference, pp. 136-141, Sep. 2008.
[21] A. Das and M. Bhattacharya, “A study on prognosis of brain tumors using fuzzy logic and
genetic algorithm based techniques,” Proceedings of the 2009 International Joint Conference
on Bioinformatics, Systems Biology and Intelligent Computing, pp. 348-351, Aug. 2009.
[22] L. A. Zadeh, “Fuzzy sets,” Information and Control, vol. 8, no. 3, pp. 338-353, Jun. 1965.
[23] D. Ramot, R. Milo, M. Friedman, and A. Kandel, “Complex fuzzy sets,” IEEE Transactions
on Fuzzy Systems, vol. 10, no. 2, pp. 171-186, Apr. 2002.
[24] D. Ramot, M. Friedman, G. Langholz, and A. Kandel, “Complex fuzzy logic,” IEEE
Transactions on Fuzzy Systems, vol. 11, no. 4, pp. 450-461, Aug. 2003.
[25] C. Li and T. Chiang, “Function approximation with complex neuro-fuzzy system using
complex fuzzy sets – a new approach,” New Generation Computing, vol. 29, no. 3, pp.
261-276, Jul. 2011.
[26] C. Li and T. W. Chiang, “Complex neuro-fuzzy ARIMA forecasting — a new approach
using complex fuzzy sets,” IEEE Transactions on Fuzzy Systems, vol. 21, no. 3, pp. 567-584,
Jun. 2013.
[27] C. Li and T. W. Chiang, “Complex fuzzy computing to time series prediction—a
multi-swarm PSO learning approach,” Lecture Notes in Computer Science, vol. 6592, pp.
242-251, Apr. 2011.
[28] C. Li and T. W. Chiang, “Complex fuzzy model with PSO-RLSE hybrid learning approach
to function approximation,” International Journal of Intelligent Information and Database
Systems, vol. 5, no. 4, pp. 409-430, Jul. 2011.
[29] C. Li and F. Chan, “Complex-Fuzzy Adaptive Image Restoration – An
Artificial-Bee-Colony-Based Learning Approach,” Lecture Notes in Computer Science, vol.
6592, pp. 90-99, Apr. 2011.
[30] G. Ou and Y.L. Murphey, “Multi-class pattern classification using neural networks,” Pattern
Recognition, vol. 40, no. 1, pp. 4-18, Jan. 2007.
[31] T. C. Chen and H. L. Tsao, “Using a hybrid meta-evolutionary rule mining approach as
a classification response model,” Expert Systems with Applications, vol. 36, no. 2, pt. 1,
pp. 1999-2007, Mar. 2009.
[32] Y. W. Chen, and C. J. Lin, “Combining SVMs with Various Feature Selection Strategies,”
Studies in Fuzziness and Soft Computing, vol. 207, pp. 315-324, 2006.
[33] M. Clerc, A method to improve Standard PSO, 2009. [Online]. Available:
http://clerc.maurice.free.fr/pso/Design_efficient_PSO.pdf [Accessed: Jul. 28, 2012].
[34] M. Clerc, Standard Particle Swarm Optimisation, Sep. 2012. [Online]. Available:
http://clerc.maurice.free.fr/pso/SPSO_descriptions.pdf [Accessed: Mar. 26, 2013].
[35] J. S. R. Jang, C. T. Sun, and E. Mizutani, “Least-squares methods for system identification,”
Neuro-Fuzzy and Soft Computing, NJ: Prentice Hall, 1997, pp. 95-125.
[36] S. Dick, “Toward complex fuzzy logic,” IEEE Transactions on Fuzzy Systems, vol. 13, no. 3,
pp. 405-414, Jun. 2005.
[37] E. Mamdani, and S. Assilian, “An experiment in linguistic synthesis with a fuzzy logic
controller,” International Journal of Man-Machine Studies, vol. 7, no. 1, pp. 1-13, Jan. 1975.
[38] T. Takagi and M. Sugeno, “Fuzzy identification of systems and its applications to modeling
and control,” IEEE Transactions on Systems, Man and Cybernetics, vol. 15, no. 1, pp.
116-132, Jan. 1985.
[39] J. S. R. Jang, “ANFIS: Adaptive-network-based fuzzy inference system,” IEEE Transactions
on Systems, Man and Cybernetics, vol. 23, no. 3, pp. 665-685, May 1993.
[40] D. Nauck and R. Kruse, “A neuro-fuzzy method to learn fuzzy classification rules from data,”
Fuzzy Sets and Systems, vol. 89, no. 3, pp. 277-288, Aug. 1997.
[41] D. Nauck, A. Nurnberger, and R. Kruse, “Neuro-fuzzy classification,” Proceedings of the 6th
Conference of the International Federation of Classification Societies, pp. 287-294, Jul.
1998.
[42] D. Nauck and R. Kruse, “Obtaining interpretable fuzzy classification rules from medical
data,” Artificial Intelligence in Medicine, pp. 146-169, Jun. 1999.
[43] John, George H., Ron Kohavi, and Karl Pfleger, “Irrelevant features and the subset selection
problem,” Proceedings of the 11th International Conference on Machine Learning, pp.
121-29, 1994.
[44] J. Kennedy & R. Eberhart, “Particle swarm optimization,” Proceedings of the IEEE
International Conference on Neural Networks, vol. 4, pp. 1942-1948, Nov. 1995.
[45] F. van den Bergh and A. P. Engelbrecht, “A cooperative approach to particle swarm
optimization,” IEEE Transactions on Evolutionary Computation, vol. 8, no. 3, pp. 225-239,
Jun. 2004.
[46] B. Liu, L. Wang, Y. H. Jin, F. Tang, and D. X. Huang, “Improved particle swarm
optimization combined with chaos,” Chaos, Solitons & Fractals, vol. 25, no. 5, pp.
1261-1271, Sep. 2005.
[47] D. Bratton and J. Kennedy, “Defining a standard for particle swarm optimization,” IEEE
Swarm Intelligence Symposium, pp. 120-127, Apr. 2007.
[48] M. Clerc, Back to random topology, Feb. 2007. [Online]. Available:
http://clerc.maurice.free.fr/pso/random_topology.pdf [Accessed: Mar. 26, 2013].
[49] Dr. William and H. Wolberg, Breast Cancer Wisconsin (Original) Data Set, Jul. 1992.
[Online]. Available:
http://archive.ics.uci.edu/ml/datasets/Breast+Cancer+Wisconsin+(Original) [Accessed:
May 14, 2013].
[50] J. Schlimmer, Congressional Voting Records Data Set, Apr. 1987. [Online]. Available:
http://archive.ics.uci.edu/ml/datasets/Congressional+Voting+Records [Accessed: May 14,
2013].
[51] D. Gil and J. L. Girela, Fertility Data Set, Jan. 2013. [Online]. Available:
http://archive.ics.uci.edu/ml/datasets/Fertility [Accessed: May 14, 2013].
[52] S. Salzberg, Echocardiogram Data Set, Feb. 1989. [Online]. Available:
http://archive.ics.uci.edu/ml/datasets/Echocardiogram [Accessed: May 14, 2013].
[53] M. Little, Parkinsons Data Set, Jun. 2008. [Online]. Available:
http://archive.ics.uci.edu/ml/datasets/Parkinsons [Accessed: May 14, 2013].
[54] R. A. Fisher, Iris Data Set, Jul. 1988. [Online]. Available:
http://archive.ics.uci.edu/ml/datasets/Iris [Accessed: May 14, 2013].
[55] Z. Q. Hong and J. Y. Yang, Lung Cancer Data Set, May 1992. [Online]. Available:
http://archive.ics.uci.edu/ml/datasets/Lung+Cancer [Accessed: May 14, 2013].
[56] M. Forina et al., Wine Data Set, Jul. 1991. [Online]. Available:
http://archive.ics.uci.edu/ml/datasets/Wine [Accessed: May 14, 2013].
[57] Barbara and F. Hayes-Roth, Hayes-Roth Data Set, Mar. 1989. [Online]. Available:
http://archive.ics.uci.edu/ml/datasets/Hayes-Roth [Accessed: May 14, 2013].
[58] JP Marques de Sá, Breast Tissue Data Set, May 2010. [Online]. Available:
http://archive.ics.uci.edu/ml/datasets/Breast+Tissue [Accessed: May 14, 2013].
[59] R. Forsyth, Zoo Data Set, May 1990. [Online]. Available:
http://archive.ics.uci.edu/ml/datasets/Zoo [Accessed: May 14, 2013].
[60] J. Weston and C. Watkins, “Multi-class support vector machines,” Royal Holloway,
University of London, Department of Computer Science, Tech. Report. CSD-TR-98-04,
1998.
[61] M. Ashraf, K. Le, and X. Juang, “Iterative weighted k-NN for constructing missing
feature values in Wisconsin breast cancer dataset,” 2011 3rd International Conference
on Data Mining and Intelligent Information Technology Applications, pp. 23-27, Oct.
2011.
[62] G. I. Salama, M. B. Abdelhalim, and M. A. Zeid, “Breast cancer diagnosis on three
different datasets using multi-classifiers,” International Journal of Computer and
Information Technology, vol. 1, no. 1, pp. 36-43, Sep. 2012.
[63] A. Marcano-Cedeño, J. Quintanilla-Domínguez, and D. Andina, “WBCD breast cancer
database classification applying artificial metaplasticity neural network,” Expert
Systems with Applications, vol. 38, no. 8, pp. 9573-9579, Aug. 2011.
[64] M. Karabatak, and M. Cevdet Ince, “An expert system for detection of breast cancer
based on association rules and neural network,” Expert Systems with Applications, vol.
36, no. 2, pp. 3465-3469, Mar. 2009.
[65] T. Kiyan and T. Yildirim, “Breast cancer diagnosis using statistical neural networks,”
IU-Journal of Electrical & Electronics Engineering, vol. 4, no. 2, pp. 1149-1153, Jun.
2004.
[66] K. M. Salama and A. A. Freitas, “ABC-Miner: An ant-based bayesian classification
algorithm,” Lecture Notes in Computer Science, vol. 7461, pp. 13-24, Sep. 2012.
[67] M. Grochowski and W. Duch, “Fast projection pursuit based on quality of projected
clusters,” Lecture Notes in Computer Science, vol. 6594, pp. 89-97, Apr. 2011.
[68] L. Li, “Perceptron learning with random coordinate descent,” Computer Science
Technical Report CaltechCSTR:2005.006, California Institute of Technology, Aug.
2005.
[69] D. Gil, J. L. Girela, J. D. Juan, M. J. Gomez-Torres, and M. Johnsson, “Predicting seminal
quality with artificial intelligence methods,” Expert Systems with Applications, vol. 39, no.
16, pp. 12564-12573, Nov. 2012
[70] M. Sebban, R. Nock, S. Lallich, E. Brodley, and A. Danyluk, “Stopping Criterion for
Boosting-Based Data Reduction Techniques: from Binary to Multiclass Problems,” Machine
Learning Research, vol. 3, no. 4, pp. 863-885, 2002.
[71] G. Melli, A Lazy Model-Based Approach to On-Line Classification, Simon Fraser University,
Apr. 1998.
[72] Z. H. Zhou and X. Y. Liu, “Training cost-sensitive neural networks with methods addressing
the class imbalance problem,” IEEE Transactions on Knowledge and Data Engineering, vol.
18, no. 1, Jan. 2006.
[73] F. Divina and E. Machiori, “Handling continuous attributes in an evolutionary inductive
learner,” IEEE Transactions on Evolutionary Computation, vol. 9, no. 1, Feb. 2005.
[74] M. A. Little, P. E. McSharry, S. J. Roberts, D. AE Costello, and I. M. Moroz, “Exploiting
Nonlinear Recurrence and Fractal Scaling Properties for Voice Disorder Detection,”
BioMedical Engineering OnLine, vol. 6, no. 23, Jun. 2007.
[75] P. D. Acton and A. Newberg, “Artificial neural network classifier for the diagnosis of
Parkinson’s disease using [99mTc]TRODAT-1 and SPECT,” Physics in Medicine and Biology,
vol. 51, no. 12, Jun. 2006.
[76] M. Fallahnezhad, M. H. Moradi, and S. Zaferanlouei, “A hybrid higher order neural
classifier for handling classification problems,” Expert Systems with Applications, vol. 38, no.
1, pp. 386-393, Jan. 2011.
[77] S. J. Wang, A. Mathew, Y. Chen, L. F. Xi, L. Ma, and J. Lee, “Empirical analysis of support
vector machine ensemble classifiers,” Expert Systems with Applications, vol. 36, no. 3, pp.
6466-6476, Apr. 2009.
[78] R. Maclin and D. Opitz, “Popular ensemble methods: an empirical study,” Journal of
Artificial Intelligence Research, vol. 11, pp. 169-198, Aug. 1999.
[79] J. J. Rodriguez and L. I. Kuncheva, “Rotation forest: a new classifier ensemble method,”
IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 28, no. 10, pp.
1619-1630, Oct. 2006.
[80] Z. Q. Hong and J. Y. Yang, “Optimal Discriminant Plane for a Small Number of Samples
and Design Method of Classifier on the Plane,” Pattern Recognition, vol. 24, no. 4, pp.
317-324, 1991.
[81] L. Yu and H. Liu, “Feature selection for high-dimensional data: A fast correlation-based
filter solution,” Proceedings of the 12th International Conference on Machine Learning,
vol. 2, no. 2, p. 856, 2003.
[82] H. Mohamadi, J. Habibi, M. S. Abadeh, and H. Saadi, “Data mining with a simulated
annealing based fuzzy classification system,” Pattern Recognition, vol. 41, no.5, pp.
1824-1833, May 2008.
[83] G.H. John and P. Langley, “Estimating continuous distributions in Bayesian classifiers,”
Proceedings of the 11th Conference on Uncertainty in Artificial Intelligence, pp. 338–
345, Aug. 1995.
[84] J.R. Quinlan, C4.5: Programs for machine learning, Morgan Kaufman Publishers Inc.,
1993.
[85] J. Bacardit and J.M. Garrell, “Bloat control and generalization pressure using the
minimum description length principle for a pittsburgh approach learning classifier
system,” Proceedings of the 2003-2005 International Conference on Learning
Classifier Systems, pp. 59-79, 2007.
[86] Y. Jiang and Z. H. Zhou, “Editing Training Data for kNN classifiers with neural
network ensemble,” Lecture Notes in Computer Science, vol. 3173, pp. 356-361, Aug.
2004.
[87] J. Estrela da Silva, Dr J. P. Marques de Sá, and J. Jossinet, “Classification of breast
tissue by electrical impedance spectroscopy,” Medical and Biological Engineering and
Computing, vol. 38, no. 1, pp. 26-30, Jan. 2000.
[88] Y.
指導教授 李俊賢(Chunshien Li) 審核日期 2013-7-12
推文 facebook   plurk   twitter   funp   google   live   udn   HD   myshare   reddit   netvibes   friend   youpush   delicious   baidu   
網路書籤 Google bookmarks   del.icio.us   hemidemi   myshare   

若有論文相關問題,請聯絡國立中央大學圖書館推廣服務組 TEL:(03)422-7151轉57407,或E-mail聯絡  - 隱私權政策聲明