博碩士論文 101423020 詳細資訊




以作者查詢圖書館館藏 以作者查詢臺灣博碩士 以作者查詢全國書目 勘誤回報 、線上人數:128 、訪客IP:54.224.52.210
姓名 蘇蔚廷(Wei-Ting Su)  查詢紙本館藏   畢業系所 資訊管理學系
論文名稱 競賽式架構應用於進化式演算法之解空間縮減
(A competitive based solution space selection framework of the evolutionary algorithm)
相關論文
★ 利用資料探勘技術建立商用複合機銷售預測模型★ 應用資料探勘技術於資源配置預測之研究-以某電腦代工支援單位為例
★ 資料探勘技術應用於航空業航班延誤分析-以C公司為例★ 全球供應鏈下新產品的安全控管-以C公司為例
★ 資料探勘應用於半導體雷射產業-以A公司為例★ 應用資料探勘技術於空運出口貨物存倉時間預測-以A公司為例
★ 使用資料探勘分類技術優化YouBike運補作業★ 特徵屬性篩選對於不同資料類型之影響
★ 資料探勘應用於B2B網路型態之企業官網研究-以T公司為例★ 衍生性金融商品之客戶投資分析與建議-整合分群與關聯法則技術
★ 應用卷積式神經網路建立肝臟超音波影像輔助判別模型★ 基於卷積神經網路之身分識別系統
★ 能源管理系統電能補值方法誤差率比較分析★ 企業員工情感分析與管理系統之研發
★ 資料淨化於類別不平衡問題: 機器學習觀點★ 資料探勘技術應用於旅客自助報到之分析—以C航空公司為例
檔案 [Endnote RIS 格式]    [Bibtex 格式]    [相關文章]   [文章引用]   [完整記錄]   [館藏目錄]   至系統瀏覽論文 ( 永不開放)
摘要(中) 進化式演算法已經被廣泛的應用在許多領域,例如高能物理分析、氣象預測、基因分析、生物研究、財務或商業資訊分析等。雖然這些方法能夠提供令人滿意的解決方案。然而不幸的是他們都已經被證實為是耗時的方法。因此,大量運算能力的需求隨之出現,雖然雲端的出現大幅的提升了運算的效能,但同時也提升了問題的複雜度,例如:更多的資料維度或資料數。此外,基於No Free Lunch (NFL)理論,所有的進化式演算法都必須付出相同的代價進行求解,換句話說,若要得到更好的解,相對的必須付出更多的代價,例如更多的作業,流程的改變,更多的時間,更多的個體等。
因此本研究目的在於提供一個以Divide and Conquer (D&C)的想法為基礎,實作一個架構以適用於所有進化式演算法克服搜尋空間過大的問題(如:龐大的搜尋空間,過早收斂,資料篩檢,黑箱等),並能夠協助進化式演算法紀錄搜尋過程中每個維度的敏感度資料,幫助使用者能夠更深入了解問題所在。
摘要(英) Evolutionary algorithms have been widely used in many fields, such as high energy physics analysis, weather forecasting, genetic analysis, biological research, financial or business information analysis. Although these methods can provide some satisfactory solutions, they have been proved to be time-consuming methods. Therefore, the computational efficiency needs to be taken into account. Although the cloud technique can significantly improve the computing performance, it also increase the complexity of problems, such as more dimensions or numbers of the datasets. In addition, based on No Free Lunch theory (NFL), all evolutionary algorithms must pay the same price for solving problems. In other words, to get a better solution, it must pays more costs, such as more operations, process change, time or individuals.
Therefore, this thesis aims to provide a framework based on the idea which is according to the Divide and Conquer (D&C) principle, and this framework can be applied to all evolutionary algorithms to overcome the large search space problem that can affect the computational efficiency (such as: a huge search space and premature convergence, data screening, black box, etc.). In addition, it also can assist the evolutionary algorithms in recording the sensitive data samples in each dimension during the searching process, which helps users to fully understand the problem.
關鍵字(中) ★ 搜尋空間縮減
★ 巨量資料
★ 進化式演算法
★ Divide and Conquer
★ 黑箱
關鍵字(英)
論文目次 目錄

摘要 I
Abstract II
致謝辭 III
第一章 前言 1
1.1背景 2
1.1.1 搜尋空間縮減 2
1.1.2 進化式演算法 3
1.2問題定義 5
1.3研究貢獻 6
1.4研究架構 8
第二章 文獻探討 9
2.1 Genetic Algorithm 11
2.1.1 Hybrid Genetic Algorithm (HGA) 11
2.1.2 Interval Genetic Algorithm (IGA) 12
2.1.3 Hybrid Interval Genetic Algorithm (HIGA) 12
2.2 Particle Swarm Optimization (PSO) 12
2.2.1 Cooperative Particle Pwarm Optimization (CPSO) 12
2.2.2 Fully Informed Particle Swarm Optimization (FIPSO) 12
2.2.3 Comprehensive Learning Particle Swarm Optimization (CLPSO) 13
2.3 Simulated Annealing (SA) 13
2.3.1 Genetic Simulated Annealing Algorithm (GSAA) 13
2.3.2 Improved Simulated Annealing Algorithm (ISAA) 13
2.4進化式演算法用於資料縮減 13
2.5覆蓋率 14
2.6結論 15
第三章 架構設計 17
3.1 Divide and Conquer 17
3.2 挑選 20
3.3 初始化 22
3.4 投票與切割 24
3.4.1 投票與切割的基本概念 25
3.4.2 空間離散基礎之投票與切割 26
3.4.3 區域擴張 28
第四章 研究結果 30
4.1 實驗環境 30
4.2 研究架構時間敏感度實驗 30
4.2.1 參數時間敏感度測試 30
4.2.2 研究架構對EAs時間影響力 31
4.3 數學模型實驗 33
4.3.1實驗參數 33
4.3.2實驗結果 38
4.3.3 第二循環 43
4.4 大型資料集實驗 47
4.4.1實驗參數 47
4.4.2 資料集 48
4.4.3實驗結果 49
第五章 結論 52
5.1結論與貢獻 52
5.2未來研究方向與建議 53

參考文獻 55





表目錄

表4-1 數學函數模型之實驗參數設定 34
表4-2 數學模型實驗最佳解與執行時間(單位:秒) 39
表4-3最佳解覆蓋率以及剩餘縮減率(單位:百分比) 41
表4-4 GA Step2結果 44
表4-5 SPSO Step2結果 45
表4-6大型資料集之實驗參數設定 47
表4-7資料集摘要 48
表4-8 資料集分類正確率以及資料縮減率(單位:百分比) 50
表4-9資料集之運算時間(單位:秒) 51





圖目錄

圖1 EAs演化流程 4
圖2 MapReduce運作原理 18
圖3 流程架構 19
圖4 隨機序列產生之個體 22
圖5 Sobol序列產生之個體 23
圖6 本文推薦之切割架構 25
圖7鄰近最佳解區域 29
圖8 GA求解Trid Fuction時間敏感度 31
圖9 SPSO求解Trid Fuction時間敏感度 31
圖10 標準版GA與GA+S3之執行時間 32
圖11 標準版GA與GA+S3之執行時間 32
圖12 標準版SPSO與SPSO+S3之執行時間差異 32
圖13 標準版SPSO與SPSO+S3之執行時間差異 33
圖14 EAs演化 34


參考文獻 參考文獻
〔1〕 D. Pyle, “Data preparation for data mining,” Morgan Kaufmann, 1999.
〔2〕 X.-B. Li and V.S. Jacob, “Adaptive data reduction for large-scale transaction data,” European Journal of Operational Research, Vol. 188(3), pp. 910-924, 2008.
〔3〕 D.R. Wilson and T.R. Martinez, “Reduction techniques for instance-based learning algorithms,” Machine Learning, Vol. 38, pp. 257-286, 2000.
〔4〕 T. Reinartz, “A unifying view on instance selection,” Data Mining and Knowledge Discovery, Vol. 6, pp. 191-210, 2002.
〔5〕 Richard Ernest Bellman; Rand Corporation. “Dynamic programming,” Princeton University Press, 1957.
〔6〕 Republished: Richard Ernest Bellman. “Dynamic Programming,” Courier Dover Publications, 2003.
〔7〕 Richard Ernest Bellman. “Adaptive control processes: a guided tour,” Princeton University Press. 1961
〔8〕 F. Solis and R. Wets, “Minimization by random search techniques,” Math. Oper. Res., vol. 6, pp. 19–30, 1981
〔9〕 林豐澤,演化式計算上篇:演化式演算法的三種理論模式,智慧科技與應用統計學報,第三卷,第一期,2005。
〔10〕 Z. Michalewicz, “Genetic Algorithms Plus Data Structures Equals Evolution Programs 2nd,” Springer-Verlag New York, Inc. Secaucus, NJ, USA. 1994.
〔11〕 T. Back, D. B. Fogel and Z. Michalewicz, “Handbook of EvolutionaryComputation,” IOP Publishing Ltd. Bristol, UK, 1997.
〔12〕 D.H. Wolpert and W.G. Macready, “No free lunch theorems for optimization,” IEEE Transactions on Evolutionary Computation, Vol. 1(1), pp.67-82, April 1997.
〔13〕 J.J. Liang, A.K. Qin, P.N. Suganthan and S. Baskar. “Comprehensive Learning Particle Swarm Optimizer for Global Optimization of Multimodal Functions,”. IEEE Transactions on Evolutionary Computation, volume 10, pages 281-296, 2006.
〔14〕 H. Liu and A. Abraham, “ Fuzzy Adaptive Turbulent Particle Swarm Optimization,” International Journal of Innovative Computing and Applications, volume 1, pages 39-47,2007.
〔15〕 White, Tom, “Hadoop: The Definitive Guide,” O′Reilly Media. p. 3,2012
〔16〕 Manyika, James; Chui, Michael; Bughin, Jaques; Brown, Brad; Dobbs, Richard; Roxburgh, Charles; Byers, Angela Hung, “Big Data: The next frontier for innovation, competition, and productivity,” McKinsey Global Institute, 2011
〔17〕 T. Back, D. B. Fogel and Z. Michalewicz, “Handbook of Evolutionary Computation,” IOP Publishing Ltd. Bristol, UK, 1997.
〔18〕 Jiang-wei Zhang ; Sch. of Comput. Sci. & Technol., Xuchang Univ., Xuchang, China ; Wen-jian Si, “Improved Enhanced Self-Tentative PSO algorithm for TSP,” Natural Computation (ICNC),2010 Sixth International Conference ,Vol5, 2010.
〔19〕 J. Kennedy and R. C. Eberhart, “Particle swarm optimization,” IEEE Int. Conf. on Neural Networks, Perth, Australia, vol. 4, pp. 1942-1948, 1995.
〔20〕 R. C. Eberhart, J. Kennedy, “A new optimizer using particle swarm theory,” IEEE Int. Symposium on Micro Machine and Human Science, Nagoya, Japan, pp. 39-43, 1995.
〔21〕 R. C. Eberhart and Y. H. Shi, “Particle swarm optimization: developments, applications and resources,” IEEE Int. Cong. on Evolutionary Computation, vol. 1, pp. 81-86, Seoul, Korea, 2001.
〔22〕 C. K. Zhang, H. Shao, and Y. Li, “Particle swarm optimization for evolving artificial neural network,” IEEE Int. Conf. on Systems, Man, and Cybernetics, vol. 4, pp. 2487-2490, Nashville, Tennessee, 2000.
〔23〕 V. Tandon, H. El-Mounayri and H. Kishawy, “NC end milling optimization using evolutionary computation,” International Journal of Machine Tools & Manufacture 42, pp. 595-605, 2002.
〔24〕 K.P. Wang, L. Huang, C.G. Zhou, and W. Pang, “Particle swarm optimization for traveling salesman problem,” in Proceedings of the Second International Conference on Machine Learning and Cybernetics, vol. 3, pp.1583-1585, 2003.
〔25〕 Hu, Xiaohui, R. C. Eberhart, and Y. Shi, “Swarm intelligence for permutation optimization: a case study of n-queens problem,” Swarm Intelligence Symposim,The Proceedings of the 2003 on IEEE(SIS′03), pp. 243-246, 2003.
〔26〕 Esmin, A.A.A., A.R. Aoki, G. Lambert-Torres, “Particle Swarm Optimization for Fuzzy Membership Functions Optimization,” IEEE International Conference, Vol. 3, 2002.
〔27〕 J Salerno, “Using the particle swarm optimization technique to train a recurrent neural model,”the Ninth IEEE International Conference on Tools with Artificial Intelligence, pp. 45-49, 1997.
〔28〕 T. Sousa, A. Silva, and A. Neves, “Particle swarm based data mining algorithms for classification tasks,” ELSEVIER on Parallel Computing, NO.30, pp.767-783, 2003.
〔29〕 Ayed Salman, Imtiaz Ahmad, and Sabah Al-Madani, “Particle swarm optimization for task assignment problem,” Microprocessors and Microsystems 26, pp. 363-371, 2002.
〔30〕 S. Kirkpatrick, C. D. Gelatt, and M. P. Vecchi, “Optimization by simulated annealing,” Science, vol. 220, pp. 671-680, 1983.
〔31〕 Yao and Xin, “New simulated annealing algorithm,” International Journal of Computer Mathematics., vol. 56, no. 3-4, pp. 161-168, 1995.
〔32〕 V. Fabian, “Simulated annealing simulated,” Computers & Mathematics with Applications., vol. 33, no. 1-2, pp. 81-94, 1997.
〔33〕 Jorge Haddock and John Mittenthal, “Simulation optimization using simulated annealing”, Computers & Industrial Engineering, vol. 22, no. 4, pp. 387-395 , 1992.
〔34〕 Yang, Fan, Zhuang, Zhenquan, Dai and Yingxia , “Using simulated annealing,” International Conference on Circuits and Systems, Nanjing, China , pp. 175, 1989.
〔35〕 L. Ingber, “Very fast simulated re-annealing,” Mathematical and Computer Modeling, vol. 12, no. 8, pp. 967-973, 1989.
〔36〕 Hatay, Tolga, Toklu and Y. Cengiz,“Optimization of trusses using the simulated annealing method,” ARI Bulletin of the Istanbul Technical University, vol. 54, no. 1, pp. 67-71, 2004.
〔37〕 Reyes, Edgar, N. Steidley and Carl, “Optimization using simulated annealing,” Northcon - Conference Record, pp. 120-126, 1998.
〔38〕 N. Metropolis, A. W. Rosenbluth, M. N. Rosenbluth, A. H. Teller and E. Teller,“Equations of state calculations by fast computing machines,” Journal of Chemical Physics 21, pp. 1087-1092, 1953.
〔39〕 Twohig, Susan N. Aletan and Samuel O, “Traveling Salesman Problem,” Annual Computer Science Conference Proceedings, Washington, DC, pp. 437, Feb, 1990.
〔40〕 Gao and Shang, “ Solving TSP with simulated annealing algorithm,” Journal of East China Shipbuilding Institute, vol. 17, no. 3, pp. 13, June, 2003.
〔41〕 Stella Sofianopoulou, “Simulated annealing applied to the process allocation problem,” European Journal of Operational Research, vol. 60, no. 3, pp. 327-334, 1992.
〔42〕 Catoni and Olivier, “Solving scheduling problems by simulated annealing,” SIAM Journal on Control and Optimization, vol. 36, no. 5, pp. 1539-1575, 1998.
〔43〕 Hutchinson George K. Wynne, Bayard E. “Flexible manufacturing system,” Industrial Engineering, vo1. 5, no. 12, pp. 10-17, 1973.
〔44〕 Carrie, A. S. Adhami, E. Stephens, A. Murdoch, I. C. “Introducing a flexible manufacturing system,” International Journal of Production Research, vol. 22, no. 6, pp. 907-916, 1984.
〔45〕 Pengfei Guo, Xuezhi Wang, Yingshi Han, “The enhanced genetic algorithms for the optimization design, ” Liaoning University of Technology, p2991-2993, 2010.
〔46〕 M.A. Potter and K.A. de Jong, “A Cooperative Coevolutionary Approach to Function Optimization, ” In The Third Parallel Problem Solving From Nature, pages 249-257, 1994.
〔47〕 F. van den Bergh and A.P. Engelbrecht, “ A Cooperative Approach to Particle Swarm Optimization, ” IEEE Transactions on Evolutionary Computation, volume 8, pages 225-239, 2004.
〔48〕 J. Kennedy and R. Mendes, “Neighborhood Topologies in Fully-Informed and Best-of-Neighborhood Particle Swarms, ” IEEE International Workshop on Soft Computing in Industrial Applications, pages 45-50, 2003.
〔49〕 J.J. Liang, A.K. Qin, P.N. Suganthan and S. Baskar, “Comprehensive LearningParticle Swarm Optimizer for Global Optimization of Multimodal Functions, ” IEEE Transactions on Evolutionary Computation, volume 10, pages 281-296, 2006.
〔50〕 Shan Hongbo ,Li Shuxia ,Gong Degang and Lou Peng, “Genetic simulated annealing algorithm-based assembly sequence planning, ” Technology and Innovation Conference, 2006.
〔51〕 QI Ji-Yang, “Application of Improved Simulated Annealing Algorithm in Facility Layout Design, ” Control Conference (CCC), 2010 29th Chinese, 2010.
〔52〕 J.R. Cano, F. Herrera, M. Lozano, “Using evolutionary algorithms as instance selection for data reduction: an experimental study,” IEEE Transactions on Evolutionary Computation ,Vol. 7(6), pp. 561-575, 2003.
〔53〕 I. Guyon and A. Elisseeff, “An introduction to variable and feature selection,” Journal of Machine Learning Research, Vol. 3, pp. 1157–1182, 2003.
〔54〕 Shang Lei, "A Feature Selection Method Based on Information Gain and Genetic Algorithm," Computer Science and Electronics Engineering (ICCSEE), 2012 International Conference ,Vol. 2.
〔55〕 L. Nanni, A. Lumini, “A. Prototype reduction techniques: a comparison among different approaches, ” Expert Systems with Applications, Vol. 38, pp. 11820-11828, 2011.
〔56〕 Horst, R., Tuy, H. “Global Optimization (Deterministic Approaches) , ” Springer-Verlag, Berlin,1996.
〔57〕 Fletcher, R., “Practical Methods of Optimization, ” Wiley, Patstow,2001.
〔58〕 Sobol, I., “Uniformly Distributed Points in a Multidimensional Cube, ” Moscow,Znanie (‘‘Mathematics and Cybernetics” series in Russian).,1985.
〔59〕 Liberti, L., Kucherenko, S., “Comparison of deterministic and stochastic approaches to global optimization, ” International Transactions in Operational Research 12, 263–285.,2005.
〔60〕 Kucherenko, S., Sytsko, Y., “Application of deterministic low-discrepancy sequences in global optimization, ” Computational Optimization and Applications 30, 297–318.,2005.
〔61〕 Mille Pant, Radha Thangaraj and Ajith Abraham,“Improved Particle Swarm Optimization with Low-Discrepancy Sequences,” to appear in Proc. of IEEE Congress on Evolutionary Computation, 2008.
〔62〕 Nguyen X. H., Nguyen Q. Uy., R. I. Mckay and P.M. Tuan, “Initializing PSO with Randomized Low-Discrepancy Sequences: The Comparative Results,” In Proc. of IEEE Congress on Evolutionary Algorithms, 2007, pp. 1985 – 1992.
〔63〕 Millie Pant, Radha Thangaraj, Ved Pal Singh and Ajith Abraham, “ Particle Swarm Optimization Using Sobol Mutation,” Emerging Trends in Engineering and Technology, 2008.
〔64〕 Radu Rugina and Martin Rinard, “Recursion Unrolling for Divide and Conquer Programs,”13th International Workshop on Languages and Compilers for Parallel Computing-Revised Papers, Pages 34 - 48 ,2000
〔65〕 F. Gustavson. “Recursion leads to automatic variable blocking for dense linear-algebra algorithms,” IBM Journal of Research and Development, 41(6):737-755,November 1997.
〔66〕 J. Frens and D. Wise. “Auto-blocking matrix-multiplication or tracking BLAS3 performance from source code,” In Proceedings of the 6th ACM SIGPLAN Symposium on Principles and Practice of Parallel Programming, Las Vegas, NV, June 1997.
〔67〕 S. Chatterjee, A. Lebeck, P. Patnala, and M. Thottethodi. “Recursive array layouts and fast matrix multiplication,” In Proceedings of the 11th Annual ACM Symposium on Parallel Algorithms and Architectures, Saint Malo, France, June 1999.
〔68〕 Wil M.P. van der Aalst, “A general divide and conquer approach for process mining,” Computer Science and Information Systems (FedCSIS), 2013 Federated Conference, 2013
〔69〕 Qiu, Y. , & Frei, H. P., “Concept-based query expansion,” In Proceedings of SIGIR-93, 16th ACM International Conference on Research and Development in Information Retrieval, pp. 160-169, Pittsburgh,1993.
〔70〕 李世炳、鄒忠毅,簡介導引模擬退火法及其應用,物理雙月刊,第二十四卷,第二期,2002年。
〔71〕 Glover, F., “Tabu Search, part i.” ORSA Journal Of Computing, vol 1, 1989.
〔72〕 黃衍明,基因演算法之基本概念、方法與國內相關研究概況,國立成功大學建築研究所博士班都市與設計運算方法專題期末報告,2003。
〔73〕 郭信川、張建仁、劉仁清,粒子群演算法於最佳化問題之研究,第一屆台灣作業研究學會學術研討會暨2004年科技與管理學術研討會,2004。
指導教授 蔡志豐 審核日期 2014-6-30
推文 facebook   plurk   twitter   funp   google   live   udn   HD   myshare   reddit   netvibes   friend   youpush   delicious   baidu   
網路書籤 Google bookmarks   del.icio.us   hemidemi   myshare   

若有論文相關問題,請聯絡國立中央大學圖書館推廣服務組 TEL:(03)422-7151轉57407,或E-mail聯絡  - 隱私權政策聲明