博碩士論文 109225022 完整後設資料紀錄

DC 欄位 語言
DC.contributor統計研究所zh_TW
DC.creator黃雅若zh_TW
DC.creatorYa-Jo Huangen_US
dc.date.accessioned2022-7-21T07:39:07Z
dc.date.available2022-7-21T07:39:07Z
dc.date.issued2022
dc.identifier.urihttp://ir.lib.ncu.edu.tw:88/thesis/view_etd.asp?URN=109225022
dc.contributor.department統計研究所zh_TW
DC.description國立中央大學zh_TW
DC.descriptionNational Central Universityen_US
dc.description.abstract在機器學習領域中,超參數調整對於深度學習演算法來說是一個很重要的步驟,不同的超參數設定可以直接影響模型效能。而貝氏優化一直是超參數調整的熱門方法,貝氏優化利用迭代的方式,不斷更新先驗與後驗分佈來找出最佳超參數組合。本研究利用貝氏優化與穩健參數設計的概念,提出了一種新的超參數優化方法。在優化過程中,該方法將控制因子及噪音因子(例如:初始權重、訓練樣本的選取)納入考量,以期提高求得最佳超參數組合之準確度。在模擬及實證例子中,依據不同類型的問題,發現所提出的方法會比傳統貝氏優化方法找到更接近真實超參數組合的設定。zh_TW
dc.description.abstractTuning hyperparameters is crucial to the success of deep learning algorithms because it affects the model performance directly. Therefore, hyperparameter tuning has received great attention. Bayesian optimization has always been a popular option for hyperparameter tuning, which obtains optimal values of hyperparameters in a sequential manner. This thesis presents a new hyperparameter optimization method using the concept of robust parameter design. We identify several noise factors (e.g, initial weights or random splitting training samples) for optimization. Simulations show that the proposed method can find hyperparameter settings that are closer to the real hyperparameter setting.en_US
DC.subject類神經網路zh_TW
DC.subject超參數優化zh_TW
DC.subject貝氏優化zh_TW
DC.subject穩健參數設計zh_TW
DC.subjectNeural networken_US
DC.subjecthyperparameter optimizationen_US
DC.subjectBayesian optimizationen_US
DC.subjectexpected improvementen_US
DC.subjectrobust parameter designen_US
DC.titleBayesian Optimization for Hyperparameter Tuning with Robust Parameter Designen_US
dc.language.isoen_USen_US
DC.type博碩士論文zh_TW
DC.typethesisen_US
DC.publisherNational Central Universityen_US

若有論文相關問題,請聯絡國立中央大學圖書館推廣服務組 TEL:(03)422-7151轉57407,或E-mail聯絡  - 隱私權政策聲明