中大機構典藏-NCU Institutional Repository-提供博碩士論文、考古題、期刊論文、研究計畫等下載:Item 987654321/86509
English  |  正體中文  |  简体中文  |  全文笔数/总笔数 : 80990/80990 (100%)
造访人次 : 41641881      在线人数 : 1485
RC Version 7.0 © Powered By DSPACE, MIT. Enhanced by NTU Library IR team.
搜寻范围 查询小技巧:
  • 您可在西文检索词汇前后加上"双引号",以获取较精准的检索结果
  • 若欲以作者姓名搜寻,建议至进阶搜寻限定作者字段,可获得较完整数据
  • 进阶搜寻


    jsp.display-item.identifier=請使用永久網址來引用或連結此文件: http://ir.lib.ncu.edu.tw/handle/987654321/86509


    题名: 基於實驗追蹤與模型回復的機器學習超參數優化設計與實作;The Design and Implementation of Machine Learning Hyperparameter Optimization with Experiment Tracking and Model Restoring
    作者: 涂珮榕;Tu, Pei-Jung
    贡献者: 資訊工程學系
    关键词: 機器學習;超參數調整;互動式機器學習;實驗追蹤;Machine Learning;Hyperparameter tuning;Interactive machine learning;Experiment tracking
    日期: 2021-07-21
    上传时间: 2021-12-07 12:54:59 (UTC+8)
    出版者: 國立中央大學
    摘要: 機器學習程式是透過不斷實驗、調整模型以訓練出一個理想模型為開發目標。為了使模型盡可能地符合期望,開發者需不斷執行超參數調整的過程,既存作法雖可達到在訓練期間調整超參數的功能,但仍有其不足之處。為了對比調整前後的差異,常見使用表格的方式去記錄實驗過程,然而此法較不易直觀地看出實驗之間的關連性。
    對此,本研究提出一個輔助超參數調整工具:RETUNE,利用回呼函式和checkpoint機制,讓使用者可在訓練期間對優化器超參數進行調整,結合視覺化方式即時反饋模型評估指標給使用者,並自動記錄調整時的模型配置等相關訓練數據。同時,提供回溯功能使模型得以回復至先前的訓練狀態,以進行模型比較。最後,以樹狀圖的方式呈現先前調整歷程,協助使用者歸納過去的實驗,進而理解超參數對訓練產生的影響,更有效的進行優化器超參數調整與模型優化的過程。;Building machine learning model is an experiment-driven process. Tuning hyperparamters iteratively to meet the acceptance criteria usually results in tremendous trial models. There are some related research for tuning hyperparameters during training, but still have constraints for building a model. Moreover, most of the developers tend to manage these artifacts in tableau way, and extra effort must be spent with it. However, tables can not reveal the correlation between experiments.
    In this paper, we implement a tool: RETUNE with callback function and checkpoint operation. It allows users to tune optimizer′s hyperparameters based on the visualized feedback of the model metrics during training, and automatically extracts model configuration from the experiment. With the feature to restore model from the previous training state, users would be able to compare from multiple potential models. Finally, the tuning process would be plotted as a tree graph which aimed at helping users understand the historical experiments and realize the relation between hyperparameters setting and training process, in order to effectively manipulate hyperparameter tuning and model optimization.
    显示于类别:[資訊工程研究所] 博碩士論文

    文件中的档案:

    档案 描述 大小格式浏览次数
    index.html0KbHTML48检视/开启


    在NCUIR中所有的数据项都受到原著作权保护.

    社群 sharing

    ::: Copyright National Central University. | 國立中央大學圖書館版權所有 | 收藏本站 | 設為首頁 | 最佳瀏覽畫面: 1024*768 | 建站日期:8-24-2009 :::
    DSpace Software Copyright © 2002-2004  MIT &  Hewlett-Packard  /   Enhanced by   NTU Library IR team Copyright ©   - 隱私權政策聲明