博碩士論文 110423070 完整後設資料紀錄

DC 欄位 語言
DC.contributor資訊管理學系zh_TW
DC.creator葉詠心zh_TW
DC.creatorIp Weng Samen_US
dc.date.accessioned2023-6-30T07:39:07Z
dc.date.available2023-6-30T07:39:07Z
dc.date.issued2023
dc.identifier.urihttp://ir.lib.ncu.edu.tw:88/thesis/view_etd.asp?URN=110423070
dc.contributor.department資訊管理學系zh_TW
DC.description國立中央大學zh_TW
DC.descriptionNational Central Universityen_US
dc.description.abstract在今天競爭激烈的商業環境中,組織可以從通過文本分類進行主題分析中獲益良多。雖然有多種方法可供選擇,但BERT是自然語言處理中最有效的技術之一。BERT通常被用作特定領域的分類模型,但因模型通常沒有超出其訓練數據的知識,例如像人類一般的對事情之常識及事物之間的關聯性認知,因此限制了它與人類智能的相似度。 為了解決這個限制,本研究討探了把BERT與另一個有價值的工具—知識圖譜相結合,以擴展分類模型的能力。通過融入知識圖谱,BERT模型可以像人類一樣獲得一般知識,提升其分類能力。BERT和知識圖谱的結合有潛力顯著提升組織從大量文本數據中提取有價值的洞察力的能力。經過實驗測試,本研究發現BERT模型在加入了不同種類的知識圖譜後,對於不同的分類任務帶來的成效不一。另外,本研究亦發現加入知識圖譜的BERT模型會面臨著不同的挑戰:如訓練模型的複雜度提高、長短文本應用上的挑戰、及確保句子與知識表示模型—知識三元組之關聯性。zh_TW
dc.description.abstractIn today′s highly competitive business environment, organizations can benefit greatly from subject analysis through text classification. While there are several methods available, BERT is one of the most effective techniques for natural language processing. However, BERT is typically used as a domain-specific classification model and may not possess knowledge beyond its training data, limiting its similarity to human intelligence. To address this limitation, this research is exploring the combination of BERT with another valuable instrument, the knowledge graph. By incorporating the knowledge graph, the BERT model can acquire general knowledge as humans do, enhancing its classification capabilities. The study found that the BERT model has different performances on classification tasks after adding various types of knowledge graphs. In addition, the study also found that the model will face different challenges: such as the increase in the complexity of the training model, challenges in the application of long and short texts, and ensuring the relevance of sentences and the knowledge representation models—knowledge triples.en_US
DC.subject文本分類zh_TW
DC.subject知識圖譜zh_TW
DC.subject情感分析zh_TW
DC.subject主題分類zh_TW
DC.subjecttext classificationen_US
DC.subjectknowledge graphen_US
DC.subjectsentiment analysisen_US
DC.subjecttopic categorizationen_US
DC.title針對特定領域任務—基於常識的BERT模型之應用zh_TW
dc.language.isozh-TWzh-TW
DC.titleThe Application of Common-Sense Knowledge-based BERT on Domain-Specific Tasksen_US
DC.type博碩士論文zh_TW
DC.typethesisen_US
DC.publisherNational Central Universityen_US

若有論文相關問題,請聯絡國立中央大學圖書館推廣服務組 TEL:(03)422-7151轉57407,或E-mail聯絡  - 隱私權政策聲明