目前螺栓連接的檢測方式目前在實務上多為檢測人員以橡膠槌敲擊螺栓,聽由敲擊螺栓後所產生的聲音來判斷螺栓是否鬆脫,然而,只依靠人力來進行檢測相當費時,判斷鬆脫需要人員主觀判斷,檢測場所人員不易進入甚至危險。若是能夠透過機械裝置對螺栓進行敲擊與辨識,可以減少檢測所需的人力與時間成本,增加檢測人員進行檢測時的安全,也能透過客觀的方式進行檢測,增加檢測準確率。因此,本研究運用微型機器學習的理念,將梅爾頻率倒譜係數與卷積神經網路部署至微控制器中,研發一套基於機械手臂與即時聽覺辨識功能之螺栓鬆脫檢測系統,用於檢測螺栓是否鬆脫。螺栓的種類被分為鬆脫與緊固兩類,將敲擊螺栓後產生的音訊透過梅爾頻率倒譜係數進行特徵提取,獲得特徵圖,再將音訊的特徵圖輸入至卷積神經網路進行訓練。將訓練完成後的模型與梅爾頻率倒譜系數組成聲學辨識模型並部署至具有麥克風模組的微控制器組成AI聲學辨識模組,結合機械手臂與微型敲擊裝置與主控制器並進行實驗驗證。本研究所訓練之聲學辨識模型在驗證集準確率為100%,在測試集準確率為99.57%。將聲學辨識模型部署至微控制器組成AI聲學辨識模組進行實驗驗證,在低環境噪音時準確率為75.7%,召回率為77.3%,在高環境噪音時準確率為72%,召回率為68.7%。未來透過本系統,可以使人員不需要親自進行敲擊,只需要遠端操作機械手臂就能夠進行敲擊與辨識,降低人員在進行檢測的風險,也能夠客觀且可以快速判斷螺栓是否鬆脫。;Currently, the detection of bolt connections in practical applications often involves inspectors using rubber hammers to percuss the bolts and check if they are loosened based on the generated sound. However, this manual detection process is time-consuming, requires subjective judgment by the inspectors, and may be difficult or even dangerous to access certain detection sites. To reduce the manpower and time costs of inspections, enhance the safety of inspectors, and improve the accuracy of detection through an objective approach, this study applies the concept of Tiny Machine Learning (TinyML) by deploying Mel-frequency cepstral coefficients (MFCCs) and Convolutional Neural Networks (CNNs) to a microcontroller. A bolt-loosen detection system based on a robotic arm and real-time audio recognition capabilities is developed for detecting loosen bolts. The bolts are categorized into two types: loosen and tight. The audio signals generated by knocking the bolts are subjected to feature extraction using MFCCs to obtain feature maps, which are then input to the CNN for training. After training, the model is combined with MFCCs to form an acoustic recognition model. The acoustic recognition model is then deployed to a microcontroller equipped with a microphone module, creating an AI acoustic recognition module. The AI acoustic recognition module is combined with a robotic arm and a tiny knocking device and a main controller for experimental verification. The trained acoustic recognition model achieved an accuracy of 100% on the validation set and 99.57% on the test set. The AI acoustic recognition module deployed to the microcontroller achieved an accuracy of 75.7% and a recall rate of 77.3% in low environmental noise, and an accuracy of 72% and a recall rate of 68.7% in high environmental noise. In the future, using this system, personnel will not need to perform the percussing manually; instead, they can remotely operate the mechanical arm for tapping and recognition, reducing the risks associated with inspections and providing an objective and fast way to determine whether the bolts are loosened.