| 摘要: | 現今偏鄉醫療普遍面臨醫師人力與儀器不足的問題,遠端超音波掃描技術可使醫師於異地即時觀測影像,提升診斷可近性。然而,影像品質高度依賴探頭與皮膚之接觸狀態與姿態角度;過大接觸力易造成影像失真,過小則導致回波不足。同時,單視角建模亦難以掌握受測表面之法向資訊,進而限制掃描穩定性。本研究提出一套結合多視角三維建模與雙力感測之模糊神經網路(Fuzzy Neural Network, FNN)控制架構,以實現遠端機器人超音波掃描的精確導引與穩定接觸控制。系統採用主從式架構,由主端操作介面與遠端機械手臂端組成。主端具備即時影像顯示與觸覺回饋介面;從端則整合雙攝影機、雙力感測器與 UR5e 機械手臂,並透過 WebSocket 與 RTDE 通訊協定進行即時資料交換。於建模階段,本研究利用雙攝影機擷取多視角 RGB-D 影像,經深度前處理後,透過手眼座標校正確立相機與機械手臂之幾何關係,再以點到平面(Point-to-Plane)ICP 演算法進行點雲配準,最終結合 Poisson 表面重建建立平滑且連續之三維模型。結果顯示,重建模型在平面誤差與角度誤差上皆具高精度,成功補足單視角重建於邊緣與遮蔽區域的資訊缺口,能有效提供法向導引以支援探頭姿態調整。在力控制層面,雙力感測模組分別配置於探頭兩側,可即時量測接觸力並計算力差,以進行角度補償並維持雙側受力平衡。控制核心採 FNN 架構,分別建立位移控制與角度控制兩組子控制器。FNN 結合模糊系統的可解釋性與神經網路的自學習能力,能在線上動態更新隸屬函數與權重,以適應非線性接觸特性之變化。實驗結果顯示,相較於傳統控制方法,FNN 控制於平均力誤差與探頭兩側受力差誤差均有明顯改善,證明其能在複雜接觸條件下維持穩定且平滑的控制表現。綜合而言,本研究所提出之遠端超音波掃描系統,不僅能精確掌握受測物表面法向,亦能即時補償探頭姿態與受力變化,顯著提升掃描穩定性與影像品質。未來將朝向多軸感測配置、自動化掃描軌跡規劃與臨床影像輔助決策等方向發展,以實現智慧化遠距超音波檢查系統。;Remote ultrasound examination has emerged as a promising solution to address the shortage of medical specialists and diagnostic equipment in rural healthcare systems. By enabling physicians to observe real-time ultrasound images from a remote location, this approach improves accessibility to diagnostic services. However, image quality is highly dependent on the probe’s contact condition and orientation relative to the skin surface; excessive contact force may cause tissue deformation and image distortion, while insufficient force can result in weak echo signals. Furthermore, single-view reconstruction is insufficient to provide accurate surface normal information, which further limits the stability of probe alignment and scanning. To address these challenges, this study proposes an integrated Fuzzy Neural Network (FNN) control framework for remote robotic ultrasound scanning, combining multi-view 3D surface reconstruction with dual-force sensing to achieve precise probe guidance and stable contact control. The system adopts a master–slave architecture, where the master side provides real-time image visualization and haptic feedback, while the slave side integrates dual cameras, dual force sensors, and a UR5e robotic arm, communicating through WebSocket and RTDE protocols for real-time data exchange. During the modeling phase, multi-view RGB-D data captured by dual cameras are processed through depth filtering and hand–eye calibration to establish geometric correspondence between the cameras and the robot. Subsequently, a point-to-plane ICP algorithm is employed for point cloud registration, and Poisson surface reconstruction is used to generate a smooth and continuous 3D model. Experimental results demonstrate that the reconstructed surface achieves high accuracy in both plane error and normal estimation, effectively compensating for boundary occlusions and missing information in single-view reconstruction. The resulting surface normals provide reliable guidance for probe posture alignment. In the force control layer, dual force sensors mounted on both sides of the probe allow real-time measurement of contact forces and estimation of differential forces, enabling angle compensation and symmetric force distribution. The control core adopts an FNN architecture, consisting of two sub-controllers for displacement and angle regulation. By combining the interpretability of fuzzy logic with the adaptive learning capabilities of neural networks, the FNN enables online adjustment of membership functions and weights, adapting effectively to nonlinear contact variations. Experimental results indicate that, compared to traditional control methods, the FNN significantly reduces both mean force error and bilateral force difference, achieving smoother and more stable control performance under complex contact conditions. In summary, the proposed robotic ultrasound scanning system not only provides accurate surface normal information for posture alignment but also achieves real-time force compensation and stable contact behavior, thereby improving scanning stability and image quality. Future work will focus on the integration of multi-axis force sensing, automated scanning trajectory planning, and AI-assisted diagnostic imaging to support the development of intelligent remote ultrasound examination systems. |