本論文將結合雙足機器人與機器學習,使用機器學習當中的監督式學習(Supervised Learning)方法為基礎,實現一種新創的雙足機器人系統建模方法。相較於傳統的解析方法,本方法因在其運算上的速度優勢,可望能夠提升機器人相關研究的運算效率。本論文主要討論一種通過深度神經網路(Deep Neural Network) 近似科列斯基分解形式(Cholesky-decompostion)雙足機器人系統的質量矩陣(Mass Matrix)的方法。在本論文中,針對一架在單足支撐階段具有5自由度的雙足機器人,基於Euler–Lagrange equation,建立其機器人系統在動態步行中的動力學模型,並利用解析方式推導其在單足支撐階段(Single Support Phase)的質量矩陣。其後利用所推導之解析質量矩陣訓練一個深度神經網路模型映射機器人系統的廣義座標與機器人系統的科列斯基分解形式質量矩陣。在經過 32 小時 50 分鐘的訓練後,能夠獲得一個在驗證階段的近似機器人系統之質量矩陣時有低於 5%的誤差的深度神經網路模型。為證明了此方法的優越性,本論文比較了三種由機器人在單足支撐階段的質量矩陣透過系統拘束方程式求得機器人在碰撞階段的拘束衝量的方法的計算速度。在拘束方程式中使用此種方法近似的質量矩陣進行運算,比使用解析質量矩陣快了5.375 毫秒。同時透過在實際機器人步態軌跡上的驗證,證明了此方法的實用性。;This thesis investigated a method that uses a deep neural network to encode the biped robot system. In detail, a supervised learning model is trained to approximate the mass matrix of the biped robot system in Cholesky-decomposed form. This method is expected to have higher computational efficiency than traditional system modeling methods. This thesis discusses a biped robot system with 5 degrees of freedom during the single support phase of walking. The ground truth of the supervised learning task is the biped robot system’s analytical mass matrix, derived from the Euler–Lagrange equation. A deep neural network is trained to map the generalized coordinates of the system to the mass matrix in Cholesky-decomposed form with reference to the analytical mass matrix. The training result shows that the neural network accurately encodes the biped robot system′s mass matrix with an approximation error of less than 5% during the testing stage after 32 hours and 50 minutes of training. The discussion compares three ways to get the constraint impulse during the impact phase of bipedal walking via mass matrix. The comparative experiment result shows that computing with the Cholesky-decomposed mass matrix encoded by the deep neural network is 5.375 milliseconds faster than computing with the analytical mass matrix. This thesis also demonstrated the practicality of this method by verifying its approximation error on actual robot gait trajectories.