在進行物件追蹤時,當物件進到含有強烈鏡面反射(specular reflection)的影像區域時,容易因為物體外觀的強烈變化而降低追蹤的準確率,此外當進行多物件追蹤時,需計算每個量測資訊與物件的資料關聯(data association),錯誤的量測資訊將降低追蹤的準確率,因此,本論文提出針對混合影像(mixed images)的以取樣為基礎的多物件追蹤演算法。首先,本論文提出簡化的RANSAC 方案用以估測相機動量,藉此提升補償運動模型(compensated motion model)與動量補償之多層分離(motion compensated layer separation)的效能。本論文採用以取樣為基礎的聯合機率資料關聯濾波器(sample-based joint probabilistic data association filter),結合共推論追蹤(co-inference tracking)後的物件狀態,計算物件與量測資訊的關聯度,提升資料關聯正確性。此外本論文提出最大聯合似然機率,利用最大似然法(maximum likelihood)以優化利用外觀資訊與軌跡資訊計算出的聯合 似然機率(joint likelihood),並提出利用量測信心指數,提供物件的遮蔽資訊,提升共推論追蹤(co-inference tracking)於更正階段的準確率,最後本論文利用外觀相似度判斷,進行物件的外觀模型更新。實驗結果顯示,本論文提出的多物件追蹤演算法可有效克服不同強度的反射以及遮蔽的影響,有效提升追蹤系統的強健性及準確性。;For object tracking, object moves into the region with strong specular reflections will decrease tracking accuracy because of the significant change of the target appearance. In addition, data association between measurements and objects is needed for multiple objects tracking because the wrong measurement will decrease tracking accuracy. Thus, this thesis proposes a sample-based multiple objects tracking for mixed images. At first, this thesis proposes a simplified RANSAC method to estimate camera motion. It can promote the efficiency of compensated motion model and motion compensated layer separation. This thesis adopts the sample-based joint probabilistic data association filter that refers to the co-inference tracking based object state to improve accuracy of data association. In addition, this thesis proposes to maximize the joint likelihood that considers appearance and trajectory information at the correction stage. This thesis also proposes occlusion confidence indicator to provide the occlusion information to improve the accuracy in co-inference tracking based correction stage. Finally, this thesis updates target appearance model according to the similarity of appearance model. Experimental results show that the proposed scheme can effectively improve robustness and accuracy under the variation of specular reflection and the occlusion condition.