Title: 新局部訓練法則運用於高階關聯性記憶體
New Local Training Rules for Higher-Order Associative memories
Authors: 劉為賢
Wei-Hsien Liu
張志永
Chang Jyh Yeong
電控工程研究所
Keywords: 關聯性記憶體; 傾斜下降搜尋法; 吸引範圍;associative memory; gradient descent method; basin of attraction;
Issue Date: 1993
Abstract: 在本論文中,首先我們導出高階關聯性記憶體的相關矩陣及臨界值可以保
證將全部訓練圖形回憶出來的充份及必要條件。根據違反完全回憶定理,
我們可以建立一個代價函數用以評估該記憶體之性能,再利用傾斜下降搜
尋法導出局部訓練法則 ,並由此法則來解最小值的問題。局部訓練法則
反覆地訓練相關矩陣及臨界值,直到符合完全回憶條件止。此外,我們提
供一套設計程序將使得每個儲存的圖形擁有較大的吸引範圍。電腦模擬的
結果將可証明局部訓練法則的效果。
In this paper, we derive the necessary and sufficient
conditions for the correlation matrix and thresholds of the
higher-order associative memory (HOAM) that can guarantee
the recall of all training patterns. According to the viola-
tion of the complete recall theorem, a cost function is intro-
duced to measure the performance of the HOAM. In terms of
the cost function, the local training rule is formulated as
a minimization problem, which are solved by a gradient descent
search. We use the local training rules to iteratively train
the correlation matrix and the thresholds, so that the HOAM
satisfies the complete recall conditions. Furthermore, a de-
signed algorithm is proposed to ensure each training pattern
is stored with as large a basin of attraction as possible.
Simulation results demonstrate that the local training rules
are powerful.
URI: http://140.113.39.130/cdrfb3/record/nctu/#NT820327037
http://hdl.handle.net/11536/57753
Appears in Collections:Thesis