Full metadata record
DC Field | Value | Language |
---|---|---|
dc.contributor.author | 劉為賢 | en_US |
dc.contributor.author | Wei-Hsien Liu | en_US |
dc.contributor.author | 張志永 | en_US |
dc.contributor.author | Chang Jyh Yeong | en_US |
dc.date.accessioned | 2014-12-12T02:11:46Z | - |
dc.date.available | 2014-12-12T02:11:46Z | - |
dc.date.issued | 1993 | en_US |
dc.identifier.uri | http://140.113.39.130/cdrfb3/record/nctu/#NT820327037 | en_US |
dc.identifier.uri | http://hdl.handle.net/11536/57753 | - |
dc.description.abstract | 在本論文中,首先我們導出高階關聯性記憶體的相關矩陣及臨界值可以保 證將全部訓練圖形回憶出來的充份及必要條件。根據違反完全回憶定理, 我們可以建立一個代價函數用以評估該記憶體之性能,再利用傾斜下降搜 尋法導出局部訓練法則 ,並由此法則來解最小值的問題。局部訓練法則 反覆地訓練相關矩陣及臨界值,直到符合完全回憶條件止。此外,我們提 供一套設計程序將使得每個儲存的圖形擁有較大的吸引範圍。電腦模擬的 結果將可証明局部訓練法則的效果。 In this paper, we derive the necessary and sufficient conditions for the correlation matrix and thresholds of the higher-order associative memory (HOAM) that can guarantee the recall of all training patterns. According to the viola- tion of the complete recall theorem, a cost function is intro- duced to measure the performance of the HOAM. In terms of the cost function, the local training rule is formulated as a minimization problem, which are solved by a gradient descent search. We use the local training rules to iteratively train the correlation matrix and the thresholds, so that the HOAM satisfies the complete recall conditions. Furthermore, a de- signed algorithm is proposed to ensure each training pattern is stored with as large a basin of attraction as possible. Simulation results demonstrate that the local training rules are powerful. | zh_TW |
dc.language.iso | en_US | en_US |
dc.subject | 關聯性記憶體; 傾斜下降搜尋法; 吸引範圍 | zh_TW |
dc.subject | associative memory; gradient descent method; basin of attraction; | en_US |
dc.title | 新局部訓練法則運用於高階關聯性記憶體 | zh_TW |
dc.title | New Local Training Rules for Higher-Order Associative memories | en_US |
dc.type | Thesis | en_US |
dc.contributor.department | 電控工程研究所 | zh_TW |
Appears in Collections: | Thesis |