Full metadata record
DC FieldValueLanguage
dc.contributor.author劉為賢en_US
dc.contributor.authorWei-Hsien Liuen_US
dc.contributor.author張志永en_US
dc.contributor.authorChang Jyh Yeongen_US
dc.date.accessioned2014-12-12T02:11:46Z-
dc.date.available2014-12-12T02:11:46Z-
dc.date.issued1993en_US
dc.identifier.urihttp://140.113.39.130/cdrfb3/record/nctu/#NT820327037en_US
dc.identifier.urihttp://hdl.handle.net/11536/57753-
dc.description.abstract在本論文中,首先我們導出高階關聯性記憶體的相關矩陣及臨界值可以保 證將全部訓練圖形回憶出來的充份及必要條件。根據違反完全回憶定理, 我們可以建立一個代價函數用以評估該記憶體之性能,再利用傾斜下降搜 尋法導出局部訓練法則 ,並由此法則來解最小值的問題。局部訓練法則 反覆地訓練相關矩陣及臨界值,直到符合完全回憶條件止。此外,我們提 供一套設計程序將使得每個儲存的圖形擁有較大的吸引範圍。電腦模擬的 結果將可証明局部訓練法則的效果。 In this paper, we derive the necessary and sufficient conditions for the correlation matrix and thresholds of the higher-order associative memory (HOAM) that can guarantee the recall of all training patterns. According to the viola- tion of the complete recall theorem, a cost function is intro- duced to measure the performance of the HOAM. In terms of the cost function, the local training rule is formulated as a minimization problem, which are solved by a gradient descent search. We use the local training rules to iteratively train the correlation matrix and the thresholds, so that the HOAM satisfies the complete recall conditions. Furthermore, a de- signed algorithm is proposed to ensure each training pattern is stored with as large a basin of attraction as possible. Simulation results demonstrate that the local training rules are powerful.zh_TW
dc.language.isoen_USen_US
dc.subject關聯性記憶體; 傾斜下降搜尋法; 吸引範圍zh_TW
dc.subjectassociative memory; gradient descent method; basin of attraction;en_US
dc.title新局部訓練法則運用於高階關聯性記憶體zh_TW
dc.titleNew Local Training Rules for Higher-Order Associative memoriesen_US
dc.typeThesisen_US
dc.contributor.department電控工程研究所zh_TW
Appears in Collections:Thesis