完整後設資料紀錄
DC 欄位語言
dc.contributor.author林耀中en_US
dc.contributor.authorYaw-Chung Linnen_US
dc.contributor.author吳炳飛en_US
dc.contributor.authorBing-Fei Wuen_US
dc.date.accessioned2014-12-12T02:11:46Z-
dc.date.available2014-12-12T02:11:46Z-
dc.date.issued1993en_US
dc.identifier.urihttp://140.113.39.130/cdrfb3/record/nctu/#NT820327039en_US
dc.identifier.urihttp://hdl.handle.net/11536/57755-
dc.description.abstract對於一般非線性估計問題,我們自互關訊息 (mutual entropy)發展一上 限 (upper bound) 於相關性係數 (correlation coefficient),此一上 限依據非線性轉換 (nonlinear transformation),而經由此轉換之後, 進而產生相互高斯 (jointly Gaussian) 訊號。我們也探討了最小均方估 計誤差 (minimum mean-squared estimation error) 和互關訊息的關係 。此外,當給定一相關性係數時,我們可以很容易地產生 ergodic 和相 互高斯訊號,所以可經由電腦來模擬。 For a general estimation problem, we develop an upper bound on the correlation coefficients in terms of the mutual entropy. This upper bound may be reached by means of a nonlinear trans- formation, after transformation, the processes are jointly Gaussian. The relationship between the minimum mean-squared error and the mutual entropy is discussed. Moreover, given a correl- ation coefficient, ergodic and jointly Gaussian signals can be generated easily, so that the simulation can be done by mputer.zh_TW
dc.language.isoen_USen_US
dc.subject互關訊息;相關性係數;相互高斯;最小均方估計誤差zh_TW
dc.subjectmutual entropy;correlation coefficient;jointly Gaussian;minimum mean-square estimation erroren_US
dc.title互關訊息於非線性估計之探討zh_TW
dc.titleMutual Entropy to Nonlinear Estimationen_US
dc.typeThesisen_US
dc.contributor.department電控工程研究所zh_TW
顯示於類別:畢業論文