Full metadata record
DC FieldValueLanguage
dc.contributor.author王聖賢en_US
dc.contributor.authorSheng-Shian Wangen_US
dc.contributor.author陳伯寧en_US
dc.contributor.authorProf.~Po-Ning Chenen_US
dc.date.accessioned2014-12-12T02:28:30Z-
dc.date.available2014-12-12T02:28:30Z-
dc.date.issued2001en_US
dc.identifier.urihttp://140.113.39.130/cdrfb3/record/nctu/#NT900435035en_US
dc.identifier.urihttp://hdl.handle.net/11536/68910-
dc.description.abstract迴旋碼編碼器和班特比解碼器是現今非常普遍的一種編碼系統的組合。這是因為在適當的設計下,這樣的組合可以提供可實現的複雜度和可接受的性能。在這樣的系統下,複雜度和性能的權衡取決於限制長度的選取。值得注意的是,班特比解碼器的效能隨著限制長度的增加而成指數性的降低,而複雜度卻同樣地成指數性的成長。現今的實踐能力僅僅能提供到不超過9的限制長度,這同時也限制了所能達到的效能。 另一方面,非常大限制長度的迴旋碼架構在理論上及實作上都已經不是問題,雖然蒙地卡羅在技術上仍然無法模擬最大近似的效能。不過,一個新的技巧—重要取樣技術被提出來可以正確地完成模擬大限制長度迴旋碼 (24或更高) 的最大近似的效能。而這模擬結果也顯示了大限制長度迴旋碼的最大近似的效能相當接近雪農的限制雖然並沒有碼器可以將之解碼。 在這篇論文中,我們提出了具有固定複雜度針對大限制長度迴旋碼的低狀態數班特比解調器,基於大限制長度迴旋碼的最大近似的效能非常好的原因,我們認為狀態降低的次佳解碼器依舊可以提供可接受的效能。最大近似的效能還受到一些其他參數選擇的影響,例如狀態規模和滑動視窗的大小,我們也會在論文中加以檢驗。zh_TW
dc.description.abstractA popular combination in modern coding system is the convolutional encoder and the Viterbi decoder. With a proper design, they can jointly provide an acceptable performance with feasible decoding complexity. In such a combination, a tradeoff on the error performance and the decoding complexity resides on the choice of the code constraint length. Specifically, the probability of Viterbi decoding failure decreases exponentially as the code constraint length increases. However, an increment of code constraint lengths also exponentially increases the computational effort of the Viterbi decoder. Nowadays, the implementation technology on the Viterbi decoder can only accommodate convolutional codes with a constraint length no greater than nine, which somehow limits the achievable error performance. On the other hand, the construction of convolutional codes with very large constraint lengths are now possible in both theory and practice, yet Monte Carlo simulations of their resultant maximum-likelihood performance is technically infeasible. The author of "An efficient new technique for accurate bit error probability estimation of ZJ decoders" presented a new simulation technique called Important Sampling, which can accurately estimate the maximum-likelihood error performance of convolutional codes with constraint length up to 24 or higher. The authors proved by Important Sampling simulations that the error performance of convolutional codes with certain constraint length can actually be close to the Shannon limit although no feasible decoder can decode such codes. In this thesis, we propose a reduced-state Viterbi decoder with fixed decoding complexity for use of codes with large constraint lengths. Since, by "An efficient new technique for accurate bit error probability estimation of ZJ decoders", the maximum-likelihood error performance of codes with large constraint length is very good, a degradation due to the sub-optimal state reduction at the decoder still provides an acceptably good performance. Performance impact from choosing different decoder parameters, such as state size and sliding window size, are also examined in this thesis.en_US
dc.language.isozh_TWen_US
dc.subject大限制長度迴旋碼zh_TW
dc.subjectconvolutional code with large constraint lengthen_US
dc.title針對大限制長度迴旋碼的低狀態數班特比解碼器zh_TW
dc.titleA state-reduction Viterbi decoder for convolutional code with large constraint lengthen_US
dc.typeThesisen_US
dc.contributor.department電信工程研究所zh_TW
Appears in Collections:Thesis