標題: | 共存干擾之管理 On the Management of In-Device Coexistence Interference |
作者: | 賴勇先 Lai, Yung-Shian 謝世福 蘇育德 Hsieh, Shih-Fu Su, Yu-Te 電信工程研究所 |
關鍵字: | 共存干擾;非連續接收機制;in-device coexistence interference(IDCI);DRX;LTE;WiFi;throughput |
公開日期: | 2012 |
摘要: | 為了完全利用有多個無線電系統共存的異質網路,使用者應該有能力去選擇最有效率的系統;其中效率的測度為單位傳輸功率的吞吐量。這樣一來,使用者裝置需搭載多種傳收器以因應不同的環境並選擇當下最適當的系統。例如:藉由選擇無線區域網路(Wi-Fi)或藍芽(Bluetooth)來代替一般的巨型蜂巢式系統以連接公眾網路,減少網路端和用戶端的單位頻寬功率消耗。一個擁有多種無線電傳輸能力的使用者設備,也可以被當作大型和小型無線系統之間的中繼站,來降低大型無線網路流量的負載。
在上述的應用中,除非這些無線電使用的頻帶間距夠遠,否則會有嚴重的干擾問題。這是因為正在傳送信號的無線電會干擾到其他無線電的接收端。找出消除此共存干擾的解決方案正是這篇論文的主要目的。我們將集中討論在長期演進發展技術(LTE)和Wi-Fi之間的共存干擾問題。這個議題在第三代合作夥伴計劃(3GPP)的標準36.816 [1]已經被密集的討論過,主要可歸納為三種候選方案:功率控制(PC)、分頻多工(FDM)、分時多工(TDM)等技術。藉由兼容這三者的優點,即同時利用FDM方案使兩個無線電操作頻段盡可能遠離、藉由TDM避免多個無線電同時啟用並將傳送功率控制在可容忍的範圍內,可以得到一個最佳解決方案。然而,如此做法可能因為與現有的標準相容性的問題而變得不可行,故尚需更進一步的探討。
非連續接收機制(DRX)原本設計是為了省電,使用者只能在每周期的一段固定時間內進行通訊,而在剩下的時間睡眠來達到省電的效果。而這樣的周期性的開關機制卻讓二個無線電可以分別在不同時間進行通訊。因為DRX已納入LTE標準,相容性的考量讓以DRX為基礎的TDM解決方顯得較為實用。但由於DRX機制會將時間資源分別給LTE和Wi-Fi傳收端,使得二者的系統吞吐量都下降。因此,我們不希望太早啟用這種機制以盡可能地避免吞吐量下降;另一方面,如果太晚啟用,沒有被處理的共存干擾又會降低訊號干擾雜訊比(SINR)並且導致封包遺失。我們試著找出啟動與解除DRX機制的最佳時間點,並且在正確的時間點傳送觸發訊號給基地台。我們利用在適當時間區間量測到的SINR來決定是否要啟上述機制並回報DRX的相關參數給基地台。根據模擬結果,在LTE下傳吞吐量的效能上,我們提出的方法會比以下另外二種方式來得好:第一種是當Wi-Fi一開啟就立即啟用DRX機制;第二種則是不啟用任何的共存干擾解決方案。我們的方法不但可以同時滿足LTE和Wi-Fi的吞吐量要求,也可使LTE 下傳吞吐量達到最大。
正如前述,為了要決定何時要啟動DRX機制,我們需要量測SINR的比值,因此如何在實體層精準地量測SINR是我們要探討的另一個議題。我們利用二階回歸模型[3]來同時估測通道增益和其對應的SINR。根據模擬結果,這種做法會比傳統的最小平方誤差通道估計器效能更好。
我們分別用電腦模擬和數學分析來估算和比較開啟DRX後LTE和Wi-Fi吞吐量,根據這些分析結果我們可以評價各種DRX參數對於吞吐量的影響。我們推導得到了傳送一個封包所費時間的機率密度函數,以及其相關的變異數以及平均吞吐量。雖然Wi-Fi系統只可在DRX的睡眠期間啟動,它可以趁這期間結束前,根據已知剩餘可用的時間調整封包大小。在這種情況下,Wi-Fi平均的吞吐量大約會等於睡眠時間和DRX週期的比值,此理論推導也藉由模擬得到驗證。另外,我們採用FD-PF[15]公平資源排程機制來進行LTE吞吐量分析,模擬結果符合LTE可用時間比例與吞吐量一同增加的預期。我們可以從模擬結果發現此排程機制所導致的一些特徵,並利用它們來建立可同時滿足LTE和Wi-Fi吞吐量需求的DRX參數選擇方針。 To make the most of a heterogeneous network in which multiple radio systems coexist, an user equipment (UE) should have the capability to select the most efficient system where the efficiency is measured by the throughput/power ratio. Such a capability requires that UEs are equipped with built-in multiple transceivers so that it can adapts to the environment and choose, for example, Wi-Fi or Bluetooth, to connect to the network instead of a regular macro-cellular system, resulting in power/bandwidth saving for both the UE and network. A device with such multiple radio capability can also be used as a relay between a macro system and a small cell to offload the network traffic. A serious interference problem arises when applications similar to the latter scenario are called for unless the assigned bands for these radios are sufficiently separated. When both transceivers in the same device are active, the transmitting radio signal will interfere the other radio's receiver. It is the purpose of this thesis to find solutions for mitigating this in-device coexistence interference(IDCI). We focus on IDCI between LTE and Wi-Fi systems. This particular issue has been intensively discussed in 3GPP specification 36.816 [1] from which three major candidates solutions have emerged, namely, Power control solution, Frequency Division Multiplexing (FDM) solution, and Time Division Multiplexing (TDM) solution. An optimal one would bear the flavors of all three approaches, i.e., one invokes the FDM solution to select the available bands as far apart as possible, and the TDM solution to avoid simultaneous activation while control the transmit power to a tolerable range. Unfortunately, such a solution is often not realizable as it may not compatible with the existing LTE standard and modifications are needed. As a result, the Discontinue Reception (DRX) based TDM solution is much more practical as DRX had been standardized by LTE. DRX is originally designed for energy-saving purpose that allows only a fractional wake-up interval within each predetermined period. However, such a periodic on-off clock can be used to serve two radios in disjoint time intervals. Since the DRX based solution divides the time resources into two parts for the LTE and Wi-Fi transceivers, it will degrade the throughput of both systems. Hence we do not want to activate such an IDCI solution too early to avoid degradation of throughput as much as possible. On the other hand, if we invoke the IDCI solution too late, the coexistence interference will reduce SINR and cause packet loss. In this study, we try to optimize the activation time for the DRX based solution, sending a trigger signal to eNB at the right time. We also find the optimal deactivation time. Based on measured SINR over a proper time period, we decide whether or not to trigger the IDC solution and report the associated DRX parameters to the serving eNB. For determining when to trigger the DRX based solution, we need to measure the signal quality such as Signal to Interference plus Noise Ratio (SINR). Hence, how to measure SINR accurately in the physical layer is another issue we want to discuss. We use the regression model based approach of [3] to jointly estimate the channel gain and the correspondent SINR. It is shown that this approach outperforms the conventional least square (LS) based channel estimator. Our simulations indicate that the LTE DL throughput performance of the proposed solution is better than the other two alternatives: the first one triggers the DRX based solution whenever WiFi is on while the second one does not employ any IDCI solution. Our method not only satisfies the LTE and WiFi throughput constraints but also maximizes the LTE DL throughput. The WiFi and LTE throughput performance is estimated by both computer simulation and analysis. The analytical results enable us to assess the effects of the DRX parameters on the throughput performance. We derive the probability density function (pdf) of the packet transmission time and evaluate the mean throughput and the associated variance. As WiFi can use only the off-duration of the DRX clock, it can adjust the packet size according the timing information. In that case, the average throughput is approximately equal to the fraction of the off-duration. The theoretical derivations are verified by simulations. For the LTE throughput analysis, we adopt the fairness-aware FD-PF scheduling scheme. The simulation results are consistent with the anticipation that the throughput improves with the increase availability of the LTE time resource. We establish guidelines for the system to select appropriate DRX parameters to meet both LTE and WiFi's throughput requirements. |
URI: | http://140.113.39.130/cdrfb3/record/nctu/#GT079913527 http://hdl.handle.net/11536/49307 |
Appears in Collections: | Thesis |
Files in This Item:
If it is a zip file, please download the file and unzip it, then open index.html in a browser to view the full text content.