In this paper, we propose two spectrally efficient adaptive partial decode-and-forward (DF)cooperative communication schemes, which are adaptive partialrepetition DF with quantized feedback (APR-DF-QF) and adaptivepartial coded cooperation DF with quantized feedback (APCC-DF-QF).We assume the relay node only has partial channel-state information, which is obtainedvia an quantized feedback link. We use the so-called mutual information (MI)model to adaptively optimize the amount of data that need to beforwarded by the relay node under a given block-error-rateconstraint. Simulation results show that with the optimized feedback, the MI model can predict well theoptimal amount of information that needs to be forwarded by therelay node, and that the two proposed schemes can substantiallyincrease the spectral efficiency.
In this paper, we present an improved detector for ACK/NACK message detection in the LTE uplink control channel with imperfect channel state information at the receiver. The detector is based on the generalized likelihood-ratio test (GLRT) paradigm. We derive detection metrics for the cases when the noise variances at the receiver are known and unknown. Noise here may comprise both thermal noise and interference. Simulation results show remarkable performance gains of the GLRT-based detector with unknown noise variances compared to the training-based maximum-likelihood detector with unknown noise variances when the noise variances in two slots are different. Furthermore, the performance of the GLRT-based detector with unknown noise variances is nearly the same as that of the training-based maximum-likelihood detector with known noise variances.
In this paper, we study ACK/NACK messages detection in the LTE physical uplink control channel (PUCCH) with multiple receive antennas. The LTE PUCCH is typically characterized by high interference variability due to severe inter-user interferences and slot-level frequency hopping. We present detection methods applicable for the cases when the noise variances at the receiver are known and unknown. Noise here may comprise both thermal noise and interference. The proposed detection technique is based on the generalized likelihood-ratio test (GLRT) paradigm. Simulation results show that GLRT-based detector offers a significant gain over the training-based maximum-likelihood detector when the noise variances in two slots are different and unknown. For the case when the noise variances at the receiver are known, the GLRT-based detector has nearly the same performance as the training-based maximum-likelihood detector.
Dynamic OFDMA has been recognized as a promising technique forimproving the performance of future wireless cellularsystems. However, this potential performance improvement comes at thecost of additional signaling overhead, which can have a non-negligibleeffect on the system efficiency. In this paper, we propose a newmethod for optimizing the frame length for the downlink in OFDMAsystems. The method maximizes the system efficiency by taking intoaccount both the channel conditions and the amount of signalingoverhead needed to deliver scheduling maps to the users. We formulatethe frame length optimization problem mathematically. By exploitingthe structure of this problem, we develop an algorithm that solves asequence of dynamic programming problems. Simulation results revealsome insight into fundamental limitations as well as provide guidelinesfor the design of dynamic OFDMA systems.