Cdma2000 channel modeling based on the gilbert–elliot algorithm within urban infrastructure

Yu.Ya. Vakula

Lviv Polytechnic National University

Any radio channel transferring information is a subject to data corruption. Especially it's important for 3G mobile networks that use  high frequency carrier wave and significantly increased data rate in channel.  In this case data loss prediction mechanism gives a lot of opportunities for new network designing as well as for existed network improvement. Statistically a communication channel with error processes can be modelled regarding to one of the predefined patterns which describe particular environment influence on measured parameters of transferred data with a sufficient accuracy. Also some kind of improved models with memory can be implemented for channels where digital packet data are carried. In this paper the main advantages of Gilbert-Elliot algorithm for bit errors simulation in communication channel of cdma2000 mobile network are described.  This algorithm is widely used in simulation development of data. The Gilbert-Elliott model is a simple binary case of finite state Markov channel with two states  marked as "good" (G) and "bad" (B) states. Each of the states may generate errors with its own error rate eG and eB, respectively. As result it is a binary symmetric channel (BSC) with transition probabilities q to switch over from "good" state to "bad" state and p for reverse transition. The article shows that  it's possible to display error grouping in packets only if transition probability is much smaller than probability to stay in the same state. Otherwise errors occurring will be observed in random ordering. Error possibility in current state depends on the previous state. In that way the channel gets the memory feature. Few external factors can cause data corruption in radio channel of cdma2000 radio system. They are occurred as result of user movement, multi-path propagation,  interaction with large obstacles, interaction with other sources of radio waves or noises, etc. It's free space loss, slow fading, fast fading, interference and noise. Review of these factors that cause  bit error occurrence and ways to further mathematical modelling are given. CDMA system involves a lot of recently developed accomplishments in radio engineering and telecommunication areas. It’s the results of impressive improvement of technical characteristics and software acceleration, which moved today cellular systems to absolutely new performance level. But in the same time it requires a professional solution to develop high quality data models. As result a mathematical model of state machine for communication channel with errors burst is developed with MatLab 2013 R2b environment. Simple version of developed model provides good statistical data. But it requires few calculations per each bit of data. It causes a low system performance.  Based on the investigation a model with significant performance improvement in comparison with the basic version of model is given.