The common denominator in all wireless sensor networks (WSNs), regardless of their underlying application, is the use of the radio to communicate information extracted from the sensed environment and, more importantly, to coordinate with other nodes. Consequently, radio communication and intelligent usage of the radio is a critical component of wireless distributed system in general and WSNs in particular. Due to the low power nature of WSNs, the radio used for communication is especially susceptible to changes in the quality of the wireless medium resulting in packet losses which can be attributed to limited transmission power levels as well as multipath effects resulting from lack of frequency diversity. Studies have confirmed that low power wireless communication is unpredictable, is sensitive to changes in the environment and is known to significantly change over different time scales.
Recent studies have indicated the presence of a wide chasm between the real world radio channel behavior and existing radio channel models in wireless simulators. This leads to significant differences in performance of a system in simulation as compared to a real world deployment. Thus, improving wireless simulators by incorporating accurate and robust radio channel models will reduce the gap between simulation and real-world performance. To reach this goal, we believe it is required to collect data traces of packet reception information over long periods of time at fine granularity. This data would be the seed for creating radio channel models that would help simulate more realistic packet losses, thus helping application designers increase the robustness of their applications by accounting in simulation for losses in the wireless medium.
The fundamental motivation for our modeling approach is that observed traces display structure at different temporal scales. In Figure, for example, one can see that over a period of minutes the link seems to switch between two states: one with PRR=0.6 (approx.) and the other with PRR=0.8 (approx.). We call this the long-term dynamics wherein the PRR stays roughly constant for a period of a few seconds. Within these regimes of near-constant PRR, it is more likely to observe a bursty sequence 0000111111 than a wildly oscillating sequence 1010101101. We call this the short-term dynamics; In order to simulate realistically the behavior of links, we want a model that is flexible enough to replicate this multiscale structure, and we want to estimate its parameters (which determine its typical PRRs or its local burstiness) from observed traces.
We propose a novel multilevel approach involving Hidden Markov Models (HMMs) and Mixtures of Multivariate Bernoullis (MMBs) for modeling the long and short time scale behavior of links in wireless sensor networks, that is, the binary sequence or trace of packet receptions (1s) and losses (0s) in the link. We call our model, the Multi-level Markov model or simply, the M&M Model. In our approach, a HMM models the long-term evolution of the trace (level 1) as transitions among a set of unobserved, level-1 states. These states typically correspond to a roughly constant packet reception rate (as determined by the data) and might correspond to different regimes of the link. Within each level-1 state, the short-term evolution of the trace (level 2) is modeled by either another HMM or by a MMB. This captures the faster, but not random, variations of the sequence of packet receptions and losses. For more details regarding the model, link to pre-prints of our paper are available.