Scientific approach:
Traditional queuing theory techniques based on Poisson traffic models were
essential for the development of telephone networks. Today's multimedia
applications produce complex traffic patterns that result from the
statistically multiplexed data, voice, image, and video patterns. For
networks carrying such diverse applications, traditional traffic models
have proved inadequate and incapable of capturing essential characteristics
of the traffic patterns. In such an environment, computer simulation and
empirical techniques have begun to play an important role in designing
current and future networks. We use hardware and software tools to
capture traffic patterns from local and wide area networks, and
We utilize modern statistical tools and techniques to characterize collected
data.
Novelty of research:
Internet traffic characterization work has only recently been shown to be
promising due to the presence of the traffic ``invariants'' detected in
traffic traces. Even with the availability of the emerging traffic models,
it is not yet known what impact these will have on designing and
provisioning data networks, and on selecting optimal connection admission
and congestion control algorithms. The goal of my research is to answer
some of these questions.
Significance:
The essential difference between traditional traffic modeling and the
self-similar models has practical implications for the engineering of
communication networks (buffer requirements in routers, Web servers,
admission and congestion control algorithms, traffic management, and
Quality of Service requirements). Other important areas that may benefit
from reliable and meaningful traffic traces are novel pricing policies and
tariffing strategies for existing and future Internet services.
Last updated Tuesday October 1 22:11:05 PDT 2002.