Document Type

Technical Report

Publication Date

2004-05-21

Filename

wucse-2004-28.pdf

Technical Report Number

WUCSE-2004-28

Abstract

Due to the importance and complexity of the packet classification problem, a myriad of algorithms and re-sulting implementations exist. The performance and capacity of many algorithms and classification devices, including TCAMs, depend upon properties of the filter set and query patterns. Unlike microprocessors in the field of computer architecture, there are no standard performance evaluation tools or techniques avail-able to evaluate packet classification algorithms and products. Network service providers are reluctant to distribute copies of real filter sets for security and confidentiality reasons, hence realistic test vectors are a scarce commodity. The small subset of the research community who obtain real filter sets either limit performance evaluation to the small sample space or employ ad hoc methods of modifying those filter sets. In response to this problem, we present ClassBench, a suite of tools for benchmarking packet classification algorithms and devices. ClassBench includes a Filter Set Generator that produces synthetic filter sets that accurately model the characteristics of real filter sets. Along with varying the size of the filter sets, we provide high-level control over the composition of the filters in the resulting filter set. The tools suite also includes a Trace Generator that produces a sequence of packet headers to exercise the synthetic filter set. Along with specifying the relative size of the trace, we provide a simple mechanism for controlling locality of reference in the trace. While we have already found ClassBench to be very useful in our own research, we seek to initiate a broader discussion and solicit input from the community to guide the refinement of the tools and codification of a formal benchmarking methodology.

Comments

Permanent URL: http://dx.doi.org/10.7936/K7P55KV9

Share

COinS