On Sat, Jan 27, 2001 at 10:54:18AM -0500, Ed Warnicke wrote:
> Is there anyone else interested in looking at what it would take to
> allow ethereal to function efficiently within a memory budget for
> very large captures?
I already know a lot of what it would take.
It'd take replacing the GtkCList with a widget that doesn't allocate a
copy of every single string in every single row and column of the packet
list, but, instead, does something such as calling a callback function
to get the text to put in a given row and column. That callback
function would then load the packet for that row (if it's not already
loaded), dissect it (if it's not already dissected), and return the
string for the column in question.
I have such a widget, and a very much *N*O*T*-ready-for-prime-time
version of Ethereal that uses it; however, making Ethereal handle that
involves a lot more than just sticking that widget in there - for
example, in order to make scrolling through a large compressed capture
not suck, we'd have to replace the current code that just uses
"gzread()" and company to read stuff from a capture file with code that
(along the lines of what the code to read a compressed Sniffer file
does) keeps track of the offsets in both the raw file and the
uncompressed data stream of the beginnings of each compressed chunk, and
does seeks to random offsets in the data stream by figuring out which
compressed chunk that offset is in, reading in that entire chunk if it's
not already in the buffer, and then moving to the appropriate offset
within that buffer.
*That* code isn't written yet.