Ethereal-users: RE: [Ethereal-users] Help automating Historical network capture-rollover

Note: This archive is from the project's previous web site, ethereal.com. This list is no longer active.

From: "Cory Perry (SNL:434-951-7463)" <CPerry@xxxxxxx>
Date: Fri, 18 Nov 2005 15:49:55 -0500
Not sure what I might be troubleshooting at any point in time so
difficult to create filter. Same for packet size, if troubleshooting URL
strings and session information, that could be deep within packet.

I have thought of setting unlimited rollover but have been hitting my
nogin against a wall trying to figure out best way to handle files and
space management. How to automatically delete older files without
running out of space for new files.

Like several people, I would like to take a vacation once in a blue
moon. ;)

Thanks for response.


   



>Message: 1
>Date: Thu, 17 Nov 2005 11:00:30 -0500
>From: "David DuPre" <david@xxxxxxxxxxxxxxxx>
>Subject: RE: [Ethereal-users] Help automating Historical network
>capture-rollover
>
>To: "'Ethereal user support'" <ethereal-users@xxxxxxxxxxxx>
>Message-ID: <00e901c5eb90$09543a20$6a00a8c0@DellTechsup>
>
>You might consider capturing only partial packets. Try some tests with
capturing only the first 90bytes of each packet.
>Then analyze it...if that isn't enough expand it to 180bytes, and
check.
>You might find that you only need the first XXX bytes of the 1500 byte
packet to understand the problem you are researching. This could reduce
>the amount of data.
>
>Another possible option is to only capture packets with a payload...so
nothing smaller than XX bytes would be captured.
>This could hide a network error though...
>
>Hope that helps,

>David

>P.S. I run Ethereal on Linux 24x7 capturing filtered traffic. I set it
up for unlimited rollover at a specific file size. Then if I need to
>analyze a certain part of a day I use the mergecap to put the files
together and look at them as one large file.