Wireshark-users: Re: [Wireshark-users] how to handle big files in wireshark
From: "j.snelders" <j.snelders@xxxxxxxxxx>
Date: Sat, 10 Jul 2010 12:15:02 +0200
Hi MK,

Did you try version 1.4.0rc1?
https://www.wireshark.org/lists/wireshark-dev/201006/msg00095.html
ftp://ftp.uni-kl.de/pub/wireshark/

Or the latest Automated Build?
http://www.wireshark.org/download/automated/

CACE Pilot
"Quickly open and analyze multi-gigabyte trace files"
"CACE Pilot is the only analysis tool to offer full integration with Wireshark."
http://www.cacetech.com/products/cace_pilot.html
Attend a QuickStart Webinar for the full hour and get a 10% discount from
the CACE Pilot or WiFi Pilot SRP. 

Get a Full-Featured 10-Day Trial of CACE Pilot
http://www.cacetech.com/products/CACE_Pilot_eval_request.html

My best
Joke

On Fri, 9 Jul 2010 20:03:17 -0400 Maverick wrote:
>I am trying to extract the application level protocol information like http,
>ssh, p2p, chat and I am not very good in programming myself to roll out
my
>own solution using libpcap library so thats why I am relying on wiresharks
>user interface. Is there any easier way that I can learn writting my own
>solution I tried some modules in python and perl but they lack documentation
>thats why I want to do my analysis on wireshark because a lot of things
are
>already implemented and it gives me results in nice the shape of nice
>summarized reports.
>
>On Fri, Jul 9, 2010 at 7:51 PM, Bryan Hoyt | Brush Technology <
>bryan@xxxxxxxxxxx> wrote:
>
>> Yeah, those are big files. I work with files of 100's of megabytes, so
>I
>> know how slow it can be. But I can imagine 7 Gb files would be a
>> show-stopper.
>>
>> What sort of analysis are you wanting to do? Is it possible that a
>> roll-your-own solution using libpcap to iterate through the file would
>do
>> the trick? Or do you really need the interactive UI goodness of Wireshark?
>>
>>  - Bryan
>>
>> --
>> PS. Check out the Brush newsletter: *Subscribe or read our previous
>> newsletters* <http://brush.co.nz/articles>
>>
>> Bryan Hoyt, *Web Development Manager*  --  Brush Technology
>> *Ph:* +64 3 942 7833     *Mobile:* +64 21 238 7955
>> *Web:* brush.co.nz
>> On Sat, Jul 10, 2010 at 11:40, Maverick <myeaddress@xxxxxxxxx> wrote:
>>
>>> Bryan you are write that way I can improve the performance a little bit
>>> but in my case pcap files are 6 or 7 Gbs so its not making much of a
>>> difference by disabling those features.
>>>
>>> MK
>>>
>>>
>>> On Fri, Jul 9, 2010 at 7:36 PM, Bryan Hoyt | Brush Technology <
>>> bryan@xxxxxxxxxxx> wrote:
>>>
>>>> I'm not an expert here, but isn't it possible to reduce the amount of
>>>> memory used by disabling all the protocols that you don't use (or even
>the
>>>> ones you do use, if you can live without them)?
>>>>
>>>> I think a lot of the memory usage comes from the specific protocols,
>not
>>>> just the wireshark core.
>>>>
>>>> Correct me if I'm wrong.
>>>>
>>>>  - Bryan
>>>>
>>>> On Sat, Jul 10, 2010 at 08:10, Maverick <myeaddress@xxxxxxxxx> wrote:
>>>>
>>>>> Thanks for the response , If I break files down into many pcap files
>is
>>>>> there any way that I can have access to all those broken files. Like
>if I
>>>>> select follow stream option would it be possible to get streams that
>are in
>>>>> the other broken files.
>>>>>
>>>>> Thanks
>>>>> MK
>>>>>
>>>>>
>>>>> On Fri, Jul 9, 2010 at 3:57 PM, Guy Harris <guy@xxxxxxxxxxxx> wrote:
>>>>>
>>>>>>
>>>>>> On Jul 9, 2010, at 12:46 PM, Maverick wrote:
>>>>>>
>>>>>> > I have huge pcap files in Gbs which I want to analyze using wireshark
>>>>>> but wireshark is extremely slow and crashes while opening those files.
>I
>>>>>> tried breaking those files into smaller files but thats not very good
>>>>>> solution as I have to open up each file and sometime relationship
between
>>>>>> files gets lost.
>>>>>> >
>>>>>> > Is there a decent way to handle huge files in wireshark .
>>>>>>
>>>>>> For now, the only way is "use a 64-bit version of Wireshark, make
sure
>>>>>> you have enough disk space/swap space to back up a large virtual address
>>>>>> space, and live with the slowness".
>>>>>>
>>>>>> There may be changes in the future to reduce the memory requirements,
>>>>>> but they're not trivial to make.