> I even put some printf statements in packet-eth.c, right before it calls
> ethertype(). When I decode with ethereal, I see it, when I decode with
> tethereal -V, I don't.
Try running tethereal with a debugger, with a breakpoint set at that
point in "packet-eth.c", and then step to see what's happening.
Given that "tethereal -V" *does* recognize IP packets using an ethertype
of 0x0800:
tooting$ tethereal -V -r /u/guy/captures/dhcp-crap.pcap.gz
...
Frame 4 (78 on wire, 78 captured)
Arrival Time: May 19, 1999 17:48:41.0524
Time delta from previous packet: 0.100770 seconds
Frame Number: 4
Packet Length: 78 bytes
Capture Length: 78 bytes
Ethernet II
Destination: XX:XX:XX:XX:XX:XX (XX:XX:XX:XX:XX:XX)
Source: YY:YY:YY:YY:YY:YY (YY:YY:YY:YY:YY:YY)
Type: IP (0x0800)
Internet Protocol
Version: 4
Header length: 20 bytes
Differentiated Services Field: 0xc0 (DSCP 0x30: Class Selector 6)
1100 00.. = Differentiated Services Codepoint: Class Selector 6 (0x30)
.... ..00 = Currently Unused: 0
Total Length: 64
Identification: 0x2571
Flags: 0x04
.1.. = Don't fragment: Set
..0. = More fragments: Not set
Fragment offset: 0
Time to live: 1
Protocol: OSPF (0x59)
Header checksum: 0x64b6 (correct)
Source: XXX.netapp.com (XX.XX.XX.XX)
...
it's next to impossible that, in a correctly-compiled Tethereal, it
would fail to call "ethertype()" if "-V" is specified, as the only way
for it *to* dissect that packet as IP is to call "ethertype()"....