On Nov 12, 2003, at 4:39 PM, Brad Hards wrote:
That is in trouble if the g_malloc fails (which I assume it can - glib
documentation isn't that great).
No - "g_malloc()" aborts on failure:
http://developer.gnome.org/doc/API/glib/glib-memory-allocation.html
"Note: If any call to allocate memory fails, the application is
terminated. This also means that there is no need to check if the call
succeeded."
and
http://developer.gnome.org/doc/API/2.0/glib/glib-Memory-Allocation.html
"If any call to allocate memory fails, the application is terminated.
This also means that there is no need to check if the call succeeded."
Also, is the output of sprintf assured to be only the size of the
formatted
input strings?
No, but the output of "snprintf()" is assured not to exceed the
specified buffer length; "snprintf()" should probably be used instead
of "sprintf()" (or "g_strdup_printf()" should be used, or
"g_string_sprintf()").
I'm thinking about platforms that use unicode variations...
No strings should be Unicode inside Ethereal - "sprintf()" and company
should never use Unicode.
In the long term, Ethereal should probably use UTF-8 and/or 2-byte
Unicode internally for string values from the packet and for the
"representation" string for protocol tree entries, and
1) let the GUI toolkit handle displaying it for GTK+ 2.x (and other
GUI toolkits that use Unicode or UTF-8 natively) and, for GTK+ 1.2[.x],
do whatever is appropriate to display whatever characters it can;
2) deal with text output to files, and printed output, at the time
they're generated. (Text output to files is tricky - on Windows it
should perhaps use Unicode, but I don't know whether all the editors
one might use on them can handle 2-byte Unicode text files; on Unix, it
might be UTF-8, or some other character set.)