[webkit-dev] type of JSChar
Artem Ananiev
Artem.Ananiev at Sun.COM
Thu Aug 9 02:30:38 PDT 2007
Darin Adler wrote:
> On Jul 27, 2007, at 4:03 AM, Alexey Proskuryakov wrote:
>
>> On 7/27/07 1:51 PM, "Simon Hausmann" <hausmann at kde.org> wrote:
>>
>>> Does anybody know/remember why JSChar is defined to wchar_t on
>>> Windows and if
>>> it is still needed?
>>
>> I think this was/is needed to match ICU's definition of UChar ("Define
>> UChar to be wchar_t if that is 16 bits wide; always assumed to be
>> unsigned. If wchar_t is not 16 bits wide, then define UChar to be
>> uint16_t. This makes the definition of UChar platform-dependent but
>> allows direct string type compatibility with platforms with 16-bit
>> wchar_t types.")
>
> That's correct. And for the same reasons we should follow the same
> pattern for UChar, even when ICU is not involved.
>
> I think that UnicodeQt4.h is the file that should be changed, not
> JSStringRef.h.
You say WebKit tries to follow ICU's UChar. If so, the real size of
wchar_t must be examined instead of just checking for WIN32 or _WIN32.
Otherwise the problem appears for every windows/g++ build without these
two defines. In the case of Qt (which doesn't use ICU), the problem can
be easily solved: define UChar in UnicodeQt4.h according to ICU's UChar
- see Simon's last email - but what about other platforms that don't
have their own Unicode implementation?
Thanks,
Artem
More information about the webkit-dev
mailing list