[Webkit-unassigned] [Bug 34560] Typedef JSChar to wchar_t in RVCT.

bugzilla-daemon at webkit.org bugzilla-daemon at webkit.org
Tue Feb 9 06:17:12 PST 2010


https://bugs.webkit.org/show_bug.cgi?id=34560





--- Comment #8 from Janne Koskinen <koshuin at gmail.com>  2010-02-09 06:17:10 PST ---
(In reply to comment #7)
> the type compatibility in C++ mode.
> 
> "a.cpp", line 13: Error:  #167: argument of type "JSChar *" is incompatible
> with parameter of type "UChar *"
>       foo(j);
>           ^
> a.cpp: 0 warnings, 1 error
> 
> 
> wchar_t in C++ is not a simple typedef of short int, it is a unique type with
> size of 2. So we need to typedef both JSChar and UChar to wchar_t for
> compatibility.

I find this piece odd. I ran your example with --cpp and --c90 and observed the
same results. however if you look at the changes made into webkit from Symbian
port you can see we don't define this and it works :S

Note that with Qt in JavaScriptCore/wtf/unicode/UnicodeQt4.h we have:

// ugly hack to make UChar compatible with JSChar in API/JSStringRef.h
#if defined(Q_OS_WIN) || COMPILER(WINSCW)
typedef wchar_t UChar;
#else
typedef uint16_t UChar;

That part is propably missing from your patch i.e. use the 'ugly hack' for RVCT
as well.

-Janne

-- 
Configure bugmail: https://bugs.webkit.org/userprefs.cgi?tab=email
------- You are receiving this mail because: -------
You are the assignee for the bug.



More information about the webkit-unassigned mailing list