[Webkit-unassigned] [Bug 50794] [chromium] const char* used for strings in a few places in the WebKit API

bugzilla-daemon at webkit.org bugzilla-daemon at webkit.org
Thu Dec 9 18:26:17 PST 2010


https://bugs.webkit.org/show_bug.cgi?id=50794


James Robinson <jamesr at chromium.org> changed:

           What    |Removed                     |Added
----------------------------------------------------------------------------
            Summary|[chromium] Change           |[chromium] const char* used
                   |WebGraphicsContext3D to use |for strings in a few places
                   |const WebString& instead of |in the WebKit API
                   |const char*                 |
          Component|WebGL                       |New Bugs
         AssignedTo|webkit-unassigned at lists.web |jamesr at chromium.org
                   |kit.org                     |
                 CC|                            |fishd at chromium.org




--- Comment #1 from James Robinson <jamesr at chromium.org>  2010-12-09 18:26:17 PST ---
Broadening the scope.  I think it'll be easier to do a change like this across all Chromium WebKit APIs at once (which I don't mind doing).

A few bits of the chromium WebKit API use const char* for string data of a known encoding rather than something like WebString/WebCString.  I think we should eliminate these and only use const char*/size_t pairs for arbitrary binary data.  Current cases:

WebKitClient uses const char* for strings in the stats counter, tracing, and histogram APIs.  It looks like the original use of const char* was inherited from the initial import into the WebKit repo and then more callers followed suit.  The callers to this API typically pass in string literals, so WebCString seems like a good fit especially as strlen() will be a compile-time constant for these callers.

WebGraphicsContext3D uses const char* to mirror OpenGL APIs.  In this case the input is sometimes a string literal and sometimes a programatically computed string (possibly generated from javascript), but the strings always travel through a WTF::String before reaching WebGraphicsContext3D.  It looks like callers compute this string by calling str.utf8().data() but the implementation then uses strlen() on the string, assuming only ASCII data.  I'm pretty sure that's harmless in this case but it is a bit odd.  The typical use case here is ASCII so my inclination is to define the API as WebCString and make sure that callers convert from WTF::String to ascii().

My plan of attack:

1.) Add implementations for affected functions in the chromium repo that accept WebCString alongside the implementations that accept const char*.  The new implementations will be (temporarily) unreachable but should compile fine alongside the old ones since they differ in signature.
2.) Wait for rolls
3.) Switch the WebKit APIs over to use the new types
4.) Wait for rolls pt 2
5.) Remove the const char* implementations from Chromium.

Darin, as arbiter of the chromium WebKit API what do you think of all the above?  Should we also add a lint rule for const char* without size_t?

-- 
Configure bugmail: https://bugs.webkit.org/userprefs.cgi?tab=email
------- You are receiving this mail because: -------
You are the assignee for the bug.



More information about the webkit-unassigned mailing list