[webkit-dev] 8 Bit Strings Turned On in JavaScriptCore

Adam Roben aroben at apple.com
Fri Nov 18 11:18:25 PST 2011


On Nov 18, 2011, at 2:14 PM, Michael Saboff wrote:

> Although the UChar* characters() method for the various string classes still works, all new code should check what "flavor" a string is constructed by using the new is8Bit() method on the various string classes.  After determining the flavor, a call to either LChar* characters8() or UChar* characters16() as appropriate should be done to access the raw characters of a string.  The call to characters() on an 8 bit string will create a 16 bit buffer and convert the native 8 bit string, keeping the conversion for future use, before returning the 16bit result.  Obviously the expense of this conversion grows with a string's length and it increases the memory footprint beyond what was required by the original 16 bit string implementation.

I wonder what we could do to make it more obvious what the correct usage is. For example, we could rename characters() to make it clear that it might allocate a new buffer. And we could make it an error to call characters8() or characters16() on the wrong kind of string.

-Adam



More information about the webkit-dev mailing list