[webkit-dev] 8 Bit Strings Turned On in JavaScriptCore

Michael Saboff msaboff at apple.com
Fri Nov 18 11:31:20 PST 2011


On Nov 18, 2011, at 11:18 AM, Adam Roben wrote:

> On Nov 18, 2011, at 2:14 PM, Michael Saboff wrote:
> 
>> Although the UChar* characters() method for the various string classes still works, all new code should check what "flavor" a string is constructed by using the new is8Bit() method on the various string classes.  After determining the flavor, a call to either LChar* characters8() or UChar* characters16() as appropriate should be done to access the raw characters of a string.  The call to characters() on an 8 bit string will create a 16 bit buffer and convert the native 8 bit string, keeping the conversion for future use, before returning the 16bit result.  Obviously the expense of this conversion grows with a string's length and it increases the memory footprint beyond what was required by the original 16 bit string implementation.
> 
> I wonder what we could do to make it more obvious what the correct usage is. For example, we could rename characters() to make it clear that it might allocate a new buffer. And we could make it an error to call characters8() or characters16() on the wrong kind of string.

We have talked about renaming and ultimately eliminating characters(), At this point, it may make sense to keep it but drastically reduce its use (error messages and the like).  I think a little more progress will help sort this out.

Concerning characters8() and characters16(), they have appropriate ASSERTS() if they are called incorrectly.

> -Adam
> 

- Michael



More information about the webkit-dev mailing list