[Webkit-unassigned] [Bug 79353] New: Use Unicode 6.1.0 when determining whether an identifier is acceptable or not

bugzilla-daemon at webkit.org bugzilla-daemon at webkit.org
Thu Feb 23 02:48:31 PST 2012


https://bugs.webkit.org/show_bug.cgi?id=79353

           Summary: Use Unicode 6.1.0 when determining whether an
                    identifier is acceptable or not
           Product: WebKit
           Version: 528+ (Nightly build)
          Platform: All
        OS/Version: All
            Status: UNCONFIRMED
          Severity: Normal
          Priority: P2
         Component: JavaScriptCore
        AssignedTo: webkit-unassigned at lists.webkit.org
        ReportedBy: mathias at qiwi.be


JavaScript currently uses an older version of the Unicode database. Here are some examples of identifiers that are currently failing because of this, even though they’re valid according to ES 5.1/Unicode 6.1:

* `var \u0cf1;` — http://mothereff.in/js-variables#%5Cu0cf1
* `var \ua7aa;` — http://mothereff.in/js-variables#%5Cua7aa
* `var \u1bba;` — http://mothereff.in/js-variables#%5Cu1bba
* `var a\ua674;` — http://mothereff.in/js-variables#a%5Cua674

Of course, there are many more.

Updating to Unicode 6.1 would improve interoperability.

Is the list of allowed characters in `IdentifierStart` and `IdentifierPart` auto-generated based on a UnicodeData.txt file, or how is this done in JavaScriptCore?

-- 
Configure bugmail: https://bugs.webkit.org/userprefs.cgi?tab=email
------- You are receiving this mail because: -------
You are the assignee for the bug.


More information about the webkit-unassigned mailing list