[Webkit-unassigned] [Bug 67335] WebSocket binaryType should not appear until it is functional
bugzilla-daemon at webkit.org
bugzilla-daemon at webkit.org
Tue Sep 13 06:50:29 PDT 2011
https://bugs.webkit.org/show_bug.cgi?id=67335
--- Comment #6 from Joel Martin <webkit at martintribe.org> 2011-09-13 06:50:29 PST ---
Okay, that's very good to hear that the binaryType attribute won't be visible until it is functional.
However, it still doesn't fully solve my problem due to the WebIDL prototype attributes issue. Without the attributes visible on the prototype it means that object detection can't be done without instantiating the object first. This isn't a problem with most objects because you can instantiate them without side-effects. However, in the case of a WebSocket object, instantiating triggers the opening of a network connection.
In my real-world case, I need to be able to determine before I create a connection whether the WebSocket object support binary data because this affects the values that I place in the sub-protocol field when I instantiate it.
Right now, the WebSocket prototype looks the same no matter what functionality is supported.
So yes, in this case at least, I very much object to binaryType not being visible in the prototype. However, for general efficiency of object detection, I think that the WebIDL spec as defined is correct: attributes should be visible in the prototype, even if they are object specific. This is correct in in firefox and opera so I will file a separate bug for this.
Can you suggest an alternative mechanism for object detecting binary support without instantiating a WebSocket object (and thus triggered an unwanted network connection)?
--
Configure bugmail: https://bugs.webkit.org/userprefs.cgi?tab=email
------- You are receiving this mail because: -------
You are the assignee for the bug.
More information about the webkit-unassigned
mailing list