[webkit-dev] An oddity with windowScriptObject:setValue and apple events
Ryan Grimm
grimm at xqdev.com
Sun Feb 1 14:34:29 PST 2009
This may sound very strange (it does to me), but I've been stumped by
this problem for weeks now so I figured I'd ask the list.
I'm working on embedding WebKit into an existing Mac application that
exposes an API. The API gives me a CGrafPtr to render into. So to
get WebKit into the window I simply call:
HIViewRef webView, contentView;
WindowRef window = GetWindowFromPort(myCGrafPtr);
WebInitForCarbon();
HIWebViewCreate(&webView);
HIViewFindByID(HIViewGetRoot(window), kHIViewWindowContentID,
&contentView);
HIViewSetFrame(webView, &bounds);
HIViewAddSubview(contentView, webView);
HIViewSetVisible(webView, true);
Which works fantastically. I then try to expose some new functions to
javascript via a windowScriptObject object:
WebView* nativeView;
nativeView = HIWebViewGetWebView(webView);
[[nativeView windowScriptObject] setValue:Metadata forKey:@"metadata"];
The addition of my Metadata object into javascript also succeeds.
However, here's crux of my problem. Whenever one of these functions
in javascript is called (eg: window.metadata.getUser()), all of the
apple events that the main application has registered on the window
become unusable. For example, if you were to execute the following
applescript:
tell application "ApplicationWithWebKit" to set user to "foobar"
It would return with an error of -1728: Can't get <reference>. If you
were to execute the same applescript before the javascript function is
called it would execute correctly.
Anyone have any thoughts on what is going on here?
Thanks.
--Ryan
More information about the webkit-dev
mailing list