[webkit-reviews] review denied: [Bug 80444] Add a Setting to expose quantized, rate-limited MemoryInfo values : [Attachment 130487] Patch
bugzilla-daemon at webkit.org
bugzilla-daemon at webkit.org
Wed Mar 14 08:40:09 PDT 2012
Tony Gentilcore <tonyg at chromium.org> has denied Adam Barth
<abarth at webkit.org>'s request for review:
Bug 80444: Add a Setting to expose quantized, rate-limited MemoryInfo values
https://bugs.webkit.org/show_bug.cgi?id=80444
Attachment 130487: Patch
https://bugs.webkit.org/attachment.cgi?id=130487&action=review
------- Additional Comments from Tony Gentilcore <tonyg at chromium.org>
View in context: https://bugs.webkit.org/attachment.cgi?id=130487&action=review
I like the approach. Clever.
A few minor questions and comments. In particular, it would be great if we
could test this. r- for this version until the comments are addressed.
> Source/WebCore/ChangeLog:32
> + information to web pages. Web pages can only learn new data every
20
Does this restriction really do anything? Can't I just create as many iframes
as I want and check once in each?
> Source/WebCore/page/MemoryInfo.cpp:44
> +class HeapSizeCache {
Probably worth making this noncopyable.
Also, do you think it is sizable enough to warrant moving to its own file?
> Source/WebCore/page/MemoryInfo.cpp:54
> + void getCachedHeapSize(size_t& usedJSHeapSize, size_t& totalJSHeapSize,
size_t& jsHeapSizeLimit)
Even though this is only called in one place, the API makes it really easy to
get wrong. Should we consider either a struct return value or three separate
calls that return size_t's. I don't think the perf here is particularly
critical.
> Source/WebCore/page/MemoryInfo.cpp:65
> + const double TwentyMinutesInSeconds = 20 * 60;
Similar to the comment above the quantize() method, probably worth a comment
explaining why we do this.
> Source/WebCore/page/MemoryInfo.cpp:96
> + const float largestBucketSize = 4000000000.0; // Roughly 4GB.
This size is, quite unfortunately, not entirely unrealistic. Should we allow
the largest to be a bit bigger in order to future-proof a little better.
> Source/WebCore/page/MemoryInfo.cpp:101
> + size_t grainularity = nextPowerOfTen / 1000; // We want 3
signficant digits.
s/grainularity/granularity/
> Source/WebCore/page/MemoryInfo.cpp:103
> + for (int i = 0; i < numberOfBuckets; ++i) {
The buckets could be layout tested pretty easily, right?
Just add a test with multiple iframes that each create a reference to a string
of a different size, then print the heap size in each. Our expectations would
just check that they fall into expected buckets.
I ask mostly because I didn't review the math super closely, and it seems best
to just have a test do that for us.
> Source/WebCore/page/MemoryInfo.cpp:119
> + // We cross 2GB around bucket 79.
I don't follow, thought the largest bucket was 4G.
> Source/WebCore/page/Settings.cpp:214
> + , m_quantizedMemoryInfoEnabled(false)
Perhaps there should just be a tri-state:
m_memoryInfoAPI with possible values of MemoryInfoAPI::disabled
MemoryInfoAPI::quantized MemoryInfoAPI::precise
More information about the webkit-reviews
mailing list