[webkit-reviews] review denied: [Bug 34033] svn-apply: Change svn-apply and svn-unapply to use new parsePatch(). : [Attachment 47299] Proposed patch

bugzilla-daemon at webkit.org bugzilla-daemon at webkit.org
Thu Feb 4 15:14:08 PST 2010


Eric Seidel <eric at webkit.org> has denied Chris Jerdonek
<cjerdonek at webkit.org>'s request for review:
Bug 34033: svn-apply: Change svn-apply and svn-unapply to use new parsePatch().
https://bugs.webkit.org/show_bug.cgi?id=34033

Attachment 47299: Proposed patch
https://bugs.webkit.org/attachment.cgi?id=47299&action=review

------- Additional Comments from Eric Seidel <eric at webkit.org>
I tend to make long loop blocks into their own functions (well, long blocks in
general):
 616	 for my $diffHashRef (@diffHashRefs) {

But perhaps this one makes sense inline like this.


Why is shouldForce supported here?
 627		     die "Index path $indexPath appears twice as a copy
target." unless $shouldForce;

Would that ever be a valid diff that we wanted to support processing of?  It
seems more like such might be caused by programmer error.  If would have
probably written it as a one-line perl if.

prepareParsedPatch seems mostly a sanity-checking method.  One which splits out
into the various arrays of hashs.

Maybe it should be called something with "sort" in the name, since it sorts
into buckets. Although that might get confusing as it's not sorting the order
of the diffs in any way.


I guess if you did split the for innards out into a separate function, it could
return the name of the hash and the outer loop could actually do the push.

copiedFiles seems uneeded.  You could do a linear search in all 3 cases,
instead of having the special copiedFiles?  Or you could have a separate pass
over the list of hashes (either before or after splitting them out into
buckets) which makes sure that none of the index paths are doubled.  A separate
pass strikes me as cleaner than wrapping it into this loop here.  Maybe python
has just gotten me used to that pattern with all of it's
map/reduce/list-comprehension style processing where you tend to make lots of
separate passes over your data to do different things.

I assume we don't support applying multipel files, or do we?
29 my @diffHashRefs = parsePatch(*ARGV);
 130 
 131 print "Parsed " . @diffHashRefs . " diffs from patch file.\n";

If we do, we need to update the print message.


I'm sad that we do so much of svn-apply at the root level instead of using
nicely named subroutines.

the move-copy for loop is one example of a case where I'm surprised we don't
just make it a sub routine and call it:
 151 for my $copyDiffHashRef (@copyDiffHashRefs) {
Maybe that's a separate patch. :)

Can't perl return a list and decompose it on the fly?
 133 my $preparedPatchHash = prepareParsedPatch($force, @diffHashRefs);

So you could write that:

my (@copyDiffHashRefs, @ nonCopyDiffHashRefs, % sourceRevisions) =
prepareParsedPatch();

Or am I dreaming of python?

Why don't we just call the hash-key "svnFormattedDiff":
 305	 my $patch = $diffHashRef->{svnConvertedText}; # SVN-formatted diff

instead of adding a comment?

Does this work on windows to?  I guess we use CYGWIN?
 54 # Relative to root
 55 my $pattern = "WebKitTools/Scripts/webkitperl/*_unittest/*.pl";
4656 

Is this semantically differnet from having never defined the key?
 34	copiedFromPath => undef,

I'm not sure my little brain can parse those unit tests.  I guess I'll have to
trust you that they're sane.

Overall this looks like a great change.  I'd love to see your response to my
naming thoughts/questions before giving this a final r+ though.  Marking this
r- for now, you can re-mark it r? if you believe it should land unchanged, or
i'm happy to review any revised version now that I've read this one through and
understand it! :)


More information about the webkit-reviews mailing list