2012-05-02 04:18:18 +08:00
|
|
|
#!/usr/bin/perl
|
2006-02-21 02:57:29 +08:00
|
|
|
# Copyright (C) 2006, Eric Wong <normalperson@yhbt.net>
|
|
|
|
# License: GPL v2 or later
|
2010-09-25 04:00:52 +08:00
|
|
|
use 5.008;
|
perl: check for perl warnings while running tests
We set "use warnings" in most of our perl code to catch problems. But as
the name implies, warnings just emit a message to stderr and don't
otherwise affect the program. So our tests are quite likely to miss that
warnings are being spewed, as most of them do not look at stderr.
We could ask perl to make all warnings fatal, but this is likely
annoying for non-developers, who would rather have a running program
with a warning than something that refuses to work at all.
So instead, let's teach the perl code to respect an environment variable
(GIT_PERL_FATAL_WARNINGS) to increase the severity of the warnings. This
can be set for day-to-day running if people want to be really pedantic,
but the primary use is to trigger it within the test suite.
We could also trigger that for every test run, but likewise even the
tests failing may be annoying to distro builders, etc (just as -Werror
would be for compiling C code). So we'll tie it to a special test-mode
variable (GIT_TEST_PERL_FATAL_WARNINGS) that can be set in the
environment or as a Makefile knob, and we'll automatically turn the knob
when DEVELOPER=1 is set. That should give developers and CI the more
careful view without disrupting normal users or packagers.
Note that the mapping from the GIT_TEST_* form to the GIT_* form in
test-lib.sh is necessary even if they had the same name: the perl
scripts need it to be normalized to a perl truth value, and we also have
to make sure it's exported (we might have gotten it from the
environment, but we might also have gotten it from GIT-BUILD-OPTIONS
directly).
Signed-off-by: Jeff King <peff@peff.net>
Signed-off-by: Junio C Hamano <gitster@pobox.com>
2020-10-22 11:24:00 +08:00
|
|
|
use warnings $ENV{GIT_PERL_FATAL_WARNINGS} ? qw(FATAL all) : ();
|
2006-02-16 17:24:16 +08:00
|
|
|
use strict;
|
|
|
|
use vars qw/ $AUTHOR $VERSION
|
2020-06-23 02:04:12 +08:00
|
|
|
$oid $oid_short $oid_length
|
|
|
|
$_revision $_repository
|
2009-05-15 09:27:15 +08:00
|
|
|
$_q $_authors $_authors_prog %users/;
|
2006-02-16 17:24:16 +08:00
|
|
|
$AUTHOR = 'Eric Wong <normalperson@yhbt.net>';
|
2006-07-06 15:14:16 +08:00
|
|
|
$VERSION = '@@GIT_VERSION@@';
|
2006-03-30 14:37:18 +08:00
|
|
|
|
2012-07-27 08:26:04 +08:00
|
|
|
use Carp qw/croak/;
|
|
|
|
use File::Basename qw/dirname basename/;
|
|
|
|
use File::Path qw/mkpath/;
|
|
|
|
use File::Spec;
|
|
|
|
use Getopt::Long qw/:config gnu_getopt no_ignore_case auto_abbrev/;
|
|
|
|
use Memoize;
|
|
|
|
|
2012-07-27 07:22:24 +08:00
|
|
|
use Git::SVN;
|
2012-07-27 08:26:04 +08:00
|
|
|
use Git::SVN::Editor;
|
|
|
|
use Git::SVN::Fetcher;
|
|
|
|
use Git::SVN::Ra;
|
|
|
|
use Git::SVN::Prompt;
|
2012-07-27 08:26:01 +08:00
|
|
|
use Git::SVN::Log;
|
2012-07-27 08:26:03 +08:00
|
|
|
use Git::SVN::Migration;
|
2012-07-27 07:22:22 +08:00
|
|
|
|
2012-07-28 17:38:26 +08:00
|
|
|
use Git::SVN::Utils qw(
|
|
|
|
fatal
|
|
|
|
can_compress
|
|
|
|
canonicalize_path
|
|
|
|
canonicalize_url
|
2012-07-28 17:38:29 +08:00
|
|
|
join_paths
|
2012-07-28 17:47:50 +08:00
|
|
|
add_path_to_url
|
2012-07-28 17:47:52 +08:00
|
|
|
join_paths
|
2012-07-28 17:38:26 +08:00
|
|
|
);
|
|
|
|
|
2012-07-27 08:26:02 +08:00
|
|
|
use Git qw(
|
|
|
|
git_cmd_try
|
|
|
|
command
|
|
|
|
command_oneline
|
|
|
|
command_noisy
|
|
|
|
command_output_pipe
|
|
|
|
command_close_pipe
|
|
|
|
command_bidi_pipe
|
|
|
|
command_close_bidi_pipe
|
2016-10-14 08:27:53 +08:00
|
|
|
get_record
|
2012-07-27 08:26:02 +08:00
|
|
|
);
|
|
|
|
|
2012-07-27 08:26:04 +08:00
|
|
|
BEGIN {
|
|
|
|
Memoize::memoize 'Git::config';
|
|
|
|
Memoize::memoize 'Git::config_bool';
|
|
|
|
}
|
|
|
|
|
2012-07-27 08:26:02 +08:00
|
|
|
|
2007-10-16 22:36:50 +08:00
|
|
|
# From which subdir have we been invoked?
|
|
|
|
my $cmd_dir_prefix = eval {
|
|
|
|
command_oneline([qw/rev-parse --show-prefix/], STDERR => 0)
|
|
|
|
} || '';
|
|
|
|
|
2007-02-15 08:04:10 +08:00
|
|
|
$Git::SVN::Ra::_log_window_size = 100;
|
2006-03-30 14:37:18 +08:00
|
|
|
|
2011-10-15 06:53:31 +08:00
|
|
|
if (! exists $ENV{SVN_SSH} && exists $ENV{GIT_SSH}) {
|
|
|
|
$ENV{SVN_SSH} = $ENV{GIT_SSH};
|
|
|
|
}
|
|
|
|
|
|
|
|
if (exists $ENV{SVN_SSH} && $^O eq 'msys') {
|
|
|
|
$ENV{SVN_SSH} =~ s/\\/\\\\/g;
|
|
|
|
$ENV{SVN_SSH} =~ s/(.*)/"$1"/;
|
2009-08-19 07:54:40 +08:00
|
|
|
}
|
|
|
|
|
2007-01-12 18:35:20 +08:00
|
|
|
$Git::SVN::Log::TZ = $ENV{TZ};
|
2006-02-16 17:24:16 +08:00
|
|
|
$ENV{TZ} = 'UTC';
|
git-svn: add --follow-parent and --no-metadata options to fetch
--follow-parent:
This is especially helpful when we're tracking a directory
that has been moved around within the repository, or if we
started tracking a branch and never tracked the trunk it was
descended from.
This relies on the SVN::* libraries to work. We can't
reliably parse path info from the svn command-line client
without relying on XML, so it's better just to have the SVN::*
libs installed.
This also removes oldvalue verification when calling update-ref
In SVN, branches can be deleted, and then recreated under the
same path as the original one with different ancestry
information, causing parent information to be mismatched /
misordered.
Also force the current ref, if existing, to be a parent,
regardless of whether or not it was specified.
--no-metadata:
This gets rid of the git-svn-id: lines at the end of every commit.
With this, you lose the ability to use the rebuild command. If
you ever lose your .git/svn/git-svn/.rev_db file, you won't be
able to fetch again, either. This is fine for one-shot imports.
Also fix some issues with multi-fetch --follow-parent that were
exposed while testing this. Additionally, repack checking is
simplified greatly.
git-svn log will not work on repositories using this, either.
Signed-off-by: Eric Wong <normalperson@yhbt.net>
Signed-off-by: Junio C Hamano <junkio@cox.net>
2006-06-28 10:39:13 +08:00
|
|
|
$| = 1; # unbuffer STDOUT
|
2006-02-16 17:24:16 +08:00
|
|
|
|
2012-04-02 21:52:34 +08:00
|
|
|
# All SVN commands do it. Otherwise we may die on SIGPIPE when the remote
|
|
|
|
# repository decides to close the connection which we expect to be kept alive.
|
|
|
|
$SIG{PIPE} = 'IGNORE';
|
|
|
|
|
2012-05-03 03:53:50 +08:00
|
|
|
# Given a dot separated version number, "subtract" it from
|
|
|
|
# the SVN::Core::VERSION; non-negaitive return means the SVN::Core
|
|
|
|
# is at least at the version the caller asked for.
|
|
|
|
sub compare_svn_version {
|
|
|
|
my (@ours) = split(/\./, $SVN::Core::VERSION);
|
|
|
|
my (@theirs) = split(/\./, $_[0]);
|
|
|
|
my ($i, $diff);
|
|
|
|
|
|
|
|
for ($i = 0; $i < @ours && $i < @theirs; $i++) {
|
|
|
|
$diff = $ours[$i] - $theirs[$i];
|
|
|
|
return $diff if ($diff);
|
|
|
|
}
|
|
|
|
return 1 if ($i < @ours);
|
|
|
|
return -1 if ($i < @theirs);
|
|
|
|
return 0;
|
|
|
|
}
|
|
|
|
|
2010-02-24 11:13:50 +08:00
|
|
|
sub _req_svn {
|
|
|
|
require SVN::Core; # use()-ing this causes segfaults for me... *shrug*
|
|
|
|
require SVN::Ra;
|
|
|
|
require SVN::Delta;
|
2012-05-03 03:53:50 +08:00
|
|
|
if (::compare_svn_version('1.1.0') < 0) {
|
2010-02-24 11:13:50 +08:00
|
|
|
fatal "Need SVN::Core 1.1.0 or better (got $SVN::Core::VERSION)";
|
|
|
|
}
|
2006-12-16 15:58:07 +08:00
|
|
|
}
|
2012-07-27 07:22:22 +08:00
|
|
|
|
2020-06-23 02:04:12 +08:00
|
|
|
$oid = qr/(?:[a-f\d]{40}(?:[a-f\d]{24})?)/;
|
|
|
|
$oid_short = qr/[a-f\d]{4,64}/;
|
|
|
|
$oid_length = 40;
|
2007-01-14 14:35:53 +08:00
|
|
|
my ($_stdin, $_help, $_edit,
|
2009-06-24 01:02:08 +08:00
|
|
|
$_message, $_file, $_branch_dest,
|
2007-01-16 14:59:26 +08:00
|
|
|
$_template, $_shared,
|
2009-04-07 04:37:59 +08:00
|
|
|
$_version, $_fetch_all, $_no_rebase, $_fetch_parent,
|
2013-01-18 06:19:33 +08:00
|
|
|
$_before, $_after,
|
2019-11-23 06:59:29 +08:00
|
|
|
$_merge, $_strategy, $_rebase_merges, $_dry_run, $_parents, $_local,
|
2008-05-11 13:11:18 +08:00
|
|
|
$_prefix, $_no_checkout, $_url, $_verbose,
|
2014-12-07 18:47:23 +08:00
|
|
|
$_commit_url, $_tag, $_merge_info, $_interactive, $_set_svn_props);
|
2012-07-27 07:22:23 +08:00
|
|
|
|
|
|
|
# This is a refactoring artifact so Git::SVN can get at this git-svn switch.
|
|
|
|
sub opt_prefix { return $_prefix || '' }
|
|
|
|
|
2012-05-28 15:00:46 +08:00
|
|
|
$Git::SVN::Fetcher::_placeholder_filename = ".gitignore";
|
2009-03-31 02:31:41 +08:00
|
|
|
$_q ||= 0;
|
2007-01-19 09:50:01 +08:00
|
|
|
my %remote_opts = ( 'username=s' => \$Git::SVN::Prompt::_username,
|
|
|
|
'config-dir=s' => \$Git::SVN::Ra::config_dir,
|
2009-01-26 06:21:40 +08:00
|
|
|
'no-auth-cache' => \$Git::SVN::Prompt::_no_auth_cache,
|
2012-05-28 15:00:46 +08:00
|
|
|
'ignore-paths=s' => \$Git::SVN::Fetcher::_ignore_regex,
|
2013-05-04 07:10:18 +08:00
|
|
|
'include-paths=s' => \$Git::SVN::Fetcher::_include_regex,
|
2011-10-11 07:27:37 +08:00
|
|
|
'ignore-refs=s' => \$Git::SVN::Ra::_ignore_refs_regex );
|
2007-02-09 18:45:03 +08:00
|
|
|
my %fc_opts = ( 'follow-parent|follow!' => \$Git::SVN::_follow_parent,
|
2006-05-24 17:07:32 +08:00
|
|
|
'authors-file|A=s' => \$_authors,
|
2009-05-15 09:27:15 +08:00
|
|
|
'authors-prog=s' => \$_authors_prog,
|
2007-02-01 04:28:10 +08:00
|
|
|
'repack:i' => \$Git::SVN::_repack,
|
2007-02-12 07:21:24 +08:00
|
|
|
'noMetadata' => \$Git::SVN::_no_metadata,
|
|
|
|
'useSvmProps' => \$Git::SVN::_use_svm_props,
|
2007-02-17 11:57:29 +08:00
|
|
|
'useSvnsyncProps' => \$Git::SVN::_use_svnsync_props,
|
2007-02-15 08:04:10 +08:00
|
|
|
'log-window-size=i' => \$Git::SVN::Ra::_log_window_size,
|
2007-02-16 17:45:13 +08:00
|
|
|
'no-checkout' => \$_no_checkout,
|
2009-03-31 02:31:41 +08:00
|
|
|
'quiet|q+' => \$_q,
|
2007-02-01 04:28:10 +08:00
|
|
|
'repack-flags|repack-args|repack-opts=s' =>
|
|
|
|
\$Git::SVN::_repack_flags,
|
2007-11-22 21:44:42 +08:00
|
|
|
'use-log-author' => \$Git::SVN::_use_log_author,
|
2008-04-16 09:04:17 +08:00
|
|
|
'add-author-from' => \$Git::SVN::_add_author_from,
|
2009-01-18 12:10:14 +08:00
|
|
|
'localtime' => \$Git::SVN::_localtime,
|
2007-01-19 09:50:01 +08:00
|
|
|
%remote_opts );
|
2006-05-24 10:23:41 +08:00
|
|
|
|
2009-06-24 01:02:08 +08:00
|
|
|
my ($_trunk, @_tags, @_branches, $_stdlayout);
|
2007-02-18 18:34:09 +08:00
|
|
|
my %icv;
|
2007-02-15 04:27:41 +08:00
|
|
|
my %init_opts = ( 'template=s' => \$_template, 'shared:s' => \$_shared,
|
2009-06-24 01:02:08 +08:00
|
|
|
'trunk|T=s' => \$_trunk, 'tags|t=s@' => \@_tags,
|
|
|
|
'branches|b=s@' => \@_branches, 'prefix=s' => \$_prefix,
|
2007-07-14 17:25:28 +08:00
|
|
|
'stdlayout|s' => \$_stdlayout,
|
2009-07-25 15:00:50 +08:00
|
|
|
'minimize-url|m!' => \$Git::SVN::_minimize_url,
|
2007-02-18 18:34:09 +08:00
|
|
|
'no-metadata' => sub { $icv{noMetadata} = 1 },
|
|
|
|
'use-svm-props' => sub { $icv{useSvmProps} = 1 },
|
|
|
|
'use-svnsync-props' => sub { $icv{useSvnsyncProps} = 1 },
|
|
|
|
'rewrite-root=s' => sub { $icv{rewriteRoot} = $_[1] },
|
2010-01-23 16:30:00 +08:00
|
|
|
'rewrite-uuid=s' => sub { $icv{rewriteUUID} = $_[1] },
|
2007-02-15 04:27:41 +08:00
|
|
|
%remote_opts );
|
2006-06-28 10:39:12 +08:00
|
|
|
my %cmt_opts = ( 'edit|e' => \$_edit,
|
2012-05-28 15:00:46 +08:00
|
|
|
'rmdir' => \$Git::SVN::Editor::_rmdir,
|
|
|
|
'find-copies-harder' => \$Git::SVN::Editor::_find_copies_harder,
|
|
|
|
'l=i' => \$Git::SVN::Editor::_rename_limit,
|
|
|
|
'copy-similarity|C=i'=> \$Git::SVN::Editor::_cp_similarity
|
2006-06-28 10:39:12 +08:00
|
|
|
);
|
2006-06-13 06:53:13 +08:00
|
|
|
|
2006-02-16 17:24:16 +08:00
|
|
|
my %cmd = (
|
2007-01-05 10:09:56 +08:00
|
|
|
fetch => [ \&cmd_fetch, "Download new revisions from SVN",
|
2007-02-14 18:21:19 +08:00
|
|
|
{ 'revision|r=s' => \$_revision,
|
2007-02-16 19:22:40 +08:00
|
|
|
'fetch-all|all' => \$_fetch_all,
|
2009-04-07 04:37:59 +08:00
|
|
|
'parent|p' => \$_fetch_parent,
|
2007-02-14 18:21:19 +08:00
|
|
|
%fc_opts } ],
|
2007-02-17 10:45:01 +08:00
|
|
|
clone => [ \&cmd_clone, "Initialize and fetch revisions",
|
|
|
|
{ 'revision|r=s' => \$_revision,
|
2011-07-21 06:37:26 +08:00
|
|
|
'preserve-empty-dirs' =>
|
2012-05-28 15:00:46 +08:00
|
|
|
\$Git::SVN::Fetcher::_preserve_empty_dirs,
|
2011-07-21 06:37:26 +08:00
|
|
|
'placeholder-filename=s' =>
|
2012-05-28 15:00:46 +08:00
|
|
|
\$Git::SVN::Fetcher::_placeholder_filename,
|
2007-02-17 10:45:01 +08:00
|
|
|
%fc_opts, %init_opts } ],
|
2007-01-12 04:26:16 +08:00
|
|
|
init => [ \&cmd_init, "Initialize a repo for tracking" .
|
2006-06-01 06:49:56 +08:00
|
|
|
" (requires URL argument)",
|
2006-06-13 06:53:13 +08:00
|
|
|
\%init_opts ],
|
2007-02-15 04:27:41 +08:00
|
|
|
'multi-init' => [ \&cmd_multi_init,
|
|
|
|
"Deprecated alias for ".
|
|
|
|
"'$0 init -T<trunk> -b<branches> -t<tags>'",
|
|
|
|
\%init_opts ],
|
2007-01-14 19:14:28 +08:00
|
|
|
dcommit => [ \&cmd_dcommit,
|
|
|
|
'Commit several diffs to merge with upstream',
|
2006-12-16 15:58:08 +08:00
|
|
|
{ 'merge|m|M' => \$_merge,
|
|
|
|
'strategy|s=s' => \$_strategy,
|
2007-02-16 19:22:40 +08:00
|
|
|
'verbose|v' => \$_verbose,
|
2006-12-16 15:58:08 +08:00
|
|
|
'dry-run|n' => \$_dry_run,
|
2007-02-16 19:22:40 +08:00
|
|
|
'fetch-all|all' => \$_fetch_all,
|
2008-08-07 17:06:16 +08:00
|
|
|
'commit-url=s' => \$_commit_url,
|
2014-12-07 18:47:23 +08:00
|
|
|
'set-svn-props=s' => \$_set_svn_props,
|
2008-08-07 17:06:16 +08:00
|
|
|
'revision|r=i' => \$_revision,
|
2007-05-03 13:51:35 +08:00
|
|
|
'no-rebase' => \$_no_rebase,
|
2010-09-25 11:51:50 +08:00
|
|
|
'mergeinfo=s' => \$_merge_info,
|
2011-09-17 05:02:01 +08:00
|
|
|
'interactive|i' => \$_interactive,
|
2006-12-23 13:59:24 +08:00
|
|
|
%cmt_opts, %fc_opts } ],
|
2008-10-05 10:35:17 +08:00
|
|
|
branch => [ \&cmd_branch,
|
|
|
|
'Create a branch in the SVN repository',
|
|
|
|
{ 'message|m=s' => \$_message,
|
2009-06-24 01:02:08 +08:00
|
|
|
'destination|d=s' => \$_branch_dest,
|
2008-10-05 10:35:17 +08:00
|
|
|
'dry-run|n' => \$_dry_run,
|
2013-05-16 04:14:43 +08:00
|
|
|
'parents' => \$_parents,
|
2010-01-12 00:21:51 +08:00
|
|
|
'tag|t' => \$_tag,
|
|
|
|
'username=s' => \$Git::SVN::Prompt::_username,
|
|
|
|
'commit-url=s' => \$_commit_url } ],
|
2008-10-05 10:35:17 +08:00
|
|
|
tag => [ sub { $_tag = 1; cmd_branch(@_) },
|
|
|
|
'Create a tag in the SVN repository',
|
|
|
|
{ 'message|m=s' => \$_message,
|
2009-06-24 01:02:08 +08:00
|
|
|
'destination|d=s' => \$_branch_dest,
|
2010-01-12 00:21:51 +08:00
|
|
|
'dry-run|n' => \$_dry_run,
|
2013-05-16 04:14:43 +08:00
|
|
|
'parents' => \$_parents,
|
2010-01-12 00:21:51 +08:00
|
|
|
'username=s' => \$Git::SVN::Prompt::_username,
|
|
|
|
'commit-url=s' => \$_commit_url } ],
|
2007-01-15 15:21:16 +08:00
|
|
|
'set-tree' => [ \&cmd_set_tree,
|
|
|
|
"Set an SVN repository to a git tree-ish",
|
2009-05-06 02:16:14 +08:00
|
|
|
{ 'stdin' => \$_stdin, %cmt_opts, %fc_opts, } ],
|
2007-10-16 22:36:49 +08:00
|
|
|
'create-ignore' => [ \&cmd_create_ignore,
|
|
|
|
'Create a .gitignore per svn:ignore',
|
|
|
|
{ 'revision|r=i' => \$_revision
|
|
|
|
} ],
|
2009-11-16 10:57:16 +08:00
|
|
|
'mkdirs' => [ \&cmd_mkdirs ,
|
|
|
|
"recreate empty directories after a checkout",
|
|
|
|
{ 'revision|r=i' => \$_revision } ],
|
2007-10-16 22:36:50 +08:00
|
|
|
'propget' => [ \&cmd_propget,
|
|
|
|
'Print the value of a property on a file or directory',
|
|
|
|
{ 'revision|r=i' => \$_revision } ],
|
2014-12-07 18:47:23 +08:00
|
|
|
'propset' => [ \&cmd_propset,
|
|
|
|
'Set the value of a property on a file or directory - will be set on commit',
|
|
|
|
{} ],
|
2007-10-16 22:36:51 +08:00
|
|
|
'proplist' => [ \&cmd_proplist,
|
|
|
|
'List all properties of a file or directory',
|
|
|
|
{ 'revision|r=i' => \$_revision } ],
|
2007-01-12 09:58:39 +08:00
|
|
|
'show-ignore' => [ \&cmd_show_ignore, "Show svn:ignore listings",
|
2007-09-07 08:00:08 +08:00
|
|
|
{ 'revision|r=i' => \$_revision
|
2007-09-05 17:35:29 +08:00
|
|
|
} ],
|
2007-11-20 06:56:15 +08:00
|
|
|
'show-externals' => [ \&cmd_show_externals, "Show svn:externals listings",
|
|
|
|
{ 'revision|r=i' => \$_revision
|
|
|
|
} ],
|
2007-01-14 18:17:00 +08:00
|
|
|
'multi-fetch' => [ \&cmd_multi_fetch,
|
2007-02-14 18:21:19 +08:00
|
|
|
"Deprecated alias for $0 fetch --all",
|
|
|
|
{ 'revision|r=s' => \$_revision, %fc_opts } ],
|
2007-01-19 09:50:01 +08:00
|
|
|
'migrate' => [ sub { },
|
|
|
|
# no-op, we automatically run this anyways,
|
|
|
|
'Migrate configuration/metadata/layout from
|
|
|
|
previous versions of git-svn',
|
2007-02-15 11:34:56 +08:00
|
|
|
{ 'minimize' => \$Git::SVN::Migration::_minimize,
|
|
|
|
%remote_opts } ],
|
2007-01-12 18:35:20 +08:00
|
|
|
'log' => [ \&Git::SVN::Log::cmd_show_log, 'Show commit logs',
|
|
|
|
{ 'limit=i' => \$Git::SVN::Log::limit,
|
2006-06-01 17:35:44 +08:00
|
|
|
'revision|r=s' => \$_revision,
|
2007-01-12 18:35:20 +08:00
|
|
|
'verbose|v' => \$Git::SVN::Log::verbose,
|
|
|
|
'incremental' => \$Git::SVN::Log::incremental,
|
|
|
|
'oneline' => \$Git::SVN::Log::oneline,
|
|
|
|
'show-commit' => \$Git::SVN::Log::show_commit,
|
|
|
|
'non-recursive' => \$Git::SVN::Log::non_recursive,
|
2006-06-01 17:35:44 +08:00
|
|
|
'authors-file|A=s' => \$_authors,
|
2007-01-12 18:35:20 +08:00
|
|
|
'color' => \$Git::SVN::Log::color,
|
2007-09-07 08:00:08 +08:00
|
|
|
'pager=s' => \$Git::SVN::Log::pager
|
2006-06-01 17:35:44 +08:00
|
|
|
} ],
|
2008-08-08 16:41:58 +08:00
|
|
|
'find-rev' => [ \&cmd_find_rev,
|
|
|
|
"Translate between SVN revision numbers and tree-ish",
|
2014-09-07 16:35:19 +08:00
|
|
|
{ 'B|before' => \$_before,
|
|
|
|
'A|after' => \$_after } ],
|
2007-02-16 19:22:40 +08:00
|
|
|
'rebase' => [ \&cmd_rebase, "Fetch and rebase your working directory",
|
|
|
|
{ 'merge|m|M' => \$_merge,
|
|
|
|
'verbose|v' => \$_verbose,
|
|
|
|
'strategy|s=s' => \$_strategy,
|
2007-03-14 02:40:36 +08:00
|
|
|
'local|l' => \$_local,
|
2007-02-16 19:22:40 +08:00
|
|
|
'fetch-all|all' => \$_fetch_all,
|
2008-05-20 11:29:17 +08:00
|
|
|
'dry-run|n' => \$_dry_run,
|
2019-11-23 06:59:29 +08:00
|
|
|
'rebase-merges|p' => \$_rebase_merges,
|
2007-02-16 19:22:40 +08:00
|
|
|
%fc_opts } ],
|
2007-01-14 14:35:53 +08:00
|
|
|
'commit-diff' => [ \&cmd_commit_diff,
|
|
|
|
'Commit a diff between two trees',
|
2006-06-28 10:39:12 +08:00
|
|
|
{ 'message|m=s' => \$_message,
|
|
|
|
'file|F=s' => \$_file,
|
2006-11-09 17:19:37 +08:00
|
|
|
'revision|r=s' => \$_revision,
|
2006-06-28 10:39:12 +08:00
|
|
|
%cmt_opts } ],
|
2007-11-22 03:57:18 +08:00
|
|
|
'info' => [ \&cmd_info,
|
|
|
|
"Show info about the latest SVN revision
|
|
|
|
on the current branch",
|
2007-11-22 03:57:19 +08:00
|
|
|
{ 'url' => \$_url, } ],
|
2008-02-10 12:51:08 +08:00
|
|
|
'blame' => [ \&Git::SVN::Log::cmd_blame,
|
|
|
|
"Show what revision and author last modified each line of a file",
|
2012-07-27 08:26:00 +08:00
|
|
|
{ 'git-format' => \$Git::SVN::Log::_git_format } ],
|
2009-06-04 11:45:52 +08:00
|
|
|
'reset' => [ \&cmd_reset,
|
|
|
|
"Undo fetches back to the specified SVN revision",
|
|
|
|
{ 'revision|r=s' => \$_revision,
|
|
|
|
'parent|p' => \$_fetch_parent } ],
|
2009-07-20 07:00:52 +08:00
|
|
|
'gc' => [ \&cmd_gc,
|
|
|
|
"Compress unhandled.log files in .git/svn and remove " .
|
|
|
|
"index files in .git/svn",
|
|
|
|
{} ],
|
2006-02-16 17:24:16 +08:00
|
|
|
);
|
2006-06-13 06:53:13 +08:00
|
|
|
|
2011-09-17 05:02:01 +08:00
|
|
|
package FakeTerm;
|
|
|
|
sub new {
|
|
|
|
my ($class, $reason) = @_;
|
|
|
|
return bless \$reason, shift;
|
|
|
|
}
|
|
|
|
sub readline {
|
|
|
|
my $self = shift;
|
|
|
|
die "Cannot use readline on FakeTerm: $$self";
|
|
|
|
}
|
|
|
|
package main;
|
|
|
|
|
2014-09-14 15:38:29 +08:00
|
|
|
my $term;
|
|
|
|
sub term_init {
|
|
|
|
$term = eval {
|
2015-01-15 16:54:22 +08:00
|
|
|
require Term::ReadLine;
|
2014-09-14 15:38:29 +08:00
|
|
|
$ENV{"GIT_SVN_NOTTY"}
|
|
|
|
? new Term::ReadLine 'git-svn', \*STDIN, \*STDOUT
|
|
|
|
: new Term::ReadLine 'git-svn';
|
|
|
|
};
|
|
|
|
if ($@) {
|
|
|
|
$term = new FakeTerm "$@: going non-interactive";
|
|
|
|
}
|
2011-09-17 05:02:01 +08:00
|
|
|
}
|
|
|
|
|
2006-02-16 17:24:16 +08:00
|
|
|
my $cmd;
|
|
|
|
for (my $i = 0; $i < @ARGV; $i++) {
|
|
|
|
if (defined $cmd{$ARGV[$i]}) {
|
|
|
|
$cmd = $ARGV[$i];
|
|
|
|
splice @ARGV, $i, 1;
|
|
|
|
last;
|
2009-05-31 09:17:06 +08:00
|
|
|
} elsif ($ARGV[$i] eq 'help') {
|
|
|
|
$cmd = $ARGV[$i+1];
|
|
|
|
usage(0);
|
2006-02-16 17:24:16 +08:00
|
|
|
}
|
|
|
|
};
|
|
|
|
|
2007-12-19 16:31:43 +08:00
|
|
|
# make sure we're always running at the top-level working directory
|
2013-01-21 09:22:02 +08:00
|
|
|
if ($cmd && $cmd =~ /(?:clone|init|multi-init)$/) {
|
|
|
|
$ENV{GIT_DIR} ||= ".git";
|
2015-01-10 22:55:11 +08:00
|
|
|
# catch the submodule case
|
|
|
|
if (-f $ENV{GIT_DIR}) {
|
|
|
|
open(my $fh, '<', $ENV{GIT_DIR}) or
|
|
|
|
die "failed to open $ENV{GIT_DIR}: $!\n";
|
|
|
|
$ENV{GIT_DIR} = $1 if <$fh> =~ /^gitdir: (.+)$/;
|
|
|
|
}
|
2016-07-23 04:17:31 +08:00
|
|
|
} elsif ($cmd) {
|
2013-01-21 09:22:02 +08:00
|
|
|
my ($git_dir, $cdup);
|
|
|
|
git_cmd_try {
|
|
|
|
$git_dir = command_oneline([qw/rev-parse --git-dir/]);
|
|
|
|
} "Unable to find .git directory\n";
|
|
|
|
git_cmd_try {
|
|
|
|
$cdup = command_oneline(qw/rev-parse --show-cdup/);
|
|
|
|
chomp $cdup if ($cdup);
|
|
|
|
$cdup = "." unless ($cdup && length $cdup);
|
|
|
|
} "Already at toplevel, but $git_dir not found\n";
|
|
|
|
$ENV{GIT_DIR} = $git_dir;
|
|
|
|
chdir $cdup or die "Unable to chdir up to '$cdup'\n";
|
2008-05-23 22:19:41 +08:00
|
|
|
$_repository = Git->repository(Repository => $ENV{GIT_DIR});
|
2007-02-20 17:36:30 +08:00
|
|
|
}
|
2007-11-24 21:47:56 +08:00
|
|
|
|
|
|
|
my %opts = %{$cmd{$cmd}->[2]} if (defined $cmd);
|
|
|
|
|
2016-07-23 04:17:31 +08:00
|
|
|
read_git_config(\%opts) if $ENV{GIT_DIR};
|
2008-08-08 16:41:58 +08:00
|
|
|
if ($cmd && ($cmd eq 'log' || $cmd eq 'blame')) {
|
|
|
|
Getopt::Long::Configure('pass_through');
|
|
|
|
}
|
2011-10-04 02:21:36 +08:00
|
|
|
my $rv = GetOptions(%opts, 'h|H' => \$_help, 'version|V' => \$_version,
|
2007-11-24 21:47:56 +08:00
|
|
|
'minimize-connections' => \$Git::SVN::Migration::_minimize,
|
|
|
|
'id|i=s' => \$Git::SVN::default_ref_id,
|
|
|
|
'svn-remote|remote|R=s' => sub {
|
|
|
|
$Git::SVN::no_reuse_existing = 1;
|
|
|
|
$Git::SVN::default_repo_id = $_[1] });
|
|
|
|
exit 1 if (!$rv && $cmd && $cmd ne 'log');
|
|
|
|
|
|
|
|
usage(0) if $_help;
|
|
|
|
version() if $_version;
|
|
|
|
usage(1) unless defined $cmd;
|
|
|
|
load_authors() if $_authors;
|
2009-05-15 09:27:15 +08:00
|
|
|
if (defined $_authors_prog) {
|
2018-03-04 19:22:36 +08:00
|
|
|
my $abs_file = File::Spec->rel2abs($_authors_prog);
|
|
|
|
$_authors_prog = "'" . $abs_file . "'" if -x $abs_file;
|
2009-05-15 09:27:15 +08:00
|
|
|
}
|
2007-11-24 21:47:56 +08:00
|
|
|
|
2007-02-17 10:45:01 +08:00
|
|
|
unless ($cmd =~ /^(?:clone|init|multi-init|commit-diff)$/) {
|
2007-01-19 09:50:01 +08:00
|
|
|
Git::SVN::Migration::migration_check();
|
|
|
|
}
|
2007-02-01 04:28:10 +08:00
|
|
|
Git::SVN::init_vars();
|
2007-01-23 05:52:04 +08:00
|
|
|
eval {
|
|
|
|
Git::SVN::verify_remotes_sanity();
|
|
|
|
$cmd{$cmd}->[0]->(@ARGV);
|
2012-06-25 05:40:05 +08:00
|
|
|
post_fetch_checkout();
|
2007-01-23 05:52:04 +08:00
|
|
|
};
|
|
|
|
fatal $@ if $@;
|
2006-02-16 17:24:16 +08:00
|
|
|
exit 0;
|
|
|
|
|
|
|
|
####################### primary functions ######################
|
|
|
|
sub usage {
|
|
|
|
my $exit = shift || 0;
|
|
|
|
my $fd = $exit ? \*STDERR : \*STDOUT;
|
|
|
|
print $fd <<"";
|
|
|
|
git-svn - bidirectional operations between a single Subversion tree and git
|
2013-02-24 08:50:10 +08:00
|
|
|
usage: git svn <command> [options] [arguments]\n
|
2006-03-03 17:20:09 +08:00
|
|
|
|
|
|
|
print $fd "Available commands:\n" unless $cmd;
|
2006-02-16 17:24:16 +08:00
|
|
|
|
|
|
|
foreach (sort keys %cmd) {
|
2006-03-03 17:20:09 +08:00
|
|
|
next if $cmd && $cmd ne $_;
|
2007-02-15 11:34:56 +08:00
|
|
|
next if /^multi-/; # don't show deprecated commands
|
2006-10-12 05:53:36 +08:00
|
|
|
print $fd ' ',pack('A17',$_),$cmd{$_}->[1],"\n";
|
2007-11-04 02:53:34 +08:00
|
|
|
foreach (sort keys %{$cmd{$_}->[2]}) {
|
2007-04-03 16:57:08 +08:00
|
|
|
# mixed-case options are for .git/config only
|
|
|
|
next if /[A-Z]/ && /^[a-z]+$/i;
|
2006-03-03 17:20:09 +08:00
|
|
|
# prints out arguments as they should be passed:
|
2006-05-24 16:40:37 +08:00
|
|
|
my $x = s#[:=]s$## ? '<arg>' : s#[:=]i$## ? '<num>' : '';
|
2006-10-12 05:53:36 +08:00
|
|
|
print $fd ' ' x 21, join(', ', map { length $_ > 1 ?
|
2006-03-03 17:20:09 +08:00
|
|
|
"--$_" : "-$_" }
|
|
|
|
split /\|/,$_)," $x\n";
|
|
|
|
}
|
2006-02-16 17:24:16 +08:00
|
|
|
}
|
|
|
|
print $fd <<"";
|
2006-03-03 17:20:09 +08:00
|
|
|
\nGIT_SVN_ID may be set in the environment or via the --id/-i switch to an
|
|
|
|
arbitrary identifier if you're tracking multiple SVN branches/repositories in
|
|
|
|
one git repository and want to keep them separate. See git-svn(1) for more
|
|
|
|
information.
|
2006-02-16 17:24:16 +08:00
|
|
|
|
|
|
|
exit $exit;
|
|
|
|
}
|
|
|
|
|
2006-02-21 02:57:29 +08:00
|
|
|
sub version {
|
2010-03-04 18:23:53 +08:00
|
|
|
::_req_svn();
|
2006-12-28 17:16:20 +08:00
|
|
|
print "git-svn version $VERSION (svn $SVN::Core::VERSION)\n";
|
2006-02-21 02:57:29 +08:00
|
|
|
exit 0;
|
|
|
|
}
|
|
|
|
|
2011-09-17 05:02:01 +08:00
|
|
|
sub ask {
|
|
|
|
my ($prompt, %arg) = @_;
|
|
|
|
my $valid_re = $arg{valid_re};
|
|
|
|
my $default = $arg{default};
|
|
|
|
my $resp;
|
|
|
|
my $i = 0;
|
2014-09-14 15:38:29 +08:00
|
|
|
term_init() unless $term;
|
2011-09-17 05:02:01 +08:00
|
|
|
|
|
|
|
if ( !( defined($term->IN)
|
|
|
|
&& defined( fileno($term->IN) )
|
|
|
|
&& defined( $term->OUT )
|
|
|
|
&& defined( fileno($term->OUT) ) ) ){
|
|
|
|
return defined($default) ? $default : undef;
|
|
|
|
}
|
|
|
|
|
|
|
|
while ($i++ < 10) {
|
|
|
|
$resp = $term->readline($prompt);
|
|
|
|
if (!defined $resp) { # EOF
|
|
|
|
print "\n";
|
|
|
|
return defined $default ? $default : undef;
|
|
|
|
}
|
|
|
|
if ($resp eq '' and defined $default) {
|
|
|
|
return $default;
|
|
|
|
}
|
|
|
|
if (!defined $valid_re or $resp =~ /$valid_re/) {
|
|
|
|
return $resp;
|
|
|
|
}
|
|
|
|
}
|
|
|
|
return undef;
|
|
|
|
}
|
|
|
|
|
2007-01-12 07:35:55 +08:00
|
|
|
sub do_git_init_db {
|
|
|
|
unless (-d $ENV{GIT_DIR}) {
|
|
|
|
my @init_db = ('init');
|
|
|
|
push @init_db, "--template=$_template" if defined $_template;
|
2007-02-15 04:27:41 +08:00
|
|
|
if (defined $_shared) {
|
|
|
|
if ($_shared =~ /[a-z]/) {
|
|
|
|
push @init_db, "--shared=$_shared";
|
|
|
|
} else {
|
|
|
|
push @init_db, "--shared";
|
|
|
|
}
|
|
|
|
}
|
2007-01-12 07:35:55 +08:00
|
|
|
command_noisy(@init_db);
|
2008-05-23 22:19:41 +08:00
|
|
|
$_repository = Git->repository(Repository => ".git");
|
2007-01-12 07:35:55 +08:00
|
|
|
}
|
2007-02-18 18:34:09 +08:00
|
|
|
my $set;
|
|
|
|
my $pfx = "svn-remote.$Git::SVN::default_repo_id";
|
|
|
|
foreach my $i (keys %icv) {
|
|
|
|
die "'$set' and '$i' cannot both be set\n" if $set;
|
|
|
|
next unless defined $icv{$i};
|
|
|
|
command_noisy('config', "$pfx.$i", $icv{$i});
|
|
|
|
$set = $i;
|
|
|
|
}
|
2012-05-28 15:00:46 +08:00
|
|
|
my $ignore_paths_regex = \$Git::SVN::Fetcher::_ignore_regex;
|
2011-10-11 07:27:37 +08:00
|
|
|
command_noisy('config', "$pfx.ignore-paths", $$ignore_paths_regex)
|
|
|
|
if defined $$ignore_paths_regex;
|
2013-05-04 07:10:18 +08:00
|
|
|
my $include_paths_regex = \$Git::SVN::Fetcher::_include_regex;
|
|
|
|
command_noisy('config', "$pfx.include-paths", $$include_paths_regex)
|
|
|
|
if defined $$include_paths_regex;
|
2011-10-11 07:27:37 +08:00
|
|
|
my $ignore_refs_regex = \$Git::SVN::Ra::_ignore_refs_regex;
|
|
|
|
command_noisy('config', "$pfx.ignore-refs", $$ignore_refs_regex)
|
|
|
|
if defined $$ignore_refs_regex;
|
2011-07-21 06:37:26 +08:00
|
|
|
|
2012-05-28 15:00:46 +08:00
|
|
|
if (defined $Git::SVN::Fetcher::_preserve_empty_dirs) {
|
|
|
|
my $fname = \$Git::SVN::Fetcher::_placeholder_filename;
|
2011-07-21 06:37:26 +08:00
|
|
|
command_noisy('config', "$pfx.preserve-empty-dirs", 'true');
|
|
|
|
command_noisy('config', "$pfx.placeholder-filename", $$fname);
|
|
|
|
}
|
2020-06-23 02:04:15 +08:00
|
|
|
load_object_format();
|
2007-01-12 07:35:55 +08:00
|
|
|
}
|
|
|
|
|
2007-02-15 04:27:41 +08:00
|
|
|
sub init_subdir {
|
|
|
|
my $repo_path = shift or return;
|
|
|
|
mkpath([$repo_path]) unless -d $repo_path;
|
|
|
|
chdir $repo_path or die "Couldn't chdir to $repo_path: $!\n";
|
2007-02-23 17:26:26 +08:00
|
|
|
$ENV{GIT_DIR} = '.git';
|
2008-05-23 22:19:41 +08:00
|
|
|
$_repository = Git->repository(Repository => $ENV{GIT_DIR});
|
2007-02-15 04:27:41 +08:00
|
|
|
}
|
|
|
|
|
2007-02-17 10:45:01 +08:00
|
|
|
sub cmd_clone {
|
|
|
|
my ($url, $path) = @_;
|
2016-07-03 13:39:23 +08:00
|
|
|
if (!$url) {
|
|
|
|
die "SVN repository location required ",
|
|
|
|
"as a command-line argument\n";
|
|
|
|
} elsif (!defined $path &&
|
2009-06-24 01:02:08 +08:00
|
|
|
(defined $_trunk || @_branches || @_tags ||
|
2007-07-14 17:25:28 +08:00
|
|
|
defined $_stdlayout) &&
|
2007-02-17 10:45:01 +08:00
|
|
|
$url !~ m#^[a-z\+]+://#) {
|
|
|
|
$path = $url;
|
|
|
|
}
|
|
|
|
$path = basename($url) if !defined $path || !length $path;
|
2009-12-09 04:54:10 +08:00
|
|
|
my $authors_absolute = $_authors ? File::Spec->rel2abs($_authors) : "";
|
2007-02-23 17:26:26 +08:00
|
|
|
cmd_init($url, $path);
|
2009-12-09 04:54:10 +08:00
|
|
|
command_oneline('config', 'svn.authorsfile', $authors_absolute)
|
|
|
|
if $_authors;
|
2009-12-09 04:54:11 +08:00
|
|
|
Git::SVN::fetch_all($Git::SVN::default_repo_id);
|
2007-02-17 10:45:01 +08:00
|
|
|
}
|
|
|
|
|
2007-01-12 04:26:16 +08:00
|
|
|
sub cmd_init {
|
2007-07-14 17:25:28 +08:00
|
|
|
if (defined $_stdlayout) {
|
|
|
|
$_trunk = 'trunk' if (!defined $_trunk);
|
2009-06-24 01:02:08 +08:00
|
|
|
@_tags = 'tags' if (! @_tags);
|
|
|
|
@_branches = 'branches' if (! @_branches);
|
2007-07-14 17:25:28 +08:00
|
|
|
}
|
2009-06-24 01:02:08 +08:00
|
|
|
if (defined $_trunk || @_branches || @_tags) {
|
2007-02-15 04:27:41 +08:00
|
|
|
return cmd_multi_init(@_);
|
2006-07-01 12:42:53 +08:00
|
|
|
}
|
2007-02-15 04:27:41 +08:00
|
|
|
my $url = shift or die "SVN repository location required ",
|
|
|
|
"as a command-line argument\n";
|
2009-06-26 22:52:09 +08:00
|
|
|
$url = canonicalize_url($url);
|
2007-02-15 04:27:41 +08:00
|
|
|
init_subdir(@_);
|
2007-01-12 07:35:55 +08:00
|
|
|
do_git_init_db();
|
2006-07-01 12:42:53 +08:00
|
|
|
|
2009-07-25 15:00:50 +08:00
|
|
|
if ($Git::SVN::_minimize_url eq 'unset') {
|
|
|
|
$Git::SVN::_minimize_url = 0;
|
|
|
|
}
|
|
|
|
|
2007-01-19 09:50:01 +08:00
|
|
|
Git::SVN->init($url);
|
2006-02-16 17:24:16 +08:00
|
|
|
}
|
|
|
|
|
2007-01-05 10:09:56 +08:00
|
|
|
sub cmd_fetch {
|
2007-02-14 18:21:19 +08:00
|
|
|
if (grep /^\d+=./, @_) {
|
|
|
|
die "'<rev>=<commit>' fetch arguments are ",
|
|
|
|
"no longer supported.\n";
|
2007-01-23 07:47:41 +08:00
|
|
|
}
|
2007-02-14 18:21:19 +08:00
|
|
|
my ($remote) = @_;
|
|
|
|
if (@_ > 1) {
|
2013-02-25 06:48:38 +08:00
|
|
|
die "usage: $0 fetch [--all] [--parent] [svn-remote]\n";
|
2007-02-14 18:21:19 +08:00
|
|
|
}
|
2009-11-23 04:37:06 +08:00
|
|
|
$Git::SVN::no_reuse_existing = undef;
|
2009-04-07 04:37:59 +08:00
|
|
|
if ($_fetch_parent) {
|
|
|
|
my ($url, $rev, $uuid, $gs) = working_head_info('HEAD');
|
|
|
|
unless ($gs) {
|
|
|
|
die "Unable to determine upstream SVN information from ",
|
|
|
|
"working tree history\n";
|
|
|
|
}
|
|
|
|
# just fetch, don't checkout.
|
|
|
|
$_no_checkout = 'true';
|
|
|
|
$_fetch_all ? $gs->fetch_all : $gs->fetch;
|
|
|
|
} elsif ($_fetch_all) {
|
2007-02-14 18:21:19 +08:00
|
|
|
cmd_multi_fetch();
|
|
|
|
} else {
|
2009-04-07 04:37:59 +08:00
|
|
|
$remote ||= $Git::SVN::default_repo_id;
|
2007-02-14 18:21:19 +08:00
|
|
|
Git::SVN::fetch_all($remote, Git::SVN::read_all_remotes());
|
2007-01-14 18:17:00 +08:00
|
|
|
}
|
2007-01-05 10:09:56 +08:00
|
|
|
}
|
|
|
|
|
2007-01-15 15:21:16 +08:00
|
|
|
sub cmd_set_tree {
|
2006-02-16 17:24:16 +08:00
|
|
|
my (@commits) = @_;
|
|
|
|
if ($_stdin || !@commits) {
|
|
|
|
print "Reading from stdin...\n";
|
|
|
|
@commits = ();
|
|
|
|
while (<STDIN>) {
|
2020-06-23 02:04:12 +08:00
|
|
|
if (/\b($oid_short)\b/o) {
|
2006-02-16 17:24:16 +08:00
|
|
|
unshift @commits, $1;
|
|
|
|
}
|
|
|
|
}
|
|
|
|
}
|
|
|
|
my @revs;
|
2006-02-21 02:57:26 +08:00
|
|
|
foreach my $c (@commits) {
|
2006-12-16 02:59:54 +08:00
|
|
|
my @tmp = command('rev-parse',$c);
|
2006-02-21 02:57:26 +08:00
|
|
|
if (scalar @tmp == 1) {
|
|
|
|
push @revs, $tmp[0];
|
|
|
|
} elsif (scalar @tmp > 1) {
|
2006-12-16 02:59:54 +08:00
|
|
|
push @revs, reverse(command('rev-list',@tmp));
|
2006-02-21 02:57:26 +08:00
|
|
|
} else {
|
2007-10-16 22:36:52 +08:00
|
|
|
fatal "Failed to rev-parse $c";
|
2006-02-21 02:57:26 +08:00
|
|
|
}
|
2006-02-16 17:24:16 +08:00
|
|
|
}
|
2007-01-15 15:21:16 +08:00
|
|
|
my $gs = Git::SVN->new;
|
|
|
|
my ($r_last, $cmt_last) = $gs->last_rev_commit;
|
|
|
|
$gs->fetch;
|
2007-01-26 03:53:13 +08:00
|
|
|
if (defined $gs->{last_rev} && $r_last != $gs->{last_rev}) {
|
2007-01-15 15:21:16 +08:00
|
|
|
fatal "There are new revisions that were fetched ",
|
|
|
|
"and need to be merged (or acknowledged) ",
|
|
|
|
"before committing.\nlast rev: $r_last\n",
|
2007-10-16 22:36:52 +08:00
|
|
|
" current: $gs->{last_rev}";
|
git-svn: add support for Perl SVN::* libraries
This means we no longer have to deal with having bloated SVN
working copies around and we get a nice performance increase as
well because we don't have to exec the SVN binary and start a
new server connection each time.
Of course we have to manually manage memory with SVN::Pool
whenever we can, and hack around cases where SVN just eats
memory despite pools (I blame Perl, too). I would like to
keep memory usage as stable as possible during long fetch/commit
processes since I still use computers with only 256-512M RAM.
commit should always be faster with the SVN library code. The
SVN::Delta interface is leaky (or I'm not using it with pools
correctly), so I'm forking on every commit, but that doesn't
seem to hurt performance too much (at least on normal Unix/Linux
systems where fork() is pretty cheap).
fetch should be faster in most common cases, but probably not all.
fetches will be faster where client/server delta generation is
the bottleneck and not bandwidth. Of course, full-files are
generated server-side via deltas, too. Full files are always
transferred when they're updated, just like git-svnimport and
unlike command-line svn. I'm also hacking around memory leaks
(see comments) here by using some more forks.
I've tested fetch with http://, https://, file://, and svn://
repositories, so we should be reasonably covered in terms of
error handling for fetching.
Of course, we'll keep plain command-line svn compatibility as a
fallback for people running SVN 1.1 (I'm looking into library
support for 1.1.x SVN, too). If you want to force command-line
SVN usage, set GIT_SVN_NO_LIB=1 in your environment.
We also require two simultaneous connections (just like
git-svnimport), but this shouldn't be a problem for most
servers.
Less important commands:
show-ignore is slower because it requires repository
access, but -r/--revision <num> can be specified.
graft-branches may use more memory, but it's a
short-term process and is funky-filename-safe.
Signed-off-by: Eric Wong <normalperson@yhbt.net>
2006-06-13 06:23:48 +08:00
|
|
|
}
|
2007-01-15 15:21:16 +08:00
|
|
|
$gs->set_tree($_) foreach @revs;
|
|
|
|
print "Done committing ",scalar @revs," revisions to SVN\n";
|
2007-12-14 00:27:34 +08:00
|
|
|
unlink $gs->{index};
|
git-svn: add support for Perl SVN::* libraries
This means we no longer have to deal with having bloated SVN
working copies around and we get a nice performance increase as
well because we don't have to exec the SVN binary and start a
new server connection each time.
Of course we have to manually manage memory with SVN::Pool
whenever we can, and hack around cases where SVN just eats
memory despite pools (I blame Perl, too). I would like to
keep memory usage as stable as possible during long fetch/commit
processes since I still use computers with only 256-512M RAM.
commit should always be faster with the SVN library code. The
SVN::Delta interface is leaky (or I'm not using it with pools
correctly), so I'm forking on every commit, but that doesn't
seem to hurt performance too much (at least on normal Unix/Linux
systems where fork() is pretty cheap).
fetch should be faster in most common cases, but probably not all.
fetches will be faster where client/server delta generation is
the bottleneck and not bandwidth. Of course, full-files are
generated server-side via deltas, too. Full files are always
transferred when they're updated, just like git-svnimport and
unlike command-line svn. I'm also hacking around memory leaks
(see comments) here by using some more forks.
I've tested fetch with http://, https://, file://, and svn://
repositories, so we should be reasonably covered in terms of
error handling for fetching.
Of course, we'll keep plain command-line svn compatibility as a
fallback for people running SVN 1.1 (I'm looking into library
support for 1.1.x SVN, too). If you want to force command-line
SVN usage, set GIT_SVN_NO_LIB=1 in your environment.
We also require two simultaneous connections (just like
git-svnimport), but this shouldn't be a problem for most
servers.
Less important commands:
show-ignore is slower because it requires repository
access, but -r/--revision <num> can be specified.
graft-branches may use more memory, but it's a
short-term process and is funky-filename-safe.
Signed-off-by: Eric Wong <normalperson@yhbt.net>
2006-06-13 06:23:48 +08:00
|
|
|
}
|
2006-02-26 18:22:27 +08:00
|
|
|
|
2011-09-08 01:36:05 +08:00
|
|
|
sub split_merge_info_range {
|
|
|
|
my ($range) = @_;
|
|
|
|
if ($range =~ /(\d+)-(\d+)/) {
|
|
|
|
return (int($1), int($2));
|
|
|
|
} else {
|
|
|
|
return (int($range), int($range));
|
|
|
|
}
|
|
|
|
}
|
|
|
|
|
|
|
|
sub combine_ranges {
|
|
|
|
my ($in) = @_;
|
|
|
|
|
|
|
|
my @fnums = ();
|
|
|
|
my @arr = split(/,/, $in);
|
|
|
|
for my $element (@arr) {
|
|
|
|
my ($start, $end) = split_merge_info_range($element);
|
|
|
|
push @fnums, $start;
|
|
|
|
}
|
|
|
|
|
|
|
|
my @sorted = @arr [ sort {
|
|
|
|
$fnums[$a] <=> $fnums[$b]
|
|
|
|
} 0..$#arr ];
|
|
|
|
|
|
|
|
my @return = ();
|
|
|
|
my $last = -1;
|
|
|
|
my $first = -1;
|
|
|
|
for my $element (@sorted) {
|
|
|
|
my ($start, $end) = split_merge_info_range($element);
|
|
|
|
|
|
|
|
if ($last == -1) {
|
|
|
|
$first = $start;
|
|
|
|
$last = $end;
|
|
|
|
next;
|
|
|
|
}
|
|
|
|
if ($start <= $last+1) {
|
|
|
|
if ($end > $last) {
|
|
|
|
$last = $end;
|
|
|
|
}
|
|
|
|
next;
|
|
|
|
}
|
|
|
|
if ($first == $last) {
|
|
|
|
push @return, "$first";
|
|
|
|
} else {
|
|
|
|
push @return, "$first-$last";
|
|
|
|
}
|
|
|
|
$first = $start;
|
|
|
|
$last = $end;
|
|
|
|
}
|
|
|
|
|
|
|
|
if ($first != -1) {
|
|
|
|
if ($first == $last) {
|
|
|
|
push @return, "$first";
|
|
|
|
} else {
|
|
|
|
push @return, "$first-$last";
|
|
|
|
}
|
|
|
|
}
|
|
|
|
|
|
|
|
return join(',', @return);
|
|
|
|
}
|
|
|
|
|
|
|
|
sub merge_revs_into_hash {
|
|
|
|
my ($hash, $minfo) = @_;
|
|
|
|
my @lines = split(' ', $minfo);
|
|
|
|
|
|
|
|
for my $line (@lines) {
|
|
|
|
my ($branchpath, $revs) = split(/:/, $line);
|
|
|
|
|
|
|
|
if (exists($hash->{$branchpath})) {
|
|
|
|
# Merge the two revision sets
|
|
|
|
my $combined = "$hash->{$branchpath},$revs";
|
|
|
|
$hash->{$branchpath} = combine_ranges($combined);
|
|
|
|
} else {
|
|
|
|
# Just do range combining for consolidation
|
|
|
|
$hash->{$branchpath} = combine_ranges($revs);
|
|
|
|
}
|
|
|
|
}
|
|
|
|
}
|
|
|
|
|
|
|
|
sub merge_merge_info {
|
2013-03-31 06:06:42 +08:00
|
|
|
my ($mergeinfo_one, $mergeinfo_two, $ignore_branch) = @_;
|
2011-09-08 01:36:05 +08:00
|
|
|
my %result_hash = ();
|
|
|
|
|
|
|
|
merge_revs_into_hash(\%result_hash, $mergeinfo_one);
|
|
|
|
merge_revs_into_hash(\%result_hash, $mergeinfo_two);
|
|
|
|
|
2013-03-31 06:06:42 +08:00
|
|
|
delete $result_hash{$ignore_branch} if $ignore_branch;
|
|
|
|
|
2011-09-08 01:36:05 +08:00
|
|
|
my $result = '';
|
|
|
|
# Sort below is for consistency's sake
|
|
|
|
for my $branchname (sort keys(%result_hash)) {
|
|
|
|
my $revlist = $result_hash{$branchname};
|
|
|
|
$result .= "$branchname:$revlist\n"
|
|
|
|
}
|
|
|
|
return $result;
|
|
|
|
}
|
|
|
|
|
|
|
|
sub populate_merge_info {
|
|
|
|
my ($d, $gs, $uuid, $linear_refs, $rewritten_parent) = @_;
|
|
|
|
|
|
|
|
my %parentshash;
|
|
|
|
read_commit_parents(\%parentshash, $d);
|
|
|
|
my @parents = @{$parentshash{$d}};
|
|
|
|
if ($#parents > 0) {
|
|
|
|
# Merge commit
|
|
|
|
my $all_parents_ok = 1;
|
|
|
|
my $aggregate_mergeinfo = '';
|
|
|
|
my $rooturl = $gs->repos_root;
|
2013-03-31 06:06:42 +08:00
|
|
|
my ($target_branch) = $gs->full_pushurl =~ /^\Q$rooturl\E(.*)/;
|
2011-09-08 01:36:05 +08:00
|
|
|
|
|
|
|
if (defined($rewritten_parent)) {
|
|
|
|
# Replace first parent with newly-rewritten version
|
|
|
|
shift @parents;
|
|
|
|
unshift @parents, $rewritten_parent;
|
|
|
|
}
|
|
|
|
|
|
|
|
foreach my $parent (@parents) {
|
|
|
|
my ($branchurl, $svnrev, $paruuid) =
|
|
|
|
cmt_metadata($parent);
|
|
|
|
|
|
|
|
unless (defined($svnrev)) {
|
|
|
|
# Should have been caught be preflight check
|
|
|
|
fatal "merge commit $d has ancestor $parent, but that change "
|
|
|
|
."does not have git-svn metadata!";
|
|
|
|
}
|
2011-11-01 06:37:12 +08:00
|
|
|
unless ($branchurl =~ /^\Q$rooturl\E(.*)/) {
|
2011-09-08 01:36:05 +08:00
|
|
|
fatal "commit $parent git-svn metadata changed mid-run!";
|
|
|
|
}
|
|
|
|
my $branchpath = $1;
|
|
|
|
|
|
|
|
my $ra = Git::SVN::Ra->new($branchurl);
|
|
|
|
my (undef, undef, $props) =
|
|
|
|
$ra->get_dir(canonicalize_path("."), $svnrev);
|
|
|
|
my $par_mergeinfo = $props->{'svn:mergeinfo'};
|
|
|
|
unless (defined $par_mergeinfo) {
|
|
|
|
$par_mergeinfo = '';
|
|
|
|
}
|
|
|
|
# Merge previous mergeinfo values
|
|
|
|
$aggregate_mergeinfo =
|
|
|
|
merge_merge_info($aggregate_mergeinfo,
|
2013-03-31 06:06:42 +08:00
|
|
|
$par_mergeinfo,
|
|
|
|
$target_branch);
|
2011-09-08 01:36:05 +08:00
|
|
|
|
|
|
|
next if $parent eq $parents[0]; # Skip first parent
|
|
|
|
# Add new changes being placed in tree by merge
|
|
|
|
my @cmd = (qw/rev-list --reverse/,
|
|
|
|
$parent, qw/--not/);
|
|
|
|
foreach my $par (@parents) {
|
|
|
|
unless ($par eq $parent) {
|
|
|
|
push @cmd, $par;
|
|
|
|
}
|
|
|
|
}
|
|
|
|
my @revsin = ();
|
|
|
|
my ($revlist, $ctx) = command_output_pipe(@cmd);
|
|
|
|
while (<$revlist>) {
|
|
|
|
my $irev = $_;
|
|
|
|
chomp $irev;
|
|
|
|
my (undef, $csvnrev, undef) =
|
|
|
|
cmt_metadata($irev);
|
|
|
|
unless (defined $csvnrev) {
|
|
|
|
# A child is missing SVN annotations...
|
|
|
|
# this might be OK, or might not be.
|
|
|
|
warn "W:child $irev is merged into revision "
|
|
|
|
."$d but does not have git-svn metadata. "
|
|
|
|
."This means git-svn cannot determine the "
|
|
|
|
."svn revision numbers to place into the "
|
|
|
|
."svn:mergeinfo property. You must ensure "
|
|
|
|
."a branch is entirely committed to "
|
|
|
|
."SVN before merging it in order for "
|
|
|
|
."svn:mergeinfo population to function "
|
|
|
|
."properly";
|
|
|
|
}
|
|
|
|
push @revsin, $csvnrev;
|
|
|
|
}
|
|
|
|
command_close_pipe($revlist, $ctx);
|
|
|
|
|
|
|
|
last unless $all_parents_ok;
|
|
|
|
|
|
|
|
# We now have a list of all SVN revnos which are
|
|
|
|
# merged by this particular parent. Integrate them.
|
|
|
|
next if $#revsin == -1;
|
|
|
|
my $newmergeinfo = "$branchpath:" . join(',', @revsin);
|
|
|
|
$aggregate_mergeinfo =
|
|
|
|
merge_merge_info($aggregate_mergeinfo,
|
2013-03-31 06:06:42 +08:00
|
|
|
$newmergeinfo,
|
|
|
|
$target_branch);
|
2011-09-08 01:36:05 +08:00
|
|
|
}
|
|
|
|
if ($all_parents_ok and $aggregate_mergeinfo) {
|
|
|
|
return $aggregate_mergeinfo;
|
|
|
|
}
|
|
|
|
}
|
|
|
|
|
|
|
|
return undef;
|
|
|
|
}
|
|
|
|
|
2012-08-08 13:35:00 +08:00
|
|
|
sub dcommit_rebase {
|
|
|
|
my ($is_last, $current, $fetched_ref, $svn_error) = @_;
|
|
|
|
my @diff;
|
|
|
|
|
|
|
|
if ($svn_error) {
|
|
|
|
print STDERR "\nERROR from SVN:\n",
|
|
|
|
$svn_error->expanded_message, "\n";
|
|
|
|
}
|
|
|
|
unless ($_no_rebase) {
|
|
|
|
# we always want to rebase against the current HEAD,
|
|
|
|
# not any head that was passed to us
|
|
|
|
@diff = command('diff-tree', $current,
|
|
|
|
$fetched_ref, '--');
|
|
|
|
my @finish;
|
|
|
|
if (@diff) {
|
|
|
|
@finish = rebase_cmd();
|
|
|
|
print STDERR "W: $current and ", $fetched_ref,
|
|
|
|
" differ, using @finish:\n",
|
|
|
|
join("\n", @diff), "\n";
|
|
|
|
} elsif ($is_last) {
|
|
|
|
print "No changes between ", $current, " and ",
|
|
|
|
$fetched_ref,
|
|
|
|
"\nResetting to the latest ",
|
|
|
|
$fetched_ref, "\n";
|
|
|
|
@finish = qw/reset --mixed/;
|
|
|
|
}
|
|
|
|
command_noisy(@finish, $fetched_ref) if @finish;
|
|
|
|
}
|
|
|
|
if ($svn_error) {
|
|
|
|
die "ERROR: Not all changes have been committed into SVN"
|
|
|
|
.($_no_rebase ? ".\n" : ", however the committed\n"
|
|
|
|
."ones (if any) seem to be successfully integrated "
|
|
|
|
."into the working tree.\n")
|
|
|
|
."Please see the above messages for details.\n";
|
|
|
|
}
|
|
|
|
return @diff;
|
|
|
|
}
|
|
|
|
|
2007-01-14 19:14:28 +08:00
|
|
|
sub cmd_dcommit {
|
|
|
|
my $head = shift;
|
2010-08-03 03:58:19 +08:00
|
|
|
command_noisy(qw/update-index --refresh/);
|
2013-06-06 02:31:27 +08:00
|
|
|
git_cmd_try { command_oneline(qw/diff-index --quiet HEAD --/) }
|
2007-11-14 05:47:26 +08:00
|
|
|
'Cannot dcommit with a dirty index. Commit your changes first, '
|
2007-11-12 02:41:41 +08:00
|
|
|
. "or stash them with `git stash'.\n";
|
2007-01-14 19:14:28 +08:00
|
|
|
$head ||= 'HEAD';
|
2009-05-29 23:09:42 +08:00
|
|
|
|
|
|
|
my $old_head;
|
|
|
|
if ($head ne 'HEAD') {
|
|
|
|
$old_head = eval {
|
|
|
|
command_oneline([qw/symbolic-ref -q HEAD/])
|
|
|
|
};
|
|
|
|
if ($old_head) {
|
|
|
|
$old_head =~ s{^refs/heads/}{};
|
|
|
|
} else {
|
|
|
|
$old_head = eval { command_oneline(qw/rev-parse HEAD/) };
|
|
|
|
}
|
|
|
|
command(['checkout', $head], STDERR => 0);
|
|
|
|
}
|
|
|
|
|
2007-02-14 06:22:11 +08:00
|
|
|
my @refs;
|
2009-05-29 23:09:42 +08:00
|
|
|
my ($url, $rev, $uuid, $gs) = working_head_info('HEAD', \@refs);
|
2008-08-31 21:50:59 +08:00
|
|
|
unless ($gs) {
|
|
|
|
die "Unable to determine upstream SVN information from ",
|
|
|
|
"$head history.\nPerhaps the repository is empty.";
|
|
|
|
}
|
2009-02-23 19:02:53 +08:00
|
|
|
|
|
|
|
if (defined $_commit_url) {
|
|
|
|
$url = $_commit_url;
|
|
|
|
} else {
|
|
|
|
$url = eval { command_oneline('config', '--get',
|
|
|
|
"svn-remote.$gs->{repo_id}.commiturl") };
|
|
|
|
if (!$url) {
|
2011-04-08 22:57:54 +08:00
|
|
|
$url = $gs->full_pushurl
|
2009-02-23 19:02:53 +08:00
|
|
|
}
|
|
|
|
}
|
|
|
|
|
2008-08-07 17:06:16 +08:00
|
|
|
my $last_rev = $_revision if defined $_revision;
|
2008-04-25 02:06:36 +08:00
|
|
|
if ($url) {
|
|
|
|
print "Committing to $url ...\n";
|
|
|
|
}
|
2007-06-13 17:23:28 +08:00
|
|
|
my ($linear_refs, $parents) = linearize_history($gs, \@refs);
|
2007-09-01 09:16:12 +08:00
|
|
|
if ($_no_rebase && scalar(@$linear_refs) > 1) {
|
|
|
|
warn "Attempting to commit more than one change while ",
|
|
|
|
"--no-rebase is enabled.\n",
|
|
|
|
"If these changes depend on each other, re-running ",
|
2008-01-03 02:09:49 +08:00
|
|
|
"without --no-rebase may be required."
|
2007-09-01 09:16:12 +08:00
|
|
|
}
|
2011-09-17 05:02:01 +08:00
|
|
|
|
|
|
|
if (defined $_interactive){
|
|
|
|
my $ask_default = "y";
|
|
|
|
foreach my $d (@$linear_refs){
|
|
|
|
my ($fh, $ctx) = command_output_pipe(qw(show --summary), "$d");
|
|
|
|
while (<$fh>){
|
|
|
|
print $_;
|
|
|
|
}
|
|
|
|
command_close_pipe($fh, $ctx);
|
|
|
|
$_ = ask("Commit this patch to SVN? ([y]es (default)|[n]o|[q]uit|[a]ll): ",
|
|
|
|
valid_re => qr/^(?:yes|y|no|n|quit|q|all|a)/i,
|
|
|
|
default => $ask_default);
|
|
|
|
die "Commit this patch reply required" unless defined $_;
|
|
|
|
if (/^[nq]/i) {
|
|
|
|
exit(0);
|
|
|
|
} elsif (/^a/i) {
|
|
|
|
last;
|
|
|
|
}
|
|
|
|
}
|
|
|
|
}
|
|
|
|
|
2008-08-20 15:30:06 +08:00
|
|
|
my $expect_url = $url;
|
2011-09-08 01:36:05 +08:00
|
|
|
|
|
|
|
my $push_merge_info = eval {
|
|
|
|
command_oneline(qw/config --get svn.pushmergeinfo/)
|
|
|
|
};
|
|
|
|
if (not defined($push_merge_info)
|
|
|
|
or $push_merge_info eq "false"
|
|
|
|
or $push_merge_info eq "no"
|
|
|
|
or $push_merge_info eq "never") {
|
|
|
|
$push_merge_info = 0;
|
|
|
|
}
|
|
|
|
|
|
|
|
unless (defined($_merge_info) || ! $push_merge_info) {
|
|
|
|
# Preflight check of changes to ensure no issues with mergeinfo
|
|
|
|
# This includes check for uncommitted-to-SVN parents
|
|
|
|
# (other than the first parent, which we will handle),
|
|
|
|
# information from different SVN repos, and paths
|
|
|
|
# which are not underneath this repository root.
|
|
|
|
my $rooturl = $gs->repos_root;
|
2017-09-16 05:46:53 +08:00
|
|
|
Git::SVN::remove_username($rooturl);
|
2011-09-08 01:36:05 +08:00
|
|
|
foreach my $d (@$linear_refs) {
|
|
|
|
my %parentshash;
|
|
|
|
read_commit_parents(\%parentshash, $d);
|
|
|
|
my @realparents = @{$parentshash{$d}};
|
|
|
|
if ($#realparents > 0) {
|
|
|
|
# Merge commit
|
|
|
|
shift @realparents; # Remove/ignore first parent
|
|
|
|
foreach my $parent (@realparents) {
|
|
|
|
my ($branchurl, $svnrev, $paruuid) = cmt_metadata($parent);
|
|
|
|
unless (defined $paruuid) {
|
|
|
|
# A parent is missing SVN annotations...
|
|
|
|
# abort the whole operation.
|
|
|
|
fatal "$parent is merged into revision $d, "
|
|
|
|
."but does not have git-svn metadata. "
|
|
|
|
."Either dcommit the branch or use a "
|
|
|
|
."local cherry-pick, FF merge, or rebase "
|
|
|
|
."instead of an explicit merge commit.";
|
|
|
|
}
|
|
|
|
|
|
|
|
unless ($paruuid eq $uuid) {
|
|
|
|
# Parent has SVN metadata from different repository
|
|
|
|
fatal "merge parent $parent for change $d has "
|
|
|
|
."git-svn uuid $paruuid, while current change "
|
|
|
|
."has uuid $uuid!";
|
|
|
|
}
|
|
|
|
|
2011-11-01 06:37:12 +08:00
|
|
|
unless ($branchurl =~ /^\Q$rooturl\E(.*)/) {
|
2011-09-08 01:36:05 +08:00
|
|
|
# This branch is very strange indeed.
|
|
|
|
fatal "merge parent $parent for $d is on branch "
|
|
|
|
."$branchurl, which is not under the "
|
|
|
|
."git-svn root $rooturl!";
|
|
|
|
}
|
|
|
|
}
|
|
|
|
}
|
|
|
|
}
|
|
|
|
}
|
|
|
|
|
|
|
|
my $rewritten_parent;
|
2012-08-08 13:35:00 +08:00
|
|
|
my $current_head = command_oneline(qw/rev-parse HEAD/);
|
2008-08-20 15:30:06 +08:00
|
|
|
Git::SVN::remove_username($expect_url);
|
2011-09-01 00:48:39 +08:00
|
|
|
if (defined($_merge_info)) {
|
|
|
|
$_merge_info =~ tr{ }{\n};
|
|
|
|
}
|
2007-11-05 19:21:47 +08:00
|
|
|
while (1) {
|
|
|
|
my $d = shift @$linear_refs or last;
|
2006-11-09 17:19:37 +08:00
|
|
|
unless (defined $last_rev) {
|
|
|
|
(undef, $last_rev, undef) = cmt_metadata("$d~1");
|
|
|
|
unless (defined $last_rev) {
|
2007-01-14 19:14:28 +08:00
|
|
|
fatal "Unable to extract revision information ",
|
2007-10-16 22:36:52 +08:00
|
|
|
"from commit $d~1";
|
2006-11-09 17:19:37 +08:00
|
|
|
}
|
|
|
|
}
|
2006-08-26 15:01:23 +08:00
|
|
|
if ($_dry_run) {
|
|
|
|
print "diff-tree $d~1 $d\n";
|
|
|
|
} else {
|
2007-09-01 09:16:12 +08:00
|
|
|
my $cmt_rev;
|
2011-09-08 01:36:05 +08:00
|
|
|
|
|
|
|
unless (defined($_merge_info) || ! $push_merge_info) {
|
|
|
|
$_merge_info = populate_merge_info($d, $gs,
|
|
|
|
$uuid,
|
|
|
|
$linear_refs,
|
|
|
|
$rewritten_parent);
|
|
|
|
}
|
|
|
|
|
2007-01-14 19:14:28 +08:00
|
|
|
my %ed_opts = ( r => $last_rev,
|
2007-01-28 06:33:08 +08:00
|
|
|
log => get_commit_entry($d)->{log},
|
2008-08-07 17:06:16 +08:00
|
|
|
ra => Git::SVN::Ra->new($url),
|
2007-11-14 08:52:02 +08:00
|
|
|
config => SVN::Core::config_get_config(
|
|
|
|
$Git::SVN::Ra::config_dir
|
|
|
|
),
|
2007-01-28 06:33:08 +08:00
|
|
|
tree_a => "$d~1",
|
|
|
|
tree_b => $d,
|
|
|
|
editor_cb => sub {
|
|
|
|
print "Committed r$_[0]\n";
|
2007-09-01 09:16:12 +08:00
|
|
|
$cmt_rev = $_[0];
|
|
|
|
},
|
2010-09-25 11:51:50 +08:00
|
|
|
mergeinfo => $_merge_info,
|
2007-02-14 06:22:11 +08:00
|
|
|
svn_path => '');
|
2012-08-08 13:35:00 +08:00
|
|
|
|
|
|
|
my $err_handler = $SVN::Error::handler;
|
|
|
|
$SVN::Error::handler = sub {
|
|
|
|
my $err = shift;
|
|
|
|
dcommit_rebase(1, $current_head, $gs->refname,
|
|
|
|
$err);
|
|
|
|
};
|
|
|
|
|
2012-05-28 15:00:46 +08:00
|
|
|
if (!Git::SVN::Editor->new(\%ed_opts)->apply_diff) {
|
2007-01-14 19:14:28 +08:00
|
|
|
print "No changes\n$d~1 == $d\n";
|
2007-06-13 17:23:28 +08:00
|
|
|
} elsif ($parents->{$d} && @{$parents->{$d}}) {
|
2007-09-01 09:16:12 +08:00
|
|
|
$gs->{inject_parents_dcommit}->{$cmt_rev} =
|
2007-06-13 17:23:28 +08:00
|
|
|
$parents->{$d};
|
2007-01-14 19:14:28 +08:00
|
|
|
}
|
2007-09-01 09:16:12 +08:00
|
|
|
$_fetch_all ? $gs->fetch_all : $gs->fetch;
|
2012-08-08 13:35:00 +08:00
|
|
|
$SVN::Error::handler = $err_handler;
|
2008-01-03 02:09:49 +08:00
|
|
|
$last_rev = $cmt_rev;
|
2007-09-01 09:16:12 +08:00
|
|
|
next if $_no_rebase;
|
|
|
|
|
2012-08-08 13:35:00 +08:00
|
|
|
my @diff = dcommit_rebase(@$linear_refs == 0, $d,
|
|
|
|
$gs->refname, undef);
|
2011-09-08 01:36:05 +08:00
|
|
|
|
2012-08-08 13:35:00 +08:00
|
|
|
$rewritten_parent = command_oneline(qw/rev-parse/,
|
|
|
|
$gs->refname);
|
2011-09-08 01:36:05 +08:00
|
|
|
|
2007-11-05 19:21:47 +08:00
|
|
|
if (@diff) {
|
2012-08-08 13:35:00 +08:00
|
|
|
$current_head = command_oneline(qw/rev-parse
|
|
|
|
HEAD/);
|
2007-11-05 19:21:47 +08:00
|
|
|
@refs = ();
|
|
|
|
my ($url_, $rev_, $uuid_, $gs_) =
|
2009-05-29 23:09:42 +08:00
|
|
|
working_head_info('HEAD', \@refs);
|
2007-11-05 19:21:47 +08:00
|
|
|
my ($linear_refs_, $parents_) =
|
|
|
|
linearize_history($gs_, \@refs);
|
|
|
|
if (scalar(@$linear_refs) !=
|
|
|
|
scalar(@$linear_refs_)) {
|
|
|
|
fatal "# of revisions changed ",
|
|
|
|
"\nbefore:\n",
|
|
|
|
join("\n", @$linear_refs),
|
|
|
|
"\n\nafter:\n",
|
|
|
|
join("\n", @$linear_refs_), "\n",
|
|
|
|
'If you are attempting to commit ',
|
|
|
|
"merges, try running:\n\t",
|
|
|
|
'git rebase --interactive',
|
2019-11-23 06:59:29 +08:00
|
|
|
'--rebase-merges ',
|
2007-11-05 19:21:47 +08:00
|
|
|
$gs->refname,
|
|
|
|
"\nBefore dcommitting";
|
|
|
|
}
|
2008-08-20 15:30:06 +08:00
|
|
|
if ($url_ ne $expect_url) {
|
2009-10-09 15:01:04 +08:00
|
|
|
if ($url_ eq $gs->metadata_url) {
|
|
|
|
print
|
|
|
|
"Accepting rewritten URL:",
|
|
|
|
" $url_\n";
|
|
|
|
} else {
|
|
|
|
fatal
|
|
|
|
"URL mismatch after rebase:",
|
|
|
|
" $url_ != $expect_url";
|
|
|
|
}
|
2007-11-05 19:21:47 +08:00
|
|
|
}
|
|
|
|
if ($uuid_ ne $uuid) {
|
|
|
|
fatal "uuid mismatch after rebase: ",
|
|
|
|
"$uuid_ != $uuid";
|
|
|
|
}
|
|
|
|
# remap parents
|
|
|
|
my (%p, @l, $i);
|
|
|
|
for ($i = 0; $i < scalar @$linear_refs; $i++) {
|
|
|
|
my $new = $linear_refs_->[$i] or next;
|
|
|
|
$p{$new} =
|
|
|
|
$parents->{$linear_refs->[$i]};
|
|
|
|
push @l, $new;
|
|
|
|
}
|
|
|
|
$parents = \%p;
|
|
|
|
$linear_refs = \@l;
|
2012-08-08 13:35:00 +08:00
|
|
|
undef $last_rev;
|
2007-11-05 19:21:47 +08:00
|
|
|
}
|
2006-08-26 15:01:23 +08:00
|
|
|
}
|
|
|
|
}
|
2009-05-29 23:09:42 +08:00
|
|
|
|
|
|
|
if ($old_head) {
|
|
|
|
my $new_head = command_oneline(qw/rev-parse HEAD/);
|
|
|
|
my $new_is_symbolic = eval {
|
|
|
|
command_oneline(qw/symbolic-ref -q HEAD/);
|
|
|
|
};
|
|
|
|
if ($new_is_symbolic) {
|
|
|
|
print "dcommitted the branch ", $head, "\n";
|
|
|
|
} else {
|
|
|
|
print "dcommitted on a detached HEAD because you gave ",
|
|
|
|
"a revision argument.\n",
|
|
|
|
"The rewritten commit is: ", $new_head, "\n";
|
|
|
|
}
|
|
|
|
command(['checkout', $old_head], STDERR => 0);
|
|
|
|
}
|
|
|
|
|
2007-12-14 00:27:34 +08:00
|
|
|
unlink $gs->{index};
|
2006-08-26 15:01:23 +08:00
|
|
|
}
|
|
|
|
|
2008-10-05 10:35:17 +08:00
|
|
|
sub cmd_branch {
|
|
|
|
my ($branch_name, $head) = @_;
|
|
|
|
|
|
|
|
unless (defined $branch_name && length $branch_name) {
|
|
|
|
die(($_tag ? "tag" : "branch") . " name required\n");
|
|
|
|
}
|
|
|
|
$head ||= 'HEAD';
|
|
|
|
|
2009-12-23 14:40:18 +08:00
|
|
|
my (undef, $rev, undef, $gs) = working_head_info($head);
|
2011-04-08 22:57:54 +08:00
|
|
|
my $src = $gs->full_pushurl;
|
2008-10-05 10:35:17 +08:00
|
|
|
|
2008-12-02 10:43:00 +08:00
|
|
|
my $remote = Git::SVN::read_all_remotes()->{$gs->{repo_id}};
|
2009-06-24 01:02:08 +08:00
|
|
|
my $allglobs = $remote->{ $_tag ? 'tags' : 'branches' };
|
|
|
|
my $glob;
|
|
|
|
if ($#{$allglobs} == 0) {
|
|
|
|
$glob = $allglobs->[0];
|
|
|
|
} else {
|
|
|
|
unless(defined $_branch_dest) {
|
|
|
|
die "Multiple ",
|
|
|
|
$_tag ? "tag" : "branch",
|
|
|
|
" paths defined for Subversion repository.\n",
|
|
|
|
"You must specify where you want to create the ",
|
|
|
|
$_tag ? "tag" : "branch",
|
|
|
|
" with the --destination argument.\n";
|
|
|
|
}
|
|
|
|
foreach my $g (@{$allglobs}) {
|
2012-05-28 15:00:46 +08:00
|
|
|
my $re = Git::SVN::Editor::glob2pat($g->{path}->{left});
|
2009-06-25 17:28:15 +08:00
|
|
|
if ($_branch_dest =~ /$re/) {
|
2009-06-24 01:02:08 +08:00
|
|
|
$glob = $g;
|
|
|
|
last;
|
|
|
|
}
|
|
|
|
}
|
|
|
|
unless (defined $glob) {
|
2009-07-25 16:36:06 +08:00
|
|
|
my $dest_re = qr/\b\Q$_branch_dest\E\b/;
|
|
|
|
foreach my $g (@{$allglobs}) {
|
|
|
|
$g->{path}->{left} =~ /$dest_re/ or next;
|
|
|
|
if (defined $glob) {
|
|
|
|
die "Ambiguous destination: ",
|
|
|
|
$_branch_dest, "\nmatches both '",
|
|
|
|
$glob->{path}->{left}, "' and '",
|
|
|
|
$g->{path}->{left}, "'\n";
|
|
|
|
}
|
|
|
|
$glob = $g;
|
|
|
|
}
|
|
|
|
unless (defined $glob) {
|
|
|
|
die "Unknown ",
|
|
|
|
$_tag ? "tag" : "branch",
|
|
|
|
" destination $_branch_dest\n";
|
|
|
|
}
|
2009-06-24 01:02:08 +08:00
|
|
|
}
|
|
|
|
}
|
2008-10-05 10:35:17 +08:00
|
|
|
my ($lft, $rgt) = @{ $glob->{path} }{qw/left right/};
|
2010-01-12 00:21:23 +08:00
|
|
|
my $url;
|
|
|
|
if (defined $_commit_url) {
|
|
|
|
$url = $_commit_url;
|
|
|
|
} else {
|
|
|
|
$url = eval { command_oneline('config', '--get',
|
|
|
|
"svn-remote.$gs->{repo_id}.commiturl") };
|
|
|
|
if (!$url) {
|
2011-04-08 22:57:54 +08:00
|
|
|
$url = $remote->{pushurl} || $remote->{url};
|
2010-01-12 00:21:23 +08:00
|
|
|
}
|
|
|
|
}
|
|
|
|
my $dst = join '/', $url, $lft, $branch_name, ($rgt || ());
|
2008-10-05 10:35:17 +08:00
|
|
|
|
2010-01-12 00:20:43 +08:00
|
|
|
if ($dst =~ /^https:/ && $src =~ /^http:/) {
|
|
|
|
$src=~s/^http:/https:/;
|
|
|
|
}
|
|
|
|
|
2010-02-24 11:13:50 +08:00
|
|
|
::_req_svn();
|
2015-01-15 16:54:22 +08:00
|
|
|
require SVN::Client;
|
2010-02-24 11:13:50 +08:00
|
|
|
|
2017-03-06 13:59:07 +08:00
|
|
|
my ($config, $baton, undef) = Git::SVN::Ra::prepare_config_once();
|
2008-10-05 10:35:17 +08:00
|
|
|
my $ctx = SVN::Client->new(
|
2017-03-06 13:59:07 +08:00
|
|
|
auth => $baton,
|
|
|
|
config => $config,
|
2008-10-05 10:35:17 +08:00
|
|
|
log_msg => sub {
|
|
|
|
${ $_[0] } = defined $_message
|
|
|
|
? $_message
|
|
|
|
: 'Create ' . ($_tag ? 'tag ' : 'branch ' )
|
|
|
|
. $branch_name;
|
|
|
|
},
|
|
|
|
);
|
|
|
|
|
|
|
|
eval {
|
|
|
|
$ctx->ls($dst, 'HEAD', 0);
|
|
|
|
} and die "branch ${branch_name} already exists\n";
|
|
|
|
|
2013-05-16 04:14:43 +08:00
|
|
|
if ($_parents) {
|
|
|
|
mk_parent_dirs($ctx, $dst);
|
|
|
|
}
|
|
|
|
|
2008-10-05 10:35:17 +08:00
|
|
|
print "Copying ${src} at r${rev} to ${dst}...\n";
|
|
|
|
$ctx->copy($src, $rev, $dst)
|
|
|
|
unless $_dry_run;
|
|
|
|
|
2018-01-30 07:11:07 +08:00
|
|
|
# Release resources held by ctx before creating another SVN::Ra
|
|
|
|
# so destruction is orderly. This seems necessary with SVN 1.9.5
|
|
|
|
# to avoid segfaults.
|
|
|
|
$ctx = undef;
|
|
|
|
|
2008-10-05 10:35:17 +08:00
|
|
|
$gs->fetch_all;
|
|
|
|
}
|
|
|
|
|
2013-05-16 04:14:43 +08:00
|
|
|
sub mk_parent_dirs {
|
|
|
|
my ($ctx, $parent) = @_;
|
|
|
|
$parent =~ s{/[^/]*$}{};
|
|
|
|
|
|
|
|
if (!eval{$ctx->ls($parent, 'HEAD', 0)}) {
|
|
|
|
mk_parent_dirs($ctx, $parent);
|
|
|
|
print "Creating parent folder ${parent} ...\n";
|
|
|
|
$ctx->mkdir($parent) unless $_dry_run;
|
|
|
|
}
|
|
|
|
}
|
|
|
|
|
2007-04-28 02:57:53 +08:00
|
|
|
sub cmd_find_rev {
|
2008-03-11 16:00:45 +08:00
|
|
|
my $revision_or_hash = shift or die "SVN or git revision required ",
|
|
|
|
"as a command-line argument\n";
|
2007-04-28 02:57:53 +08:00
|
|
|
my $result;
|
|
|
|
if ($revision_or_hash =~ /^r\d+$/) {
|
2007-04-29 16:35:27 +08:00
|
|
|
my $head = shift;
|
|
|
|
$head ||= 'HEAD';
|
|
|
|
my @refs;
|
2008-07-14 23:28:04 +08:00
|
|
|
my (undef, undef, $uuid, $gs) = working_head_info($head, \@refs);
|
2007-04-29 16:35:27 +08:00
|
|
|
unless ($gs) {
|
|
|
|
die "Unable to determine upstream SVN information from ",
|
|
|
|
"$head history\n";
|
2007-04-28 02:57:53 +08:00
|
|
|
}
|
2007-04-29 16:35:27 +08:00
|
|
|
my $desired_revision = substr($revision_or_hash, 1);
|
2013-01-18 06:19:33 +08:00
|
|
|
if ($_before) {
|
|
|
|
$result = $gs->find_rev_before($desired_revision, 1);
|
|
|
|
} elsif ($_after) {
|
|
|
|
$result = $gs->find_rev_after($desired_revision, 1);
|
|
|
|
} else {
|
|
|
|
$result = $gs->rev_map_get($desired_revision, $uuid);
|
|
|
|
}
|
2007-04-28 02:57:53 +08:00
|
|
|
} else {
|
|
|
|
my (undef, $rev, undef) = cmt_metadata($revision_or_hash);
|
|
|
|
$result = $rev;
|
|
|
|
}
|
|
|
|
print "$result\n" if $result;
|
|
|
|
}
|
|
|
|
|
2011-04-01 18:26:00 +08:00
|
|
|
sub auto_create_empty_directories {
|
|
|
|
my ($gs) = @_;
|
|
|
|
my $var = eval { command_oneline('config', '--get', '--bool',
|
|
|
|
"svn-remote.$gs->{repo_id}.automkdirs") };
|
|
|
|
# By default, create empty directories by consulting the unhandled log,
|
|
|
|
# but allow setting it to 'false' to skip it.
|
|
|
|
return !($var && $var eq 'false');
|
|
|
|
}
|
|
|
|
|
2007-02-16 19:22:40 +08:00
|
|
|
sub cmd_rebase {
|
|
|
|
command_noisy(qw/update-index --refresh/);
|
2007-04-08 15:59:19 +08:00
|
|
|
my ($url, $rev, $uuid, $gs) = working_head_info('HEAD');
|
|
|
|
unless ($gs) {
|
2007-02-16 19:22:40 +08:00
|
|
|
die "Unable to determine upstream SVN information from ",
|
|
|
|
"working tree history\n";
|
|
|
|
}
|
2008-05-20 11:29:17 +08:00
|
|
|
if ($_dry_run) {
|
|
|
|
print "Remote Branch: " . $gs->refname . "\n";
|
|
|
|
print "SVN URL: " . $url . "\n";
|
|
|
|
return;
|
|
|
|
}
|
2007-02-16 19:22:40 +08:00
|
|
|
if (command(qw/diff-index HEAD --/)) {
|
2013-06-19 13:37:24 +08:00
|
|
|
print STDERR "Cannot rebase with uncommitted changes:\n";
|
2007-02-16 19:22:40 +08:00
|
|
|
command_noisy('status');
|
|
|
|
exit 1;
|
|
|
|
}
|
2007-03-14 02:40:36 +08:00
|
|
|
unless ($_local) {
|
2007-11-30 03:54:39 +08:00
|
|
|
# rebase will checkout for us, so no need to do it explicitly
|
|
|
|
$_no_checkout = 'true';
|
2007-03-14 02:40:36 +08:00
|
|
|
$_fetch_all ? $gs->fetch_all : $gs->fetch;
|
|
|
|
}
|
2007-02-16 19:22:40 +08:00
|
|
|
command_noisy(rebase_cmd(), $gs->refname);
|
2011-04-01 18:26:00 +08:00
|
|
|
if (auto_create_empty_directories($gs)) {
|
|
|
|
$gs->mkemptydirs;
|
|
|
|
}
|
2007-02-16 19:22:40 +08:00
|
|
|
}
|
|
|
|
|
2007-01-12 09:58:39 +08:00
|
|
|
sub cmd_show_ignore {
|
2007-04-08 15:59:19 +08:00
|
|
|
my ($url, $rev, $uuid, $gs) = working_head_info('HEAD');
|
|
|
|
$gs ||= Git::SVN->new;
|
2007-01-12 09:58:39 +08:00
|
|
|
my $r = (defined $_revision ? $_revision : $gs->ra->get_latest_revnum);
|
2012-07-28 04:00:51 +08:00
|
|
|
$gs->prop_walk($gs->path, $r, sub {
|
2007-10-16 22:36:48 +08:00
|
|
|
my ($gs, $path, $props) = @_;
|
|
|
|
print STDOUT "\n# $path\n";
|
|
|
|
my $s = $props->{'svn:ignore'} or return;
|
|
|
|
$s =~ s/[\r\n]+/\n/g;
|
2009-08-08 03:21:21 +08:00
|
|
|
$s =~ s/^\n+//;
|
2007-10-16 22:36:48 +08:00
|
|
|
chomp $s;
|
|
|
|
$s =~ s#^#$path#gm;
|
|
|
|
print STDOUT "$s\n";
|
|
|
|
});
|
git-svn: add support for Perl SVN::* libraries
This means we no longer have to deal with having bloated SVN
working copies around and we get a nice performance increase as
well because we don't have to exec the SVN binary and start a
new server connection each time.
Of course we have to manually manage memory with SVN::Pool
whenever we can, and hack around cases where SVN just eats
memory despite pools (I blame Perl, too). I would like to
keep memory usage as stable as possible during long fetch/commit
processes since I still use computers with only 256-512M RAM.
commit should always be faster with the SVN library code. The
SVN::Delta interface is leaky (or I'm not using it with pools
correctly), so I'm forking on every commit, but that doesn't
seem to hurt performance too much (at least on normal Unix/Linux
systems where fork() is pretty cheap).
fetch should be faster in most common cases, but probably not all.
fetches will be faster where client/server delta generation is
the bottleneck and not bandwidth. Of course, full-files are
generated server-side via deltas, too. Full files are always
transferred when they're updated, just like git-svnimport and
unlike command-line svn. I'm also hacking around memory leaks
(see comments) here by using some more forks.
I've tested fetch with http://, https://, file://, and svn://
repositories, so we should be reasonably covered in terms of
error handling for fetching.
Of course, we'll keep plain command-line svn compatibility as a
fallback for people running SVN 1.1 (I'm looking into library
support for 1.1.x SVN, too). If you want to force command-line
SVN usage, set GIT_SVN_NO_LIB=1 in your environment.
We also require two simultaneous connections (just like
git-svnimport), but this shouldn't be a problem for most
servers.
Less important commands:
show-ignore is slower because it requires repository
access, but -r/--revision <num> can be specified.
graft-branches may use more memory, but it's a
short-term process and is funky-filename-safe.
Signed-off-by: Eric Wong <normalperson@yhbt.net>
2006-06-13 06:23:48 +08:00
|
|
|
}
|
|
|
|
|
2007-11-20 06:56:15 +08:00
|
|
|
sub cmd_show_externals {
|
|
|
|
my ($url, $rev, $uuid, $gs) = working_head_info('HEAD');
|
|
|
|
$gs ||= Git::SVN->new;
|
|
|
|
my $r = (defined $_revision ? $_revision : $gs->ra->get_latest_revnum);
|
2012-07-28 04:00:51 +08:00
|
|
|
$gs->prop_walk($gs->path, $r, sub {
|
2007-11-20 06:56:15 +08:00
|
|
|
my ($gs, $path, $props) = @_;
|
|
|
|
print STDOUT "\n# $path\n";
|
|
|
|
my $s = $props->{'svn:externals'} or return;
|
|
|
|
$s =~ s/[\r\n]+/\n/g;
|
|
|
|
chomp $s;
|
|
|
|
$s =~ s#^#$path#gm;
|
|
|
|
print STDOUT "$s\n";
|
|
|
|
});
|
|
|
|
}
|
|
|
|
|
2007-10-16 22:36:49 +08:00
|
|
|
sub cmd_create_ignore {
|
|
|
|
my ($url, $rev, $uuid, $gs) = working_head_info('HEAD');
|
|
|
|
$gs ||= Git::SVN->new;
|
|
|
|
my $r = (defined $_revision ? $_revision : $gs->ra->get_latest_revnum);
|
2012-07-28 04:00:51 +08:00
|
|
|
$gs->prop_walk($gs->path, $r, sub {
|
2007-10-16 22:36:49 +08:00
|
|
|
my ($gs, $path, $props) = @_;
|
|
|
|
# $path is of the form /path/to/dir/
|
2009-02-20 02:08:04 +08:00
|
|
|
$path = '.' . $path;
|
|
|
|
# SVN can have attributes on empty directories,
|
|
|
|
# which git won't track
|
|
|
|
mkpath([$path]) unless -d $path;
|
|
|
|
my $ignore = $path . '.gitignore';
|
2007-10-16 22:36:49 +08:00
|
|
|
my $s = $props->{'svn:ignore'} or return;
|
|
|
|
open(GITIGNORE, '>', $ignore)
|
2007-10-16 22:36:52 +08:00
|
|
|
or fatal("Failed to open `$ignore' for writing: $!");
|
2007-10-16 22:36:49 +08:00
|
|
|
$s =~ s/[\r\n]+/\n/g;
|
2009-08-08 03:21:21 +08:00
|
|
|
$s =~ s/^\n+//;
|
2007-10-16 22:36:49 +08:00
|
|
|
chomp $s;
|
|
|
|
# Prefix all patterns so that the ignore doesn't apply
|
|
|
|
# to sub-directories.
|
|
|
|
$s =~ s#^#/#gm;
|
|
|
|
print GITIGNORE "$s\n";
|
|
|
|
close(GITIGNORE)
|
2007-10-16 22:36:52 +08:00
|
|
|
or fatal("Failed to close `$ignore': $!");
|
2008-05-05 06:33:09 +08:00
|
|
|
command_noisy('add', '-f', $ignore);
|
2007-10-16 22:36:49 +08:00
|
|
|
});
|
|
|
|
}
|
|
|
|
|
2009-11-16 10:57:16 +08:00
|
|
|
sub cmd_mkdirs {
|
|
|
|
my ($url, $rev, $uuid, $gs) = working_head_info('HEAD');
|
|
|
|
$gs ||= Git::SVN->new;
|
|
|
|
$gs->mkemptydirs($_revision);
|
|
|
|
}
|
|
|
|
|
2007-10-16 22:36:50 +08:00
|
|
|
# get_svnprops(PATH)
|
|
|
|
# ------------------
|
2007-10-16 22:36:51 +08:00
|
|
|
# Helper for cmd_propget and cmd_proplist below.
|
2007-10-16 22:36:50 +08:00
|
|
|
sub get_svnprops {
|
|
|
|
my $path = shift;
|
|
|
|
my ($url, $rev, $uuid, $gs) = working_head_info('HEAD');
|
|
|
|
$gs ||= Git::SVN->new;
|
|
|
|
|
|
|
|
# prefix THE PATH by the sub-directory from which the user
|
|
|
|
# invoked us.
|
|
|
|
$path = $cmd_dir_prefix . $path;
|
2007-10-16 22:36:52 +08:00
|
|
|
fatal("No such file or directory: $path") unless -e $path;
|
2007-10-16 22:36:50 +08:00
|
|
|
my $is_dir = -d $path ? 1 : 0;
|
2012-09-18 08:09:31 +08:00
|
|
|
$path = join_paths($gs->path, $path);
|
2007-10-16 22:36:50 +08:00
|
|
|
|
|
|
|
# canonicalize the path (otherwise libsvn will abort or fail to
|
|
|
|
# find the file)
|
2007-11-21 14:43:17 +08:00
|
|
|
$path = canonicalize_path($path);
|
2007-10-16 22:36:50 +08:00
|
|
|
|
|
|
|
my $r = (defined $_revision ? $_revision : $gs->ra->get_latest_revnum);
|
|
|
|
my $props;
|
|
|
|
if ($is_dir) {
|
|
|
|
(undef, undef, $props) = $gs->ra->get_dir($path, $r);
|
|
|
|
}
|
|
|
|
else {
|
|
|
|
(undef, $props) = $gs->ra->get_file($path, $r, undef);
|
|
|
|
}
|
|
|
|
return $props;
|
|
|
|
}
|
|
|
|
|
|
|
|
# cmd_propget (PROP, PATH)
|
|
|
|
# ------------------------
|
|
|
|
# Print the SVN property PROP for PATH.
|
|
|
|
sub cmd_propget {
|
|
|
|
my ($prop, $path) = @_;
|
|
|
|
$path = '.' if not defined $path;
|
|
|
|
usage(1) if not defined $prop;
|
|
|
|
my $props = get_svnprops($path);
|
|
|
|
if (not defined $props->{$prop}) {
|
2007-10-16 22:36:52 +08:00
|
|
|
fatal("`$path' does not have a `$prop' SVN property.");
|
2007-10-16 22:36:50 +08:00
|
|
|
}
|
|
|
|
print $props->{$prop} . "\n";
|
|
|
|
}
|
|
|
|
|
2014-12-07 18:47:23 +08:00
|
|
|
# cmd_propset (PROPNAME, PROPVAL, PATH)
|
|
|
|
# ------------------------
|
|
|
|
# Adjust the SVN property PROPNAME to PROPVAL for PATH.
|
|
|
|
sub cmd_propset {
|
|
|
|
my ($propname, $propval, $path) = @_;
|
|
|
|
$path = '.' if not defined $path;
|
|
|
|
$path = $cmd_dir_prefix . $path;
|
|
|
|
usage(1) if not defined $propname;
|
|
|
|
usage(1) if not defined $propval;
|
|
|
|
my $file = basename($path);
|
|
|
|
my $dn = dirname($path);
|
|
|
|
my $cur_props = Git::SVN::Editor::check_attr( "svn-properties", $path );
|
|
|
|
my @new_props;
|
|
|
|
if (!$cur_props || $cur_props eq "unset" || $cur_props eq "" || $cur_props eq "set") {
|
|
|
|
push @new_props, "$propname=$propval";
|
|
|
|
} else {
|
|
|
|
# TODO: handle combining properties better
|
|
|
|
my @props = split(/;/, $cur_props);
|
|
|
|
my $replaced_prop;
|
|
|
|
foreach my $prop (@props) {
|
|
|
|
# Parse 'name=value' syntax and set the property.
|
|
|
|
if ($prop =~ /([^=]+)=(.*)/) {
|
|
|
|
my ($n,$v) = ($1,$2);
|
|
|
|
if ($n eq $propname) {
|
|
|
|
$v = $propval;
|
|
|
|
$replaced_prop = 1;
|
|
|
|
}
|
|
|
|
push @new_props, "$n=$v";
|
|
|
|
}
|
|
|
|
}
|
|
|
|
if (!$replaced_prop) {
|
|
|
|
push @new_props, "$propname=$propval";
|
|
|
|
}
|
|
|
|
}
|
|
|
|
my $attrfile = "$dn/.gitattributes";
|
|
|
|
open my $attrfh, '>>', $attrfile or die "Can't open $attrfile: $!\n";
|
|
|
|
# TODO: don't simply append here if $file already has svn-properties
|
|
|
|
my $new_props = join(';', @new_props);
|
|
|
|
print $attrfh "$file svn-properties=$new_props\n" or
|
|
|
|
die "write to $attrfile: $!\n";
|
|
|
|
close $attrfh or die "close $attrfile: $!\n";
|
|
|
|
}
|
|
|
|
|
2007-10-16 22:36:51 +08:00
|
|
|
# cmd_proplist (PATH)
|
|
|
|
# -------------------
|
|
|
|
# Print the list of SVN properties for PATH.
|
|
|
|
sub cmd_proplist {
|
|
|
|
my $path = shift;
|
|
|
|
$path = '.' if not defined $path;
|
|
|
|
my $props = get_svnprops($path);
|
|
|
|
print "Properties on '$path':\n";
|
|
|
|
foreach (sort keys %{$props}) {
|
|
|
|
print " $_\n";
|
|
|
|
}
|
|
|
|
}
|
|
|
|
|
2007-01-12 07:35:55 +08:00
|
|
|
sub cmd_multi_init {
|
2006-06-13 06:53:13 +08:00
|
|
|
my $url = shift;
|
2009-06-24 01:02:08 +08:00
|
|
|
unless (defined $_trunk || @_branches || @_tags) {
|
2007-01-05 10:02:00 +08:00
|
|
|
usage(1);
|
2006-06-13 06:53:13 +08:00
|
|
|
}
|
2007-05-19 18:59:02 +08:00
|
|
|
|
Git 2.0: git svn: Set default --prefix='origin/' if --prefix is not given
git-svn by default puts its Subversion-tracking refs directly in
refs/remotes/*. This runs counter to Git's convention of using
refs/remotes/$remote/* for storing remote-tracking branches.
Furthermore, combining git-svn with regular git remotes run the risk of
clobbering refs under refs/remotes (e.g. if you have a git remote
called "tags" with a "v1" branch, it will overlap with the git-svn's
tracking branch for the "v1" tag from Subversion.
Even though the git-svn refs stored in refs/remotes/* are not "proper"
remote-tracking branches (since they are not covered by a proper git
remote's refspec), they clearly represent a similar concept, and would
benefit from following the same convention.
For example, if git-svn tracks Subversion branch "foo" at
refs/remotes/foo, and you create a local branch refs/heads/foo to add
some commits to be pushed back to Subversion (using "git svn dcommit),
then it is clearly unhelpful of Git to throw
warning: refname 'foo' is ambiguous.
every time you checkout, rebase, or otherwise interact with the branch.
The existing workaround for this is to supply the --prefix=quux/ to
git svn init/clone, so that git-svn's tracking branches end up in
refs/remotes/quux/* instead of refs/remotes/*. However, encouraging
users to specify --prefix to work around a design flaw in git-svn is
suboptimal, and not a long term solution to the problem. Instead,
git-svn should default to use a non-empty prefix that saves
unsuspecting users from the inconveniences described above.
This patch will only affect newly created git-svn setups, as the
--prefix option only applies to git svn init (and git svn clone).
Existing git-svn setups will continue with their existing (lack of)
prefix. Also, if anyone somehow prefers git-svn's old layout, they
can recreate that by explicitly passing an empty prefix (--prefix "")
on the git svn init/clone command line.
The patch changes the default value for --prefix from "" to "origin/",
updates the git-svn manual page, and fixes the fallout in the git-svn
testcases.
(Note that this patch might be easier to review using the --word-diff
and --word-diff-regex=. diff options.)
[ew: squashed description of <= 1.9 behavior into manpage]
Suggested-by: Thomas Ferris Nicolaisen <tfnico@gmail.com>
Signed-off-by: Johan Herland <johan@herland.net>
Signed-off-by: Eric Wong <normalperson@yhbt.net>
2013-10-11 20:57:07 +08:00
|
|
|
$_prefix = 'origin/' unless defined $_prefix;
|
2007-02-15 04:27:41 +08:00
|
|
|
if (defined $url) {
|
2009-06-26 22:52:09 +08:00
|
|
|
$url = canonicalize_url($url);
|
2007-02-15 04:27:41 +08:00
|
|
|
init_subdir(@_);
|
|
|
|
}
|
2007-02-23 17:26:26 +08:00
|
|
|
do_git_init_db();
|
2007-01-05 10:02:00 +08:00
|
|
|
if (defined $_trunk) {
|
2010-06-13 19:27:43 +08:00
|
|
|
$_trunk =~ s#^/+##;
|
2009-08-12 11:14:27 +08:00
|
|
|
my $trunk_ref = 'refs/remotes/' . $_prefix . 'trunk';
|
2007-01-19 09:50:01 +08:00
|
|
|
# try both old-style and new-style lookups:
|
|
|
|
my $gs_trunk = eval { Git::SVN->new($trunk_ref) };
|
2007-01-12 07:35:55 +08:00
|
|
|
unless ($gs_trunk) {
|
2007-01-19 09:50:01 +08:00
|
|
|
my ($trunk_url, $trunk_path) =
|
|
|
|
complete_svn_url($url, $_trunk);
|
|
|
|
$gs_trunk = Git::SVN->init($trunk_url, $trunk_path,
|
|
|
|
undef, $trunk_ref);
|
2007-01-05 10:02:00 +08:00
|
|
|
}
|
2006-10-12 02:53:21 +08:00
|
|
|
}
|
2009-06-24 01:02:08 +08:00
|
|
|
return unless @_branches || @_tags;
|
2007-01-12 09:09:26 +08:00
|
|
|
my $ra = $url ? Git::SVN::Ra->new($url) : undef;
|
2009-06-24 01:02:08 +08:00
|
|
|
foreach my $path (@_branches) {
|
|
|
|
complete_url_ls_init($ra, $path, '--branches/-b', $_prefix);
|
|
|
|
}
|
|
|
|
foreach my $path (@_tags) {
|
|
|
|
complete_url_ls_init($ra, $path, '--tags/-t', $_prefix.'tags/');
|
|
|
|
}
|
2006-06-13 06:53:13 +08:00
|
|
|
}
|
|
|
|
|
2007-01-14 18:17:00 +08:00
|
|
|
sub cmd_multi_fetch {
|
2009-11-23 04:37:06 +08:00
|
|
|
$Git::SVN::no_reuse_existing = undef;
|
2007-01-28 14:28:56 +08:00
|
|
|
my $remotes = Git::SVN::read_all_remotes();
|
|
|
|
foreach my $repo_id (sort keys %$remotes) {
|
2007-02-13 16:38:02 +08:00
|
|
|
if ($remotes->{$repo_id}->{url}) {
|
2007-02-04 05:29:17 +08:00
|
|
|
Git::SVN::fetch_all($repo_id, $remotes);
|
|
|
|
}
|
2006-06-13 06:53:13 +08:00
|
|
|
}
|
|
|
|
}
|
|
|
|
|
2007-01-14 14:35:53 +08:00
|
|
|
# this command is special because it requires no metadata
|
|
|
|
sub cmd_commit_diff {
|
|
|
|
my ($ta, $tb, $url) = @_;
|
2013-02-25 06:48:38 +08:00
|
|
|
my $usage = "usage: $0 commit-diff -r<revision> ".
|
2007-10-16 22:36:52 +08:00
|
|
|
"<tree-ish> <tree-ish> [<URL>]";
|
2007-01-14 14:35:53 +08:00
|
|
|
fatal($usage) if (!defined $ta || !defined $tb);
|
2008-05-17 23:07:09 +08:00
|
|
|
my $svn_path = '';
|
2007-01-14 14:35:53 +08:00
|
|
|
if (!defined $url) {
|
|
|
|
my $gs = eval { Git::SVN->new };
|
|
|
|
if (!$gs) {
|
|
|
|
fatal("Needed URL or usable git-svn --id in ",
|
|
|
|
"the command-line\n", $usage);
|
|
|
|
}
|
2012-07-28 04:00:52 +08:00
|
|
|
$url = $gs->url;
|
2012-07-28 04:00:51 +08:00
|
|
|
$svn_path = $gs->path;
|
2007-01-14 14:35:53 +08:00
|
|
|
}
|
|
|
|
unless (defined $_revision) {
|
|
|
|
fatal("-r|--revision is a required argument\n", $usage);
|
|
|
|
}
|
|
|
|
if (defined $_message && defined $_file) {
|
|
|
|
fatal("Both --message/-m and --file/-F specified ",
|
|
|
|
"for the commit message.\n",
|
2007-10-16 22:36:52 +08:00
|
|
|
"I have no idea what you mean");
|
2007-01-14 14:35:53 +08:00
|
|
|
}
|
|
|
|
if (defined $_file) {
|
|
|
|
$_message = file_to_s($_file);
|
|
|
|
} else {
|
|
|
|
$_message ||= get_commit_entry($tb)->{log};
|
|
|
|
}
|
|
|
|
my $ra ||= Git::SVN::Ra->new($url);
|
|
|
|
my $r = $_revision;
|
|
|
|
if ($r eq 'HEAD') {
|
|
|
|
$r = $ra->get_latest_revnum;
|
|
|
|
} elsif ($r !~ /^\d+$/) {
|
|
|
|
die "revision argument: $r not understood by git-svn\n";
|
|
|
|
}
|
2007-01-28 06:33:08 +08:00
|
|
|
my %ed_opts = ( r => $r,
|
|
|
|
log => $_message,
|
|
|
|
ra => $ra,
|
|
|
|
tree_a => $ta,
|
|
|
|
tree_b => $tb,
|
|
|
|
editor_cb => sub { print "Committed r$_[0]\n" },
|
|
|
|
svn_path => $svn_path );
|
2012-05-28 15:00:46 +08:00
|
|
|
if (!Git::SVN::Editor->new(\%ed_opts)->apply_diff) {
|
2007-01-14 14:35:53 +08:00
|
|
|
print "No changes\n$ta == $tb\n";
|
|
|
|
}
|
|
|
|
}
|
|
|
|
|
2007-11-22 03:57:18 +08:00
|
|
|
sub cmd_info {
|
2014-08-03 09:44:08 +08:00
|
|
|
my $path_arg = defined($_[0]) ? $_[0] : '.';
|
|
|
|
my $path = $path_arg;
|
|
|
|
if (File::Spec->file_name_is_absolute($path)) {
|
|
|
|
$path = canonicalize_path($path);
|
|
|
|
|
|
|
|
my $toplevel = eval {
|
|
|
|
my @cmd = qw/rev-parse --show-toplevel/;
|
|
|
|
command_oneline(\@cmd, STDERR => 0);
|
|
|
|
};
|
|
|
|
|
|
|
|
# remove $toplevel from the absolute path:
|
|
|
|
my ($vol, $dirs, $file) = File::Spec->splitpath($path);
|
|
|
|
my (undef, $tdirs, $tfile) = File::Spec->splitpath($toplevel);
|
|
|
|
my @dirs = File::Spec->splitdir($dirs);
|
|
|
|
my @tdirs = File::Spec->splitdir($tdirs);
|
|
|
|
pop @dirs if $dirs[-1] eq '';
|
|
|
|
pop @tdirs if $tdirs[-1] eq '';
|
|
|
|
push @dirs, $file;
|
|
|
|
push @tdirs, $tfile;
|
|
|
|
while (@tdirs && @dirs && $tdirs[0] eq $dirs[0]) {
|
|
|
|
shift @dirs;
|
|
|
|
shift @tdirs;
|
|
|
|
}
|
|
|
|
$dirs = File::Spec->catdir(@dirs);
|
|
|
|
$path = File::Spec->catpath($vol, $dirs);
|
|
|
|
|
|
|
|
$path = canonicalize_path($path);
|
|
|
|
} else {
|
|
|
|
$path = canonicalize_path($cmd_dir_prefix . $path);
|
|
|
|
}
|
2008-08-05 15:35:16 +08:00
|
|
|
if (exists $_[1]) {
|
2007-11-22 03:57:18 +08:00
|
|
|
die "Too many arguments specified\n";
|
|
|
|
}
|
|
|
|
|
|
|
|
my ($file_type, $diff_status) = find_file_type_and_diff_status($path);
|
|
|
|
|
|
|
|
if (!$file_type && !$diff_status) {
|
2008-08-29 21:42:48 +08:00
|
|
|
print STDERR "svn: '$path' is not under version control\n";
|
|
|
|
exit 1;
|
2007-11-22 03:57:18 +08:00
|
|
|
}
|
|
|
|
|
|
|
|
my ($url, $rev, $uuid, $gs) = working_head_info('HEAD');
|
|
|
|
unless ($gs) {
|
|
|
|
die "Unable to determine upstream SVN information from ",
|
|
|
|
"working tree history\n";
|
|
|
|
}
|
2008-08-05 15:35:16 +08:00
|
|
|
|
|
|
|
# canonicalize_path() will return "" to make libsvn 1.5.x happy,
|
|
|
|
$path = "." if $path eq "";
|
|
|
|
|
2014-08-03 09:44:08 +08:00
|
|
|
my $full_url = canonicalize_url( add_path_to_url( $url, $path ) );
|
2007-11-22 03:57:18 +08:00
|
|
|
|
2007-11-22 03:57:19 +08:00
|
|
|
if ($_url) {
|
2012-07-28 17:47:49 +08:00
|
|
|
print "$full_url\n";
|
2007-11-22 03:57:19 +08:00
|
|
|
return;
|
|
|
|
}
|
|
|
|
|
2014-08-03 09:44:08 +08:00
|
|
|
my $result = "Path: $path_arg\n";
|
2007-11-22 03:57:18 +08:00
|
|
|
$result .= "Name: " . basename($path) . "\n" if $file_type ne "dir";
|
2012-07-28 17:47:49 +08:00
|
|
|
$result .= "URL: $full_url\n";
|
2007-11-22 03:57:18 +08:00
|
|
|
|
2007-11-22 10:20:57 +08:00
|
|
|
eval {
|
|
|
|
my $repos_root = $gs->repos_root;
|
|
|
|
Git::SVN::remove_username($repos_root);
|
2012-07-28 17:47:48 +08:00
|
|
|
$result .= "Repository Root: " . canonicalize_url($repos_root) . "\n";
|
2007-11-22 10:20:57 +08:00
|
|
|
};
|
|
|
|
if ($@) {
|
|
|
|
$result .= "Repository Root: (offline)\n";
|
|
|
|
}
|
2010-03-04 04:34:31 +08:00
|
|
|
::_req_svn();
|
2009-01-19 10:02:01 +08:00
|
|
|
$result .= "Repository UUID: $uuid\n" unless $diff_status eq "A" &&
|
2012-05-03 03:53:50 +08:00
|
|
|
(::compare_svn_version('1.5.4') <= 0 || $file_type ne "dir");
|
2007-11-22 03:57:18 +08:00
|
|
|
$result .= "Revision: " . ($diff_status eq "A" ? 0 : $rev) . "\n";
|
|
|
|
|
|
|
|
$result .= "Node Kind: " .
|
|
|
|
($file_type eq "dir" ? "directory" : "file") . "\n";
|
|
|
|
|
|
|
|
my $schedule = $diff_status eq "A"
|
|
|
|
? "add"
|
|
|
|
: ($diff_status eq "D" ? "delete" : "normal");
|
|
|
|
$result .= "Schedule: $schedule\n";
|
|
|
|
|
|
|
|
if ($diff_status eq "A") {
|
|
|
|
print $result, "\n";
|
|
|
|
return;
|
|
|
|
}
|
|
|
|
|
|
|
|
my ($lc_author, $lc_rev, $lc_date_utc);
|
2014-08-03 09:44:08 +08:00
|
|
|
my @args = Git::SVN::Log::git_svn_log_cmd($rev, $rev, "--", $path);
|
2007-11-22 03:57:18 +08:00
|
|
|
my $log = command_output_pipe(@args);
|
|
|
|
my $esc_color = qr/(?:\033\[(?:(?:\d+;)*\d*)?m)*/;
|
|
|
|
while (<$log>) {
|
|
|
|
if (/^${esc_color}author (.+) <[^>]+> (\d+) ([\-\+]?\d+)$/o) {
|
|
|
|
$lc_author = $1;
|
|
|
|
$lc_date_utc = Git::SVN::Log::parse_git_date($2, $3);
|
|
|
|
} elsif (/^${esc_color} (git-svn-id:.+)$/o) {
|
|
|
|
(undef, $lc_rev, undef) = ::extract_metadata($1);
|
|
|
|
}
|
|
|
|
}
|
|
|
|
close $log;
|
|
|
|
|
|
|
|
Git::SVN::Log::set_local_timezone();
|
|
|
|
|
|
|
|
$result .= "Last Changed Author: $lc_author\n";
|
|
|
|
$result .= "Last Changed Rev: $lc_rev\n";
|
|
|
|
$result .= "Last Changed Date: " .
|
|
|
|
Git::SVN::Log::format_svn_date($lc_date_utc) . "\n";
|
|
|
|
|
|
|
|
if ($file_type ne "dir") {
|
|
|
|
my $text_last_updated_date =
|
|
|
|
($diff_status eq "D" ? $lc_date_utc : (stat $path)[9]);
|
|
|
|
$result .=
|
|
|
|
"Text Last Updated: " .
|
|
|
|
Git::SVN::Log::format_svn_date($text_last_updated_date) .
|
|
|
|
"\n";
|
|
|
|
my $checksum;
|
|
|
|
if ($diff_status eq "D") {
|
|
|
|
my ($fh, $ctx) =
|
|
|
|
command_output_pipe(qw(cat-file blob), "HEAD:$path");
|
|
|
|
if ($file_type eq "link") {
|
|
|
|
my $file_name = <$fh>;
|
2007-11-23 03:18:00 +08:00
|
|
|
$checksum = md5sum("link $file_name");
|
2007-11-22 03:57:18 +08:00
|
|
|
} else {
|
2007-11-23 03:18:00 +08:00
|
|
|
$checksum = md5sum($fh);
|
2007-11-22 03:57:18 +08:00
|
|
|
}
|
|
|
|
command_close_pipe($fh, $ctx);
|
|
|
|
} elsif ($file_type eq "link") {
|
|
|
|
my $file_name =
|
|
|
|
command(qw(cat-file blob), "HEAD:$path");
|
|
|
|
$checksum =
|
2007-11-23 03:18:00 +08:00
|
|
|
md5sum("link " . $file_name);
|
2007-11-22 03:57:18 +08:00
|
|
|
} else {
|
|
|
|
open FILE, "<", $path or die $!;
|
2007-11-23 03:18:00 +08:00
|
|
|
$checksum = md5sum(\*FILE);
|
2007-11-22 03:57:18 +08:00
|
|
|
close FILE or die $!;
|
|
|
|
}
|
|
|
|
$result .= "Checksum: " . $checksum . "\n";
|
|
|
|
}
|
|
|
|
|
|
|
|
print $result, "\n";
|
|
|
|
}
|
|
|
|
|
2009-06-04 11:45:52 +08:00
|
|
|
sub cmd_reset {
|
|
|
|
my $target = shift || $_revision or die "SVN revision required\n";
|
|
|
|
$target = $1 if $target =~ /^r(\d+)$/;
|
|
|
|
$target =~ /^\d+$/ or die "Numeric SVN revision expected\n";
|
|
|
|
my ($url, $rev, $uuid, $gs) = working_head_info('HEAD');
|
|
|
|
unless ($gs) {
|
|
|
|
die "Unable to determine upstream SVN information from ".
|
|
|
|
"history\n";
|
|
|
|
}
|
|
|
|
my ($r, $c) = $gs->find_rev_before($target, not $_fetch_parent);
|
2010-05-05 07:36:47 +08:00
|
|
|
die "Cannot find SVN revision $target\n" unless defined($c);
|
2009-06-04 11:45:52 +08:00
|
|
|
$gs->rev_map_set($r, $c, 'reset', $uuid);
|
|
|
|
print "r$r = $c ($gs->{ref_id})\n";
|
|
|
|
}
|
|
|
|
|
2009-07-20 07:00:52 +08:00
|
|
|
sub cmd_gc {
|
2015-01-15 16:54:22 +08:00
|
|
|
require File::Find;
|
2012-07-27 07:22:22 +08:00
|
|
|
if (!can_compress()) {
|
2009-07-20 07:00:52 +08:00
|
|
|
warn "Compress::Zlib could not be found; unhandled.log " .
|
|
|
|
"files will not be compressed.\n";
|
|
|
|
}
|
2015-01-15 16:54:22 +08:00
|
|
|
File::Find::find({ wanted => \&gc_directory, no_chdir => 1},
|
2016-10-14 08:27:54 +08:00
|
|
|
Git::SVN::svn_dir());
|
2009-07-20 07:00:52 +08:00
|
|
|
}
|
|
|
|
|
2006-02-16 17:24:16 +08:00
|
|
|
########################### utility functions #########################
|
|
|
|
|
2007-02-16 19:22:40 +08:00
|
|
|
sub rebase_cmd {
|
|
|
|
my @cmd = qw/rebase/;
|
|
|
|
push @cmd, '-v' if $_verbose;
|
|
|
|
push @cmd, qw/--merge/ if $_merge;
|
|
|
|
push @cmd, "--strategy=$_strategy" if $_strategy;
|
2019-11-23 06:59:29 +08:00
|
|
|
push @cmd, "--rebase-merges" if $_rebase_merges;
|
2007-02-16 19:22:40 +08:00
|
|
|
@cmd;
|
|
|
|
}
|
|
|
|
|
2007-02-16 17:45:13 +08:00
|
|
|
sub post_fetch_checkout {
|
|
|
|
return if $_no_checkout;
|
2012-06-25 05:40:05 +08:00
|
|
|
return if verify_ref('HEAD^0');
|
2007-02-16 17:45:13 +08:00
|
|
|
my $gs = $Git::SVN::_head or return;
|
|
|
|
|
2009-08-13 07:01:59 +08:00
|
|
|
# look for "trunk" ref if it exists
|
|
|
|
my $remote = Git::SVN::read_all_remotes()->{$gs->{repo_id}};
|
|
|
|
my $fetch = $remote->{fetch};
|
|
|
|
if ($fetch) {
|
|
|
|
foreach my $p (keys %$fetch) {
|
|
|
|
basename($fetch->{$p}) eq 'trunk' or next;
|
|
|
|
$gs = Git::SVN->new($fetch->{$p}, $gs->{repo_id}, $p);
|
|
|
|
last;
|
|
|
|
}
|
|
|
|
}
|
|
|
|
|
2012-06-25 05:40:05 +08:00
|
|
|
command_noisy(qw(update-ref HEAD), $gs->refname);
|
|
|
|
return unless verify_ref('HEAD^0');
|
2007-02-16 17:45:13 +08:00
|
|
|
|
|
|
|
return if $ENV{GIT_DIR} !~ m#^(?:.*/)?\.git$#;
|
2016-10-14 08:27:54 +08:00
|
|
|
my $index = command_oneline(qw(rev-parse --git-path index));
|
2007-02-16 17:45:13 +08:00
|
|
|
return if -f $index;
|
|
|
|
|
2007-06-03 22:48:16 +08:00
|
|
|
return if command_oneline(qw/rev-parse --is-inside-work-tree/) eq 'false';
|
2007-02-16 17:45:13 +08:00
|
|
|
return if command_oneline(qw/rev-parse --is-inside-git-dir/) eq 'true';
|
|
|
|
command_noisy(qw/read-tree -m -u -v HEAD HEAD/);
|
|
|
|
print STDERR "Checked out HEAD:\n ",
|
|
|
|
$gs->full_url, " r", $gs->last_rev, "\n";
|
2011-04-01 18:26:00 +08:00
|
|
|
if (auto_create_empty_directories($gs)) {
|
|
|
|
$gs->mkemptydirs($gs->last_rev);
|
|
|
|
}
|
2007-02-16 17:45:13 +08:00
|
|
|
}
|
|
|
|
|
2007-01-05 10:02:00 +08:00
|
|
|
sub complete_svn_url {
|
|
|
|
my ($url, $path) = @_;
|
2012-07-28 04:00:51 +08:00
|
|
|
|
2016-03-17 04:14:08 +08:00
|
|
|
if ($path =~ m#^[a-z\+]+://#i) { # path is a URL
|
|
|
|
$path = canonicalize_url($path);
|
|
|
|
} else {
|
|
|
|
$path = canonicalize_path($path);
|
|
|
|
if (!defined $url || $url !~ m#^[a-z\+]+://#i) {
|
2007-01-05 10:02:00 +08:00
|
|
|
fatal("E: '$path' is not a complete URL ",
|
2007-10-16 22:36:52 +08:00
|
|
|
"and a separate URL is not specified");
|
2007-01-05 10:02:00 +08:00
|
|
|
}
|
2007-01-19 09:50:01 +08:00
|
|
|
return ($url, $path);
|
2007-01-05 10:02:00 +08:00
|
|
|
}
|
2007-01-19 09:50:01 +08:00
|
|
|
return ($path, '');
|
2007-01-05 10:02:00 +08:00
|
|
|
}
|
|
|
|
|
2006-06-13 06:53:13 +08:00
|
|
|
sub complete_url_ls_init {
|
2007-01-19 09:50:01 +08:00
|
|
|
my ($ra, $repo_path, $switch, $pfx) = @_;
|
|
|
|
unless ($repo_path) {
|
2006-06-13 06:53:13 +08:00
|
|
|
print STDERR "W: $switch not specified\n";
|
|
|
|
return;
|
|
|
|
}
|
2016-03-17 04:14:08 +08:00
|
|
|
if ($repo_path =~ m#^[a-z\+]+://#i) {
|
|
|
|
$repo_path = canonicalize_url($repo_path);
|
2007-01-19 09:50:01 +08:00
|
|
|
$ra = Git::SVN::Ra->new($repo_path);
|
|
|
|
$repo_path = '';
|
2007-01-12 09:09:26 +08:00
|
|
|
} else {
|
2016-03-17 04:14:08 +08:00
|
|
|
$repo_path = canonicalize_path($repo_path);
|
2007-01-19 09:50:01 +08:00
|
|
|
$repo_path =~ s#^/+##;
|
2007-01-12 09:09:26 +08:00
|
|
|
unless ($ra) {
|
2007-01-19 09:50:01 +08:00
|
|
|
fatal("E: '$repo_path' is not a complete URL ",
|
2007-10-16 22:36:52 +08:00
|
|
|
"and a separate URL is not specified");
|
2007-01-12 07:35:55 +08:00
|
|
|
}
|
2007-01-12 09:09:26 +08:00
|
|
|
}
|
2012-07-28 04:00:52 +08:00
|
|
|
my $url = $ra->url;
|
2007-02-15 07:10:44 +08:00
|
|
|
my $gs = Git::SVN->init($url, undef, undef, undef, 1);
|
|
|
|
my $k = "svn-remote.$gs->{repo_id}.url";
|
|
|
|
my $orig_url = eval { command_oneline(qw/config --get/, $k) };
|
2012-07-28 04:00:52 +08:00
|
|
|
if ($orig_url && ($orig_url ne $gs->url)) {
|
2007-02-15 07:10:44 +08:00
|
|
|
die "$k already set: $orig_url\n",
|
2012-07-28 04:00:52 +08:00
|
|
|
"wanted to set to: $gs->url\n";
|
2007-02-01 19:59:07 +08:00
|
|
|
}
|
2012-07-28 04:00:52 +08:00
|
|
|
command_oneline('config', $k, $gs->url) unless $orig_url;
|
|
|
|
|
2012-07-28 17:47:52 +08:00
|
|
|
my $remote_path = join_paths( $gs->path, $repo_path );
|
2009-08-17 05:22:12 +08:00
|
|
|
$remote_path =~ s{%([0-9A-F]{2})}{chr hex($1)}ieg;
|
2007-02-15 07:10:44 +08:00
|
|
|
$remote_path =~ s#^/##g;
|
2008-03-15 02:01:23 +08:00
|
|
|
$remote_path .= "/*" if $remote_path !~ /\*/;
|
2007-02-15 07:10:44 +08:00
|
|
|
my ($n) = ($switch =~ /^--(\w+)/);
|
|
|
|
if (length $pfx && $pfx !~ m#/$#) {
|
|
|
|
die "--prefix='$pfx' must have a trailing slash '/'\n";
|
2006-06-13 06:53:13 +08:00
|
|
|
}
|
2008-08-08 16:41:57 +08:00
|
|
|
command_noisy('config',
|
2009-06-24 01:02:08 +08:00
|
|
|
'--add',
|
2008-08-08 16:41:57 +08:00
|
|
|
"svn-remote.$gs->{repo_id}.$n",
|
|
|
|
"$remote_path:refs/remotes/$pfx*" .
|
|
|
|
('/*' x (($remote_path =~ tr/*/*/) - 1)) );
|
2006-06-13 06:53:13 +08:00
|
|
|
}
|
|
|
|
|
2006-12-16 02:59:54 +08:00
|
|
|
sub verify_ref {
|
|
|
|
my ($ref) = @_;
|
2006-12-28 17:16:21 +08:00
|
|
|
eval { command_oneline([ 'rev-parse', '--verify', $ref ],
|
|
|
|
{ STDERR => 0 }); };
|
2006-12-16 02:59:54 +08:00
|
|
|
}
|
|
|
|
|
git-svn: add support for Perl SVN::* libraries
This means we no longer have to deal with having bloated SVN
working copies around and we get a nice performance increase as
well because we don't have to exec the SVN binary and start a
new server connection each time.
Of course we have to manually manage memory with SVN::Pool
whenever we can, and hack around cases where SVN just eats
memory despite pools (I blame Perl, too). I would like to
keep memory usage as stable as possible during long fetch/commit
processes since I still use computers with only 256-512M RAM.
commit should always be faster with the SVN library code. The
SVN::Delta interface is leaky (or I'm not using it with pools
correctly), so I'm forking on every commit, but that doesn't
seem to hurt performance too much (at least on normal Unix/Linux
systems where fork() is pretty cheap).
fetch should be faster in most common cases, but probably not all.
fetches will be faster where client/server delta generation is
the bottleneck and not bandwidth. Of course, full-files are
generated server-side via deltas, too. Full files are always
transferred when they're updated, just like git-svnimport and
unlike command-line svn. I'm also hacking around memory leaks
(see comments) here by using some more forks.
I've tested fetch with http://, https://, file://, and svn://
repositories, so we should be reasonably covered in terms of
error handling for fetching.
Of course, we'll keep plain command-line svn compatibility as a
fallback for people running SVN 1.1 (I'm looking into library
support for 1.1.x SVN, too). If you want to force command-line
SVN usage, set GIT_SVN_NO_LIB=1 in your environment.
We also require two simultaneous connections (just like
git-svnimport), but this shouldn't be a problem for most
servers.
Less important commands:
show-ignore is slower because it requires repository
access, but -r/--revision <num> can be specified.
graft-branches may use more memory, but it's a
short-term process and is funky-filename-safe.
Signed-off-by: Eric Wong <normalperson@yhbt.net>
2006-06-13 06:23:48 +08:00
|
|
|
sub get_tree_from_treeish {
|
2006-02-21 02:57:28 +08:00
|
|
|
my ($treeish) = @_;
|
2007-01-14 14:35:53 +08:00
|
|
|
# $treeish can be a symbolic ref, too:
|
2006-12-16 02:59:54 +08:00
|
|
|
my $type = command_oneline(qw/cat-file -t/, $treeish);
|
2006-02-21 02:57:28 +08:00
|
|
|
my $expected;
|
|
|
|
while ($type eq 'tag') {
|
2006-12-16 02:59:54 +08:00
|
|
|
($treeish, $type) = command(qw/cat-file tag/, $treeish);
|
2006-02-21 02:57:28 +08:00
|
|
|
}
|
|
|
|
if ($type eq 'commit') {
|
2006-12-16 02:59:54 +08:00
|
|
|
$expected = (grep /^tree /, command(qw/cat-file commit/,
|
|
|
|
$treeish))[0];
|
2020-06-23 02:04:12 +08:00
|
|
|
($expected) = ($expected =~ /^tree ($oid)$/o);
|
2006-02-21 02:57:28 +08:00
|
|
|
die "Unable to get tree from $treeish\n" unless $expected;
|
|
|
|
} elsif ($type eq 'tree') {
|
|
|
|
$expected = $treeish;
|
|
|
|
} else {
|
|
|
|
die "$treeish is a $type, expected tree, tag or commit\n";
|
|
|
|
}
|
git-svn: add support for Perl SVN::* libraries
This means we no longer have to deal with having bloated SVN
working copies around and we get a nice performance increase as
well because we don't have to exec the SVN binary and start a
new server connection each time.
Of course we have to manually manage memory with SVN::Pool
whenever we can, and hack around cases where SVN just eats
memory despite pools (I blame Perl, too). I would like to
keep memory usage as stable as possible during long fetch/commit
processes since I still use computers with only 256-512M RAM.
commit should always be faster with the SVN library code. The
SVN::Delta interface is leaky (or I'm not using it with pools
correctly), so I'm forking on every commit, but that doesn't
seem to hurt performance too much (at least on normal Unix/Linux
systems where fork() is pretty cheap).
fetch should be faster in most common cases, but probably not all.
fetches will be faster where client/server delta generation is
the bottleneck and not bandwidth. Of course, full-files are
generated server-side via deltas, too. Full files are always
transferred when they're updated, just like git-svnimport and
unlike command-line svn. I'm also hacking around memory leaks
(see comments) here by using some more forks.
I've tested fetch with http://, https://, file://, and svn://
repositories, so we should be reasonably covered in terms of
error handling for fetching.
Of course, we'll keep plain command-line svn compatibility as a
fallback for people running SVN 1.1 (I'm looking into library
support for 1.1.x SVN, too). If you want to force command-line
SVN usage, set GIT_SVN_NO_LIB=1 in your environment.
We also require two simultaneous connections (just like
git-svnimport), but this shouldn't be a problem for most
servers.
Less important commands:
show-ignore is slower because it requires repository
access, but -r/--revision <num> can be specified.
graft-branches may use more memory, but it's a
short-term process and is funky-filename-safe.
Signed-off-by: Eric Wong <normalperson@yhbt.net>
2006-06-13 06:23:48 +08:00
|
|
|
return $expected;
|
|
|
|
}
|
|
|
|
|
2007-01-14 14:35:53 +08:00
|
|
|
sub get_commit_entry {
|
|
|
|
my ($treeish) = shift;
|
|
|
|
my %log_entry = ( log => '', tree => get_tree_from_treeish($treeish) );
|
2016-10-14 08:27:54 +08:00
|
|
|
my @git_path = qw(rev-parse --git-path);
|
|
|
|
my $commit_editmsg = command_oneline(@git_path, 'COMMIT_EDITMSG');
|
|
|
|
my $commit_msg = command_oneline(@git_path, 'COMMIT_MSG');
|
2007-01-14 14:35:53 +08:00
|
|
|
open my $log_fh, '>', $commit_editmsg or croak $!;
|
2006-02-16 17:24:16 +08:00
|
|
|
|
2007-01-14 14:35:53 +08:00
|
|
|
my $type = command_oneline(qw/cat-file -t/, $treeish);
|
2006-07-10 11:20:48 +08:00
|
|
|
if ($type eq 'commit' || $type eq 'tag') {
|
2006-12-16 02:59:54 +08:00
|
|
|
my ($msg_fh, $ctx) = command_output_pipe('cat-file',
|
2007-01-14 14:35:53 +08:00
|
|
|
$type, $treeish);
|
2006-02-16 17:24:16 +08:00
|
|
|
my $in_msg = 0;
|
2008-04-16 09:04:17 +08:00
|
|
|
my $author;
|
|
|
|
my $saw_from = 0;
|
2008-06-13 07:10:50 +08:00
|
|
|
my $msgbuf = "";
|
2006-02-16 17:24:16 +08:00
|
|
|
while (<$msg_fh>) {
|
|
|
|
if (!$in_msg) {
|
2013-09-30 22:46:14 +08:00
|
|
|
$in_msg = 1 if (/^$/);
|
2008-04-16 09:04:17 +08:00
|
|
|
$author = $1 if (/^author (.*>)/);
|
2006-03-03 17:20:08 +08:00
|
|
|
} elsif (/^git-svn-id: /) {
|
2007-01-14 14:35:53 +08:00
|
|
|
# skip this for now, we regenerate the
|
|
|
|
# correct one on re-fetch anyways
|
|
|
|
# TODO: set *:merge properties or like...
|
2006-02-16 17:24:16 +08:00
|
|
|
} else {
|
2008-04-16 09:04:17 +08:00
|
|
|
if (/^From:/ || /^Signed-off-by:/) {
|
|
|
|
$saw_from = 1;
|
|
|
|
}
|
2008-06-13 07:10:50 +08:00
|
|
|
$msgbuf .= $_;
|
2006-02-16 17:24:16 +08:00
|
|
|
}
|
|
|
|
}
|
2008-06-13 07:10:50 +08:00
|
|
|
$msgbuf =~ s/\s+$//s;
|
2017-12-14 08:05:08 +08:00
|
|
|
$msgbuf =~ s/\r\n/\n/sg; # SVN 1.6+ disallows CRLF
|
2008-04-16 09:04:17 +08:00
|
|
|
if ($Git::SVN::_add_author_from && defined($author)
|
|
|
|
&& !$saw_from) {
|
2008-06-13 07:10:50 +08:00
|
|
|
$msgbuf .= "\n\nFrom: $author";
|
2008-04-16 09:04:17 +08:00
|
|
|
}
|
2008-06-13 07:10:50 +08:00
|
|
|
print $log_fh $msgbuf or croak $!;
|
2006-12-16 02:59:54 +08:00
|
|
|
command_close_pipe($msg_fh, $ctx);
|
2006-02-16 17:24:16 +08:00
|
|
|
}
|
2007-01-14 14:35:53 +08:00
|
|
|
close $log_fh or croak $!;
|
2006-02-16 17:24:16 +08:00
|
|
|
|
|
|
|
if ($_edit || ($type eq 'tree')) {
|
2009-10-31 09:42:34 +08:00
|
|
|
chomp(my $editor = command_oneline(qw(var GIT_EDITOR)));
|
|
|
|
system('sh', '-c', $editor.' "$@"', $editor, $commit_editmsg);
|
2006-02-16 17:24:16 +08:00
|
|
|
}
|
2007-01-14 14:35:53 +08:00
|
|
|
rename $commit_editmsg, $commit_msg or croak $!;
|
2008-10-30 14:49:26 +08:00
|
|
|
{
|
2009-05-28 15:56:23 +08:00
|
|
|
require Encode;
|
2008-10-30 14:49:26 +08:00
|
|
|
# SVN requires messages to be UTF-8 when entering the repo
|
|
|
|
open $log_fh, '<', $commit_msg or croak $!;
|
|
|
|
binmode $log_fh;
|
2016-10-14 08:27:53 +08:00
|
|
|
chomp($log_entry{log} = get_record($log_fh, undef));
|
2008-10-30 14:49:26 +08:00
|
|
|
|
2009-05-28 15:56:23 +08:00
|
|
|
my $enc = Git::config('i18n.commitencoding') || 'UTF-8';
|
|
|
|
my $msg = $log_entry{log};
|
|
|
|
|
|
|
|
eval { $msg = Encode::decode($enc, $msg, 1) };
|
|
|
|
if ($@) {
|
|
|
|
die "Could not decode as $enc:\n", $msg,
|
|
|
|
"\nPerhaps you need to set i18n.commitencoding\n";
|
2008-10-30 14:49:26 +08:00
|
|
|
}
|
2009-05-28 15:56:23 +08:00
|
|
|
|
|
|
|
eval { $msg = Encode::encode('UTF-8', $msg, 1) };
|
|
|
|
die "Could not encode as UTF-8:\n$msg\n" if $@;
|
|
|
|
|
|
|
|
$log_entry{log} = $msg;
|
|
|
|
|
2008-10-30 14:49:26 +08:00
|
|
|
close $log_fh or croak $!;
|
|
|
|
}
|
2007-01-14 14:35:53 +08:00
|
|
|
unlink $commit_msg;
|
|
|
|
\%log_entry;
|
git-svn: add support for Perl SVN::* libraries
This means we no longer have to deal with having bloated SVN
working copies around and we get a nice performance increase as
well because we don't have to exec the SVN binary and start a
new server connection each time.
Of course we have to manually manage memory with SVN::Pool
whenever we can, and hack around cases where SVN just eats
memory despite pools (I blame Perl, too). I would like to
keep memory usage as stable as possible during long fetch/commit
processes since I still use computers with only 256-512M RAM.
commit should always be faster with the SVN library code. The
SVN::Delta interface is leaky (or I'm not using it with pools
correctly), so I'm forking on every commit, but that doesn't
seem to hurt performance too much (at least on normal Unix/Linux
systems where fork() is pretty cheap).
fetch should be faster in most common cases, but probably not all.
fetches will be faster where client/server delta generation is
the bottleneck and not bandwidth. Of course, full-files are
generated server-side via deltas, too. Full files are always
transferred when they're updated, just like git-svnimport and
unlike command-line svn. I'm also hacking around memory leaks
(see comments) here by using some more forks.
I've tested fetch with http://, https://, file://, and svn://
repositories, so we should be reasonably covered in terms of
error handling for fetching.
Of course, we'll keep plain command-line svn compatibility as a
fallback for people running SVN 1.1 (I'm looking into library
support for 1.1.x SVN, too). If you want to force command-line
SVN usage, set GIT_SVN_NO_LIB=1 in your environment.
We also require two simultaneous connections (just like
git-svnimport), but this shouldn't be a problem for most
servers.
Less important commands:
show-ignore is slower because it requires repository
access, but -r/--revision <num> can be specified.
graft-branches may use more memory, but it's a
short-term process and is funky-filename-safe.
Signed-off-by: Eric Wong <normalperson@yhbt.net>
2006-06-13 06:23:48 +08:00
|
|
|
}
|
|
|
|
|
2006-02-16 17:24:16 +08:00
|
|
|
sub s_to_file {
|
|
|
|
my ($str, $file, $mode) = @_;
|
|
|
|
open my $fd,'>',$file or croak $!;
|
|
|
|
print $fd $str,"\n" or croak $!;
|
|
|
|
close $fd or croak $!;
|
|
|
|
chmod ($mode &~ umask, $file) if (defined $mode);
|
|
|
|
}
|
|
|
|
|
|
|
|
sub file_to_s {
|
|
|
|
my $file = shift;
|
|
|
|
open my $fd,'<',$file or croak "$!: file: $file\n";
|
|
|
|
local $/;
|
|
|
|
my $ret = <$fd>;
|
|
|
|
close $fd or croak $!;
|
|
|
|
$ret =~ s/\s*$//s;
|
|
|
|
return $ret;
|
|
|
|
}
|
|
|
|
|
2006-03-03 17:20:08 +08:00
|
|
|
# '<svn username> = real-name <email address>' mapping based on git-svnimport:
|
|
|
|
sub load_authors {
|
|
|
|
open my $authors, '<', $_authors or die "Can't open $_authors $!\n";
|
2007-01-12 18:35:20 +08:00
|
|
|
my $log = $cmd eq 'log';
|
2006-03-03 17:20:08 +08:00
|
|
|
while (<$authors>) {
|
|
|
|
chomp;
|
2015-09-10 20:32:13 +08:00
|
|
|
next unless /^(.+?|\(no author\))\s*=\s*(.+?)\s*<(.*)>\s*$/;
|
2006-03-03 17:20:08 +08:00
|
|
|
my ($user, $name, $email) = ($1, $2, $3);
|
2007-01-12 18:35:20 +08:00
|
|
|
if ($log) {
|
|
|
|
$Git::SVN::Log::rusers{"$name <$email>"} = $user;
|
|
|
|
} else {
|
|
|
|
$users{$user} = [$name, $email];
|
|
|
|
}
|
2006-06-01 17:35:44 +08:00
|
|
|
}
|
|
|
|
close $authors or croak $!;
|
|
|
|
}
|
|
|
|
|
2007-01-29 08:16:53 +08:00
|
|
|
# convert GetOpt::Long specs for use by git-config
|
2009-11-15 06:25:11 +08:00
|
|
|
sub read_git_config {
|
2006-05-24 16:40:37 +08:00
|
|
|
my $opts = shift;
|
2007-02-12 07:21:24 +08:00
|
|
|
my @config_only;
|
2006-05-24 16:40:37 +08:00
|
|
|
foreach my $o (keys %$opts) {
|
2007-02-12 07:21:24 +08:00
|
|
|
# if we have mixedCase and a long option-only, then
|
|
|
|
# it's a config-only variable that we don't need for
|
|
|
|
# the command-line.
|
|
|
|
push @config_only, $o if ($o =~ /[A-Z]/ && $o =~ /^[a-z]+$/i);
|
2006-05-24 16:40:37 +08:00
|
|
|
my $v = $opts->{$o};
|
2007-02-12 07:21:24 +08:00
|
|
|
my ($key) = ($o =~ /^([a-zA-Z\-]+)/);
|
2006-05-24 16:40:37 +08:00
|
|
|
$key =~ s/-//g;
|
2008-10-24 03:21:34 +08:00
|
|
|
my $arg = 'git config';
|
2006-05-24 16:40:37 +08:00
|
|
|
$arg .= ' --int' if ($o =~ /[:=]i$/);
|
|
|
|
$arg .= ' --bool' if ($o !~ /[:=][sfi]$/);
|
|
|
|
if (ref $v eq 'ARRAY') {
|
|
|
|
chomp(my @tmp = `$arg --get-all svn.$key`);
|
|
|
|
@$v = @tmp if @tmp;
|
|
|
|
} else {
|
|
|
|
chomp(my $tmp = `$arg --get svn.$key`);
|
2007-02-11 13:07:12 +08:00
|
|
|
if ($tmp && !($arg =~ / --bool/ && $tmp eq 'false')) {
|
2006-05-24 16:40:37 +08:00
|
|
|
$$v = $tmp;
|
|
|
|
}
|
|
|
|
}
|
|
|
|
}
|
2020-06-23 02:04:15 +08:00
|
|
|
load_object_format();
|
2007-02-12 07:21:24 +08:00
|
|
|
delete @$opts{@config_only} if @config_only;
|
2006-05-24 16:40:37 +08:00
|
|
|
}
|
|
|
|
|
2020-06-23 02:04:15 +08:00
|
|
|
sub load_object_format {
|
|
|
|
chomp(my $hash = `git config --get extensions.objectformat`);
|
|
|
|
$::oid_length = 64 if $hash eq 'sha256';
|
|
|
|
}
|
|
|
|
|
2006-06-01 17:35:44 +08:00
|
|
|
sub extract_metadata {
|
2006-06-28 10:39:11 +08:00
|
|
|
my $id = shift or return (undef, undef, undef);
|
2007-06-30 16:56:13 +08:00
|
|
|
my ($url, $rev, $uuid) = ($id =~ /^\s*git-svn-id:\s+(.*)\@(\d+)
|
2009-07-12 05:13:12 +08:00
|
|
|
\s([a-f\d\-]+)$/ix);
|
2006-11-24 06:54:04 +08:00
|
|
|
if (!defined $rev || !$uuid || !$url) {
|
2006-06-01 17:35:44 +08:00
|
|
|
# some of the original repositories I made had
|
2006-07-10 13:50:18 +08:00
|
|
|
# identifiers like this:
|
2009-07-12 05:13:12 +08:00
|
|
|
($rev, $uuid) = ($id =~/^\s*git-svn-id:\s(\d+)\@([a-f\d\-]+)/i);
|
2006-06-01 17:35:44 +08:00
|
|
|
}
|
|
|
|
return ($url, $rev, $uuid);
|
|
|
|
}
|
|
|
|
|
2006-06-28 10:39:11 +08:00
|
|
|
sub cmt_metadata {
|
|
|
|
return extract_metadata((grep(/^git-svn-id: /,
|
2006-12-16 02:59:54 +08:00
|
|
|
command(qw/cat-file commit/, shift)))[-1]);
|
2006-06-28 10:39:11 +08:00
|
|
|
}
|
|
|
|
|
2009-04-11 04:32:41 +08:00
|
|
|
sub cmt_sha2rev_batch {
|
|
|
|
my %s2r;
|
|
|
|
my ($pid, $in, $out, $ctx) = command_bidi_pipe(qw/cat-file --batch/);
|
|
|
|
my $list = shift;
|
|
|
|
|
|
|
|
foreach my $sha (@{$list}) {
|
|
|
|
my $first = 1;
|
|
|
|
my $size = 0;
|
|
|
|
print $out $sha, "\n";
|
|
|
|
|
|
|
|
while (my $line = <$in>) {
|
2020-06-23 02:04:14 +08:00
|
|
|
if ($first && $line =~ /^$::oid\smissing$/) {
|
2009-04-11 04:32:41 +08:00
|
|
|
last;
|
|
|
|
} elsif ($first &&
|
2020-06-23 02:04:14 +08:00
|
|
|
$line =~ /^$::oid\scommit\s(\d+)$/) {
|
2009-04-11 04:32:41 +08:00
|
|
|
$first = 0;
|
|
|
|
$size = $1;
|
|
|
|
next;
|
|
|
|
} elsif ($line =~ /^(git-svn-id: )/) {
|
|
|
|
my (undef, $rev, undef) =
|
|
|
|
extract_metadata($line);
|
|
|
|
$s2r{$sha} = $rev;
|
|
|
|
}
|
|
|
|
|
|
|
|
$size -= length($line);
|
|
|
|
last if ($size == 0);
|
|
|
|
}
|
|
|
|
}
|
|
|
|
|
|
|
|
command_close_bidi_pipe($pid, $in, $out, $ctx);
|
|
|
|
|
|
|
|
return \%s2r;
|
|
|
|
}
|
|
|
|
|
2007-02-16 19:22:40 +08:00
|
|
|
sub working_head_info {
|
|
|
|
my ($head, $refs) = @_;
|
2012-02-12 08:23:06 +08:00
|
|
|
my @args = qw/rev-list --first-parent --pretty=medium/;
|
2013-06-06 02:31:27 +08:00
|
|
|
my ($fh, $ctx) = command_output_pipe(@args, $head, "--");
|
2007-06-30 16:56:13 +08:00
|
|
|
my $hash;
|
2007-06-30 16:56:14 +08:00
|
|
|
my %max;
|
2007-06-30 16:56:13 +08:00
|
|
|
while (<$fh>) {
|
2020-06-23 02:04:12 +08:00
|
|
|
if ( m{^commit ($::oid)$} ) {
|
2007-06-30 16:56:13 +08:00
|
|
|
unshift @$refs, $hash if $hash and $refs;
|
|
|
|
$hash = $1;
|
|
|
|
next;
|
|
|
|
}
|
|
|
|
next unless s{^\s*(git-svn-id:)}{$1};
|
|
|
|
my ($url, $rev, $uuid) = extract_metadata($_);
|
2007-04-08 15:59:19 +08:00
|
|
|
if (defined $url && defined $rev) {
|
2007-06-30 16:56:14 +08:00
|
|
|
next if $max{$url} and $max{$url} < $rev;
|
2007-04-08 15:59:19 +08:00
|
|
|
if (my $gs = Git::SVN->find_by_url($url)) {
|
2008-07-14 23:28:04 +08:00
|
|
|
my $c = $gs->rev_map_get($rev, $uuid);
|
2007-04-26 02:50:32 +08:00
|
|
|
if ($c && $c eq $hash) {
|
2007-04-08 15:59:19 +08:00
|
|
|
close $fh; # break the pipe
|
|
|
|
return ($url, $rev, $uuid, $gs);
|
2007-06-30 16:56:14 +08:00
|
|
|
} else {
|
2007-12-09 15:27:41 +08:00
|
|
|
$max{$url} ||= $gs->rev_map_max;
|
2007-04-08 15:59:19 +08:00
|
|
|
}
|
|
|
|
}
|
|
|
|
}
|
2007-02-16 19:22:40 +08:00
|
|
|
}
|
2007-04-08 15:59:19 +08:00
|
|
|
command_close_pipe($fh, $ctx);
|
|
|
|
(undef, undef, undef, undef);
|
2007-02-16 19:22:40 +08:00
|
|
|
}
|
|
|
|
|
2007-06-13 17:23:28 +08:00
|
|
|
sub read_commit_parents {
|
|
|
|
my ($parents, $c) = @_;
|
2007-09-09 07:33:08 +08:00
|
|
|
chomp(my $p = command_oneline(qw/rev-list --parents -1/, $c));
|
|
|
|
$p =~ s/^($c)\s*// or die "rev-list --parents -1 $c failed!\n";
|
|
|
|
@{$parents->{$c}} = split(/ /, $p);
|
2007-06-13 17:23:28 +08:00
|
|
|
}
|
|
|
|
|
|
|
|
sub linearize_history {
|
|
|
|
my ($gs, $refs) = @_;
|
|
|
|
my %parents;
|
|
|
|
foreach my $c (@$refs) {
|
|
|
|
read_commit_parents(\%parents, $c);
|
|
|
|
}
|
|
|
|
|
|
|
|
my @linear_refs;
|
|
|
|
my %skip = ();
|
|
|
|
my $last_svn_commit = $gs->last_commit;
|
|
|
|
foreach my $c (reverse @$refs) {
|
|
|
|
next if $c eq $last_svn_commit;
|
|
|
|
last if $skip{$c};
|
|
|
|
|
|
|
|
unshift @linear_refs, $c;
|
|
|
|
$skip{$c} = 1;
|
|
|
|
|
|
|
|
# we only want the first parent to diff against for linear
|
|
|
|
# history, we save the rest to inject when we finalize the
|
|
|
|
# svn commit
|
|
|
|
my $fp_a = verify_ref("$c~1");
|
|
|
|
my $fp_b = shift @{$parents{$c}} if $parents{$c};
|
|
|
|
if (!$fp_a || !$fp_b) {
|
|
|
|
die "Commit $c\n",
|
|
|
|
"has no parent commit, and therefore ",
|
|
|
|
"nothing to diff against.\n",
|
|
|
|
"You should be working from a repository ",
|
|
|
|
"originally created by git-svn\n";
|
|
|
|
}
|
|
|
|
if ($fp_a ne $fp_b) {
|
|
|
|
die "$c~1 = $fp_a, however parsing commit $c ",
|
|
|
|
"revealed that:\n$c~1 = $fp_b\nBUG!\n";
|
|
|
|
}
|
|
|
|
|
|
|
|
foreach my $p (@{$parents{$c}}) {
|
|
|
|
$skip{$p} = 1;
|
|
|
|
}
|
|
|
|
}
|
|
|
|
(\@linear_refs, \%parents);
|
|
|
|
}
|
|
|
|
|
2007-11-22 03:57:18 +08:00
|
|
|
sub find_file_type_and_diff_status {
|
|
|
|
my ($path) = @_;
|
2008-07-21 04:14:07 +08:00
|
|
|
return ('dir', '') if $path eq '';
|
2007-11-22 03:57:18 +08:00
|
|
|
|
|
|
|
my $diff_output =
|
|
|
|
command_oneline(qw(diff --cached --name-status --), $path) || "";
|
|
|
|
my $diff_status = (split(' ', $diff_output))[0] || "";
|
|
|
|
|
|
|
|
my $ls_tree = command_oneline(qw(ls-tree HEAD), $path) || "";
|
|
|
|
|
|
|
|
return (undef, undef) if !$diff_status && !$ls_tree;
|
|
|
|
|
|
|
|
if ($diff_status eq "A") {
|
|
|
|
return ("link", $diff_status) if -l $path;
|
|
|
|
return ("dir", $diff_status) if -d $path;
|
|
|
|
return ("file", $diff_status);
|
|
|
|
}
|
|
|
|
|
|
|
|
my $mode = (split(' ', $ls_tree))[0] || "";
|
|
|
|
|
|
|
|
return ("link", $diff_status) if $mode eq "120000";
|
|
|
|
return ("dir", $diff_status) if $mode eq "040000";
|
|
|
|
return ("file", $diff_status);
|
|
|
|
}
|
|
|
|
|
2007-11-21 14:43:17 +08:00
|
|
|
sub md5sum {
|
|
|
|
my $arg = shift;
|
|
|
|
my $ref = ref $arg;
|
2015-01-15 16:54:22 +08:00
|
|
|
require Digest::MD5;
|
2007-11-21 14:43:17 +08:00
|
|
|
my $md5 = Digest::MD5->new();
|
2008-08-13 00:00:53 +08:00
|
|
|
if ($ref eq 'GLOB' || $ref eq 'IO::File' || $ref eq 'File::Temp') {
|
2007-11-21 14:43:17 +08:00
|
|
|
$md5->addfile($arg) or croak $!;
|
|
|
|
} elsif ($ref eq 'SCALAR') {
|
|
|
|
$md5->add($$arg) or croak $!;
|
|
|
|
} elsif (!$ref) {
|
|
|
|
$md5->add($arg) or croak $!;
|
|
|
|
} else {
|
2012-07-27 07:22:22 +08:00
|
|
|
fatal "Can't provide MD5 hash for unknown ref type: '", $ref, "'";
|
2007-11-21 14:43:17 +08:00
|
|
|
}
|
|
|
|
return $md5->hexdigest();
|
|
|
|
}
|
|
|
|
|
2009-07-20 07:00:52 +08:00
|
|
|
sub gc_directory {
|
2012-07-27 07:22:22 +08:00
|
|
|
if (can_compress() && -f $_ && basename($_) eq "unhandled.log") {
|
2009-07-20 07:00:52 +08:00
|
|
|
my $out_filename = $_ . ".gz";
|
|
|
|
open my $in_fh, "<", $_ or die "Unable to open $_: $!\n";
|
|
|
|
binmode $in_fh;
|
|
|
|
my $gz = Compress::Zlib::gzopen($out_filename, "ab") or
|
|
|
|
die "Unable to open $out_filename: $!\n";
|
|
|
|
|
|
|
|
my $res;
|
|
|
|
while ($res = sysread($in_fh, my $str, 1024)) {
|
|
|
|
$gz->gzwrite($str) or
|
|
|
|
die "Unable to write: ".$gz->gzerror()."!\n";
|
|
|
|
}
|
2015-01-15 16:54:22 +08:00
|
|
|
no warnings 'once'; # $File::Find::name would warn
|
2009-07-20 07:00:52 +08:00
|
|
|
unlink $_ or die "unlink $File::Find::name: $!\n";
|
|
|
|
} elsif (-f $_ && basename($_) eq "index") {
|
|
|
|
unlink $_ or die "unlink $_: $!\n";
|
|
|
|
}
|
|
|
|
}
|
|
|
|
|
2006-02-16 17:24:16 +08:00
|
|
|
__END__
|
|
|
|
|
|
|
|
Data structures:
|
|
|
|
|
2007-02-04 05:29:17 +08:00
|
|
|
|
|
|
|
$remotes = { # returned by read_all_remotes()
|
|
|
|
'svn' => {
|
|
|
|
# svn-remote.svn.url=https://svn.musicpd.org
|
|
|
|
url => 'https://svn.musicpd.org',
|
|
|
|
# svn-remote.svn.fetch=mpd/trunk:trunk
|
|
|
|
fetch => {
|
|
|
|
'mpd/trunk' => 'trunk',
|
|
|
|
},
|
|
|
|
# svn-remote.svn.tags=mpd/tags/*:tags/*
|
|
|
|
tags => {
|
|
|
|
path => {
|
|
|
|
left => 'mpd/tags',
|
|
|
|
right => '',
|
|
|
|
regex => qr!mpd/tags/([^/]+)$!,
|
|
|
|
glob => 'tags/*',
|
|
|
|
},
|
|
|
|
ref => {
|
|
|
|
left => 'tags',
|
|
|
|
right => '',
|
|
|
|
regex => qr!tags/([^/]+)$!,
|
|
|
|
glob => 'tags/*',
|
|
|
|
},
|
|
|
|
}
|
|
|
|
}
|
|
|
|
};
|
|
|
|
|
2007-01-14 14:35:53 +08:00
|
|
|
$log_entry hashref as returned by libsvn_log_entry()
|
2006-02-16 17:24:16 +08:00
|
|
|
{
|
2007-01-14 14:35:53 +08:00
|
|
|
log => 'whitespace-formatted log entry
|
2006-02-16 17:24:16 +08:00
|
|
|
', # trailing newline is preserved
|
|
|
|
revision => '8', # integer
|
|
|
|
date => '2004-02-24T17:01:44.108345Z', # commit date
|
|
|
|
author => 'committer name'
|
|
|
|
};
|
|
|
|
|
2007-01-27 17:32:00 +08:00
|
|
|
|
|
|
|
# this is generated by generate_diff();
|
2006-02-16 17:24:16 +08:00
|
|
|
@mods = array of diff-index line hashes, each element represents one line
|
|
|
|
of diff-index output
|
|
|
|
|
|
|
|
diff-index line ($m hash)
|
|
|
|
{
|
|
|
|
mode_a => first column of diff-index output, no leading ':',
|
|
|
|
mode_b => second column of diff-index output,
|
|
|
|
sha1_b => sha1sum of the final blob,
|
2006-03-03 17:20:07 +08:00
|
|
|
chg => change type [MCRADT],
|
2006-02-16 17:24:16 +08:00
|
|
|
file_a => original file name of a file (iff chg is 'C' or 'R')
|
|
|
|
file_b => new/current file name of a file (any chg)
|
|
|
|
}
|
|
|
|
;
|
git-svn: add support for Perl SVN::* libraries
This means we no longer have to deal with having bloated SVN
working copies around and we get a nice performance increase as
well because we don't have to exec the SVN binary and start a
new server connection each time.
Of course we have to manually manage memory with SVN::Pool
whenever we can, and hack around cases where SVN just eats
memory despite pools (I blame Perl, too). I would like to
keep memory usage as stable as possible during long fetch/commit
processes since I still use computers with only 256-512M RAM.
commit should always be faster with the SVN library code. The
SVN::Delta interface is leaky (or I'm not using it with pools
correctly), so I'm forking on every commit, but that doesn't
seem to hurt performance too much (at least on normal Unix/Linux
systems where fork() is pretty cheap).
fetch should be faster in most common cases, but probably not all.
fetches will be faster where client/server delta generation is
the bottleneck and not bandwidth. Of course, full-files are
generated server-side via deltas, too. Full files are always
transferred when they're updated, just like git-svnimport and
unlike command-line svn. I'm also hacking around memory leaks
(see comments) here by using some more forks.
I've tested fetch with http://, https://, file://, and svn://
repositories, so we should be reasonably covered in terms of
error handling for fetching.
Of course, we'll keep plain command-line svn compatibility as a
fallback for people running SVN 1.1 (I'm looking into library
support for 1.1.x SVN, too). If you want to force command-line
SVN usage, set GIT_SVN_NO_LIB=1 in your environment.
We also require two simultaneous connections (just like
git-svnimport), but this shouldn't be a problem for most
servers.
Less important commands:
show-ignore is slower because it requires repository
access, but -r/--revision <num> can be specified.
graft-branches may use more memory, but it's a
short-term process and is funky-filename-safe.
Signed-off-by: Eric Wong <normalperson@yhbt.net>
2006-06-13 06:23:48 +08:00
|
|
|
|
git-svn: add --follow-parent and --no-metadata options to fetch
--follow-parent:
This is especially helpful when we're tracking a directory
that has been moved around within the repository, or if we
started tracking a branch and never tracked the trunk it was
descended from.
This relies on the SVN::* libraries to work. We can't
reliably parse path info from the svn command-line client
without relying on XML, so it's better just to have the SVN::*
libs installed.
This also removes oldvalue verification when calling update-ref
In SVN, branches can be deleted, and then recreated under the
same path as the original one with different ancestry
information, causing parent information to be mismatched /
misordered.
Also force the current ref, if existing, to be a parent,
regardless of whether or not it was specified.
--no-metadata:
This gets rid of the git-svn-id: lines at the end of every commit.
With this, you lose the ability to use the rebuild command. If
you ever lose your .git/svn/git-svn/.rev_db file, you won't be
able to fetch again, either. This is fine for one-shot imports.
Also fix some issues with multi-fetch --follow-parent that were
exposed while testing this. Additionally, repack checking is
simplified greatly.
git-svn log will not work on repositories using this, either.
Signed-off-by: Eric Wong <normalperson@yhbt.net>
Signed-off-by: Junio C Hamano <junkio@cox.net>
2006-06-28 10:39:13 +08:00
|
|
|
# retval of read_url_paths{,_all}();
|
|
|
|
$l_map = {
|
|
|
|
# repository root url
|
|
|
|
'https://svn.musicpd.org' => {
|
|
|
|
# repository path # GIT_SVN_ID
|
|
|
|
'mpd/trunk' => 'trunk',
|
|
|
|
'mpd/tags/0.11.5' => 'tags/0.11.5',
|
|
|
|
},
|
|
|
|
}
|
|
|
|
|
git-svn: add support for Perl SVN::* libraries
This means we no longer have to deal with having bloated SVN
working copies around and we get a nice performance increase as
well because we don't have to exec the SVN binary and start a
new server connection each time.
Of course we have to manually manage memory with SVN::Pool
whenever we can, and hack around cases where SVN just eats
memory despite pools (I blame Perl, too). I would like to
keep memory usage as stable as possible during long fetch/commit
processes since I still use computers with only 256-512M RAM.
commit should always be faster with the SVN library code. The
SVN::Delta interface is leaky (or I'm not using it with pools
correctly), so I'm forking on every commit, but that doesn't
seem to hurt performance too much (at least on normal Unix/Linux
systems where fork() is pretty cheap).
fetch should be faster in most common cases, but probably not all.
fetches will be faster where client/server delta generation is
the bottleneck and not bandwidth. Of course, full-files are
generated server-side via deltas, too. Full files are always
transferred when they're updated, just like git-svnimport and
unlike command-line svn. I'm also hacking around memory leaks
(see comments) here by using some more forks.
I've tested fetch with http://, https://, file://, and svn://
repositories, so we should be reasonably covered in terms of
error handling for fetching.
Of course, we'll keep plain command-line svn compatibility as a
fallback for people running SVN 1.1 (I'm looking into library
support for 1.1.x SVN, too). If you want to force command-line
SVN usage, set GIT_SVN_NO_LIB=1 in your environment.
We also require two simultaneous connections (just like
git-svnimport), but this shouldn't be a problem for most
servers.
Less important commands:
show-ignore is slower because it requires repository
access, but -r/--revision <num> can be specified.
graft-branches may use more memory, but it's a
short-term process and is funky-filename-safe.
Signed-off-by: Eric Wong <normalperson@yhbt.net>
2006-06-13 06:23:48 +08:00
|
|
|
Notes:
|
|
|
|
I don't trust the each() function on unless I created %hash myself
|
|
|
|
because the internal iterator may not have started at base.
|