Skip to content(if available)orjump to list(if available)

Git-fetch-file – Sync files from other repos with commit tracking and safety

Bukhmanizer

I’m imagining some sort of art project where people try and figure out the most complicated software you can make entirely off files stitched together from different repos.

conception

Or a Fuse filesystem based on it.

andrewmcwatters

Haha! That is a pretty creative idea.

andrewmcwatters

Hi HN, thanks for sending this to the front page.

I'm finding myself needing some resources from other projects in a way that ecosystem-specific dependency management isn't going to help, or I'd be pulling in too many files.

Submodules aren't the answer, and some other existing git user-defined commands don't seem to do what I need either.

I want a file from another repository, and the ability to pin it, and track it in the future, or just always be up-to-date by using the default HEAD commit value set in `.git-remote-files`.

    git fetch-file add https://github.com/octocat/Hello-World README
    git fetch-file pull
Let's me track the README file from the octocat/Hello-World repository, and pull down the file. A record of it is then saved in `.git-remote-files`.

Let me know if you have any questions!

g4cg54g54

been wanting to build something very similar, so sharing some notes (before actually getting to test it):

--dry-run

a "push" subcommand? (especially in combination with 'overwrite to local repository-path" mentioned below, for remotes rater useless sure ;))

also your readme leaves "kinda open" what happens with the actual file(s), `.git-remote-files` is mentioned "should be committed", but the file it cloned?

also a little unclear how `--save` plays into that (since the .git-remote-files example shows only a commit no branch) (and when would one ever run it without save?)

cli-arg/secondary `.git-remote-files`-File (possibly as secondary `.local.git-remote-files` or such that can also override repository-URLs) for local/private repos?

option (autodetect?) to also write gitattributes to mark those picked files binary (what may also be done into the repo, or local-only into the .git/ dir of the repo...)

since its called `git-fetch-file` and not `.git-remote-files` a overall comment may be nice as refence when first generating the file ;)

but by now i´m just rambling, looking forward to actually try it when home ;) thanks in advance

andrewmcwatters

Thank you for your comments!

I'll thinking about what would be some nice output for --dry-run. Do you have a desired behavior? Maybe something like this?

    Would fetch:
      src/utils.js from https://github.com/user/library.git (a1b2c3d -> f9e8d7c)
      config/webpack.js from https://github.com/company/tools.git (HEAD -> 1a2b3c4)
    
    Would skip (local changes):
      docs/README.md from https://github.com/org/templates.git (use --force to overwrite)
    
    Up to date:
      package.json from https://github.com/user/library.git (f4e5d6c)
Push seems kinda neat for getting changes back to a remote!

I will try to make the README.md a little more clear about what happens after `pull`, because you're right, it's not specified, but files aren't actually committed, just placed in the directory for you to do as you please.

I like your ideas! Thank you!

g4cg54g54

for --dry-run that looks pretty good yeps! I like the "--force hint" in there too!

for the "push", I think my idea was mostly about "local-remotes", think "I have both cloned locally, with both IDEs open going back&forth"

one injection there would be `../someupstream/file` vs. `../someupstream/.git/refs/HEAD:file`... aka "pick the file as is" (potentially marked as "${HEAD}-dirty" vs. "only committed things are truth" (and if just so one doesn't need a extra `cp` command ;))

`just placed in the directory for you to do as you please.` could open a "--auto-commit" option -> based on a template similar to the dry run? (ideally overridable in .git-remote-files)

null

[deleted]