[Neurodebian-upstream] [Nipy-devel] Standard dataset

Yaroslav Halchenko debian at onerussian.com
Wed Sep 22 01:24:56 UTC 2010


On Tue, 21 Sep 2010, Matthew Brett wrote:
> I confess I did not completely understand what LKCL was trying to do.
and I confess that I only think that I understand what LKCL was trying
to do ;) and I haven't tried his code myself, but thanks to you I will

> Do I understand correctly, that, in your git repository, you checkout
> the commit that you want to export, then export that commit as a
> packfile, to a torrent.
I think correct... only thing not digesting for me right with
packs, that afaik same objects could reside in different packs for
objects among people, thus demolishing the efficiency of the system.
What I thought he does is feeding objects themselves into torrent, so
you could simply fetch them based on sha sums from the web of torrents.

> Thence you can use the LKCL commands to unpack the torrent again as a
> git commit.
rright - you would just collect all necessary objects and be ready to
checkout ;)

-- 
                                  .-.
=------------------------------   /v\  ----------------------------=
Keep in touch                    // \\     (yoh@|www.)onerussian.com
Yaroslav Halchenko              /(   )\               ICQ#: 60653192
                   Linux User    ^^-^^    [175555]





More information about the Neurodebian-upstream mailing list