[Apt-zip-devel] offline.debian.net method

Eddy Petrișor eddy.petrisor at gmail.com
Tue Mar 18 10:47:26 UTC 2008


Giacomo A. Catenazzi wrote:
> Eddy Petrișor wrote:
>> Giacomo Catenazzi wrote:
>>
>> I just finished importing and tagging the missing versions into subversion,
>> so now the svn version should be the same as the 0.18 release and should 
>> contain all the history contained in the archive (0.16, 0.17 and 0.18).
> 
> thanks :-)
> What method do you use?

For these imports I just got the old releases and copied the files over the trunk checkout of 
apt-zip, committed, used 'svn-buildpackage --svn-only-tag' to tag, repeat with the next release.

> I see in wiki.d.o that a lot of team have a lot of different
> methods. We have simple life because we are upstream and we
> have only one package.

Yes, that is true, native packages are well supported since svn-buildpackage was the first to be 
maintained through svn-buildpackage ;-) . Also, they are easier to maintain.


For the previous releases the flow was:
- change
- svn-buildpackage --svn-ignore-new
- test (bug still present)
- change
- svn-buildpackage --svn-ignore-new
- test (bug fixed)
- dch -e "bla bla"
- svn ci <modifications>
- svn-buildpackage
- test (release is ready)
- svn-buildpackage --svn-only-tag
- ask for an upload

> Anyway do you use plain svn or svn-buildpackage ?

I think one ends up using both since svn-buildpackage doesn't provide you (as a native package) to 
add new files, but svn does.

I use both, but svn-buildpackage automates some things like building+lintian checks, tagging, 
addition of a new changelog entry to avoid working on a released version after tagging (although dch 
has some option to detect if a version was released 'dch --release-heuristic log'), screaming when 
trying to build a package that didn't had it modifications committed in svn (there is also the 
--svn-ignore-new option that allows building without committing).

Generally, I tend to svn-buildpackage instead of dpkg-buildpackage since I already have it in my 
fingers and because I am a svn-buildpackage co-maintainer.



Also, I am open to any questions, suggestions about svn-bp usage, be it related to apt-zip 
maintainance or not.

>>> As further step we can add:
>>> - dependencies search,
>>> - split big tar to an user defined max size (i.e. floppy)
>>> - user could save the package list
>>>   - easier to get new security or stable-update
>>>   - notified via email
>>> - ...
>> Yes, this sounds good. Still, the bottleneck here seems to be the server 
>> itslef. This would definitely have to be account (or some other 
>> authentication) based in order to prevent DoS attacks.
> 
> debian mirror are not authenticated.

I was talking about the .d.n server that does the assembly and calculat.... (light bulb on)

OH, you mean the server will only do calculations and will leave downloading to the connected 
machine? That would mean we would still need some way to download stuff. Is that the javascript 
stuff you were talking about?

So, let me get this straight:
- disconnected machine (DM) uses another fetch method, let's call it 'offline.d.n' and creates a 
script/set of files that allow the connected machine (CM) to export the state of the DM to 
offline.debian.net
- the CM will access offline.d.n and will upload the DM state
- offline.d.n calculates dependencies and generates some javascript that will get the necessary 
files for the DM to get it into the desired state; offiline.d.n redirects (somehow) the CM to the 
page that has the JS
- the JS script runs on the CM and, as a result, it starts downloading the necessary files and 
placing them in a local archive
- files are downloaded and ready to be brought back to the DM
- on the DM, apt-zip-inst is ran

> And for my experiences, authentication is much more resource extensive
> (captcha, confirmation via email [to the wrong people], ...).

I was saying that there should be some authentication on offiline.d.n to prevent it being a target 
of DoS attacks via automatic uploads of machine states.

> I think it would be simpler to limit bandwidth or number of downloads
> (apache and lighttpd have such options).

now *I* am confused. Does offline.d.n does the downloads then the CM gets the whole archive, or 
offline.d.n is only a dependency calculator?

>> Also, apt-zip would be only responsible of creating a "local machine 
>> status" snapshot (states of the packages+sources.list{,.d/*}).
>>
>> Still, I think this should *not* replace the current apt-zip-list and 
>> apt-zip-inst scripts until we have something really functional.
> 
> but also the fetch scripts should remain. They are very good for
> automated job.
> 
> I see the web version as an additional fetch method.

ok.

>> It depends on which packages are downloaded. We could impose a soft 
>> limit of (say) 50MB for a apt-zip-server upgrade and request direct 
>> usage of apt-zip-list update + apt-zip-list upgrade ... for bigger 
>> downloads.
> 
> let see. I really don't know usage pattern and how much people
> will use it.
> But if the method become popular, we would find easily mirror
> machines.
> Probably we should also talk with Ubuntu, to share resources.

Initially I thought offline.d.n does downloads itself (probably hosting the service on a mirror will 
make this step fast), but that would ignore the apt sources from the DM 
sources.list{,.d/*.sources.list} files and would bottleneck the CM-offline.d.n connection or even 
trigger a DoS on offline.d.n.

That's why I think is better to have all the downloading to the CM, but the dependency calculation 
should be passed to offline.d.n.


-- 
Regards,
EddyP
=============================================
"Imagination is more important than knowledge" A.Einstein




More information about the apt-zip-devel mailing list