[Debconf6-data-commit] r490 - / procedings procedings/25-piuparts procedings/25-piuparts/orig procedings/27-Packaging-shared-libraries procedings/27-Packaging-shared-libraries/orig procedings/29-codes-of-value procedings/29-codes-of-value/orig procedings/37-ligtning-talks procedings/37-ligtning-talks/orig

Alexander Schmehl tolimar at costa.debian.org
Mon Apr 10 20:30:19 UTC 2006


Author: tolimar
Date: 2006-04-10 20:30:09 +0000 (Mon, 10 Apr 2006)
New Revision: 490

Added:
   procedings/
   procedings/25-piuparts/
   procedings/25-piuparts/orig/
   procedings/25-piuparts/orig/piuparts-debconf-paper.tex
   procedings/25-piuparts/paper.tex
   procedings/27-Packaging-shared-libraries/
   procedings/27-Packaging-shared-libraries/orig/
   procedings/27-Packaging-shared-libraries/orig/libraries.tex
   procedings/27-Packaging-shared-libraries/paper.tex
   procedings/29-codes-of-value/
   procedings/29-codes-of-value/orig/
   procedings/29-codes-of-value/orig/Coleman-Codes-Value.txt
   procedings/29-codes-of-value/paper.tex
   procedings/37-ligtning-talks/
   procedings/37-ligtning-talks/orig/
   procedings/37-ligtning-talks/orig/lightning.tex
   procedings/37-ligtning-talks/paper.tex
   procedings/all.dvi
   procedings/all.tex
Log:
First snapshot of procedings

Added: procedings/25-piuparts/orig/piuparts-debconf-paper.tex
===================================================================
--- procedings/25-piuparts/orig/piuparts-debconf-paper.tex	2006-04-10 11:52:01 UTC (rev 489)
+++ procedings/25-piuparts/orig/piuparts-debconf-paper.tex	2006-04-10 20:30:09 UTC (rev 490)
@@ -0,0 +1,452 @@
+%
+% finnish-inquisition.tex
+%
+% Copyright 2006 Lars Wirzenius.
+% No rights reserved. Use, modify, copy as you wish in whatever form and
+% format you wish.
+
+
+\documentclass[11pt,a4paper]{article}
+
+\title{Nobody expects the Finnish inquisition \\
+{\small Confessions of a Debian package torturer}}
+\author{Lars Wirzenius (liw at iki.fi)}
+
+\begin{document}
+
+\maketitle
+
+\begin{abstract}
+
+                This article gives an overview of piuparts, a tool for
+                testing that Debian packages install, upgrade, and purge
+                without problems. We cover what it does, how it can be
+                used by a package maintainer, and what has been done to
+                test all packages in Debian with it.
+
+\end{abstract}
+
+\section{Introduction}
+
+                Debian is building a high quality free operating system.
+                Our approach to testing is primarily to attract users to
+                it. Unless bugs are reported, we assume everything works.
+                
+                This works fairly well, at least for popular software.
+                For less popular software, it doesn't work so well. If
+                no-one removes a package before a Debian release, we
+                won't know about the bug in the post-removal script that
+                accidentally deletes the entire filesystem. The bug
+                lurks until someone is curious, tries the software, then
+                removes it.
+                
+                This is an extreme scenario, and fairly unlikely. Other
+                bugs are common, ranging from failures to install
+                properly due to missing dependencies, to failures to
+                remove properly, due to various reasons.
+                
+                Many of the more popular packages in Debian also have
+                these kinds of bugs. It would seem that relying on users
+                reporting bugs does not work well on finding many
+                non-destructive, but irritating bugs. Automatic testing
+                tools are better at consistent nit-picking.
+                
+                The piuparts program was written to specifically test
+                installation, upgrading, and removal of packages. It was
+                first written in the early summer of 2005, and has been
+                run against all packages in the Debian archive since
+                the late summer of 2005. Hundreds of bugs have been filed
+                based on this work.
+                
+\section{Piuparts principles}
+
+                Piuparts, which is short for "package installation,
+                upgrading, and removal testing suite", a name suggested
+                by Tollef Fog Heen, does three types of tests. In all
+                tests, piuparts first creates a chroot with a fairly
+                minimal Debian installation using debootstrap, and
+                installs, removes, and otherwise handles packages
+                inside the chroot. It then compares snapshots taken at
+                opportune moments to see that everything is fine.
+                
+                In the first kind of test, the basic install/purge test,
+                piuparts installs and then removes and purges a package.
+                It compares snapshots taken before installation and
+                after purging, and reports any changes: any files that
+                exist after purging that didn't exist before installation,
+                or vice versa, or files that have been changed.
+                
+                The guiding principle is that purging a package should
+                restore the system into the same state as it was before
+                it was installed. There are some exceptions, but they are
+                fairly rare and piuparts can be taught about them. For
+                the vast majority of packages the principle holds, and
+                when it doesn't, it is a bug in the package.
+                
+                The second kind of test checks that upgrades from the
+                previous version of the package work. The previous version
+                is the one currently in the Debian archive and the new
+                version is the one that has been built locally. This is
+                useful for package maintainers to check before uploading.
+                
+                The third kind of test checks upgrades between Debian
+                releases. For example, one might check that a package
+                can be installed in sarge, then upgraded (with the rest
+                of the system) to etch, and further to sid, and finally
+                removed and purged.
+                
+\section{Piuparts usage}
+
+                The most common use case for piuparts is for a package
+                maintainer to test their new package version before uploading
+                it to Debian. This is done with the following command
+                line:
+
+\begin{quote}\begin{verbatim}
+piuparts *.deb
+\end{verbatim}\end{quote}
+                %
+                The command line syntax has been optimized for this 
+                commonest case. It uses {\tt /etc/apt/sources.list} to find
+                the preferred mirror, then runs the first and second
+                kind of test described above.
+                
+                The command produces a lot of output. If there are any
+                errors, they are reported at the end, but the preceding
+                output is sometimes useful for debugging.
+                
+\begin{figure}[htb]
+\small
+\begin{quote}\begin{verbatim}
+0m0.0s DEBUG: Setting up minimal chroot for sid at 
+/tmp/tmpfWhyUZ.
+0m0.0s DEBUG: Starting command: debootstrap --resolve-deps 
+sid /tmp/tmpfWhyUZ http://liw.iki.fi/debian/
+0m0.1s DUMP:   I: Retrieving Release
+...
+1m1.9s DUMP:   I: Base system installed successfully.
+...
+1m2.2s DEBUG: Created policy-rc.d and chmodded it.
+1m2.2s DEBUG: NOT minimizing chroot because of dpkg bug
+...
+1m3.4s DEBUG: Starting command: chroot /tmp/tmpfWhyUZ 
+dpkg -i tmp/liwc\_1.20-2\_i386.deb
+...
+1m3.5s DEBUG: Starting command: chroot /tmp/tmpfWhyUZ 
+apt-get -yf --no-remove install
+...
+1m3.9s DEBUG: Starting command: chroot /tmp/tmpfWhyUZ 
+dpkg --remove liwc
+...
+1m3.9s DEBUG: Starting command: chroot /tmp/tmpfWhyUZ 
+dpkg --remove --pending
+...
+1m4.0s DEBUG: Starting command: chroot /tmp/tmpfWhyUZ 
+dpkg --purge liwc
+...
+1m4.7s INFO: PASS: Installation and purging test.
+.. .                                    
+1m6.5s INFO: PASS: Installation, upgrade, and purging test.
+\end{verbatim}\end{quote}
+\caption{Interesting parts of piuparts output.}
+\label{piupartssampleoutput}
+\end{figure}
+
+                Some interesting parts of a sample test run are shown in
+                figure~\ref{piupartssampleoutput}. Some notes about the
+                output:
+                
+                \begin{itemize}
+                
+                    \item Time stamps are relative to the start of the
+                    run. This makes it easy to see how long different
+                    operations take. For example, debootstrap
+                    takes just over one minute.
+                    
+                    \item Each line starts with an indication of the
+                    importance of the message: DUMP, DEBUG, INFO, or
+                    ERROR.
+                    
+                    \item {\tt /tmp/tmpfWhyUZ} is the name of the chroot, a
+                    random, temporary directory.
+                    
+                    \item Piuparts reports all external commands it
+                    runs, what they output to stdout or stderr, and what
+                    their exit code is. A non-zero exit code is
+                    interpreted as an error that stops the testing,
+                    except for circumstances where it is acceptable
+                    for the command to fail.
+                    
+                    \item Piuparts creates a {\tt /usr/sbin/policy-rc.d} to
+                    prevent any services from being started up. This
+                    doesn't work perfectly, because not all packages use
+                    {\tt invoke-rc.d} yet, but by a lucky co-incidence,
+                    because piuparts does not mount {\tt /proc} inside the
+                    chroot, {\tt start-stop-daemon} doesn't work, either,
+                    preventing most services from starting. This is a
+                    temporary situation.
+                    
+                    \item The other line at 1m2.2s is a reminder that
+                    piuparts does not minimize the chroot, that is, it
+                    does not remove all non-essential packages from it.
+                    This is because of bug \#318825 in dpkg.
+                    
+                    \item At 1m3.4s piuparts installs the package file
+                    it has been instructed to test. This may fail
+                    because of missing dependencies; these are then
+                    installed by the {\tt apt-get} command at 1m3.5s.
+                    
+                    \item Removal of a package is done at two steps: first
+                    removal, then purging. This seems to expose more
+                    bugs in packages than purging directly.
+                    
+                    \item If all goes well, the "PASS" line will be
+                    printed, as here at 1m4.7s.
+                    
+                    \item Next, the second test is done, which looks
+                    pretty similar in the log file (and is therefore not
+                    repeated here), and finally its "PASS" line is also
+                    printed.
+                
+                \end{itemize}
+                %
+                Sometimes a bug is found. For example, the command to
+                install a package may fail due to a bug in the
+                post-installation script:
+
+\begin{quote}\begin{verbatim}
+0m52.9s ERROR: Command failed (status=25600): 'chroot 
+/tmp/tmpM1bBtd apt-get -y install aspell-lt'
+...
+  Setting up aspell-lt (1.1-4) ...
+  dpkg: error processing aspell-lt (--configure):
+   subprocess post-installation script returned error exit 
+   status 1
+  Errors were encountered while processing:
+   aspell-lt
+  E: Sub-process /usr/bin/dpkg returned an error code (1)
+\end{verbatim}\end{quote}
+                %
+                Here we see the command that failed, the exit code, and
+                the output (stdout and stderr) of the command. Piuparts
+                does not try to analyze what went wrong with the
+                command, this needs to be done manually.
+                
+                When piuparts itself notices an error by comparing chroot
+                snapshots, it prints out errors like this:
+
+\begin{quote}\begin{verbatim}
+0m7.8s ERROR: Package purging left files on system:
+  /usr/lib/aspell
+    owned by: libaspell15, aspell-no
+  /usr/lib/aspell/no.dat
+  /usr/lib/aspell/no.multi
+  /usr/lib/aspell/no\_phonet.dat
+\end{verbatim}\end{quote}
+                %
+                This reports that after the package was purged, there were
+                extra files remaining on the system, and that of these
+                files, {\tt /usr/lib/aspell} was owned by the packages
+                {\tt libaspell15} and {\tt aspell-no}. The other files were not 
+                owned by any package, as far as dpkg knows. Indeed, when
+                investigating the package source code, these
+                files are created by the post-installation script. The
+                bug, then, is that they pre- or post-removal scripts don't
+                remove the files as they should.
+
+\section{Typical problems in packages}
+
+                The top five most common or important problems in packages
+                found by piuparts are:
+     
+                \begin{enumerate}
+                
+                    \item {\tt postinst} creates a file (e.g., a
+                       configuration or log file), but {\tt postrm} does not
+                       remove it.
+                       
+                    \item The package handles alternatives badly, e.g., by
+                       using different names for the alternative in
+                       {\tt postinst} and {\tt postrm}.
+                       
+                    \item {\tt postrm} uses {\tt ucf} or another command from a
+                       non-essential package during purge.
+                     
+                    \item Something else goes wrong and the maintainer
+                       script uses "\verb|>/dev/null 2>&1|" to hide it. This
+                       can make sysadmins very livid.
+                       
+                    \item Packages don't use {\tt invoke-rc.d} if it's there,
+                       thus starting (or trying to) services when they
+                       shouldn't.
+                       
+                \end{enumerate}
+                %
+                There are, of course, many other kinds of problems in
+                packages, but these have been among the most common ones
+                as found by piuparts. For a complete list, query the
+                Debian bug tracking system for user tag {\tt found-by-piuparts}
+                by user {\tt piuparts at qa.debian.org}.
+
+\section{Making piuparts run faster}
+
+                Creating a chroot can take quite a while, and piuparts
+                can thus take several minutes to run even on a fast
+                modern PC. There are several things that can be done
+                to make it go faster.
+                
+                \begin{itemize}
+                
+                    \item Tell piuparts to use the pbuilder chroot,
+                    which needs to be kept up to date anyway. Use
+                    option~\verb|-p|.
+                    
+                    \item Alternatively, tell piuparts to save its own
+                    chroot, and use that in later runs. Options~\verb|-s|
+                    and~\verb|-b|.
+                    
+                    \item Have a local mirror, or an http cache that keeps
+                    Debian packages.
+                    
+                \end{itemize}
+                %
+                With these settings, a full piuparts run can take as little
+                as a few seconds.
+
+\section{Testing all packages}
+
+                Piuparts has been running against as many packages as
+                possible in the Debian archive since August, 2005. The
+                first and third kind of tests have been run: a basic
+                install/purge test, and then an upgrade test with
+                installation in sarge, upgrading to etch, then to sid,
+                then purging.
+                
+                At the end of March, as this is being written, 310 bugs
+                have been filed, based on manual analysis of failed
+                piuparts tests. Of these, 148 or 48~\% have been fixed,
+                and one if pending upload. A fix rate of 48~\% is not too
+                bad, but it should be much higher.
+                
+                To run piuparts on the entire archive, a distributed
+                system has been developed to share the load on many
+                computers, consisting of piuparts-master and
+                piuparts-slave. The master keeps track of which packages
+                and versions have been tested, and what the result was
+                (and the corresponding log files), and instructs the
+                slaves as to which packages they should be testing next.
+                
+                The slaves get lists of packages and versions to test,
+                and test those and report results back to the master.
+                There can only be one master, but any number of slaves.
+                
+                Not all packages are being tested. The master will only
+                mark a package for testing if its dependencies have already
+                passed testing, otherwise the test is certain to fail, and
+                manual analysis will have to figure out whether it was
+                the package itself, or one of its dependencies that failed.
+                This is tedious, boring work, which is better avoided.
+                
+                In addition, the master won't allow packages that are
+                part of a dependency loop to be tested. Dependency loops
+                must be broken by dpkg, and this makes things 
+                undeterministic. This seems to indicate that dependency
+                loops mean packages are inherently buggy, even if they
+                often work well enough in practice.
+                
+                Also, some packages are impossible to test successfully
+                with piuparts, either due to limitations in piuparts, 
+                or due to the natures of the packages. Piuparts is not
+                able to test packages that are already in the chroot
+                it initially creates, or packages that replace packages
+                in the chroot. Some packages are already broken
+                in sarge, so the upgrade test will fail because of that.
+                Some packages cannot be installed on a normal system
+                at all.
+                
+                A breakup of packages in various classes of testability
+                at the end of March are given in \ref{piupartsstats}.
+                
+\begin{table}[htb]
+\label{piupartsstats}
+\caption{Classes of packages, end of March 2006}
+\begin{center}
+\begin{tabular}{l r}
+
+\hline
+Package class                       & count \\
+\hline
+successfully-tested                 & 5140 \\
+failed-testing                      & 454 \\
+fix-not-yet-tested                  & 4 \\
+cannot-be-tested                    & 95 \\
+essential-required-important        & 149 \\
+waiting-to-be-tested                & 1377 \\
+waiting-for-dependency-to-be-tested & 2069 \\
+dependency-failed-testing           & 4418 \\
+dependency-cannot-be-tested         & 742 \\
+dependency-does-not-exist           & 520 \\
+dependency-fix-not-yet-tested       & 9 \\
+circular-dependency                 & 3995 \\
+unknown                             & 0 \\
+\hline
+Total                               & 18972 \\
+\hline
+
+\end{tabular}
+\end{center}
+\end{table}
+
+
+
+\section{Some plans for the future}
+
+                Some plans for the immediate future of piuparts:
+                
+                \begin{itemize}
+                
+                    \item Make piuparts check for processes inside the
+                    chroot, and fail if there are any. After this is
+                    done, mount {\tt /proc} inside the chroot to catch
+                    packages that start services when they shouldn't.
+                    
+                    \item Run piuparts on multiple architectures. At
+                    the moment, only i386 is being tested. The goal is
+                    to test packages on the faster architectures first,
+                    and only if that is successful, test them on slower
+                    ones.
+                    
+                    \item Re-test all packages regularly, especially after
+                    significant changes to piuparts. Sometimes piuparts
+                    adds new logic to find errors, or changes to
+                    dependencies will cause an old version of a package
+                    to fail.
+                    
+                    \item Deal with packages that replace parts of the
+                    chroot.
+                    
+                    \item Deal with dependency loops and other reasons why
+                    only part of the archive is currently testable by
+                    piuparts-master and -slave.
+                    
+                \end{itemize}
+                %
+                Of course, other suggestions are welcome.
+
+
+\section{See also}
+
+                The following locations may be of interest
+                %
+                \begin{itemize}
+                
+                    \item {\tt http://liw.iki.fi/liw/bzr/piuparts/} is
+                    the public bzr branch for piuparts.
+                    
+                    \item {\tt http://liw.iki.fi/liw/bzr/finnish-inquisition/}
+                    is the public bzr branch for this paper.
+                    
+                \end{itemize}
+
+
+\end{document}

Added: procedings/25-piuparts/paper.tex
===================================================================
--- procedings/25-piuparts/paper.tex	2006-04-10 11:52:01 UTC (rev 489)
+++ procedings/25-piuparts/paper.tex	2006-04-10 20:30:09 UTC (rev 490)
@@ -0,0 +1,436 @@
+%
+% finnish-inquisition.tex
+%
+% Copyright 2006 Lars Wirzenius.
+% No rights reserved. Use, modify, copy as you wish in whatever form and
+% format you wish.
+
+
+\section{Abstract}
+                This article gives an overview of piuparts, a tool for
+                testing that Debian packages install, upgrade, and purge
+                without problems. We cover what it does, how it can be
+                used by a package maintainer, and what has been done to
+                test all packages in Debian with it.
+
+\section{Introduction}
+
+                Debian is building a high quality free operating system.
+                Our approach to testing is primarily to attract users to
+                it. Unless bugs are reported, we assume everything works.
+                
+                This works fairly well, at least for popular software.
+                For less popular software, it doesn't work so well. If
+                no-one removes a package before a Debian release, we
+                won't know about the bug in the post-removal script that
+                accidentally deletes the entire filesystem. The bug
+                lurks until someone is curious, tries the software, then
+                removes it.
+                
+                This is an extreme scenario, and fairly unlikely. Other
+                bugs are common, ranging from failures to install
+                properly due to missing dependencies, to failures to
+                remove properly, due to various reasons.
+                
+                Many of the more popular packages in Debian also have
+                these kinds of bugs. It would seem that relying on users
+                reporting bugs does not work well on finding many
+                non-destructive, but irritating bugs. Automatic testing
+                tools are better at consistent nit-picking.
+                
+                The piuparts program was written to specifically test
+                installation, upgrading, and removal of packages. It was
+                first written in the early summer of 2005, and has been
+                run against all packages in the Debian archive since
+                the late summer of 2005. Hundreds of bugs have been filed
+                based on this work.
+                
+\section{Piuparts principles}
+
+                Piuparts, which is short for "package installation,
+                upgrading, and removal testing suite", a name suggested
+                by Tollef Fog Heen, does three types of tests. In all
+                tests, piuparts first creates a chroot with a fairly
+                minimal Debian installation using debootstrap, and
+                installs, removes, and otherwise handles packages
+                inside the chroot. It then compares snapshots taken at
+                opportune moments to see that everything is fine.
+                
+                In the first kind of test, the basic install/purge test,
+                piuparts installs and then removes and purges a package.
+                It compares snapshots taken before installation and
+                after purging, and reports any changes: any files that
+                exist after purging that didn't exist before installation,
+                or vice versa, or files that have been changed.
+                
+                The guiding principle is that purging a package should
+                restore the system into the same state as it was before
+                it was installed. There are some exceptions, but they are
+                fairly rare and piuparts can be taught about them. For
+                the vast majority of packages the principle holds, and
+                when it doesn't, it is a bug in the package.
+                
+                The second kind of test checks that upgrades from the
+                previous version of the package work. The previous version
+                is the one currently in the Debian archive and the new
+                version is the one that has been built locally. This is
+                useful for package maintainers to check before uploading.
+                
+                The third kind of test checks upgrades between Debian
+                releases. For example, one might check that a package
+                can be installed in sarge, then upgraded (with the rest
+                of the system) to etch, and further to sid, and finally
+                removed and purged.
+                
+\section{Piuparts usage}
+
+                The most common use case for piuparts is for a package
+                maintainer to test their new package version before uploading
+                it to Debian. This is done with the following command
+                line:
+
+\begin{quote}\begin{verbatim}
+piuparts *.deb
+\end{verbatim}\end{quote}
+                %
+                The command line syntax has been optimized for this 
+                commonest case. It uses {\tt /etc/apt/sources.list} to find
+                the preferred mirror, then runs the first and second
+                kind of test described above.
+                
+                The command produces a lot of output. If there are any
+                errors, they are reported at the end, but the preceding
+                output is sometimes useful for debugging.
+                
+\begin{figure}[htb]
+\small
+\begin{quote}\begin{verbatim}
+0m0.0s DEBUG: Setting up minimal chroot for sid at 
+/tmp/tmpfWhyUZ.
+0m0.0s DEBUG: Starting command: debootstrap --resolve-deps 
+sid /tmp/tmpfWhyUZ http://liw.iki.fi/debian/
+0m0.1s DUMP:   I: Retrieving Release
+...
+1m1.9s DUMP:   I: Base system installed successfully.
+...
+1m2.2s DEBUG: Created policy-rc.d and chmodded it.
+1m2.2s DEBUG: NOT minimizing chroot because of dpkg bug
+...
+1m3.4s DEBUG: Starting command: chroot /tmp/tmpfWhyUZ 
+dpkg -i tmp/liwc\_1.20-2\_i386.deb
+...
+1m3.5s DEBUG: Starting command: chroot /tmp/tmpfWhyUZ 
+apt-get -yf --no-remove install
+...
+1m3.9s DEBUG: Starting command: chroot /tmp/tmpfWhyUZ 
+dpkg --remove liwc
+...
+1m3.9s DEBUG: Starting command: chroot /tmp/tmpfWhyUZ 
+dpkg --remove --pending
+...
+1m4.0s DEBUG: Starting command: chroot /tmp/tmpfWhyUZ 
+dpkg --purge liwc
+...
+1m4.7s INFO: PASS: Installation and purging test.
+.. .                                    
+1m6.5s INFO: PASS: Installation, upgrade, and purging test.
+\end{verbatim}\end{quote}
+\caption{Interesting parts of piuparts output.}
+\label{piupartssampleoutput}
+\end{figure}
+
+                Some interesting parts of a sample test run are shown in
+                figure~\ref{piupartssampleoutput}. Some notes about the
+                output:
+                
+                \begin{itemize}
+                
+                    \item Time stamps are relative to the start of the
+                    run. This makes it easy to see how long different
+                    operations take. For example, debootstrap
+                    takes just over one minute.
+                    
+                    \item Each line starts with an indication of the
+                    importance of the message: DUMP, DEBUG, INFO, or
+                    ERROR.
+                    
+                    \item {\tt /tmp/tmpfWhyUZ} is the name of the chroot, a
+                    random, temporary directory.
+                    
+                    \item Piuparts reports all external commands it
+                    runs, what they output to stdout or stderr, and what
+                    their exit code is. A non-zero exit code is
+                    interpreted as an error that stops the testing,
+                    except for circumstances where it is acceptable
+                    for the command to fail.
+                    
+                    \item Piuparts creates a {\tt /usr/sbin/policy-rc.d} to
+                    prevent any services from being started up. This
+                    doesn't work perfectly, because not all packages use
+                    {\tt invoke-rc.d} yet, but by a lucky co-incidence,
+                    because piuparts does not mount {\tt /proc} inside the
+                    chroot, {\tt start-stop-daemon} doesn't work, either,
+                    preventing most services from starting. This is a
+                    temporary situation.
+                    
+                    \item The other line at 1m2.2s is a reminder that
+                    piuparts does not minimize the chroot, that is, it
+                    does not remove all non-essential packages from it.
+                    This is because of bug \#318825 in dpkg.
+                    
+                    \item At 1m3.4s piuparts installs the package file
+                    it has been instructed to test. This may fail
+                    because of missing dependencies; these are then
+                    installed by the {\tt apt-get} command at 1m3.5s.
+                    
+                    \item Removal of a package is done at two steps: first
+                    removal, then purging. This seems to expose more
+                    bugs in packages than purging directly.
+                    
+                    \item If all goes well, the "PASS" line will be
+                    printed, as here at 1m4.7s.
+                    
+                    \item Next, the second test is done, which looks
+                    pretty similar in the log file (and is therefore not
+                    repeated here), and finally its "PASS" line is also
+                    printed.
+                
+                \end{itemize}
+                %
+                Sometimes a bug is found. For example, the command to
+                install a package may fail due to a bug in the
+                post-installation script:
+
+\begin{quote}\begin{verbatim}
+0m52.9s ERROR: Command failed (status=25600): 'chroot 
+/tmp/tmpM1bBtd apt-get -y install aspell-lt'
+...
+  Setting up aspell-lt (1.1-4) ...
+  dpkg: error processing aspell-lt (--configure):
+   subprocess post-installation script returned error exit 
+   status 1
+  Errors were encountered while processing:
+   aspell-lt
+  E: Sub-process /usr/bin/dpkg returned an error code (1)
+\end{verbatim}\end{quote}
+                %
+                Here we see the command that failed, the exit code, and
+                the output (stdout and stderr) of the command. Piuparts
+                does not try to analyze what went wrong with the
+                command, this needs to be done manually.
+                
+                When piuparts itself notices an error by comparing chroot
+                snapshots, it prints out errors like this:
+
+\begin{quote}\begin{verbatim}
+0m7.8s ERROR: Package purging left files on system:
+  /usr/lib/aspell
+    owned by: libaspell15, aspell-no
+  /usr/lib/aspell/no.dat
+  /usr/lib/aspell/no.multi
+  /usr/lib/aspell/no\_phonet.dat
+\end{verbatim}\end{quote}
+                %
+                This reports that after the package was purged, there were
+                extra files remaining on the system, and that of these
+                files, {\tt /usr/lib/aspell} was owned by the packages
+                {\tt libaspell15} and {\tt aspell-no}. The other files were not 
+                owned by any package, as far as dpkg knows. Indeed, when
+                investigating the package source code, these
+                files are created by the post-installation script. The
+                bug, then, is that they pre- or post-removal scripts don't
+                remove the files as they should.
+
+\section{Typical problems in packages}
+
+                The top five most common or important problems in packages
+                found by piuparts are:
+     
+                \begin{enumerate}
+                
+                    \item {\tt postinst} creates a file (e.g., a
+                       configuration or log file), but {\tt postrm} does not
+                       remove it.
+                       
+                    \item The package handles alternatives badly, e.g., by
+                       using different names for the alternative in
+                       {\tt postinst} and {\tt postrm}.
+                       
+                    \item {\tt postrm} uses {\tt ucf} or another command from a
+                       non-essential package during purge.
+                     
+                    \item Something else goes wrong and the maintainer
+                       script uses "\verb|>/dev/null 2>&1|" to hide it. This
+                       can make sysadmins very livid.
+                       
+                    \item Packages don't use {\tt invoke-rc.d} if it's there,
+                       thus starting (or trying to) services when they
+                       shouldn't.
+                       
+                \end{enumerate}
+                %
+                There are, of course, many other kinds of problems in
+                packages, but these have been among the most common ones
+                as found by piuparts. For a complete list, query the
+                Debian bug tracking system for user tag {\tt found-by-piuparts}
+                by user {\tt piuparts at qa.debian.org}.
+
+\section{Making piuparts run faster}
+
+                Creating a chroot can take quite a while, and piuparts
+                can thus take several minutes to run even on a fast
+                modern PC. There are several things that can be done
+                to make it go faster.
+                
+                \begin{itemize}
+                
+                    \item Tell piuparts to use the pbuilder chroot,
+                    which needs to be kept up to date anyway. Use
+                    option~\verb|-p|.
+                    
+                    \item Alternatively, tell piuparts to save its own
+                    chroot, and use that in later runs. Options~\verb|-s|
+                    and~\verb|-b|.
+                    
+                    \item Have a local mirror, or an http cache that keeps
+                    Debian packages.
+                    
+                \end{itemize}
+                %
+                With these settings, a full piuparts run can take as little
+                as a few seconds.
+
+\section{Testing all packages}
+
+                Piuparts has been running against as many packages as
+                possible in the Debian archive since August, 2005. The
+                first and third kind of tests have been run: a basic
+                install/purge test, and then an upgrade test with
+                installation in sarge, upgrading to etch, then to sid,
+                then purging.
+                
+                At the end of March, as this is being written, 310 bugs
+                have been filed, based on manual analysis of failed
+                piuparts tests. Of these, 148 or 48~\% have been fixed,
+                and one if pending upload. A fix rate of 48~\% is not too
+                bad, but it should be much higher.
+                
+                To run piuparts on the entire archive, a distributed
+                system has been developed to share the load on many
+                computers, consisting of piuparts-master and
+                piuparts-slave. The master keeps track of which packages
+                and versions have been tested, and what the result was
+                (and the corresponding log files), and instructs the
+                slaves as to which packages they should be testing next.
+                
+                The slaves get lists of packages and versions to test,
+                and test those and report results back to the master.
+                There can only be one master, but any number of slaves.
+                
+                Not all packages are being tested. The master will only
+                mark a package for testing if its dependencies have already
+                passed testing, otherwise the test is certain to fail, and
+                manual analysis will have to figure out whether it was
+                the package itself, or one of its dependencies that failed.
+                This is tedious, boring work, which is better avoided.
+                
+                In addition, the master won't allow packages that are
+                part of a dependency loop to be tested. Dependency loops
+                must be broken by dpkg, and this makes things 
+                undeterministic. This seems to indicate that dependency
+                loops mean packages are inherently buggy, even if they
+                often work well enough in practice.
+                
+                Also, some packages are impossible to test successfully
+                with piuparts, either due to limitations in piuparts, 
+                or due to the natures of the packages. Piuparts is not
+                able to test packages that are already in the chroot
+                it initially creates, or packages that replace packages
+                in the chroot. Some packages are already broken
+                in sarge, so the upgrade test will fail because of that.
+                Some packages cannot be installed on a normal system
+                at all.
+                
+                A breakup of packages in various classes of testability
+                at the end of March are given in \ref{piupartsstats}.
+                
+\begin{table}[htb]
+\label{piupartsstats}
+\caption{Classes of packages, end of March 2006}
+\begin{center}
+\begin{tabular}{l r}
+
+\hline
+Package class                       & count \\
+\hline
+successfully-tested                 & 5140 \\
+failed-testing                      & 454 \\
+fix-not-yet-tested                  & 4 \\
+cannot-be-tested                    & 95 \\
+essential-required-important        & 149 \\
+waiting-to-be-tested                & 1377 \\
+waiting-for-dependency-to-be-tested & 2069 \\
+dependency-failed-testing           & 4418 \\
+dependency-cannot-be-tested         & 742 \\
+dependency-does-not-exist           & 520 \\
+dependency-fix-not-yet-tested       & 9 \\
+circular-dependency                 & 3995 \\
+unknown                             & 0 \\
+\hline
+Total                               & 18972 \\
+\hline
+
+\end{tabular}
+\end{center}
+\end{table}
+
+
+
+\section{Some plans for the future}
+
+                Some plans for the immediate future of piuparts:
+                
+                \begin{itemize}
+                
+                    \item Make piuparts check for processes inside the
+                    chroot, and fail if there are any. After this is
+                    done, mount {\tt /proc} inside the chroot to catch
+                    packages that start services when they shouldn't.
+                    
+                    \item Run piuparts on multiple architectures. At
+                    the moment, only i386 is being tested. The goal is
+                    to test packages on the faster architectures first,
+                    and only if that is successful, test them on slower
+                    ones.
+                    
+                    \item Re-test all packages regularly, especially after
+                    significant changes to piuparts. Sometimes piuparts
+                    adds new logic to find errors, or changes to
+                    dependencies will cause an old version of a package
+                    to fail.
+                    
+                    \item Deal with packages that replace parts of the
+                    chroot.
+                    
+                    \item Deal with dependency loops and other reasons why
+                    only part of the archive is currently testable by
+                    piuparts-master and -slave.
+                    
+                \end{itemize}
+                %
+                Of course, other suggestions are welcome.
+
+
+\section{See also}
+
+                The following locations may be of interest
+                %
+                \begin{itemize}
+                
+                    \item \url{http://liw.iki.fi/liw/bzr/piuparts/} is
+                    the public bzr branch for piuparts.
+                    
+                    \item \url{http://liw.iki.fi/liw/bzr/finnish-inquisition/}
+                    is the public bzr branch for this paper.
+                    
+                \end{itemize}

Added: procedings/27-Packaging-shared-libraries/orig/libraries.tex
===================================================================
--- procedings/27-Packaging-shared-libraries/orig/libraries.tex	2006-04-10 11:52:01 UTC (rev 489)
+++ procedings/27-Packaging-shared-libraries/orig/libraries.tex	2006-04-10 20:30:09 UTC (rev 490)
@@ -0,0 +1,339 @@
+\documentclass[a4paper,10pt]{article}
+
+\usepackage[scale=0.8]{geometry}
+\usepackage[english]{babel}
+\usepackage[utf8]{inputenc}
+\usepackage{url,ae}
+
+\begin{document}
+\title{Packaging shared libraries}
+\author{Josselin \sc Mouette --- CS Systèmes d'information}
+\maketitle
+
+
+\section{Introducing shared libraries}
+
+\subsection{Basic concepts}
+
+A library is a piece of code that can be used in several binaries, split out for factorization reasons. In the old days, libraries were all \textit{statically linked}; that is, they were included directly in the resulting binary. Modern operating systems use shared libraries, which are compiled in separate files and loaded together with the binary at startup time. Shared libraries are the most widespread use of \textit{shared objects}, files containing code that can be loaded at runtime, generally with the \texttt{.so} extension.
+
+\subsection{A bit of terminology}
+
+\paragraph{API} The \textit{Application Programming Interface} of a library describes how it can be used by the programmer. It generally consists in a list of structures and functions and their associated behavior. Changing the behavior of a function or the type of arguments it requires \textit{breaks} the API: programs that used to compile with an older version of the library will stop building.
+
+\paragraph{ABI} The \textit{Application Binary Interface} defines the low-level interface between a shared library and the binary using it. It is specific to the architecture and the operating system, and consists in a lists of \textit{symbols} and their associated type and behavior. A binary linked to a shared library will be able to run with another library, or another version of that library, provided that it implements the same ABI. Adding elements to a structure or turning a function into a macro \textit{breaks} the ABI: binaries that used to run with an older version of the library will stop loading. Most of the time, breaking the API also breaks the ABI.
+
+\paragraph{SONAME} The "SONAME" is the canonical name of a shared library, defining an ABI for a given operating system and architecture. It is defined when building the library. The convention for SONAMEs is to use \texttt{libfoo.so.N} and to increment N whenever the ABI is changed. This way, ABI--incompatible versions of the library and binaries using them can coexist on the same system.
+
+\subsection{Linking and using libraries}
+
+A simple example of building a library using gcc :
+\begin{verbatim}
+gcc -fPIC -c -o foo-init.o foo-init.c
+[ ... ]
+gcc -shared -Wl,-soname,libfoo.so.3 -o libfoo.so.3 foo-init.o foo-client.o [...]
+ln -s libfoo.so.3 libfoo.so
+\end{verbatim}
+
+As the command line shows, the SONAME is defined at that time. The symbolic link is needed for compilation of programs using the library. Supposing it has been installed in a standard location, you can link a binary --- which can be another shared library --- using it with \texttt{-lfoo}. The linker looks for \texttt{libfoo.so}, and stores the SONAME found (\texttt{libfoo.so.3}) in the binary's ELF\footnote{\textit{Executable and Linking Format}: the binary format for binaries and shared objects on most UNIX systems.} header. 
+
+The output of the \texttt{objdump -p} command shows the headers of an ELF object. For the library, the output contains:
+\begin{verbatim}
+  SONAME      libfoo.so.3
+\end{verbatim}
+For the binary, it contains:
+\begin{verbatim}
+  NEEDED      libfoo.so.3
+\end{verbatim}
+
+The symbols provided by the library remain undefined in the binary at that time. In the dynamic symbol table showed by \texttt{objdump -T}, the library contains the symbol:
+\begin{verbatim}
+0807c8e0 g    DF .text  0000007d  Base        foo_init
+\end{verbatim}
+while in the binary it remains undefined:
+\begin{verbatim}
+00000000      DF *UND*  0000001c              foo_init
+\end{verbatim}
+
+When the binary is started, the GNU \textit{dynamic linker}\footnote{Other linkers can use a different scheme, especially when it comes to filename lookup.} looks for the NEEDED sections and loads the libraries listed there, using the SONAME as a file name. It then maps the undefined symbols to the ones found in the libraries.
+
+\subsection{Libtool}
+\label{libtool}
+
+Libtool is a tool designed to simplify the build process of libraries. It is full of features that make the developers' life easier, and full of bugs that bring added complexity for system administrators and especially distribution maintainers. Its paradigm is to build an extra file, named \texttt{libfoo.la}, which contains some metadata about the library; most importantly, the list of library dependencies for the library itself. Together with this file, it can build the shared version \texttt{libfoo.so} and the static version \texttt{libfoo.a} of the library.
+
+It integrates easily with autoconf and automake. You can put in the \texttt{configure.ac}\footnote{The version information is given for libtool's versioning scheme. You can read more about it in the libtool manual.}:
+\begin{verbatim}
+AM_PROG_LIBTOOL
+VERSION_INFO=3:1:0
+AC_SUBST(VERSION_INFO)
+\end{verbatim}
+and in the \texttt{Makefile.am}:
+\begin{verbatim}
+libfoo_la_SOURCES = foo-init.c foo-client.c foo.h [...]
+libfoo_la_LDFLAGS = -version-info @VERSION_INFO@
+libfoo_HEADERS = foo.h
+\end{verbatim}
+
+\subsection{Pkgconfig}
+
+Pkgconfig is a tool to replace the variety of \texttt{libfoo-config} scripts in a standard way that integrates with autoconf. Here is a sample file, \texttt{libnautilus-burn.pc}:
+\begin{verbatim}
+prefix=/usr
+exec_prefix=${prefix}
+libdir=${exec_prefix}/lib
+includedir=${prefix}/include/libnautilus-burn
+
+Name: libnautilus-burn
+Description: Nautilus Burn Library
+Version: 2.12.3
+Requires: glib-2.0 gtk+-2.0
+Libs: -L${libdir} -lnautilus-burn
+Cflags: -I${includedir}
+\end{verbatim}%$
+The \texttt{Cflags:} and \texttt{Libs:} fields provide the list of \texttt{CFLAGS} and \texttt{LDFLAGS} to use for linking with that library. The \texttt{Requires:} field provides some dependencies that a binary using that library should also link with. In this case, pkgconfig will also look for \texttt{glib-2.0.pc} and \texttt{gtk+-2.0.pc}.
+
+\medskip
+
+Integration with autoconf is provided. Here is an example \texttt{configure.ac} test for a package requiring the GTK+ library:
+\begin{verbatim}
+PKG_CHECK_MODULES(GTK, gtk+-2.0 >= 2.6.0,,
+                  AC_MSG_ERROR([GTK+-2.0 is required]))
+\end{verbatim}
+
+\section{Debian packaging of a shared library}
+
+\subsection{Simple case -- what the policy mandates}
+
+Packaging a simple library for Debian is not much different from another piece of software. In all cases there should at least be two packages:
+\begin{itemize}
+\item \texttt{libfoo3}, containing the \texttt{/usr/lib/*.so.*} files, so that you get \texttt{libfoo.so.3}. The \texttt{postinst} script of this package should contain a call to the \texttt{ldconfig} command, and it has to be registered in dpkg's \textit{shlibs} database. This can be achieved by a call to \texttt{dh\_makeshlibs}.
+\item \texttt{libfoo-dev} or \texttt{libfoo3-dev}, containing the headers in \texttt{/usr/include}, and other files in \texttt{/usr/lib}: the \texttt{libfoo.so} symbolic link, the \texttt{libfoo.a} static library, and if relevant \texttt{libfoo.la} (in \texttt{/usr/lib}) and \texttt{libfoo.pc} (in \texttt{/usr/share/pkgconfig}\footnote{Pkgconfig has started moving its \texttt{.pc} files from \texttt{/usr/lib/pkgconfig} and this should be encouraged.}). It should depend on \texttt{libfoo3 (= \${Source-Version})}.
+\end{itemize}
+
+The \textit{shlibs} system provides a mapping of library SONAMES to package names and minimal versions for the ABIs a of libraries a package is built against.
+
+\subsection{Updating the package}
+
+As for anything providing an interface, shared libraries have to be treated carefully when it comes to updating the package.
+\begin{itemize}
+\item If the ABI has not changed at all, no changes are required to the package.
+\item The most common case is the ABI being changed in a backwards-compatible way, by adding symbols. In this case, the \textit{shlibs} system should be informed of the minimum version required. This is achieved by changing the \texttt{rules} file to call:
+\begin{verbatim}
+        dh_makeshlibs -V'libfoo3 (>= 3.1.0)'
+\end{verbatim}
+The referenced version is the one of the latest version where the ABI was changed.
+\item When some symbols are removed or their meaning is changed, the ABI is broken and the SONAME should have changed. The shared library package name has to be changed to reflect this new SONAME: \texttt{libfoo3} becomes \texttt{libfoo4}.
+\item If the API changes, some packages using the library may stop building. If the change is small, it may only require fixing of a handful of packages. If it's a broad change, the simplest course of action is to change the development package name: \texttt{libfoo3-dev} becomes \texttt{libfoo4-dev}.
+\end{itemize}
+
+\subsection{Library transitions}
+
+Whenever the ABI is broken, a library transition starts. Before anything like this happens, the release team should be asked for approval, so that they know the transition will happen. If possible, two transition implicating the same packages should be avoided, as they would have to complete together.
+
+All packages using the library have to be rebuilt in the \textit{unstable} distribution so that they can go to \textit{testing} together. Depending on the library, the optimal course of action may vary.
+\begin{itemize}
+\item If there is a small enough number of reverse dependencies, things can go fast: an upload right to \textit{unstable}, asking the release team to trigger a set of binary NMUs for all depending packages.
+\item More complex cases, especially if some reverse dependencies can fail to build, should be started in \textit{experimental}.
+\item For some nightmare libraries, several source versions are present at once, even in stable releases. The examples of gnutls and libpng come to mind.
+\end{itemize}
+
+\subsection{Providing a debugging version}
+
+If the library is known to causes crashes or is under development, the availability of debugging symbols is quite helpful. Fortunately, debhelper can do all of this automatically. After defining an empty \texttt{libfoo3-dbg} package, the magic command is:
+\begin{verbatim}
+        dh_strip --dbg-package=libfoo3-dbg
+\end{verbatim}
+This will move debugging symbols in \texttt{/usr/lib/debug} in this package; debuggers like \texttt{gdb} can use them automatically.
+
+\subsection{More complex cases -- how to avoid circular dependencies}
+
+With large and complex libraries, other kinds of issues appear. Considering the example of \texttt{gconf2}, the upstream distribution contains:
+\begin{itemize}
+\item a library used by applications,
+\item a per-user daemon,
+\item chunks of data, mostly localization files,
+\item configuration files,
+\item documentation,
+\item support binaries using the library.
+\end{itemize}
+
+To avoid having in \texttt{libgconf2-4} any files outside versioned directories, the configuration and data were moved to a \texttt{gconf2-common} package. Documentation was put in \texttt{libgconf2-dev}, where it is useful, and as mandated by policy, support binaries were put in a separate package, named \texttt{gconf2}.
+
+\medskip
+
+The tricky part is the daemon. When it is not running for the user, it is started by the application using the GConf library, which means the library should depend on the daemon. Still, the daemon is linked with the library. Until 2005, the daemon was in the \texttt{gconf2} package, meaning a \textit{circular dependency} between \texttt{gconf2} and \texttt{libgconf2-4}.
+
+Circular dependencies lead to various issues:
+\begin{itemize}
+\item APT randomly fails to upgrade such packages in large-scale upgrades;
+\item the \texttt{postinst} scripts are executed in a random order;
+\item worst of all, the \texttt{prerm} scripts of depending packages can be executed while dependent packages have been removed. This issue turned out to be a release-critical bug for \texttt{gconf2}, seriously breaking the build daemons' environment.
+\end{itemize}
+
+The solution to circular dependencies is to put files depending on each other in a single package: if they cannot live without each other, there is no reason to put them in separate packages. Thus, the daemon was put in the \texttt{libgconf2-4} package. To avoid including non-versioned files in the library package, which can be an issue in case of a SONAME change and which will become an issue for the multiarch project, the packaging was modified to use \texttt{/usr/lib/libgconf2-4} as its \textit{libexecdir}, putting the daemon in this directory.
+
+Despite having been tested in \textit{experimental}, no less than 6 new RC bugs were reported against the new package. If anything, it means such changes have to be done with extreme care, thinking of all upgrade scenarios; \textit{unstable} users can imagine unsought ways to torture APT and will install any package combination that is allowed.
+
+\section{Common developer mistakes}
+
+A commonly spread game among upstream library developers is to keep Debian developers busy. Here are some common ways for them to achieve this goal.
+
+\subsection{Non-PIC code}
+
+As a shared library can be loaded at any position in the address space, its compiled code cannot contain anything that depends on that absolute position. The compiler has to be instructed to build \textit{Position Independent Code} with the \texttt{-fPIC} option. Usually, this means building two versions of each code object, one with \texttt{-fPIC} and one without. Libtool will do this automatically.
+
+However, some developers using their own build system will forget this flag. Most of the time, they only work with the \texttt{i386}, on which non-PIC shared libraries still work. Furthermore, PIC code is slower on this architecture, as it is missing a relative jump instruction, getting some performance fanatics to knowingly remove it.
+
+Non-PIC code can also arise from inline assembly code, if it was not written with position independence in mind. In all cases, lintian will emit an error when finding non-PIC code, which shows up as a \texttt{TEXTREL} section in the output of \texttt{objdump -p}.
+
+\subsection{Unstable ABI without SONAME changes}
+
+Sometimes, an ABI change is noticed in a released library without a SONAME change. Removal or change of generally unused symbols is the most common case. In such cases, upstream developers will generally not change the SONAME of the library and distributors have to deal with it. The solution is to change the package name, \texttt{libfoo3} becoming \texttt{libfoo3a}. The new package has to conflict with the old one and all depending packages have to be rebuilt.
+
+Some upstream library developers go even further, not having a clue about what is an ABI. They consider the shared library just like the static version and the ABI can change at each release. Examples include \texttt{hdf5} or the Mozilla suite. In case of such an unstable ABI, a simple course of action is to ship only a static version of the library. However, it makes the security team's work a nightmare, as every package using the library has to be rebuilt after a security update.
+
+A more clever solution to such breakage is to give a Debian-specific SONAME to the library and to change it whenever needed. This work has been done for the Mozilla suite in the \texttt{xulrunner} package. When the breakage is systematic as in \texttt{hdf5}, the change can be automated with libtool, as shows this sample from the diff file:
+\begin{verbatim}
+-LT_LINK_LIB=$(LT) --mode=link $(CC) -rpath $(libdir) $(DYNAMIC_DIRS)
++LT_LINK_LIB=$(LT) --mode=link $(CC) -rpath $(libdir) -release $(H5_VERSION) -version-info 0
+\end{verbatim}
+The \texttt{-release} flag for libtool gives a string to add to the library name. Thus, the \texttt{libhdf5.so.0} library becomes \texttt{libhdf5-1.6.5.so.0}.
+
+As for the build process, the library package name has to be changed for each new upstream version: here it becomes \texttt{libhdf5-1.6.5-0}. Automated \texttt{debian/control} generation helps making updates as easy as with other packages --- apart from the fact they have to go through the \textit{NEW} queue at every upstream release.
+
+\medskip
+
+It should be noted that a clever library design can eliminate most causes for an ABI breakage. An example of such a design can be find in GNOME libraries: all data structures are hidden in private structures that cannot be found in public headers, and they are only accessible through helper functions that always answer to a functional need. Most GNOME libraries haven't changed their SONAMEs for several years despite major architectural changes.
+
+\subsection{Exporting private symbols}
+
+At link time, all functions and global variables that were not declared as \texttt{static} in the source code become exported symbols in the generated library. That includes functions that do not appear in public header files, and which as such should not be used as part of the API.
+
+Some application developers make use of this small hole. They define the prototype of these private functions in their own headers and make use of them at link time. Such an application is heavily buggy, as it will break when the library developers decide to change their private functions. To detect these applications reliably and to prevent them from running at all, the list of exported symbols should be restricted. It also helps avoiding symbol name conflicts between libraries.
+
+It can be achieved using a simple version script (see p.~\pageref{verscript}). There is also a feature from libtool which allows to automate this process. Here is a sample taken from the SDL\_mixer {Makefile.am} file:
+\begin{verbatim}
+libSDL_mixer_la_LDFLAGS =       \
+[...]
+        -export-symbols-regex Mix_.*
+\end{verbatim}
+
+This way, only symbols being part of the SDL\_mixer namespace, those beginning with \texttt{Mix\_}, are exported.
+
+\medskip
+
+Namespace conflicts can also occur between symbols from the library itself and functions belonging to a program linking to it. The ELF architecture allows a program to override function definitions from a shared library. The symbols can be protected against this kind of override by using the \texttt{-Wl,-Bsymbolic} argument at link time. It should be used for libraries exporting too generic functions, and it should be systematically applied to library plugins, \textit{e.g.} GTK+ input methods or theme engines. Such plugins can have their code intermixed with any kind of application that has not been tested with them, and namespace conflicts should be avoided in this case.
+
+
+\section{Going further -- reducing the release team's hair loss}
+
+\subsection{Versioning the symbols}
+
+\subsubsection{The problem}
+
+Let's consider the following simple scenario: a picture viewer written using GTK+. The software makes use of libgtk for its graphical interface, and of libpng to load PNG images. However, libgtk by itself already depends on libpng. When the ABI of libpng changed, and \texttt{libpng.so.2} became \texttt{libpng.so.3}, both GTK+ and the application had to be rebuilt. In this kind of case, if only the picture viewer is rebuilt, it will end up depending indirectly on both \texttt{libpng.so.2} and \texttt{libpng.so.3}.
+
+Here, the software is faced with a design flaw in the dynamic linker: when resolving library dependencies, all symbols found in all dependencies, direct or indirect, are loaded in a global symbol table. Once this is done, there is no way to tell between a symbol that comes from \texttt{libpng.so.2} and one with the same name coming from \texttt{libpng.so.3}. This way, GTK+ can call some functions that belong to \texttt{libpng.so.3} while using the ABI from \texttt{libpng.so.2}, causing crashes.
+
+\subsubsection{The solution}
+
+Such issues can be solved by introducing \textit{versioned symbols} in the libraries. Another option has to be passed at link time:
+\label{verscript}
+\begin{verbatim}
+  libpng12_la_LDFLAGS += -Wl,--version-script=libpng.vers
+\end{verbatim}
+
+The \textit{version script} referenced here can be a simple script to give the same version to all symbols:
+\begin{verbatim}
+PNG12_0 {
+*; };
+\end{verbatim}
+
+The 1.2.x version (\texttt{libpng.so.3}) is given the \texttt{PNG12\_0} version, while the 1.0.x version is given \texttt{PNG10\_0}. Let's have a look at the symbols in the libraries using the \texttt{objdump -T} command. For the 1.0.x version we have:
+\begin{verbatim}
+00006260 g    DF .text  00000011  PNG10_0     png_init_io
+\end{verbatim}
+and for the 1.2.x version:
+\begin{verbatim}
+000067a0 g    DF .text  00000011  PNG12_0     png_init_io
+\end{verbatim}
+
+Now, when a binary is linked against this new version, it still marks the symbols from libpng as undefined, but with a symbol version:
+\begin{verbatim}
+00000000      DF *UND*  00000011  PNG12_0     png_init_io
+\end{verbatim}
+When two symbols with the same name are available in the global symbol time, the dynamic linker will know which one to use.
+
+\subsubsection{Caveats}
+
+To benefit from versioned symbols, all packages using the library have to be rebuilt. Once this is done, it is possible to migrate from a library version to another providing the same symbols, transparently. For a library as widely used as libpng, this was a very slow transition mechanism. Before the \textit{sarge} release, all packages using libpng have been rebuilt using these versioned symbols, whether using version 1.0.x or 1.2.x. After the release, the 1.0.x version has been entirely removed, and packages using 1.0.x have migrated to 1.2.x without major issues. Having waited for a stable release allows to be sure upgrades across stable releases go smoothly.
+
+It is of critical importance to forward such changes to upstream developers and to make sure they are adopted widely. Otherwise, if upstream developers or another distributor chooses to introduce a \textit{different} version for these symbols, the two versions of the library become incompatible. A recent example is found with \texttt{libmysqlclient}: the patch was accepted by upstream developers, but they choose to change the symbols version, without knowing it would render the binary library incompatible with the one Debian had been shipping.
+
+
+\subsubsection{Improving the version script}
+
+In the case of libpng, it is also beneficial to restrict the list of exported symbols. All of this can be done in a single version script which is automatically generated from the headers:
+\begin{verbatim}
+PNG12_0 { global:
+png_init_io;
+png_read_image;
+[...]
+local: *; };
+\end{verbatim}
+
+
+\subsection{Restricting the list of dependencies}
+
+\subsubsection{Relibtoolizing packages}
+
+As explained p.~\pageref{libtool}, libtool stores the list of dependencies of a library in the \texttt{libfoo.la} file. While they are only useful for static linking (as the \texttt{libfoo.a} file does not store its dependencies), it also uses them for dynamic linking. When the dependencies are also using libtool, it will recurse through \texttt{.la} files looking for all dependencies.
+
+As a result, binaries end up being direct linked with many libraries they do not actually require. While this is harmless on a stable platform, it can cause major issues with a system continuously under development like Debian, as dependencies are continuously evolving, being added, removed or migrated to new versions. These unneeded dependencies result in unneeded rebuilds during library transitions and added complexity for migration to the \textit{testing} distribution.
+
+The Debian \texttt{libtool} package contains a patch that corrects this behavior. However, as libtool only produces scripts that get included with the upstream package, the package acted upon has to include as a patch the result of a \textit{relibtoolization} using the Debian version of libtool:
+\begin{verbatim}
+libtoolize --force --copy ; aclocal ; automake --force-missing --add-missing --foreign --copy ;
+autoconf ; rm -rf autom4te.cache
+\end{verbatim}
+
+It has the drawback to add some continuous burden on the Debian maintainer, as it needs to be done for each new upstream release. Furthermore, it is generally not enough, as indirect dependencies can be added by other sources in a complex build process. 
+
+\medskip
+
+When recursing through dependencies, libtool also adds them to the list of dependencies of the library it's building. For example, when building \texttt{libfoo} which requires \texttt{libbar} which it turn depends on \texttt{libbaz}, it will add a reference to \texttt{libbaz} in \texttt{libfoo.la}. If the dependency on \texttt{libbaz} is removed, packages depending on \texttt{libfoo} will fail to build, as they will look for a library that does not exist anymore.
+
+\subsubsection{Pkgconfig}
+
+Another widespread source of indirect dependencies is pkgconfig. As it also handles dependencies through \texttt{Requires:} fields, it will link the binary with several indirect dependencies. Furthermore, developers often add some indirect dependencies in \texttt{Libs:} fields.
+
+Recent changes in pkgconfig allow the use of \texttt{Requires.private:} and \texttt{Libs.private} fields. These libraries and dependencies will be linked in only when using static linking. Here is an example in \texttt{cairo.pc}:
+\begin{verbatim}
+Requires.private: freetype2 >= 8.0.2 fontconfig xrender libpng12
+\end{verbatim}
+
+Unlike the relibtoolization, these changes have to be made in the packages that are depended upon, not in the package that hits the problem. Furthermore, it has been argued that libraries that have their headers automatically included (like glib when using GTK+) should be linked in by default nevertheless.
+
+\subsubsection{GNU linker magic}
+
+The GNU linker has an option that can make all indirect dependencies go away: \texttt{-\null-as-needed}. For example, it can be passed to the configure script:
+\begin{verbatim}
+LDFLAGS="-Wl,--as-needed" ./configure --prefix=/usr [...]
+\end{verbatim}
+
+When passed this option, the dynamic linker does not necessarily make the binary it is linking depend on the shared libraries passed with \texttt{-lfoo} arguments. First, it checks that the binary is actually using some symbols in the library, skipping the library if not needed. This mechanism dramatically reduces the list of unneeded dependencies, including the ones upstream developers could have explicitly added.
+
+This option should not be used blindly. In some specific cases, the library should be linked in even when none of its symbols are used. Support for it is still young, and it should not be considered 100~\% reliable. Furthermore, it does not solve the issue of libtool recursing in \texttt{.la} files and searching for removed libraries.
+
+\medskip
+
+To make things worse, a recent change in libtool introduced argument reordering at link time, which turns the \texttt{-\null-as-needed} option into a dummy one. This only happens when building libraries, not applications. A workaround was developed, as a patch for \texttt{ltmain.sh}, for the \texttt{libgnome} package where it is of large importance. It is currently waiting for a cleanup before being submitted as another Debian-specific libtool change\footnote{The upstream libtool developers have stated it may be fixed in the future, but not even in libtool 2.0.}.
+
+\section*{Conclusion}
+
+Apart from treating each update with care, there is no general rule for packaging shared libraries. There are many solutions and workarounds for known problems, but each of them adds complexity to the packaging and should be considered on a case-by-case basis. As the long list of problems shows, being release manager is not an easy task, and library package maintainers should do their best to keep the release team's task feasible.
+
+There is a huge number of software libraries distributed in the wild, and almost two thousand of them are shipped in the Debian distribution. Among all developers of these libraries, many of them are not aware of shared libraries specific issues. The Debian maintainer's job is more than working around these issues: it is to help upstream understand them and fix their packages. As such, forwarding and explaining patches is a crucial task.
+
+\end{document}

Added: procedings/27-Packaging-shared-libraries/paper.tex
===================================================================
--- procedings/27-Packaging-shared-libraries/paper.tex	2006-04-10 11:52:01 UTC (rev 489)
+++ procedings/27-Packaging-shared-libraries/paper.tex	2006-04-10 20:30:09 UTC (rev 490)
@@ -0,0 +1,326 @@
+%\usepackage[scale=0.8]{geometry}
+
+\section{Introducing shared libraries}
+
+\subsection{Basic concepts}
+
+A library is a piece of code that can be used in several binaries, split out for factorization reasons. In the old days, libraries were all \textit{statically linked}; that is, they were included directly in the resulting binary. Modern operating systems use shared libraries, which are compiled in separate files and loaded together with the binary at startup time. Shared libraries are the most widespread use of \textit{shared objects}, files containing code that can be loaded at runtime, generally with the \texttt{.so} extension.
+
+\subsection{A bit of terminology}
+
+\paragraph{API} The \textit{Application Programming Interface} of a library describes how it can be used by the programmer. It generally consists in a list of structures and functions and their associated behavior. Changing the behavior of a function or the type of arguments it requires \textit{breaks} the API: programs that used to compile with an older version of the library will stop building.
+
+\paragraph{ABI} The \textit{Application Binary Interface} defines the low-level interface between a shared library and the binary using it. It is specific to the architecture and the operating system, and consists in a lists of \textit{symbols} and their associated type and behavior. A binary linked to a shared library will be able to run with another library, or another version of that library, provided that it implements the same ABI. Adding elements to a structure or turning a function into a macro \textit{breaks} the ABI: binaries that used to run with an older version of the library will stop loading. Most of the time, breaking the API also breaks the ABI.
+
+\paragraph{SONAME} The "SONAME" is the canonical name of a shared library, defining an ABI for a given operating system and architecture. It is defined when building the library. The convention for SONAMEs is to use \texttt{libfoo.so.N} and to increment N whenever the ABI is changed. This way, ABI--incompatible versions of the library and binaries using them can coexist on the same system.
+
+\subsection{Linking and using libraries}
+
+A simple example of building a library using gcc :
+\begin{verbatim}
+gcc -fPIC -c -o foo-init.o foo-init.c
+[ ... ]
+gcc -shared -Wl,-soname,libfoo.so.3 -o libfoo.so.3 foo-init.o foo-client.o [...]
+ln -s libfoo.so.3 libfoo.so
+\end{verbatim}
+
+As the command line shows, the SONAME is defined at that time. The symbolic link is needed for compilation of programs using the library. Supposing it has been installed in a standard location, you can link a binary --- which can be another shared library --- using it with \texttt{-lfoo}. The linker looks for \texttt{libfoo.so}, and stores the SONAME found (\texttt{libfoo.so.3}) in the binary's ELF\footnote{\textit{Executable and Linking Format}: the binary format for binaries and shared objects on most UNIX systems.} header. 
+
+The output of the \texttt{objdump -p} command shows the headers of an ELF object. For the library, the output contains:
+\begin{verbatim}
+  SONAME      libfoo.so.3
+\end{verbatim}
+For the binary, it contains:
+\begin{verbatim}
+  NEEDED      libfoo.so.3
+\end{verbatim}
+
+The symbols provided by the library remain undefined in the binary at that time. In the dynamic symbol table showed by \texttt{objdump -T}, the library contains the symbol:
+\begin{verbatim}
+0807c8e0 g    DF .text  0000007d  Base        foo_init
+\end{verbatim}
+while in the binary it remains undefined:
+\begin{verbatim}
+00000000      DF *UND*  0000001c              foo_init
+\end{verbatim}
+
+When the binary is started, the GNU \textit{dynamic linker}\footnote{Other linkers can use a different scheme, especially when it comes to filename lookup.} looks for the NEEDED sections and loads the libraries listed there, using the SONAME as a file name. It then maps the undefined symbols to the ones found in the libraries.
+
+\subsection{Libtool}
+\label{libtool}
+
+Libtool is a tool designed to simplify the build process of libraries. It is full of features that make the developers' life easier, and full of bugs that bring added complexity for system administrators and especially distribution maintainers. Its paradigm is to build an extra file, named \texttt{libfoo.la}, which contains some metadata about the library; most importantly, the list of library dependencies for the library itself. Together with this file, it can build the shared version \texttt{libfoo.so} and the static version \texttt{libfoo.a} of the library.
+
+It integrates easily with autoconf and automake. You can put in the \texttt{configure.ac}\footnote{The version information is given for libtool's versioning scheme. You can read more about it in the libtool manual.}:
+\begin{verbatim}
+AM_PROG_LIBTOOL
+VERSION_INFO=3:1:0
+AC_SUBST(VERSION_INFO)
+\end{verbatim}
+and in the \texttt{Makefile.am}:
+\begin{verbatim}
+libfoo_la_SOURCES = foo-init.c foo-client.c foo.h [...]
+libfoo_la_LDFLAGS = -version-info @VERSION_INFO@
+libfoo_HEADERS = foo.h
+\end{verbatim}
+
+\subsection{Pkgconfig}
+
+Pkgconfig is a tool to replace the variety of \texttt{libfoo-config} scripts in a standard way that integrates with autoconf. Here is a sample file, \texttt{libnautilus-burn.pc}:
+\begin{verbatim}
+prefix=/usr
+exec_prefix=${prefix}
+libdir=${exec_prefix}/lib
+includedir=${prefix}/include/libnautilus-burn
+
+Name: libnautilus-burn
+Description: Nautilus Burn Library
+Version: 2.12.3
+Requires: glib-2.0 gtk+-2.0
+Libs: -L${libdir} -lnautilus-burn
+Cflags: -I${includedir}
+\end{verbatim}%$
+The \texttt{Cflags:} and \texttt{Libs:} fields provide the list of \texttt{CFLAGS} and \texttt{LDFLAGS} to use for linking with that library. The \texttt{Requires:} field provides some dependencies that a binary using that library should also link with. In this case, pkgconfig will also look for \texttt{glib-2.0.pc} and \texttt{gtk+-2.0.pc}.
+
+\medskip
+
+Integration with autoconf is provided. Here is an example \texttt{configure.ac} test for a package requiring the GTK+ library:
+\begin{verbatim}
+PKG_CHECK_MODULES(GTK, gtk+-2.0 >= 2.6.0,,
+                  AC_MSG_ERROR([GTK+-2.0 is required]))
+\end{verbatim}
+
+\section{Debian packaging of a shared library}
+
+\subsection{Simple case -- what the policy mandates}
+
+Packaging a simple library for Debian is not much different from another piece of software. In all cases there should at least be two packages:
+\begin{itemize}
+\item \texttt{libfoo3}, containing the \texttt{/usr/lib/*.so.*} files, so that you get \texttt{libfoo.so.3}. The \texttt{postinst} script of this package should contain a call to the \texttt{ldconfig} command, and it has to be registered in dpkg's \textit{shlibs} database. This can be achieved by a call to \texttt{dh\_makeshlibs}.
+\item \texttt{libfoo-dev} or \texttt{libfoo3-dev}, containing the headers in \texttt{/usr/include}, and other files in \texttt{/usr/lib}: the \texttt{libfoo.so} symbolic link, the \texttt{libfoo.a} static library, and if relevant \texttt{libfoo.la} (in \texttt{/usr/lib}) and \texttt{libfoo.pc} (in \texttt{/usr/share/pkgconfig}\footnote{Pkgconfig has started moving its \texttt{.pc} files from \texttt{/usr/lib/pkgconfig} and this should be encouraged.}). It should depend on \texttt{libfoo3 (= \${Source-Version})}.
+\end{itemize}
+
+The \textit{shlibs} system provides a mapping of library SONAMES to package names and minimal versions for the ABIs a of libraries a package is built against.
+
+\subsection{Updating the package}
+
+As for anything providing an interface, shared libraries have to be treated carefully when it comes to updating the package.
+\begin{itemize}
+\item If the ABI has not changed at all, no changes are required to the package.
+\item The most common case is the ABI being changed in a backwards-compatible way, by adding symbols. In this case, the \textit{shlibs} system should be informed of the minimum version required. This is achieved by changing the \texttt{rules} file to call:
+\begin{verbatim}
+        dh_makeshlibs -V'libfoo3 (>= 3.1.0)'
+\end{verbatim}
+The referenced version is the one of the latest version where the ABI was changed.
+\item When some symbols are removed or their meaning is changed, the ABI is broken and the SONAME should have changed. The shared library package name has to be changed to reflect this new SONAME: \texttt{libfoo3} becomes \texttt{libfoo4}.
+\item If the API changes, some packages using the library may stop building. If the change is small, it may only require fixing of a handful of packages. If it's a broad change, the simplest course of action is to change the development package name: \texttt{libfoo3-dev} becomes \texttt{libfoo4-dev}.
+\end{itemize}
+
+\subsection{Library transitions}
+
+Whenever the ABI is broken, a library transition starts. Before anything like this happens, the release team should be asked for approval, so that they know the transition will happen. If possible, two transition implicating the same packages should be avoided, as they would have to complete together.
+
+All packages using the library have to be rebuilt in the \textit{unstable} distribution so that they can go to \textit{testing} together. Depending on the library, the optimal course of action may vary.
+\begin{itemize}
+\item If there is a small enough number of reverse dependencies, things can go fast: an upload right to \textit{unstable}, asking the release team to trigger a set of binary NMUs for all depending packages.
+\item More complex cases, especially if some reverse dependencies can fail to build, should be started in \textit{experimental}.
+\item For some nightmare libraries, several source versions are present at once, even in stable releases. The examples of gnutls and libpng come to mind.
+\end{itemize}
+
+\subsection{Providing a debugging version}
+
+If the library is known to causes crashes or is under development, the availability of debugging symbols is quite helpful. Fortunately, debhelper can do all of this automatically. After defining an empty \texttt{libfoo3-dbg} package, the magic command is:
+\begin{verbatim}
+        dh_strip --dbg-package=libfoo3-dbg
+\end{verbatim}
+This will move debugging symbols in \texttt{/usr/lib/debug} in this package; debuggers like \texttt{gdb} can use them automatically.
+
+\subsection{More complex cases -- how to avoid circular dependencies}
+
+With large and complex libraries, other kinds of issues appear. Considering the example of \texttt{gconf2}, the upstream distribution contains:
+\begin{itemize}
+\item a library used by applications,
+\item a per-user daemon,
+\item chunks of data, mostly localization files,
+\item configuration files,
+\item documentation,
+\item support binaries using the library.
+\end{itemize}
+
+To avoid having in \texttt{libgconf2-4} any files outside versioned directories, the configuration and data were moved to a \texttt{gconf2-common} package. Documentation was put in \texttt{libgconf2-dev}, where it is useful, and as mandated by policy, support binaries were put in a separate package, named \texttt{gconf2}.
+
+\medskip
+
+The tricky part is the daemon. When it is not running for the user, it is started by the application using the GConf library, which means the library should depend on the daemon. Still, the daemon is linked with the library. Until 2005, the daemon was in the \texttt{gconf2} package, meaning a \textit{circular dependency} between \texttt{gconf2} and \texttt{libgconf2-4}.
+
+Circular dependencies lead to various issues:
+\begin{itemize}
+\item APT randomly fails to upgrade such packages in large-scale upgrades;
+\item the \texttt{postinst} scripts are executed in a random order;
+\item worst of all, the \texttt{prerm} scripts of depending packages can be executed while dependent packages have been removed. This issue turned out to be a release-critical bug for \texttt{gconf2}, seriously breaking the build daemons' environment.
+\end{itemize}
+
+The solution to circular dependencies is to put files depending on each other in a single package: if they cannot live without each other, there is no reason to put them in separate packages. Thus, the daemon was put in the \texttt{libgconf2-4} package. To avoid including non-versioned files in the library package, which can be an issue in case of a SONAME change and which will become an issue for the multiarch project, the packaging was modified to use \texttt{/usr/lib/libgconf2-4} as its \textit{libexecdir}, putting the daemon in this directory.
+
+Despite having been tested in \textit{experimental}, no less than 6 new RC bugs were reported against the new package. If anything, it means such changes have to be done with extreme care, thinking of all upgrade scenarios; \textit{unstable} users can imagine unsought ways to torture APT and will install any package combination that is allowed.
+
+\section{Common developer mistakes}
+
+A commonly spread game among upstream library developers is to keep Debian developers busy. Here are some common ways for them to achieve this goal.
+
+\subsection{Non-PIC code}
+
+As a shared library can be loaded at any position in the address space, its compiled code cannot contain anything that depends on that absolute position. The compiler has to be instructed to build \textit{Position Independent Code} with the \texttt{-fPIC} option. Usually, this means building two versions of each code object, one with \texttt{-fPIC} and one without. Libtool will do this automatically.
+
+However, some developers using their own build system will forget this flag. Most of the time, they only work with the \texttt{i386}, on which non-PIC shared libraries still work. Furthermore, PIC code is slower on this architecture, as it is missing a relative jump instruction, getting some performance fanatics to knowingly remove it.
+
+Non-PIC code can also arise from inline assembly code, if it was not written with position independence in mind. In all cases, lintian will emit an error when finding non-PIC code, which shows up as a \texttt{TEXTREL} section in the output of \texttt{objdump -p}.
+
+\subsection{Unstable ABI without SONAME changes}
+
+Sometimes, an ABI change is noticed in a released library without a SONAME change. Removal or change of generally unused symbols is the most common case. In such cases, upstream developers will generally not change the SONAME of the library and distributors have to deal with it. The solution is to change the package name, \texttt{libfoo3} becoming \texttt{libfoo3a}. The new package has to conflict with the old one and all depending packages have to be rebuilt.
+
+Some upstream library developers go even further, not having a clue about what is an ABI. They consider the shared library just like the static version and the ABI can change at each release. Examples include \texttt{hdf5} or the Mozilla suite. In case of such an unstable ABI, a simple course of action is to ship only a static version of the library. However, it makes the security team's work a nightmare, as every package using the library has to be rebuilt after a security update.
+
+A more clever solution to such breakage is to give a Debian-specific SONAME to the library and to change it whenever needed. This work has been done for the Mozilla suite in the \texttt{xulrunner} package. When the breakage is systematic as in \texttt{hdf5}, the change can be automated with libtool, as shows this sample from the diff file:
+\begin{verbatim}
+-LT_LINK_LIB=$(LT) --mode=link $(CC) -rpath $(libdir) $(DYNAMIC_DIRS)
++LT_LINK_LIB=$(LT) --mode=link $(CC) -rpath $(libdir) -release $(H5_VERSION) -version-info 0
+\end{verbatim}
+The \texttt{-release} flag for libtool gives a string to add to the library name. Thus, the \texttt{libhdf5.so.0} library becomes \texttt{libhdf5-1.6.5.so.0}.
+
+As for the build process, the library package name has to be changed for each new upstream version: here it becomes \texttt{libhdf5-1.6.5-0}. Automated \texttt{debian/control} generation helps making updates as easy as with other packages --- apart from the fact they have to go through the \textit{NEW} queue at every upstream release.
+
+\medskip
+
+It should be noted that a clever library design can eliminate most causes for an ABI breakage. An example of such a design can be find in GNOME libraries: all data structures are hidden in private structures that cannot be found in public headers, and they are only accessible through helper functions that always answer to a functional need. Most GNOME libraries haven't changed their SONAMEs for several years despite major architectural changes.
+
+\subsection{Exporting private symbols}
+
+At link time, all functions and global variables that were not declared as \texttt{static} in the source code become exported symbols in the generated library. That includes functions that do not appear in public header files, and which as such should not be used as part of the API.
+
+Some application developers make use of this small hole. They define the prototype of these private functions in their own headers and make use of them at link time. Such an application is heavily buggy, as it will break when the library developers decide to change their private functions. To detect these applications reliably and to prevent them from running at all, the list of exported symbols should be restricted. It also helps avoiding symbol name conflicts between libraries.
+
+It can be achieved using a simple version script (see p.~\pageref{verscript}). There is also a feature from libtool which allows to automate this process. Here is a sample taken from the SDL\_mixer {Makefile.am} file:
+\begin{verbatim}
+libSDL_mixer_la_LDFLAGS =       \
+[...]
+        -export-symbols-regex Mix_.*
+\end{verbatim}
+
+This way, only symbols being part of the SDL\_mixer namespace, those beginning with \texttt{Mix\_}, are exported.
+
+\medskip
+
+Namespace conflicts can also occur between symbols from the library itself and functions belonging to a program linking to it. The ELF architecture allows a program to override function definitions from a shared library. The symbols can be protected against this kind of override by using the \texttt{-Wl,-Bsymbolic} argument at link time. It should be used for libraries exporting too generic functions, and it should be systematically applied to library plugins, \textit{e.g.} GTK+ input methods or theme engines. Such plugins can have their code intermixed with any kind of application that has not been tested with them, and namespace conflicts should be avoided in this case.
+
+
+\section{Going further -- reducing the release team's hair loss}
+
+\subsection{Versioning the symbols}
+
+\subsubsection{The problem}
+
+Let's consider the following simple scenario: a picture viewer written using GTK+. The software makes use of libgtk for its graphical interface, and of libpng to load PNG images. However, libgtk by itself already depends on libpng. When the ABI of libpng changed, and \texttt{libpng.so.2} became \texttt{libpng.so.3}, both GTK+ and the application had to be rebuilt. In this kind of case, if only the picture viewer is rebuilt, it will end up depending indirectly on both \texttt{libpng.so.2} and \texttt{libpng.so.3}.
+
+Here, the software is faced with a design flaw in the dynamic linker: when resolving library dependencies, all symbols found in all dependencies, direct or indirect, are loaded in a global symbol table. Once this is done, there is no way to tell between a symbol that comes from \texttt{libpng.so.2} and one with the same name coming from \texttt{libpng.so.3}. This way, GTK+ can call some functions that belong to \texttt{libpng.so.3} while using the ABI from \texttt{libpng.so.2}, causing crashes.
+
+\subsubsection{The solution}
+
+Such issues can be solved by introducing \textit{versioned symbols} in the libraries. Another option has to be passed at link time:
+\label{verscript}
+\begin{verbatim}
+  libpng12_la_LDFLAGS += -Wl,--version-script=libpng.vers
+\end{verbatim}
+
+The \textit{version script} referenced here can be a simple script to give the same version to all symbols:
+\begin{verbatim}
+PNG12_0 {
+*; };
+\end{verbatim}
+
+The 1.2.x version (\texttt{libpng.so.3}) is given the \texttt{PNG12\_0} version, while the 1.0.x version is given \texttt{PNG10\_0}. Let's have a look at the symbols in the libraries using the \texttt{objdump -T} command. For the 1.0.x version we have:
+\begin{verbatim}
+00006260 g    DF .text  00000011  PNG10_0     png_init_io
+\end{verbatim}
+and for the 1.2.x version:
+\begin{verbatim}
+000067a0 g    DF .text  00000011  PNG12_0     png_init_io
+\end{verbatim}
+
+Now, when a binary is linked against this new version, it still marks the symbols from libpng as undefined, but with a symbol version:
+\begin{verbatim}
+00000000      DF *UND*  00000011  PNG12_0     png_init_io
+\end{verbatim}
+When two symbols with the same name are available in the global symbol time, the dynamic linker will know which one to use.
+
+\subsubsection{Caveats}
+
+To benefit from versioned symbols, all packages using the library have to be rebuilt. Once this is done, it is possible to migrate from a library version to another providing the same symbols, transparently. For a library as widely used as libpng, this was a very slow transition mechanism. Before the \textit{sarge} release, all packages using libpng have been rebuilt using these versioned symbols, whether using version 1.0.x or 1.2.x. After the release, the 1.0.x version has been entirely removed, and packages using 1.0.x have migrated to 1.2.x without major issues. Having waited for a stable release allows to be sure upgrades across stable releases go smoothly.
+
+It is of critical importance to forward such changes to upstream developers and to make sure they are adopted widely. Otherwise, if upstream developers or another distributor chooses to introduce a \textit{different} version for these symbols, the two versions of the library become incompatible. A recent example is found with \texttt{libmysqlclient}: the patch was accepted by upstream developers, but they choose to change the symbols version, without knowing it would render the binary library incompatible with the one Debian had been shipping.
+
+
+\subsubsection{Improving the version script}
+
+In the case of libpng, it is also beneficial to restrict the list of exported symbols. All of this can be done in a single version script which is automatically generated from the headers:
+\begin{verbatim}
+PNG12_0 { global:
+png_init_io;
+png_read_image;
+[...]
+local: *; };
+\end{verbatim}
+
+
+\subsection{Restricting the list of dependencies}
+
+\subsubsection{Relibtoolizing packages}
+
+As explained p.~\pageref{libtool}, libtool stores the list of dependencies of a library in the \texttt{libfoo.la} file. While they are only useful for static linking (as the \texttt{libfoo.a} file does not store its dependencies), it also uses them for dynamic linking. When the dependencies are also using libtool, it will recurse through \texttt{.la} files looking for all dependencies.
+
+As a result, binaries end up being direct linked with many libraries they do not actually require. While this is harmless on a stable platform, it can cause major issues with a system continuously under development like Debian, as dependencies are continuously evolving, being added, removed or migrated to new versions. These unneeded dependencies result in unneeded rebuilds during library transitions and added complexity for migration to the \textit{testing} distribution.
+
+The Debian \texttt{libtool} package contains a patch that corrects this behavior. However, as libtool only produces scripts that get included with the upstream package, the package acted upon has to include as a patch the result of a \textit{relibtoolization} using the Debian version of libtool:
+\begin{verbatim}
+libtoolize --force --copy ; aclocal ; automake --force-missing --add-missing --foreign --copy ;
+autoconf ; rm -rf autom4te.cache
+\end{verbatim}
+
+It has the drawback to add some continuous burden on the Debian maintainer, as it needs to be done for each new upstream release. Furthermore, it is generally not enough, as indirect dependencies can be added by other sources in a complex build process. 
+
+\medskip
+
+When recursing through dependencies, libtool also adds them to the list of dependencies of the library it's building. For example, when building \texttt{libfoo} which requires \texttt{libbar} which it turn depends on \texttt{libbaz}, it will add a reference to \texttt{libbaz} in \texttt{libfoo.la}. If the dependency on \texttt{libbaz} is removed, packages depending on \texttt{libfoo} will fail to build, as they will look for a library that does not exist anymore.
+
+\subsubsection{Pkgconfig}
+
+Another widespread source of indirect dependencies is pkgconfig. As it also handles dependencies through \texttt{Requires:} fields, it will link the binary with several indirect dependencies. Furthermore, developers often add some indirect dependencies in \texttt{Libs:} fields.
+
+Recent changes in pkgconfig allow the use of \texttt{Requires.private:} and \texttt{Libs.private} fields. These libraries and dependencies will be linked in only when using static linking. Here is an example in \texttt{cairo.pc}:
+\begin{verbatim}
+Requires.private: freetype2 >= 8.0.2 fontconfig xrender libpng12
+\end{verbatim}
+
+Unlike the relibtoolization, these changes have to be made in the packages that are depended upon, not in the package that hits the problem. Furthermore, it has been argued that libraries that have their headers automatically included (like glib when using GTK+) should be linked in by default nevertheless.
+
+\subsubsection{GNU linker magic}
+
+The GNU linker has an option that can make all indirect dependencies go away: \texttt{-\null-as-needed}. For example, it can be passed to the configure script:
+\begin{verbatim}
+LDFLAGS="-Wl,--as-needed" ./configure --prefix=/usr [...]
+\end{verbatim}
+
+When passed this option, the dynamic linker does not necessarily make the binary it is linking depend on the shared libraries passed with \texttt{-lfoo} arguments. First, it checks that the binary is actually using some symbols in the library, skipping the library if not needed. This mechanism dramatically reduces the list of unneeded dependencies, including the ones upstream developers could have explicitly added.
+
+This option should not be used blindly. In some specific cases, the library should be linked in even when none of its symbols are used. Support for it is still young, and it should not be considered 100~\% reliable. Furthermore, it does not solve the issue of libtool recursing in \texttt{.la} files and searching for removed libraries.
+
+\medskip
+
+To make things worse, a recent change in libtool introduced argument reordering at link time, which turns the \texttt{-\null-as-needed} option into a dummy one. This only happens when building libraries, not applications. A workaround was developed, as a patch for \texttt{ltmain.sh}, for the \texttt{libgnome} package where it is of large importance. It is currently waiting for a cleanup before being submitted as another Debian-specific libtool change\footnote{The upstream libtool developers have stated it may be fixed in the future, but not even in libtool 2.0.}.
+
+\section*{Conclusion}
+
+Apart from treating each update with care, there is no general rule for packaging shared libraries. There are many solutions and workarounds for known problems, but each of them adds complexity to the packaging and should be considered on a case-by-case basis. As the long list of problems shows, being release manager is not an easy task, and library package maintainers should do their best to keep the release team's task feasible.
+
+There is a huge number of software libraries distributed in the wild, and almost two thousand of them are shipped in the Debian distribution. Among all developers of these libraries, many of them are not aware of shared libraries specific issues. The Debian maintainer's job is more than working around these issues: it is to help upstream understand them and fix their packages. As such, forwarding and explaining patches is a crucial task.

Added: procedings/29-codes-of-value/orig/Coleman-Codes-Value.txt
===================================================================
--- procedings/29-codes-of-value/orig/Coleman-Codes-Value.txt	2006-04-10 11:52:01 UTC (rev 489)
+++ procedings/29-codes-of-value/orig/Coleman-Codes-Value.txt	2006-04-10 20:30:09 UTC (rev 490)
@@ -0,0 +1,178 @@
+
+Codes of Value: Hacker Pragmatics, Poetics, and Selfhood
+Gabriella Coleman, Postdoctoral Fellow, Center for Cultural Analysis, Rutgers University
+
+ 
+
+
+I have nothing to declare but my genius 
+Oscar Wilde 
+
+#count the number of stars in the sky
+ $cnt = $sky =~ tr/*/*/;
+
+
+Paper Overview
+	
+This line of Perl denotes a hacker homage to cleverness as a double entrede of both semantic ingenuity and technical cleverness. To fully appreciate the semantic humor presented here, we must look at the finer points of a particular set of the developer population, the Perl hacker. These hackers have developed a computer scripting language, Perl, in which terse but technically powerful expressions can be formed (in comparison to other programming languages). The Perl community takes special pride in cleverly condensing long segments of code into very short and sometimes "obfuscated" one-liners.  If this above line of code were to be expanded into something more traditional and accessible to Perl novices, it may read something like:
+
+   $cnt = 0;
+   $i = 0;
+   $skylen = length($sky)
+   while ($i < $skylen) { 
+ 	$sky = substr($sky,0, $i) . '*' . substr($sky, $i+1, length	($skylen));
+      $i++;
+    }
+        $cnt = length($sky);
+
+We see that this enterprising Perl programmer has taken 6 lines of code and reduced it by taking advantage of certain side effects found in the constructs of the Perl computer language. With this transformation of "prose" into terse "poetry," the developer displays a mastery of the computer language. This mastery is sealed on semantic level by the joke of "counting the number of stars in the sky" due to the naming of the variable $sky, and the word play of the asterisk or star. Since the counting function is directed to literally count any appearance of the asterisk symbol, a star, (this is what the program does) the programmer decided to display his craftiness by choosing the variable name $sky and hence the description of the function "count the number of stars in the sky."
+ 	
+This snippet of code is a useful object to present here because it is a potent example of hacker value in a dual capacity. Free and opens source (F/OSS) hackers have come to deem accessible, open code such as the example above and the Perl language it is written in, as valuable. With access to code, hackers argue they can learn new skills and improve technology. But in the above minuscule line of code, we can glean another trace of value. Because it is a particularly tricky way to solve a problem, and contains a nugget of non-technical cleverness, this code reveals the value hackers place on the performance of wit. This tendency to perform wit is all pervasive in the hacker habitat (Fisher 1999; Raymond 1998; Thomas 2002). Blossoming from the prosaic world of hacker technical and social praxis, the clever performance of technology and humor might be termed as the "semantic ground" through which hackers "construct and represent themselves and others" (Comaroff and Comaroff 1991: 21).
+	
+Judging from this Perl example alone, it is not surprising that in much of the literature, hackers are treated as quintessentially individualistic (Turkle 1984, 1995; Levy 1984; Sterling 1992; Castells 2001; Borsook 2000; Davis 1998; Nissembaum 2004). "The hacker," Sherry Turkle writes, "is the defender of idiosyncrasy, individuality, genius and the cult of individual" (1984: 229). Hackers do seem perpetually keen on asserting their individuality through acts of ingenuity, and thus this statement is unmistakably correct. However, in most accounts on hackers, the meaning of individualism is treated either as an ideological cloak or uninteresting, and thus is often left underspecified. In this piece, through an ethnographic examination of hacker coding practices, social interaction, and humor, I specify how hackers conceive of and preform a form of liberal individuality that departs from another version of liberal selfhood. 
+ 
+Hacker notions of creativity and individuality, I argue, extend and re-vision an existing cultural trope of individualism that diverges from the predominant reading of the liberal self as that of the consumer or "possessive individual" (Macpherson 1962; cf. Graeber 1997). Their enactment of individualism is a re-creation of the type of liberal person envisioned in the works of John Stuart Mill in his critique of utilitarianism (1857), more recently addressed in the works of other liberal thinkers like John Dewey (1935), and practically articulated in ideals of meritocracy, institutions of education, and free speech jurisprudence. As Wendy Donner explains, the Millian conception of selfhood sits at odds with a Lockean sensibility "wedded to possessive individualism" for Mill formulates "individualism as flowing from the development and use of the higher human powers, which is antagonistic to a desire to control others" (1991: 130). For hackers, selfhood is foremost a form of self-determination that arises out of the ability to think, speak, and create freely and independently. As Chris Kelty (2005) has persuasively argued by drawing on the work of Charles Taylor (2004), hackers and other net advocates have crafted a liberal "social imaginary" in which the right to build technology free from encumbrance is seen "as essential to freedom and public participation as is speaking to a public about such activities (2005:187).  And indeed, the commonplace hacker assertion that free software is about "free speech" not "free beer," signals how hackers have reformulated liberal freedom into their own technical vernacular. Over the last decade, by specifically integrating free speech discourse into the sinews of F/OSS moral philosophy, hackers have gone further than simply resonating with the type of liberal theory exemplified by John S. Mill. They have literally remade this liberal legacy as their own. 
+	
+Clearly there are culturally pervasive ideals and practical articulations of the Millian paradigm from which hackers can easily draw upon. But the more interesting question is: why does this liberal legacy of the free thinking individual capture the hacker cultural imaginary? In this piece I seek to make this question intelligible by portraying how hackers create value and notions of individuality through routine everyday practices as coding, humor, and interactions with other programmers. It is the symbiosis between their praxis and prevalent liberal tropes of individuality and meritocracy that form the groundwork of hacker liberal self-fashioning as I discuss here.
+Central to their construction of selfhood is a faith in the power of the human imagination that demands of hackers constant acts of self-development through which they cultivate their skills, improve technology, and prove their worth to other hackers. To become a hacker is to embody a form of individualism that shuns what they designate as mediocrity in favor of a virtuous display of wit, technical ability and intelligence. Hackers consistently and constantly enact this in a repertoire of micropractices, including humor, agonistic yet playful taunting, and the clever composition and display of code.   
+	
+Since the designation of superior code, cleverness, intelligence, or even a good joke can only be affirmed by other hackers, personal technical development requires both a demanding life of continual performativity as well as the affirmative judgment of others who are similarly engaged in this form of technical self-fashioning. This raises a subtle paradox that textures their modes of sociality and interpersonal interactions: hackers are bound together in an elite fraternal order of judgment that requires of them constant performance of a set of character traits that are often directed to confirm their mental and creative independence from each other. 
+
+This paradox alone is not necessarily generative of social tensions. This is, after all, how academics organize much of their labor. However, given that so much of hacker production derives from a collective and common enterprise, a fact that hackers themselves more openly acknowledge and theorize in the very ethical philosophy of 
+F/OSS, their affirmation of independence is potentially subverted by the reality of and desire to recognize collective invention. As I discuss below, the use of humor and technical cleverness reveals as well as attenuates the hacker ambivalence between the forms of individualism and collectivism, elitism and populism, exclusivity and inclusivity that have come to mark their lifeworld. 
+
+This piece now continues with a brief discussion of Mill's conception of individuality, self-cultivation, and judgment. This will help ground the second part of the article, which takes a close look at hacker pragmatics and poetics. I open with a discussion on hacker pragmatics as this will help clarify how hackers use cleverness to establish definitions of selfhood. Though this is not on humor per se, in this second half, I also heavily draw on  examples of everyday hacker humor, treating it as iconic of the wider range of their "signifying practices" (Hedbidge 1979) through which they define, clarify, and realize the cultural meanings of creativity, individuality, and authorship. The final and third section provides a closer look at the relation between the hacker self and the liberal self, and ends with a discussion on the hacker ideal of authorship and meritocracy that has grown from their commitment to Millian individualism. 
+ 
+[1] Here is a little more information about the code. The "tr" in this code is a function that translates all occurrences of the search characters listed, with the corresponding replacement character list.  In this case, the search list is delimited by the slash character, so the list of what to search for is the asterisk character. The replacement list is the second asterisk character, so overall it is replacing the asterisk with an asterisk. The side-effect of this code is that the 'tr' function returns the number of search and replaces performed, so by replacing all the asterisks in the variable $sky, with asterisks, the variable $cnt gets assigned the number of search and replaces that happens, resulting in a count of the number of stars in the $sky. What follows after the # symbol is a comment, a non-functional operator found in most programs, theoretically supposed to explain what the code does.
+
+Works Cited
+
+Works Cited  
+
+Ackerman, Bruce 
+	1980. 	Social Justice in the Liberal State. New Haven: Yale University Press. 
+Allen, Robert
+	1983 	"Collective Invention" Journal of Economic Behavior and Organization 4:1-24.
+Arrow, Sameul Bowles, and Steven Duralau (eds.)
+	2002	Meritocracy and Economic Inequality. Princeton: Princeton University Press. 
+Bollinger, Lee. and Geoffery. Stone (eds.)
+	 2002	Eternally Vigilant: Free Speech in the Modern Era. Chicago: The University of Chicago Press
+Borsook, Paulina
+	2000	Cyberselfish: A Critical Romp through the Terribly Libertarian Culture of High Tech. New York: Public Affair.
+Bourdieu, Pierre
+	1977	Outline of a Theory of Practice. Cambridge: Cambridge University Press. 
+	1984	Distinction: A Social Critique of the Judgment of Taste. Cambridge: Harvard University Press. 
+Boyle, James
+	1996	 Shamans, Software, and Spleens. Cambridge: Harvard University Press. 
+Brown, Bill 
+	1998	"How to Do Things with Things (A Toy Story)" Critical Inquiry.  (24)4: 935-964.
+	2001	"Thing Theory" Critical Inquiry. 28(1):1-22
+Castells, Manuel 
+	2001 	The Internet Galaxy: Reflections on the Internet, Business, and Society. Cambridge: Oxford University Press.
+ Comaroff, Jean and John Commaroff
+1991	Of Revelation and Revolution: Christianity, Colonialism, and Consciousness in South Africa. Volume One. Chicago: University of Chicago Press. 
+Davis, Erik
+	1998	Technosis: myth, magic, and mysticism in the age of information. New York: Three Rivers Press. 
+ Dewey, John 
+	1935	Liberalism and Social Action. New York: G. P. Putnam's Sons.
+ Donner, Wendy
+	1991	The Liberal Self: John Stuart Mill's Moral and Political Philosophy. Ithaca: 	Cornell University Press.
+Douglas, Mary
+	1975	 Implicit Meanings: Essays in Anthropology. London: Routledge. 
+Downey, Greg
+	1998	The Machine in Me. New York and London: Routledge.
+Drahos, Peters and John Braithwaite
+	2003	Information Feudalism. NY, NY: W.W. Norton and Company.
+Edwards, Paul
+	1996	The Closed War: Computers and the Politics of Discourse in Cold War America. Cambridge and London: The MIT Press. 
+Fischer, Claude
+	1991	America Calling: A Social History of the Telephone to 1940. Berkeley: University of California Press.  
+Fischer, Michael J.
+	1999	 "Worlding Cyberspace: Towards a Crucial Ethnography in Time, Space, Theory" in Critical Anthropology Now: Unexpected Context, Shifting Constituencies, Changing Agendas. George Marcus (ed.) Santa Fe: Sar Press.
+Galison, Peter
+	1997 	Image and Logic: A Material Culture of Microphysics. Chicago: University of Chicago Press. 
+Galison, Peter and Caroline Jones (eds.)
+	1998	Picturing Science Producing Art. New York and London: Routledge. 
+Galloway, Alexander R.
+	2004	Protocol: How Control Exists after Decentralization. Cambridge and London: The MIT Press. 
+Gilroy, Paul
+	1993	The Black Atlantic: Modernity and Double Consciousness. Cambridge: Harvard University Press.
+ Goffman, Erving
+	1963 	Interaction Ritual New York: Anchor Books.
+ Graeber, David 
+ 	1997	"Manners, Deference, and Private Property" Comparative Studies in Society and History. 39(4)694-726.
+Halliday, Richard J.
+	1976	John Stewart Mill.  New York: Barnes and Noble.
+Haraway, Donna
+	2000	 "A Cyborg Manifesto: Science, Technology and Socialist-Feminism in the Late Twentieth Century" in The Cybercultures Reader Bell, D. and B. Kennedy (eds: London and New York: Routledge [1985] 
+Hayles, Katherine
+	1999	How We Became Post-Human. Chicago: University of Chicago Press. 
+Hebdige, Dick
+	1987	Cut N' Mix. New York and London: Routledge. 
+	1997	"Subculture the Meaning of Style" in The Subcultures Reader. New York and London: Routledge, [1979]
+Himanen, Pekka 
+	2001	The Hacker Ethic and the Spirit of the Information Age. New York: Random House.
+Jaszi, Peter
+	1992	"On the Authorship Effect: Contemporary Copyright and Collective Creativity." 10 Cardozo Arts and Entertainment Law Journal 274.
+Kelty, Chris M. 
+ 	2005 	"Geeks, Social Imaginaries, and Recursive Publics" Cultural Anthropology. Vol(20)2.
+Knorr Cetina, Karina
+	1999	Epistemic Cultures: How the Sciences Make Knowledge. Cambridge: Harvard University Press.
+Leach, James	
+	2005	"Modes of Creativity and the Register of Ownership." In Code: Collaborative Ownership and the Digital Economy. Rishab Aiyer Ghosh (ed.). Cambridge, MA: MIT Press.
+Lessig, Lawrence
+ 	1999	Code and Other Laws of Cyberspace. New York: Perseus Books. 
+Levy, Steven 
+	1984	Hackers Heroes of the Computer Revolution. New York: Delta. 
+Macpherson, C. B. 
+	1962	The Political Theory of Possessive Individualism: Hobbes to Locke. Oxford: Clarendon Press.
+Mill, John S
+	1969	 Autobiography. Jack Stillinger (ed) Boston: Houghton Mifflin Company, [1874]
+	1991	On Liberty. H.B. Acton (ed.). London: Dent [1857].
+Nelson, Ted
+	1974 	Computer Lib/Dream Machines. Personal Publications.   
+Nissen, Jorgen 
+	2001	 "Hackers: Masters of Modernity and Modern Technology" in Digital Diversions: Youth Culture in the Age of Multimedia. Julian Stefton-Green (ed).London: University College London. 
+Nissenbaum, Helen
+	2004	 "Hackers and the Contested Ontology of Cyberspace" New Media and Society (6)2.
+Passavant, Paul A. 
+	2002 	No Escape: Freedom of Speech and the Paradox of Rights. New York and London: New York University Press.
+Raymond, Eric (ed)
+	1998	 The New Hacker's Dictionary. (Third Edition) Cambridge: MIT Press. 
+Ricouer, Paul 
+	1996 	"Reflections on a new ethos for Europe" in Ricahrd Kearly (ed.) Paul Ricouer: The Hermeneutics  of Action. London: Sage
+Rose, Michael 
+	1993	 Authors and Owners: The Invention of Copyright. Cambrdige, MA: Harvard University Press.
+Salin, Peter
+	1991 	Freedom of Speech in Software. Available at http://www.philsalin.com/patents.html.
+Smart, Paul
+	1991	Mill and Marx: Individual liberty and the roads to freedom. Manchester and New York: Manchester University Press. 
+Soros, George
+	1998	The Crisis of Global Capitalism [Open Society Endangered]. New York: Public Affairs. 
+Star, Susan Leigh and James R. Griesemer 
+	1998	"Institutional Ecology, "Translation," and Boundary Objects: Amateurs and Professionals in Berkeley's Museum of Vertebrate Zoology" in The Science Studies Reader. Mario Biagioli (ed.)  New York: Routledge. 
+Sterling, Bruce 
+	1992	The Hacker Crackdown: Law and Disorder on the Electronic Frontier. New York: Bantam.
+Taylor, Charles 
+	2004	Modern Social Imaginaries. Durham: Duke University Press.
+Thomas, Douglas 
+	2002	Hacker Culture. Minneapolis: University of Minnesota Press. 
+Tien, Lee
+	2000	"Publishing Software as a Speech Act" 15 Berkeley Technology Law Journal. http://www.law.berkeley.edu/journals/btlj/articles/vol15/.
+Turkle, Sherry 
+	1984	The Second Self: Computers and the Human Spirit. New York: Simon and Schuster. 
+	1995	Life on the Screen: Identity in the Age of the Internet. New York: Simon and Schuster.
+Vaidyanathan, Siva
+2001Copyrights and Copywrongs: The Rise of Intellectual Property and How It Threatens Creativity. New York: NYU Press. 
+Ullman, Ellen
+	1997	Close to the Machine: Technophilia and its Discontents. San Francisco: City Lights. 
+Winner, Langdon
+	1986	 The Whale and the Reactor: A Search for Limits in an Age of High Technology.  Chicago: University of Chicago Press. 
+Woodmansee, Martha 
+	1994	The Author, Art, and the Market: Rereading the History of Aesthetics. New York: Columbia University Press.
+Weber, Steven. 
+	2004	 The Success of Open Source. Cambridge: Harvard University Press.
+Young, Iris
+	1990 	Justice and the Politics of Difference. Princeton: Princeton University Press.  
+ 
+

Added: procedings/29-codes-of-value/paper.tex
===================================================================
--- procedings/29-codes-of-value/paper.tex	2006-04-10 11:52:01 UTC (rev 489)
+++ procedings/29-codes-of-value/paper.tex	2006-04-10 20:30:09 UTC (rev 490)
@@ -0,0 +1,295 @@
+\quote{I have nothing to declare but my genius\\
+-- Oscar Wilde}
+
+\begin{verbatim}
+#count the number of stars in the sky
+ $cnt = $sky =~ tr/*/*/;
+\end{verbatim}
+
+\section{Paper Overview}
+
+	
+This line\footnote{Here is a little more information about the code. The "tr" in this code is
+a function that translates all occurrences of the search characters listed,
+with the corresponding replacement character list.  In this case, the search
+list is delimited by the slash character, so the list of what to search for is
+the asterisk character. The replacement list is the second asterisk character,
+so overall it is replacing the asterisk with an asterisk. The side-effect of
+this code is that the 'tr' function returns the number of search and replaces
+performed, so by replacing all the asterisks in the variable \texttt{\$sky}, with
+asterisks, the variable \texttt{\$cnt} gets assigned the number of search and replaces
+that happens, resulting in a count of the number of stars in the \texttt{\$sky}. What
+follows after the \texttt{\#} symbol is a comment, a non-functional operator found in
+most programs, theoretically supposed to explain what the code does.}
+of Perl denotes a hacker homage to cleverness as a double entrede of
+both semantic ingenuity and technical cleverness. To fully appreciate the
+semantic humor presented here, we must look at the finer points of a particular
+set of the developer population, the Perl hacker. These hackers have developed
+a computer scripting language, Perl, in which terse but technically powerful
+expressions can be formed (in comparison to other programming languages). The
+Perl community takes special pride in cleverly condensing long segments of code
+into very short and sometimes "obfuscated" one-liners.  If this above line of
+code were to be expanded into something more traditional and accessible to Perl
+novices, it may read something like:
+
+\begin{verbatim}
+   $cnt = 0;
+   $i = 0;
+   $skylen = length($sky)
+   while ($i < $skylen) { 
+ 	$sky = substr($sky,0, $i) . '*' . substr($sky, $i+1, length	($skylen));
+      $i++;
+    }
+        $cnt = length($sky);
+\end{verbatim}
+
+We see that this enterprising Perl programmer has taken 6 lines of code and
+reduced it by taking advantage of certain side effects found in the constructs
+of the Perl computer language. With this transformation of "prose" into terse
+"poetry," the developer displays a mastery of the computer language. This
+mastery is sealed on semantic level by the joke of "counting the number of
+stars in the sky" due to the naming of the variable \texttt{\$sky}, and the word play of
+the asterisk or star. Since the counting function is directed to literally
+count any appearance of the asterisk symbol, a star, (this is what the program
+does) the programmer decided to display his craftiness by choosing the variable
+name \texttt{\$sky} and hence the description of the function "count the number of stars
+in the sky."
+ 	
+This snippet of code is a useful object to present here because it is a potent
+example of hacker value in a dual capacity. Free and opens source (F/OSS)
+hackers have come to deem accessible, open code such as the example above and
+the Perl language it is written in, as valuable. With access to code, hackers
+argue they can learn new skills and improve technology. But in the above
+minuscule line of code, we can glean another trace of value. Because it is a
+particularly tricky way to solve a problem, and contains a nugget of
+non-technical cleverness, this code reveals the value hackers place on the
+performance of wit. This tendency to perform wit is all pervasive in the hacker
+habitat (Fisher 1999; Raymond 1998; Thomas 2002). Blossoming from the prosaic
+world of hacker technical and social praxis, the clever performance of
+technology and humor might be termed as the "semantic ground" through which
+hackers "construct and represent themselves and others" (Comaroff and Comaroff
+1991: 21).
+	
+Judging from this Perl example alone, it is not surprising that in much of the
+literature, hackers are treated as quintessentially individualistic (Turkle
+1984, 1995; Levy 1984; Sterling 1992; Castells 2001; Borsook 2000; Davis 1998;
+Nissembaum 2004). "The hacker," Sherry Turkle writes, "is the defender of
+idiosyncrasy, individuality, genius and the cult of individual" (1984: 229).
+Hackers do seem perpetually keen on asserting their individuality through acts
+of ingenuity, and thus this statement is unmistakably correct. However, in most
+accounts on hackers, the meaning of individualism is treated either as an
+ideological cloak or uninteresting, and thus is often left underspecified. In
+this piece, through an ethnographic examination of hacker coding practices,
+social interaction, and humor, I specify how hackers conceive of and preform a
+form of liberal individuality that departs from another version of liberal
+selfhood. 
+ 
+Hacker notions of creativity and individuality, I argue, extend and re-vision
+an existing cultural trope of individualism that diverges from the predominant
+reading of the liberal self as that of the consumer or "possessive individual"
+(Macpherson 1962; cf. Graeber 1997). Their enactment of individualism is a
+re-creation of the type of liberal person envisioned in the works of John
+Stuart Mill in his critique of utilitarianism (1857), more recently addressed
+in the works of other liberal thinkers like John Dewey (1935), and practically
+articulated in ideals of meritocracy, institutions of education, and free
+speech jurisprudence. As Wendy Donner explains, the Millian conception of
+selfhood sits at odds with a Lockean sensibility "wedded to possessive
+individualism" for Mill formulates "individualism as flowing from the
+development and use of the higher human powers, which is antagonistic to a
+desire to control others" (1991: 130). For hackers, selfhood is foremost a form
+of self-determination that arises out of the ability to think, speak, and
+create freely and independently. As Chris Kelty (2005) has persuasively argued
+by drawing on the work of Charles Taylor (2004), hackers and other net
+advocates have crafted a liberal "social imaginary" in which the right to build
+technology free from encumbrance is seen "as essential to freedom and public
+participation as is speaking to a public about such activities (2005:187).  And
+indeed, the commonplace hacker assertion that free software is about "free
+speech" not "free beer," signals how hackers have reformulated liberal freedom
+into their own technical vernacular. Over the last decade, by specifically
+integrating free speech discourse into the sinews of F/OSS moral philosophy,
+hackers have gone further than simply resonating with the type of liberal
+theory exemplified by John S. Mill. They have literally remade this liberal
+legacy as their own. 
+	
+Clearly there are culturally pervasive ideals and practical articulations of
+the Millian paradigm from which hackers can easily draw upon. But the more
+interesting question is: why does this liberal legacy of the free thinking
+individual capture the hacker cultural imaginary? In this piece I seek to make
+this question intelligible by portraying how hackers create value and notions
+of individuality through routine everyday practices as coding, humor, and
+interactions with other programmers. It is the symbiosis between their praxis
+and prevalent liberal tropes of individuality and meritocracy that form the
+groundwork of hacker liberal self-fashioning as I discuss here.  Central to
+their construction of selfhood is a faith in the power of the human imagination
+that demands of hackers constant acts of self-development through which they
+cultivate their skills, improve technology, and prove their worth to other
+hackers. To become a hacker is to embody a form of individualism that shuns
+what they designate as mediocrity in favor of a virtuous display of wit,
+technical ability and intelligence. Hackers consistently and constantly enact
+this in a repertoire of micropractices, including humor, agonistic yet playful
+taunting, and the clever composition and display of code.   
+	
+Since the designation of superior code, cleverness, intelligence, or even a
+good joke can only be affirmed by other hackers, personal technical development
+requires both a demanding life of continual performativity as well as the
+affirmative judgment of others who are similarly engaged in this form of
+technical self-fashioning. This raises a subtle paradox that textures their
+modes of sociality and interpersonal interactions: hackers are bound together
+in an elite fraternal order of judgment that requires of them constant
+performance of a set of character traits that are often directed to confirm
+their mental and creative independence from each other. 
+
+This paradox alone is not necessarily generative of social tensions. This is,
+after all, how academics organize much of their labor. However, given that so
+much of hacker production derives from a collective and common enterprise, a
+fact that hackers themselves more openly acknowledge and theorize in the very
+ethical philosophy of F/OSS, their affirmation of independence is potentially
+subverted by the reality of and desire to recognize collective invention. As I
+discuss below, the use of humor and technical cleverness reveals as well as
+attenuates the hacker ambivalence between the forms of individualism and
+collectivism, elitism and populism, exclusivity and inclusivity that have come
+to mark their lifeworld. 
+
+This piece now continues with a brief discussion of Mill's conception of
+individuality, self-cultivation, and judgment. This will help ground the second
+part of the article, which takes a close look at hacker pragmatics and poetics.
+I open with a discussion on hacker pragmatics as this will help clarify how
+hackers use cleverness to establish definitions of selfhood. Though this is not
+on humor per se, in this second half, I also heavily draw on  examples of
+everyday hacker humor, treating it as iconic of the wider range of their
+"signifying practices" (Hedbidge 1979) through which they define, clarify, and
+realize the cultural meanings of creativity, individuality, and authorship. The
+final and third section provides a closer look at the relation between the
+hacker self and the liberal self, and ends with a discussion on the hacker
+ideal of authorship and meritocracy that has grown from their commitment to
+Millian individualism. 
+ 
+\section{Works Cited}
+
+Ackerman, Bruce 
+	1980. 	Social Justice in the Liberal State. New Haven: Yale University Press. 
+Allen, Robert
+	1983 	"Collective Invention" Journal of Economic Behavior and Organization 4:1-24.
+Arrow, Sameul Bowles, and Steven Duralau (eds.)
+	2002	Meritocracy and Economic Inequality. Princeton: Princeton University Press. 
+Bollinger, Lee. and Geoffery. Stone (eds.)
+	 2002	Eternally Vigilant: Free Speech in the Modern Era. Chicago: The University of Chicago Press
+Borsook, Paulina
+	2000	Cyberselfish: A Critical Romp through the Terribly Libertarian Culture of High Tech. New York: Public Affair.
+Bourdieu, Pierre
+	1977	Outline of a Theory of Practice. Cambridge: Cambridge University Press. 
+	1984	Distinction: A Social Critique of the Judgment of Taste. Cambridge: Harvard University Press. 
+Boyle, James
+	1996	 Shamans, Software, and Spleens. Cambridge: Harvard University Press. 
+Brown, Bill 
+	1998	"How to Do Things with Things (A Toy Story)" Critical Inquiry.  (24)4: 935-964.
+	2001	"Thing Theory" Critical Inquiry. 28(1):1-22
+Castells, Manuel 
+	2001 	The Internet Galaxy: Reflections on the Internet, Business, and Society. Cambridge: Oxford University Press.
+ Comaroff, Jean and John Commaroff
+1991	Of Revelation and Revolution: Christianity, Colonialism, and Consciousness in South Africa. Volume One. Chicago: University of Chicago Press. 
+Davis, Erik
+	1998	Technosis: myth, magic, and mysticism in the age of information. New York: Three Rivers Press. 
+ Dewey, John 
+	1935	Liberalism and Social Action. New York: G. P. Putnam's Sons.
+ Donner, Wendy
+	1991	The Liberal Self: John Stuart Mill's Moral and Political Philosophy. Ithaca: 	Cornell University Press.
+Douglas, Mary
+	1975	 Implicit Meanings: Essays in Anthropology. London: Routledge. 
+Downey, Greg
+	1998	The Machine in Me. New York and London: Routledge.
+Drahos, Peters and John Braithwaite
+	2003	Information Feudalism. NY, NY: W.W. Norton and Company.
+Edwards, Paul
+	1996	The Closed War: Computers and the Politics of Discourse in Cold War America. Cambridge and London: The MIT Press. 
+Fischer, Claude
+	1991	America Calling: A Social History of the Telephone to 1940. Berkeley: University of California Press.  
+Fischer, Michael J.
+	1999	 "Worlding Cyberspace: Towards a Crucial Ethnography in Time, Space, Theory" in Critical Anthropology Now: Unexpected Context, Shifting Constituencies, Changing Agendas. George Marcus (ed.) Santa Fe: Sar Press.
+Galison, Peter
+	1997 	Image and Logic: A Material Culture of Microphysics. Chicago: University of Chicago Press. 
+Galison, Peter and Caroline Jones (eds.)
+	1998	Picturing Science Producing Art. New York and London: Routledge. 
+Galloway, Alexander R.
+	2004	Protocol: How Control Exists after Decentralization. Cambridge and London: The MIT Press. 
+Gilroy, Paul
+	1993	The Black Atlantic: Modernity and Double Consciousness. Cambridge: Harvard University Press.
+ Goffman, Erving
+	1963 	Interaction Ritual New York: Anchor Books.
+ Graeber, David 
+ 	1997	"Manners, Deference, and Private Property" Comparative Studies in Society and History. 39(4)694-726.
+Halliday, Richard J.
+	1976	John Stewart Mill.  New York: Barnes and Noble.
+Haraway, Donna
+	2000	 "A Cyborg Manifesto: Science, Technology and Socialist-Feminism in the Late Twentieth Century" in The Cybercultures Reader Bell, D. and B. Kennedy (eds: London and New York: Routledge [1985] 
+Hayles, Katherine
+	1999	How We Became Post-Human. Chicago: University of Chicago Press. 
+Hebdige, Dick
+	1987	Cut N' Mix. New York and London: Routledge. 
+	1997	"Subculture the Meaning of Style" in The Subcultures Reader. New York and London: Routledge, [1979]
+Himanen, Pekka 
+	2001	The Hacker Ethic and the Spirit of the Information Age. New York: Random House.
+Jaszi, Peter
+	1992	"On the Authorship Effect: Contemporary Copyright and Collective Creativity." 10 Cardozo Arts and Entertainment Law Journal 274.
+Kelty, Chris M. 
+ 	2005 	"Geeks, Social Imaginaries, and Recursive Publics" Cultural Anthropology. Vol(20)2.
+Knorr Cetina, Karina
+	1999	Epistemic Cultures: How the Sciences Make Knowledge. Cambridge: Harvard University Press.
+Leach, James	
+	2005	"Modes of Creativity and the Register of Ownership." In Code: Collaborative Ownership and the Digital Economy. Rishab Aiyer Ghosh (ed.). Cambridge, MA: MIT Press.
+Lessig, Lawrence
+ 	1999	Code and Other Laws of Cyberspace. New York: Perseus Books. 
+Levy, Steven 
+	1984	Hackers Heroes of the Computer Revolution. New York: Delta. 
+Macpherson, C. B. 
+	1962	The Political Theory of Possessive Individualism: Hobbes to Locke. Oxford: Clarendon Press.
+Mill, John S
+	1969	 Autobiography. Jack Stillinger (ed) Boston: Houghton Mifflin Company, [1874]
+	1991	On Liberty. H.B. Acton (ed.). London: Dent [1857].
+Nelson, Ted
+	1974 	Computer Lib/Dream Machines. Personal Publications.   
+Nissen, Jorgen 
+	2001	 "Hackers: Masters of Modernity and Modern Technology" in Digital Diversions: Youth Culture in the Age of Multimedia. Julian Stefton-Green (ed).London: University College London. 
+Nissenbaum, Helen
+	2004	 "Hackers and the Contested Ontology of Cyberspace" New Media and Society (6)2.
+Passavant, Paul A. 
+	2002 	No Escape: Freedom of Speech and the Paradox of Rights. New York and London: New York University Press.
+Raymond, Eric (ed)
+	1998	 The New Hacker's Dictionary. (Third Edition) Cambridge: MIT Press. 
+Ricouer, Paul 
+	1996 	"Reflections on a new ethos for Europe" in Ricahrd Kearly (ed.) Paul Ricouer: The Hermeneutics  of Action. London: Sage
+Rose, Michael 
+	1993	 Authors and Owners: The Invention of Copyright. Cambrdige, MA: Harvard University Press.
+Salin, Peter
+	1991 	Freedom of Speech in Software. Available at http://www.philsalin.com/patents.html.
+Smart, Paul
+	1991	Mill and Marx: Individual liberty and the roads to freedom. Manchester and New York: Manchester University Press. 
+Soros, George
+	1998	The Crisis of Global Capitalism [Open Society Endangered]. New York: Public Affairs. 
+Star, Susan Leigh and James R. Griesemer 
+	1998	"Institutional Ecology, "Translation," and Boundary Objects: Amateurs and Professionals in Berkeley's Museum of Vertebrate Zoology" in The Science Studies Reader. Mario Biagioli (ed.)  New York: Routledge. 
+Sterling, Bruce 
+	1992	The Hacker Crackdown: Law and Disorder on the Electronic Frontier. New York: Bantam.
+Taylor, Charles 
+	2004	Modern Social Imaginaries. Durham: Duke University Press.
+Thomas, Douglas 
+	2002	Hacker Culture. Minneapolis: University of Minnesota Press. 
+Tien, Lee
+	2000	"Publishing Software as a Speech Act" 15 Berkeley Technology Law Journal. http://www.law.berkeley.edu/journals/btlj/articles/vol15/.
+Turkle, Sherry 
+	1984	The Second Self: Computers and the Human Spirit. New York: Simon and Schuster. 
+	1995	Life on the Screen: Identity in the Age of the Internet. New York: Simon and Schuster.
+Vaidyanathan, Siva
+2001Copyrights and Copywrongs: The Rise of Intellectual Property and How It Threatens Creativity. New York: NYU Press. 
+Ullman, Ellen
+	1997	Close to the Machine: Technophilia and its Discontents. San Francisco: City Lights. 
+Winner, Langdon
+	1986	 The Whale and the Reactor: A Search for Limits in an Age of High Technology.  Chicago: University of Chicago Press. 
+Woodmansee, Martha 
+	1994	The Author, Art, and the Market: Rereading the History of Aesthetics. New York: Columbia University Press.
+Weber, Steven. 
+	2004	 The Success of Open Source. Cambridge: Harvard University Press.
+Young, Iris
+	1990 	Justice and the Politics of Difference. Princeton: Princeton University Press.  
+ 
+

Added: procedings/37-ligtning-talks/orig/lightning.tex
===================================================================
--- procedings/37-ligtning-talks/orig/lightning.tex	2006-04-10 11:52:01 UTC (rev 489)
+++ procedings/37-ligtning-talks/orig/lightning.tex	2006-04-10 20:30:09 UTC (rev 490)
@@ -0,0 +1,244 @@
+\documentclass[english]{article}
+\usepackage[T1]{fontenc}
+\usepackage[latin1]{inputenc}
+
+\makeatletter
+
+\providecommand{\tabularnewline}{\\}
+
+\usepackage{babel}
+\makeatother
+\begin{document}
+
+\vfill{}
+\title{DebConf 6: Lightning Talks}
+\vfill{}
+
+
+\date{April 2006}
+
+
+\author{Joey Hess}
+
+\maketitle
+\begin{abstract}
+Lightning talks are a way to let a variety of people speak on a variety
+of subjects, without a lot of formal conference overhead. Each talk
+is limited to 5 minutes, and the talks are presented back-to-back
+throughout the 45 minute session. Just as a one-liner can be interesting
+and useful despite its short length, the five minutes of a lightning
+talk is just enough time to discuss the core of an idea, technique,
+interesting peice of software, etc. And of course, if one of the talks
+isn't interesting, another will be along in just five minutes. 
+\end{abstract}
+
+\section{Tenative lightning talk schedule}
+
+\begin{tabular}{|c|c|c|}
+\hline 
+time&
+speaker&
+title\tabularnewline
+\hline
+\hline 
+2 minutes&
+Joey Hess&
+introduction\tabularnewline
+\hline
+5 minutes&
+Jeroen van Wolffelaar&
+Actively discovering bugs/issues with packages\tabularnewline
+\hline 
+5 minutes&
+Jose Parrella&
+Walkthrough: Make your Country love Debian\tabularnewline
+\hline 
+5 minutes&
+Andreas Schuldei&
+Debian in the greater Linux ecosystem\tabularnewline
+\hline 
+5 minutes&
+David Moreno Garza&
+WNPP: Automatizing the unautomatizable\tabularnewline
+\hline 
+5 minutes&
+Raphael Hertzog&
+How far can we go with a collaborative maintenance infrastructure\tabularnewline
+\hline 
+5 minutes&
+Matt Taggart&
+How to get debian-admin to help you\tabularnewline
+\hline 
+5 minutes&
+Joey Hess&
+Significant Choices\tabularnewline
+\hline 
+2 minutes&
+Jeroen van Wolffelaar&
+Tracking MIA developers\tabularnewline
+\hline 
+3 minutes&
+Jeroen van Wolffelaar&
+Datamining on Debian packages metadata\tabularnewline
+\hline
+\end{tabular}
+
+
+\subsection{Alternates}
+
+\begin{tabular}{|c|c|c|}
+\hline 
+time&
+speaker&
+title\tabularnewline
+\hline
+\hline 
+5 minutes&
+Joey Hess&
+Debian in 4 MB or less\tabularnewline
+\hline
+\hline 
+2+ minutes&
+Jeroen van Wolffelaar&
+How to pronounce Jeroen van Wolffelaar, and other names\tabularnewline
+\hline
+\end{tabular}
+
+
+\section{Talk summaries}
+
+
+\subsection{Introduction}
+
+\begin{quotation}
+(Presenters, line up!) 
+
+Just explaining what a lightning talk is and how things will work.
+
+Basically, here is a fixed microphone into which you will start by
+giving your name and the talk title, here is a video hookup, here
+is a noisemaker that I will sound after your 5 minutes are up at which
+point you are DONE and it's the next person's turn. Have fun!
+\end{quotation}
+Presented by Joey Hess
+
+
+\subsection{Actively discovering bugs/issues with packages}
+
+\begin{quotation}
+There are several tools to check a package automatically: lintian,
+piuparts, building with pbuilder. How to do execute those continuously,
+and especially, make the results immediately available to all, and
+have issues directly reported to the maintainers.
+\end{quotation}
+Presented by Jeroen van Wolffelaar
+
+
+\subsection{Walkthrough: Make your Country love Debian}
+
+\begin{quotation}
+A short tale about the Venezuelan experience in the Free Software
+Migration, and how each day it is turning strongly towards Debian
+GNU/Linux. Including Government migration towards Debian, people using
+Debian as free, democratic Internet access platform, and Debian community
+activities. 
+\end{quotation}
+Presented by Jose Parrella
+
+
+\subsection{Debian in the greater Linux ecosystem}
+
+\begin{quotation}
+I would like to pass on what I hear from sponsors/supporters of Debian
+when talking to them. This includes e.g. Debian's role in the LSB
+landscape, its perception by some service selling companies, and directions
+it could take when placing itself in the market in the future.
+\end{quotation}
+Presented by Andreas Schuldei
+
+
+\subsection{WNPP: Automatizing the unautomatizable}
+
+\begin{quotation}
+Let's talk about what's been done on the WNPP field by all the people
+involved on it. What can be improved or what can be implemented to
+make things easier for WNPP maintainers.
+\end{quotation}
+Presented by David Moreno Garza 
+
+
+\subsection{How far can we go with a collaborative maintenance infrastructure}
+
+\begin{quotation}
+Short presentation of the Collaborative Maintenance proposal and possible
+implications that it can have on NM, QA, and our way to maintain the
+packages. Explain how that infrastructure fits with the PTS and everything
+else.
+\end{quotation}
+Presented by Raphael Hertzog 
+
+
+\subsection{How to get debian-admin to help you}
+
+\begin{quotation}
+The way in which you craft a request to debian-admin has a great affect
+on how quickly they can help you. This talk will help you determine
+what things debian-admin will be able to help you with and what information
+to include in the request to order to get help quickly without the
+need for extra questions. 
+\end{quotation}
+Presented by Matt Taggart
+
+
+\subsection{Significant Choices}
+
+\begin{quotation}
+A rant on how some decisions are made in Debian in less than ideal
+ways and the surprising consequences. 
+\end{quotation}
+Presented by Joey Hess
+
+
+\subsection{Tracking MIA developers }
+
+\begin{quotation}
+A brief rundown of the infrastructure, but more importantly procedures
+and customs, used to find and take action on people who are MIA, or
+more accurately put, people who are suspected of having themselves
+overcommitted to Debian work. 
+\end{quotation}
+Presented by: Jeroen van Wolffelaar
+
+
+\subsection{Datamining on Debian packages metadata}
+
+\begin{quotation}
+There is a lot of data available about Debian packages. > 10G of bug
+data, packages files, upload logs, who sponsors who, and dozens of
+other sources. They are hard to present and correllate though: an
+identity isn't clearly defined (people can have multiple gpg keys,
+email addresses, names, and some of those can even clash). This is
+where carnivore comes in, a new QA tool to assist here. Also discussing
+other techniques and applications. 
+\end{quotation}
+Presented by Jeroen van Wolffelaar 
+
+
+\subsection{Debian in 4 MB or less}
+
+\begin{quotation}
+Explaining how the ADS root builder can create embedded systems based
+on Debian and how this can tie in with projects like Debonaras. 
+\end{quotation}
+Presented by Joey Hess
+
+
+\subsection{How to pronounce Jeroen van Wolffelaar, and other names}
+
+\begin{quotation}
+\char`\"{}Random-J\char`\"{}, \char`\"{}Jeroen van Wifflepuck\char`\"{},
+all not correct... Short introduction to Dutch sounds not found in
+English or most other languages.
+\end{quotation}
+Presented by Jeroen van Wolffelaaaaar
+\end{document}

Added: procedings/37-ligtning-talks/paper.tex
===================================================================
--- procedings/37-ligtning-talks/paper.tex	2006-04-10 11:52:01 UTC (rev 489)
+++ procedings/37-ligtning-talks/paper.tex	2006-04-10 20:30:09 UTC (rev 490)
@@ -0,0 +1,221 @@
+\providecommand{\tabularnewline}{\\}
+
+\section{Abstract}
+Lightning talks are a way to let a variety of people speak on a variety
+of subjects, without a lot of formal conference overhead. Each talk
+is limited to 5 minutes, and the talks are presented back-to-back
+throughout the 45 minute session. Just as a one-liner can be interesting
+and useful despite its short length, the five minutes of a lightning
+talk is just enough time to discuss the core of an idea, technique,
+interesting peice of software, etc. And of course, if one of the talks
+isn't interesting, another will be along in just five minutes. 
+
+\section{Tenative lightning talk schedule}
+
+\begin{tabular}{|c|c|c|}
+\hline 
+time&
+speaker&
+title\tabularnewline
+\hline
+\hline 
+2 minutes&
+Joey Hess&
+introduction\tabularnewline
+\hline
+5 minutes&
+Jeroen van Wolffelaar&
+Actively discovering bugs/issues with packages\tabularnewline
+\hline 
+5 minutes&
+Jose Parrella&
+Walkthrough: Make your Country love Debian\tabularnewline
+\hline 
+5 minutes&
+Andreas Schuldei&
+Debian in the greater Linux ecosystem\tabularnewline
+\hline 
+5 minutes&
+David Moreno Garza&
+WNPP: Automatizing the unautomatizable\tabularnewline
+\hline 
+5 minutes&
+Raphael Hertzog&
+How far can we go with a collaborative maintenance infrastructure\tabularnewline
+\hline 
+5 minutes&
+Matt Taggart&
+How to get debian-admin to help you\tabularnewline
+\hline 
+5 minutes&
+Joey Hess&
+Significant Choices\tabularnewline
+\hline 
+2 minutes&
+Jeroen van Wolffelaar&
+Tracking MIA developers\tabularnewline
+\hline 
+3 minutes&
+Jeroen van Wolffelaar&
+Datamining on Debian packages metadata\tabularnewline
+\hline
+\end{tabular}
+
+
+\subsection{Alternates}
+
+\begin{tabular}{|c|c|c|}
+\hline 
+time&
+speaker&
+title\tabularnewline
+\hline
+\hline 
+5 minutes&
+Joey Hess&
+Debian in 4 MB or less\tabularnewline
+\hline
+\hline 
+2+ minutes&
+Jeroen van Wolffelaar&
+How to pronounce Jeroen van Wolffelaar, and other names\tabularnewline
+\hline
+\end{tabular}
+
+
+\section{Talk summaries}
+
+
+\subsection{Introduction}
+
+\begin{quotation}
+(Presenters, line up!) 
+
+Just explaining what a lightning talk is and how things will work.
+
+Basically, here is a fixed microphone into which you will start by
+giving your name and the talk title, here is a video hookup, here
+is a noisemaker that I will sound after your 5 minutes are up at which
+point you are DONE and it's the next person's turn. Have fun!
+\end{quotation}
+Presented by Joey Hess
+
+
+\subsection{Actively discovering bugs/issues with packages}
+
+\begin{quotation}
+There are several tools to check a package automatically: lintian,
+piuparts, building with pbuilder. How to do execute those continuously,
+and especially, make the results immediately available to all, and
+have issues directly reported to the maintainers.
+\end{quotation}
+Presented by Jeroen van Wolffelaar
+
+
+\subsection{Walkthrough: Make your Country love Debian}
+
+\begin{quotation}
+A short tale about the Venezuelan experience in the Free Software
+Migration, and how each day it is turning strongly towards Debian
+GNU/Linux. Including Government migration towards Debian, people using
+Debian as free, democratic Internet access platform, and Debian community
+activities. 
+\end{quotation}
+Presented by Jose Parrella
+
+
+\subsection{Debian in the greater Linux ecosystem}
+
+\begin{quotation}
+I would like to pass on what I hear from sponsors/supporters of Debian
+when talking to them. This includes e.g. Debian's role in the LSB
+landscape, its perception by some service selling companies, and directions
+it could take when placing itself in the market in the future.
+\end{quotation}
+Presented by Andreas Schuldei
+
+
+\subsection{WNPP: Automatizing the unautomatizable}
+
+\begin{quotation}
+Let's talk about what's been done on the WNPP field by all the people
+involved on it. What can be improved or what can be implemented to
+make things easier for WNPP maintainers.
+\end{quotation}
+Presented by David Moreno Garza 
+
+
+\subsection{How far can we go with a collaborative maintenance infrastructure}
+
+\begin{quotation}
+Short presentation of the Collaborative Maintenance proposal and possible
+implications that it can have on NM, QA, and our way to maintain the
+packages. Explain how that infrastructure fits with the PTS and everything
+else.
+\end{quotation}
+Presented by Raphael Hertzog 
+
+
+\subsection{How to get debian-admin to help you}
+
+\begin{quotation}
+The way in which you craft a request to debian-admin has a great affect
+on how quickly they can help you. This talk will help you determine
+what things debian-admin will be able to help you with and what information
+to include in the request to order to get help quickly without the
+need for extra questions. 
+\end{quotation}
+Presented by Matt Taggart
+
+
+\subsection{Significant Choices}
+
+\begin{quotation}
+A rant on how some decisions are made in Debian in less than ideal
+ways and the surprising consequences. 
+\end{quotation}
+Presented by Joey Hess
+
+
+\subsection{Tracking MIA developers }
+
+\begin{quotation}
+A brief rundown of the infrastructure, but more importantly procedures
+and customs, used to find and take action on people who are MIA, or
+more accurately put, people who are suspected of having themselves
+overcommitted to Debian work. 
+\end{quotation}
+Presented by: Jeroen van Wolffelaar
+
+
+\subsection{Datamining on Debian packages metadata}
+
+\begin{quotation}
+There is a lot of data available about Debian packages. > 10G of bug
+data, packages files, upload logs, who sponsors who, and dozens of
+other sources. They are hard to present and correllate though: an
+identity isn't clearly defined (people can have multiple gpg keys,
+email addresses, names, and some of those can even clash). This is
+where carnivore comes in, a new QA tool to assist here. Also discussing
+other techniques and applications. 
+\end{quotation}
+Presented by Jeroen van Wolffelaar 
+
+
+\subsection{Debian in 4 MB or less}
+
+\begin{quotation}
+Explaining how the ADS root builder can create embedded systems based
+on Debian and how this can tie in with projects like Debonaras. 
+\end{quotation}
+Presented by Joey Hess
+
+
+\subsection{How to pronounce Jeroen van Wolffelaar, and other names}
+
+\begin{quotation}
+\char`\"{}Random-J\char`\"{}, \char`\"{}Jeroen van Wifflepuck\char`\"{},
+all not correct... Short introduction to Dutch sounds not found in
+English or most other languages.
+\end{quotation}
+Presented by Jeroen van Wolffelaaaaar

Added: procedings/all.dvi
===================================================================
(Binary files differ)


Property changes on: procedings/all.dvi
___________________________________________________________________
Name: svn:mime-type
   + application/octet-stream

Added: procedings/all.tex
===================================================================
--- procedings/all.tex	2006-04-10 11:52:01 UTC (rev 489)
+++ procedings/all.tex	2006-04-10 20:30:09 UTC (rev 490)
@@ -0,0 +1,48 @@
+\documentclass[a4paper,10pt,twoside,notitlepage,nochapterprefix,noappendixprefix]{scrreprt}
+%DIV22,BCOR1.5cm,10pt,twoside,twocolumn,headnosepline,footnosepline]{scrartcl}
+%\usepackage{graphicx}
+%\usepackage{moreverb}
+\usepackage[colorlinks=true,urlcolor=blue,breaklinks=false]{hyperref}
+%\usepackage[colorlinks=true,urlcolor=black,linkcolor=black,breaklinks=false]{hyperref}
+
+\begin{document}
+
+\setcounter{secnumdepth}{-1}
+\setcounter{tocdepth}{0}
+
+\tableofcontents
+\vfill
+\sloppy
+
+\part{Talks}
+
+\chapter[Nobody expects the Finnish inquisition]
+  {Nobody expects the Finnish inquisition: Confessions of a Debian package torturer\\
+  \small{by Lars Wirzenius (liw at iki.fi)}}
+\include{25-piuparts/paper}
+
+\chapter[Packaging shared libraries]
+  {Packaging shared libraries\\
+  \small{Josselin Mouette}}
+\include{27-Packaging-shared-libraries/paper}
+
+\chapter[Codes of Value: Hacker Pragmatics, Poetics, and Selfhood]
+  {Codes of Value: Hacker Pragmatics, Poetics, and Selfhood\\
+  \small{by Gabriella Coleman, Postdoctoral Fellow, Center for Cultural Analysis, Rutgers University}}
+\include{29-codes-of-value/paper} 
+
+\chapter[Lightning Talks]
+  {Lightning Talks\\
+  \small{Joey Hess et al}}
+\include{37-ligtning-talks/paper}
+
+\part{Workshops}
+
+\part{Round tables}
+
+\part{(incomplete) List of Birth of the Feather Sessions}
+
+\appendix
+\part{copyright notice}
+
+\end{document}




More information about the Debconf6-data-commit mailing list