[SCM] ardour/master: Imported debian/ from ardour3 packaging repository

umlaeute at users.alioth.debian.org umlaeute at users.alioth.debian.org
Thu Sep 10 21:38:26 UTC 2015


The following commit has been merged in the master branch:
commit 687787779e26cad1df560cbb0197d2ccb482d03b
Author: IOhannes m zmölnig <zmoelnig at umlautQ.umlaeute.mur.at>
Date:   Thu Sep 10 21:02:47 2015 +0200

    Imported debian/ from ardour3 packaging repository

diff --git a/debian/NEWS b/debian/NEWS
deleted file mode 100644
index 0427603..0000000
--- a/debian/NEWS
+++ /dev/null
@@ -1,32 +0,0 @@
-ardour (0.9beta10.2-1) unstable; urgency=low
-  
-     This release disables optimization for i386 because otherwise the
-     binaries fail on k6 and such. Read
-     <http://bugs.debian.org/cgi-bin/bugreport.cgi?bug=234371> and
-     <http://bugs.debian.org/cgi-bin/bugreport.cgi?bug=219672>.
-     If you want to build an optimized version, read, edit and uncomment 
-     the part in debian/rules and rebuild.
-   
- -- Robert Jordens <jordens at debian.org>  Tue, 24 Feb 2008 22:59:33 +0200
-
-
-ardour (0.9beta9+3-2) unstable; urgency=low
-
-     The Manual (formerly contained in the package ardour-doc) has been
-     removed. It contained the skeleton an a few bits of information that were
-     already quite outdated.
-     .
-     The Ardour developers (mainly Paul Davis) have decided develop an
-     extensive manual that will be sold and will not be freely distributable.
-     The discussion about this can be found in the Ardour mailing lists (such
-     as <http://boudicca.tux.org/hypermail/ardour-dev/>).
-     .
-     A scrambled version (making reproduction and/or use harder) may be
-     available from the project's homepage <http://ardour.sourceforge.net>.
-     The homepage also contains information about the version of the manual
-     that will be sold.
-     .
-     This scrambled manual contains much more information than the version
-     formerly contained in ardour-doc.
-
- -- Robert Jordens <jordens at debian.org>  Wed, 28 Jan 2004 22:31:13 +0200
diff --git a/debian/README.Debian b/debian/README.Debian
deleted file mode 100644
index c04e6ad..0000000
--- a/debian/README.Debian
+++ /dev/null
@@ -1,81 +0,0 @@
-ardour for Debian
------------------
-
-*       The session_exchange.py script has been renamed to
-        ardour2-session_exchange.py, so that there is no file conflict between
-        ardour v2 and the original ardour package.
-
- -- Luke Yelavich <themuso at ubuntu.com>  Thu,  5 Apr 2007 12:35:37 +1000
-
-
-*	These Debian packages for ardour modify its build system slightly to
-        comply with Debian packaging policy. Please mention problems that 
-        seem to be caused by C++ dynamic linkage flags, libraries or compiler 
-        versions in bugreports to the Debian bugtracking system:
-	
-	http://www.debian.org/Bugs/Reporting
-	http://bugs.debian.org/src:ardour
-	
-	You can help debugging and fixing ardour in such cases by compiling 
-	it unoptimized and unstripped. That's done with
-		$ sudo apt-get --target-release unstable build-dep ardour
-		$ export DEB_BUILD_OPTIONS="noopt nostrip"
-		$ apt-get --target-release unstable --compile source ardour
-	Install the resulting packages.
-	/usr/share/doc/ardour/FAQ.gz (section 1.8) contains information
-	about debugging ardour with gdb. 1.8.A doesn't apply to Debian. Use
-		$ gdb /usr/bin/ardour2
-		gdb> run
-		...crash it or make it get stuck...
-		gdb> thread apply all bt
-	Send the 
-		+ output of gdb and ardour together with with
-		+ /usr/share/doc/ardour/buildinfo.gz and detailed
-		  information about
-		+ your hardware (graphics, sound, processor, harddisk) and
-		+ your kernel (version, patches, lowlatency)
-	to the Debian Bug Tracking System or ardour-dev at ardour.org.
-	Thanks.
-
-*	You have to configure and start JACK before starting ardour.
-	Setting up a working JACK is not always easy. Please try these before
-	filing bugs about ardour being to slow or being kicked by JACK.
-	Messages like: "JACK has been shut down or it disconnected
-	ardour..." or "subgraph starting at ardour timed out..." are a sure
-	sign of the former.
-	
-		+ set up a working .asoundrc (see Jack Documentation)
-		+ that implies: don't use the default "plug"-layer of JACK-alsa
-		+ try with a large period-size (--period 4096)
-		+ maybe try --nperiods 4 (or 2)
-		+ maybe try/leave away --asio
-		+ try --realtime-priority if you running as root or with 
-                  capabilities
-		+ see http://jackit.sourceforge.net/docs/faq.php about improving
-		  your setup
-		
-	Otherwise your system might really be too slow. Sorry. Try it on a 
-        faster one.
-
-*	You have to run jackd and ardour as the same user. To get away
-	from having to run everything (JACK and ardour) as root use
-	jackstart and modify your kernel a little thus allowing programs to
-	give away certain "capabilities":
-	http://jackit.sourceforge.net/docs/faq.php#a5
-
-
-Further information and documentation:
-
-http://ardour.org/
-http://jackit.sourceforge.net/
-
-User mailinglist archives:
-http://boudicca.tux.org/hypermail/ardour-users/
-
-Developers mailinglist archives:
-http://boudicca.tux.org/hypermail/ardour-dev/
-
-JACK developers mailinglist archives:
-http://boudicca.tux.org/hypermail/jackit-devel/
-
- -- Robert Jordens <jordens at debian.org>, $LastChangedDate: 2004-03-06 00:49:17 +0100 (Sat, 06 Mar 2004) $
diff --git a/debian/TODO.Debian b/debian/TODO.Debian
deleted file mode 100644
index 9d947b8..0000000
--- a/debian/TODO.Debian
+++ /dev/null
@@ -1,13 +0,0 @@
-TODO
-
-Robert Jordens <jordens at debian.org>
-
-This software may be used and distributed according to the terms
-of the GNU General Public License, incorporated herein by reference.
-
- - you can't build it with a previous (older) version installed
-   Build-Conflicts (still true?)
- - libs/ardour/cycles.h, libs/pbd/pbd/atomic.h: look for better
-   implementations (cycles: arm, m68k, sparc; atomic: arm, hppa, sparc)
-   JACK has a nicer sysdeps-like solution
-
diff --git a/debian/ardour-altivec.desktop b/debian/ardour-altivec.desktop
deleted file mode 100644
index c82f24c..0000000
--- a/debian/ardour-altivec.desktop
+++ /dev/null
@@ -1,10 +0,0 @@
-[Desktop Entry]
-Version=1.0
-Name=Ardour Digital Audio Workstation (AltiVec)
-Comment=Record, mix and master multi-track audio
-Exec=/usr/bin/ardour2 %U
-Terminal=false
-Type=Application
-Icon=/usr/share/ardour2/icons/ardour_icon_22px.png
-Categories=AudioVideo;Audio;
-MimeType=application/x-ardour;
diff --git a/debian/ardour-altivec.docs b/debian/ardour-altivec.docs
deleted file mode 100644
index d703f62..0000000
--- a/debian/ardour-altivec.docs
+++ /dev/null
@@ -1,2 +0,0 @@
-debian/TODO.Debian
-debian/README.Debian
diff --git a/debian/ardour-altivec.examples b/debian/ardour-altivec.examples
deleted file mode 100644
index 8b7d5a7..0000000
--- a/debian/ardour-altivec.examples
+++ /dev/null
@@ -1 +0,0 @@
-build-altivec/ardour_system.rc
diff --git a/debian/ardour-altivec.install b/debian/ardour-altivec.install
deleted file mode 100644
index eee08c3..0000000
--- a/debian/ardour-altivec.install
+++ /dev/null
@@ -1,7 +0,0 @@
-debian/tmp/altivec/usr/bin/ardour2 usr/bin
-debian/tmp/altivec/usr/lib*	usr/
-debian/tmp/altivec/usr/share/ardour2 usr/share/
-debian/tmp/altivec/usr/share/locale usr/share/
-debian/tmp/altivec/etc/ardour2* etc/
-debian/ardour.desktop usr/share/applications
-debian/ardour2-session_exchange.py usr/bin
diff --git a/debian/ardour-altivec.manpages b/debian/ardour-altivec.manpages
deleted file mode 100644
index 4928454..0000000
--- a/debian/ardour-altivec.manpages
+++ /dev/null
@@ -1 +0,0 @@
-DOCUMENTATION/ardour.1*
diff --git a/debian/ardour-altivec.menu b/debian/ardour-altivec.menu
deleted file mode 100644
index 7021a25..0000000
--- a/debian/ardour-altivec.menu
+++ /dev/null
@@ -1,4 +0,0 @@
-?package(ardour-altivec):needs="X11" section="Applications/Sound" \
-  hints="Professional,Featureful,WAV,GTK,MIDI,Music Editor,DAW,Multitrack,JACK,LADSPA" \
-  title="Ardour Digital Audio Workstation (AltiVec)" command="/usr/bin/ardour2"
-
diff --git a/debian/ardour-altivec.sharedmimeinfo b/debian/ardour-altivec.sharedmimeinfo
deleted file mode 100644
index c93411b..0000000
--- a/debian/ardour-altivec.sharedmimeinfo
+++ /dev/null
@@ -1,8 +0,0 @@
-<?xml version="1.0" encoding="UTF-8"?>
-<mime-info xmlns="http://www.freedesktop.org/standards/shared-mime-info">
-  <mime-type type="application/x-ardour">
-  <comment>Ardour session file</comment>
-  <glob pattern="*.ardour"/>
-  <generic-icon name="application-x-ardour"/>
-  </mime-type>
-</mime-info>
diff --git a/debian/ardour-i686.desktop b/debian/ardour-i686.desktop
deleted file mode 100644
index 9256567..0000000
--- a/debian/ardour-i686.desktop
+++ /dev/null
@@ -1,10 +0,0 @@
-[Desktop Entry]
-Version=1.0
-Name=Ardour Digital Audio Workstation (i686)
-Comment=Record, mix and master multi-track audio
-Exec=/usr/bin/ardour2 %U
-Terminal=false
-Type=Application
-Icon=/usr/share/ardour2/icons/ardour_icon_22px.png
-Categories=AudioVideo;Audio;
-MimeType=application/x-ardour;
diff --git a/debian/ardour-i686.docs b/debian/ardour-i686.docs
deleted file mode 100644
index d703f62..0000000
--- a/debian/ardour-i686.docs
+++ /dev/null
@@ -1,2 +0,0 @@
-debian/TODO.Debian
-debian/README.Debian
diff --git a/debian/ardour-i686.examples b/debian/ardour-i686.examples
deleted file mode 100644
index 76c4214..0000000
--- a/debian/ardour-i686.examples
+++ /dev/null
@@ -1 +0,0 @@
-build-i686/ardour_system.rc
diff --git a/debian/ardour-i686.install b/debian/ardour-i686.install
deleted file mode 100644
index 30c1a3b..0000000
--- a/debian/ardour-i686.install
+++ /dev/null
@@ -1,7 +0,0 @@
-debian/tmp/i686/usr/bin/ardour2 usr/bin
-debian/tmp/i686/usr/lib*	usr/
-debian/tmp/i686/usr/share/ardour2 usr/share/
-debian/tmp/i686/usr/share/locale usr/share/
-debian/tmp/i686/etc/ardour2* etc/
-debian/ardour.desktop usr/share/applications
-debian/ardour2-session_exchange.py usr/bin
diff --git a/debian/ardour-i686.manpages b/debian/ardour-i686.manpages
deleted file mode 100644
index 4928454..0000000
--- a/debian/ardour-i686.manpages
+++ /dev/null
@@ -1 +0,0 @@
-DOCUMENTATION/ardour.1*
diff --git a/debian/ardour-i686.menu b/debian/ardour-i686.menu
deleted file mode 100644
index 82e1f48..0000000
--- a/debian/ardour-i686.menu
+++ /dev/null
@@ -1,4 +0,0 @@
-?package(ardour-i686):needs="X11" section="Applications/Sound" \
-  hints="Professional,Featureful,WAV,GTK,MIDI,Music Editor,DAW,Multitrack,JACK,LADSPA" \
-  title="Ardour Digital Audio Workstation (i686)" command="/usr/bin/ardour2"
-
diff --git a/debian/ardour-i686.sharedmimeinfo b/debian/ardour-i686.sharedmimeinfo
deleted file mode 100644
index c93411b..0000000
--- a/debian/ardour-i686.sharedmimeinfo
+++ /dev/null
@@ -1,8 +0,0 @@
-<?xml version="1.0" encoding="UTF-8"?>
-<mime-info xmlns="http://www.freedesktop.org/standards/shared-mime-info">
-  <mime-type type="application/x-ardour">
-  <comment>Ardour session file</comment>
-  <glob pattern="*.ardour"/>
-  <generic-icon name="application-x-ardour"/>
-  </mime-type>
-</mime-info>
diff --git a/debian/ardour-opt.desktop b/debian/ardour-opt.desktop
deleted file mode 100644
index 5a771e3..0000000
--- a/debian/ardour-opt.desktop
+++ /dev/null
@@ -1,10 +0,0 @@
-[Desktop Entry]
-Version=1.0
-Name=Ardour Digital Audio Workstation (@optarch@)
-Comment=Record, mix and master multi-track audio
-Exec=/usr/bin/ardour2 %U
-Terminal=false
-Type=Application
-Icon=/usr/share/ardour2/icons/ardour_icon_22px.png
-Categories=AudioVideo;Audio;
-MimeType=application/x-ardour;
diff --git a/debian/ardour-opt.docs b/debian/ardour-opt.docs
deleted file mode 100644
index d703f62..0000000
--- a/debian/ardour-opt.docs
+++ /dev/null
@@ -1,2 +0,0 @@
-debian/TODO.Debian
-debian/README.Debian
diff --git a/debian/ardour-opt.examples b/debian/ardour-opt.examples
deleted file mode 100644
index 9e8267d..0000000
--- a/debian/ardour-opt.examples
+++ /dev/null
@@ -1 +0,0 @@
-ardour_system.rc
diff --git a/debian/ardour-opt.install b/debian/ardour-opt.install
deleted file mode 100644
index 71138ea..0000000
--- a/debian/ardour-opt.install
+++ /dev/null
@@ -1,7 +0,0 @@
-debian/tmp/@optarch@/usr/bin/ardour2 usr/bin
-debian/tmp/@optarch@/usr/lib*	usr/
-debian/tmp/@optarch@/usr/share/ardour2 usr/share/
-debian/tmp/@optarch@/usr/share/locale usr/share/
-debian/tmp/@optarch@/etc/ardour2* etc/
-debian/ardour.desktop usr/share/applications
-debian/ardour2-session_exchange.py usr/bin
diff --git a/debian/ardour-opt.manpages b/debian/ardour-opt.manpages
deleted file mode 100644
index 4928454..0000000
--- a/debian/ardour-opt.manpages
+++ /dev/null
@@ -1 +0,0 @@
-DOCUMENTATION/ardour.1*
diff --git a/debian/ardour-opt.menu b/debian/ardour-opt.menu
deleted file mode 100644
index 5dba9a7..0000000
--- a/debian/ardour-opt.menu
+++ /dev/null
@@ -1,4 +0,0 @@
-?package(ardour- at optarch@):needs="X11" section="Applications/Sound" \
-  hints="Professional,Featureful,WAV,GTK,MIDI,Music Editor,DAW,Multitrack,JACK,LADSPA" \
-  title="Ardour Digital Audio Workstation (@optarch@)" command="/usr/bin/ardour2"
-
diff --git a/debian/ardour.desktop b/debian/ardour.desktop
index e12350a..94780c5 100644
--- a/debian/ardour.desktop
+++ b/debian/ardour.desktop
@@ -1,11 +1,11 @@
 [Desktop Entry]
 Version=1.0
-Name=Ardour Digital Audio Workstation
+Name=Ardour4
+GenericName=Ardour Digital Audio Workstation 4
 Comment=Record, mix and master multi-track audio
-Keywords=audio;sound;jackd;DAW;multitrack;ladspa;lv2;
-Exec=/usr/bin/ardour2 %U
+Keywords=audio;sound;jackd,DAW,multitrack,ladspa,lv2,midi
+Exec=/usr/bin/ardour4
+Icon=ardour
 Terminal=false
 Type=Application
-Icon=/usr/share/ardour2/icons/ardour_icon_22px.png
 Categories=AudioVideo;Audio;
-MimeType=application/x-ardour;
diff --git a/debian/ardour.docs b/debian/ardour.docs
deleted file mode 100644
index d703f62..0000000
--- a/debian/ardour.docs
+++ /dev/null
@@ -1,2 +0,0 @@
-debian/TODO.Debian
-debian/README.Debian
diff --git a/debian/ardour.examples b/debian/ardour.examples
deleted file mode 100644
index 8f5377b..0000000
--- a/debian/ardour.examples
+++ /dev/null
@@ -1,2 +0,0 @@
-build-generic/ardour_system.rc
-
diff --git a/debian/ardour.install b/debian/ardour.install
index 1491236..e6abf35 100644
--- a/debian/ardour.install
+++ b/debian/ardour.install
@@ -1,8 +1,4 @@
-debian/tmp/generic/usr/bin/ardour2 usr/bin
-debian/tmp/generic/usr/lib*	usr/
-debian/tmp/generic/usr/share/ardour2 usr/share/
-debian/tmp/generic/usr/share/locale usr/share/
-debian/tmp/generic/etc/ardour2* etc/
+debian/tmp/*
+debian/*.xpm usr/share/pixmaps
+debian/*.svg usr/share/pixmaps
 debian/ardour.desktop usr/share/applications
-debian/ardour2-session_exchange.py usr/bin
-debian/ardourino.template	usr/share/ardour2/templates/
diff --git a/debian/ardour.manpages b/debian/ardour.manpages
index 8878a3e..0e39ea4 100644
--- a/debian/ardour.manpages
+++ b/debian/ardour.manpages
@@ -1 +1 @@
-debian/ardour2.1
+debian/ardour4.1
diff --git a/debian/ardour.menu b/debian/ardour.menu
index 98a322e..a8f86c2 100644
--- a/debian/ardour.menu
+++ b/debian/ardour.menu
@@ -1,4 +1,6 @@
-?package(ardour):needs="X11" section="Applications/Sound" \
-  hints="Professional,Featureful,WAV,GTK,MIDI,Music Editor,DAW,Multitrack,JACK,LADSPA" \
-  title="Ardour Digital Audio Workstation" command="/usr/bin/ardour2"
-
+?package(ardour):needs="X11" \
+ section="Applications/Sound" \
+ icon="/usr/share/pixmaps/ardour4.xpm" \
+ title="ARDOUR4" \
+ command="/usr/bin/ardour4" \
+ hints="Professional,Featureful,WAV,GTK,MIDI,Music Editor,DAW,Multitrack,JACK,LADSPA"
diff --git a/debian/ardour.svg b/debian/ardour.svg
new file mode 100644
index 0000000..d5a807e
--- /dev/null
+++ b/debian/ardour.svg
@@ -0,0 +1,1520 @@
+<?xml version="1.0" encoding="UTF-8" standalone="no"?>
+<!-- Created with Inkscape (http://www.inkscape.org/) -->
+
+<svg
+   xmlns:dc="http://purl.org/dc/elements/1.1/"
+   xmlns:cc="http://creativecommons.org/ns#"
+   xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#"
+   xmlns:svg="http://www.w3.org/2000/svg"
+   xmlns="http://www.w3.org/2000/svg"
+   xmlns:xlink="http://www.w3.org/1999/xlink"
+   version="1.1"
+   width="110.67193"
+   height="96.196815"
+   id="svg8439">
+  <defs
+     id="defs8441">
+    <linearGradient
+       id="linearGradient9412">
+      <stop
+         id="stop9414"
+         style="stop-color:#fc909d;stop-opacity:1"
+         offset="0" />
+      <stop
+         id="stop9416"
+         style="stop-color:#e6384d;stop-opacity:1"
+         offset="1" />
+    </linearGradient>
+    <linearGradient
+       id="linearGradient9389">
+      <stop
+         id="stop9391"
+         style="stop-color:#ffffff;stop-opacity:1"
+         offset="0" />
+      <stop
+         id="stop9393"
+         style="stop-color:#000000;stop-opacity:1"
+         offset="1" />
+    </linearGradient>
+    <linearGradient
+       id="linearGradient9367">
+      <stop
+         id="stop9369"
+         style="stop-color:#ffffff;stop-opacity:1"
+         offset="0" />
+      <stop
+         id="stop9385"
+         style="stop-color:#ffffff;stop-opacity:0.57647061"
+         offset="0.13162723" />
+      <stop
+         id="stop9379"
+         style="stop-color:#ffffff;stop-opacity:0.87692308"
+         offset="0.26325446" />
+      <stop
+         id="stop9377"
+         style="stop-color:#ffffff;stop-opacity:0.14615385"
+         offset="0.3965418" />
+      <stop
+         id="stop9375"
+         style="stop-color:#ffffff;stop-opacity:0.92307693"
+         offset="0.48540002" />
+      <stop
+         id="stop9381"
+         style="stop-color:#ffffff;stop-opacity:0.23846154"
+         offset="0.63126868" />
+      <stop
+         id="stop9383"
+         style="stop-color:#ffffff;stop-opacity:0.89999998"
+         offset="0.80882281" />
+      <stop
+         id="stop9371"
+         style="stop-color:#ffffff;stop-opacity:0"
+         offset="1" />
+    </linearGradient>
+    <linearGradient
+       id="linearGradient9322">
+      <stop
+         id="stop9324"
+         style="stop-color:#c4435d;stop-opacity:1"
+         offset="0" />
+      <stop
+         id="stop9326"
+         style="stop-color:#9d3a4e;stop-opacity:1"
+         offset="1" />
+    </linearGradient>
+    <linearGradient
+       id="linearGradient9314">
+      <stop
+         id="stop9316"
+         style="stop-color:#ffffff;stop-opacity:1"
+         offset="0" />
+      <stop
+         id="stop4257"
+         style="stop-color:#fcc7cb;stop-opacity:1"
+         offset="0.04719589" />
+      <stop
+         id="stop9318"
+         style="stop-color:#f56d7d;stop-opacity:1"
+         offset="1" />
+    </linearGradient>
+    <linearGradient
+       id="linearGradient9293">
+      <stop
+         id="stop9295"
+         style="stop-color:#fe6f80;stop-opacity:1"
+         offset="0" />
+      <stop
+         id="stop9418"
+         style="stop-color:#f23150;stop-opacity:1"
+         offset="0.49315068" />
+      <stop
+         id="stop9297"
+         style="stop-color:#dc1c3c;stop-opacity:1"
+         offset="1" />
+    </linearGradient>
+    <filter
+       color-interpolation-filters="sRGB"
+       id="filter9280">
+      <feGaussianBlur
+         id="feGaussianBlur9282"
+         stdDeviation="2.3513211" />
+    </filter>
+    <linearGradient
+       x1="754.24707"
+       y1="513.5202"
+       x2="754.24707"
+       y2="731.7702"
+       id="linearGradient9328"
+       xlink:href="#linearGradient9322"
+       gradientUnits="userSpaceOnUse"
+       gradientTransform="translate(0,1)" />
+    <linearGradient
+       x1="780.35992"
+       y1="513.5202"
+       x2="826.12207"
+       y2="731.7702"
+       id="linearGradient9351"
+       xlink:href="#linearGradient9293"
+       gradientUnits="userSpaceOnUse"
+       gradientTransform="translate(0,1)" />
+    <linearGradient
+       x1="782.03992"
+       y1="629.48621"
+       x2="803.99365"
+       y2="591.46118"
+       id="linearGradient9373"
+       xlink:href="#linearGradient9367"
+       gradientUnits="userSpaceOnUse"
+       spreadMethod="reflect" />
+    <linearGradient
+       x1="893.86932"
+       y1="514.45074"
+       x2="893.86932"
+       y2="671.54669"
+       id="linearGradient9401"
+       xlink:href="#linearGradient9389"
+       gradientUnits="userSpaceOnUse" />
+    <mask
+       id="mask9397">
+      <path
+         d="M 826.10748,513.50597 700.12208,731.7702 c 9.25847,-8e-5 7.80015,-6.57273 12.18552,-6.57273 4.63672,0.0138 3.77357,3.28637 8.1431,3.28637 2.2464,0 4.04532,-2.22468 4.04252,-7.73593 0,-7.34724 1.82516,-11.08039 4.07156,-11.08039 2.24637,0 4.07155,3.27232 4.07155,6.54351 0,4.11809 1.82516,7.38697 4.07155,7.38697 2.24637,0 4.07156,-5.56758 4.07156,-12.44732 l -0.029,-7.88131 c 0,-9.51106 1.82517,-17.21685 4.07154,-17.21685 2.24637,0 4.07528,3.32956 4.07156,14.10496 0,8.56423 1.82517,17.10052 4.07155,17.10052 2.24637,0 4.07434,-9.93625 4.07155,-25.88342 0,-18.42407 1.79614,-30.33302 4.04241,-30.33302 2.24638,0 4.07445,11.91684 4.07157,23.67305 0,13.72891 1.82525,27.25037 4.07154,27.25037 2.24637,0 4.07445,-13.52442 4.07156,-32.28157 0,-20.14009 1.79612,-34.7827 4.04252,-34.7827 2.24638,0 4.07434,14.64336 4.07155,31.46729 0,17.65378 1.82517,32.97955 4.07156,32.97955 2.24626,0 4.07151,-15.31451 4.07151,-34.92803 0,-19.61373 1.79617,-35.21897 4.04245,-35.21897 2.24629,0 4.07156,15.6061 4.07156,34.89901 0,19.29281 1.82526,34.86998 4.07154,34.86998 2.24629,0 4.07156,-15.58693 4.07156,-34.20103 0,-18.21066 1.79613,-33.47401 4.0425,-33.47401 2.24628,0 4.07155,15.2546 4.07155,35.1607 0,19.17845 1.82515,33.85203 4.07155,33.85203 2.24633,9e-5 4.07435,-14.67453 4.07153,-31.26365 0,-15.58916 1.79615,-29.43149 4.04245,-29.43149 2.24637,0 4.07444,13.86887 4.07151,32.77593 0,17.69316 1.82529,30.53673 4.07159,30.53673 2.24636,-8e-5 4.07441,-12.84806 4.07151,-26.6687 0,-12.45075 1.79617,-24.22578 4.04253,-24.22578 2.2464,0 4.07443,11.76731 4.07144,28.47183 0,15.23586 1.82538,25.79622 4.07158,25.79622 2.24649,-8e-5 4.07164,-10.55393 4.07164,-21.31751 0,-9.24789 1.79605,-18.58368 4.04251,-18.58368 2.24627,0 4.07154,9.32806 4.07154,23.12055 0,12.27551 1.82508,20.38685 4.07155,20.38685 2.24629,0 4.07155,-7.13427 4.07155,-14.91934 0,-8.03247 1.82527,-14.19229 4.07155,-14.19229 2.24638,0 4.04243,6.89877 4.04243,17.53672 0,9.22647 1.82518,15.00657 4.07153,15.00657 2.24649,1.1e-4 4.07156,-4.78097 4.07156,-9.59712 0,-4.96701 1.82519,-9.82998 4.07155,-9.82998 2.2464,8e-5 4.04254,4.69926 4.04254,12.30194 0,6.39443 1.82515,10.1207 4.07155,10.1207 2.24618,0 4.07155,-2.59601 4.07155,-5.61287 0,-3.96768 1.82515,-5.78746 4.07155,-5.78746 2.24629,0 4.04249,2.72344 4.04249,7.76507 0,3.97383 1.82511,6.04915 4.07156,6.04915 4.24691,0 3.67352,-5.28863 8.14307,-5.32213 4.89126,0 3.45654,7.09615 8.11401,7.09615 3.3574,0 5.95645,-1.29391 8.14304,-1.30869 3.63402,0 3.88622,2.55925 12.18558,2.55925 L 826.10763,513.50597 z"
+         id="path9399"
+         style="fill:url(#linearGradient9401);fill-opacity:1;stroke:none;display:inline" />
+    </mask>
+    <linearGradient
+       x1="817.23798"
+       y1="518.19482"
+       x2="696.69739"
+       y2="725.07654"
+       id="linearGradient9410"
+       xlink:href="#linearGradient9314"
+       gradientUnits="userSpaceOnUse"
+       gradientTransform="translate(0,1)" />
+    <filter
+       color-interpolation-filters="sRGB"
+       id="filter5193">
+      <feGaussianBlur
+         id="feGaussianBlur5195"
+         stdDeviation="0.42921875" />
+    </filter>
+    <linearGradient
+       x1="986.59003"
+       y1="500.81711"
+       x2="991.57782"
+       y2="539.79639"
+       id="linearGradient4376"
+       xlink:href="#linearGradient9322"
+       gradientUnits="userSpaceOnUse"
+       gradientTransform="translate(2.000024,1)" />
+    <linearGradient
+       x1="986.59003"
+       y1="500.81711"
+       x2="991.57782"
+       y2="539.79639"
+       id="linearGradient4406"
+       xlink:href="#linearGradient9322"
+       gradientUnits="userSpaceOnUse"
+       gradientTransform="translate(2.000024,1)" />
+    <linearGradient
+       x1="984.36688"
+       y1="507.79288"
+       x2="1005.7471"
+       y2="539.59833"
+       id="linearGradient4417"
+       xlink:href="#linearGradient9293"
+       gradientUnits="userSpaceOnUse"
+       gradientTransform="translate(2.000024,1)" />
+    <linearGradient
+       x1="634.13293"
+       y1="229.35312"
+       x2="613.25647"
+       y2="264.00134"
+       id="linearGradient4435"
+       xlink:href="#linearGradient9314"
+       gradientUnits="userSpaceOnUse"
+       gradientTransform="translate(361.12209,272.7702)" />
+    <linearGradient
+       x1="635"
+       y1="247.39062"
+       x2="656.03125"
+       y2="247.39062"
+       id="linearGradient4443"
+       xlink:href="#linearGradient9412"
+       gradientUnits="userSpaceOnUse"
+       gradientTransform="translate(361.12209,272.7702)" />
+    <linearGradient
+       x1="971.12207"
+       y1="518.84833"
+       x2="1017.1221"
+       y2="518.84833"
+       id="linearGradient4464"
+       xlink:href="#linearGradient9412"
+       gradientUnits="userSpaceOnUse"
+       gradientTransform="translate(2.000024,1)" />
+    <linearGradient
+       x1="738.98798"
+       y1="520.94482"
+       x2="834.44739"
+       y2="747.32654"
+       id="linearGradient4472"
+       xlink:href="#linearGradient9412"
+       gradientUnits="userSpaceOnUse"
+       gradientTransform="translate(0,1)" />
+    <linearGradient
+       x1="623.55115"
+       y1="320.2645"
+       x2="629"
+       y2="346.875"
+       id="linearGradient4567"
+       xlink:href="#linearGradient9322"
+       gradientUnits="userSpaceOnUse"
+       gradientTransform="translate(361.12209,271.7702)" />
+    <linearGradient
+       x1="625.45148"
+       y1="319.25"
+       x2="611"
+       y2="344.00238"
+       id="linearGradient4525"
+       xlink:href="#linearGradient9314"
+       gradientUnits="userSpaceOnUse"
+       gradientTransform="translate(361.12209,273.80145)" />
+    <linearGradient
+       x1="623.55115"
+       y1="320.2645"
+       x2="629"
+       y2="346.875"
+       id="linearGradient4527"
+       xlink:href="#linearGradient9412"
+       gradientUnits="userSpaceOnUse"
+       gradientTransform="translate(361.12209,273.80145)" />
+    <linearGradient
+       x1="978.12207"
+       y1="599.0202"
+       x2="987.12207"
+       y2="619.2702"
+       id="linearGradient4529"
+       xlink:href="#linearGradient9293"
+       gradientUnits="userSpaceOnUse"
+       gradientTransform="translate(2.000024,0)" />
+    <linearGradient
+       x1="618.64557"
+       y1="397"
+       x2="622.39337"
+       y2="414.67593"
+       id="linearGradient4586"
+       xlink:href="#linearGradient9322"
+       gradientUnits="userSpaceOnUse"
+       gradientTransform="translate(361.12209,271.7702)" />
+    <linearGradient
+       x1="621.11475"
+       y1="399.14731"
+       x2="611.74695"
+       y2="415.72012"
+       id="linearGradient4594"
+       xlink:href="#linearGradient9314"
+       gradientUnits="userSpaceOnUse"
+       gradientTransform="translate(361.12209,271.7702)" />
+    <linearGradient
+       x1="622"
+       y1="407.5"
+       x2="631.8125"
+       y2="407.5"
+       id="linearGradient4602"
+       xlink:href="#linearGradient9412"
+       gradientUnits="userSpaceOnUse"
+       gradientTransform="translate(361.12209,271.7702)" />
+    <linearGradient
+       x1="617.80591"
+       y1="399.51733"
+       x2="626.59186"
+       y2="415.02948"
+       id="linearGradient4610"
+       xlink:href="#linearGradient9293"
+       gradientUnits="userSpaceOnUse"
+       gradientTransform="translate(361.12209,271.7702)" />
+    <linearGradient
+       x1="617.40814"
+       y1="465.59119"
+       x2="621.60748"
+       y2="478.64096"
+       id="linearGradient4682"
+       xlink:href="#linearGradient9322"
+       gradientUnits="userSpaceOnUse"
+       gradientTransform="translate(361.12209,271.7702)" />
+    <linearGradient
+       x1="617.8584"
+       y1="468.12088"
+       x2="612.46295"
+       y2="476.46606"
+       id="linearGradient4705"
+       xlink:href="#linearGradient9314"
+       gradientUnits="userSpaceOnUse"
+       gradientTransform="translate(361.12209,271.7702)" />
+    <linearGradient
+       x1="619"
+       y1="473.09375"
+       x2="625.84375"
+       y2="473.09375"
+       id="linearGradient4713"
+       xlink:href="#linearGradient9412"
+       gradientUnits="userSpaceOnUse"
+       gradientTransform="translate(361.12209,271.7702)" />
+    <linearGradient
+       x1="974.46582"
+       y1="741.3576"
+       x2="979.1377"
+       y2="750.84314"
+       id="linearGradient4721"
+       xlink:href="#linearGradient9293"
+       gradientUnits="userSpaceOnUse"
+       gradientTransform="translate(2.000024,0)" />
+    <clipPath
+       id="clipPath3370-4">
+      <path
+         d="m 517.47193,500.02475 -6.83309,11.69808 0.11827,0 c 0.23385,-0.005 0.41368,-0.4418 0.53484,-0.4418 0.1884,0 0.23748,0.0822 0.3993,0.0822 0.18846,0 0.10465,-0.77714 0.328,-0.77714 0.20445,0 0.16353,0.51701 0.45137,0.51701 0.30678,0 0.14443,-1.83629 0.54039,-1.85503 0.36417,0 0.16401,1.62747 0.4888,1.62747 0.32582,0 0.0902,-3.04096 0.54131,-3.04096 0.32931,0 0.10101,2.94373 0.44903,2.94373 0.36497,0 0.18029,-3.69548 0.59053,-3.69548 0.53382,0 0.12696,3.66791 0.61346,3.66791 0.48127,0 0.0596,-3.83569 0.64163,-3.83569 0.60737,0 0.16385,3.86469 0.66689,3.86469 0.54239,0 0.15844,-3.30531 0.7143,-3.30531 0.53696,0 0.0815,3.41601 0.57746,3.41601 0.47964,0 0.12132,-2.4917 0.6098,-2.4917 0.49784,0 0.12552,2.63629 0.57481,2.63629 0.4264,0 0.3035,-1.41439 0.65504,-1.41439 0.37691,0 0.15031,1.61424 0.52638,1.61424 0.35071,0 0.25648,-0.86103 0.58015,-0.86103 0.32367,0 0.0795,1.06341 0.44936,1.06341 0.39523,0 0.15794,-0.38245 0.48539,-0.38245 0.33557,0 0.17631,0.56876 0.44362,0.56876 0.21651,0 0.35,-0.13643 0.58071,-0.13643 0.21559,0 0.47245,0.23574 0.92399,0.23574 l 0.18135,0 -6.83309,-11.69808 z"
+         id="path3372-1"
+         style="color:#000000;fill:none;stroke:#220000;stroke-width:0.98935091;stroke-linecap:round;stroke-linejoin:round;stroke-miterlimit:4;stroke-opacity:1;stroke-dasharray:none;stroke-dashoffset:0;marker:none;visibility:visible;display:inline;overflow:visible" />
+    </clipPath>
+    <linearGradient
+       id="linearGradient2909-4">
+      <stop
+         id="stop2911-9"
+         style="stop-color:#ef2929;stop-opacity:1"
+         offset="0" />
+      <stop
+         id="stop2913-0"
+         style="stop-color:#a40000;stop-opacity:1"
+         offset="1" />
+    </linearGradient>
+    <linearGradient
+       id="linearGradient2948-9">
+      <stop
+         id="stop2950-7"
+         style="stop-color:#ffffff;stop-opacity:1"
+         offset="0" />
+      <stop
+         id="stop2952-4"
+         style="stop-color:#ffffff;stop-opacity:0"
+         offset="1" />
+    </linearGradient>
+    <clipPath
+       id="clipPath3358-5">
+      <path
+         d="m 520.47197,454.77761 -9.86643,16.97854 0.17078,0 c 0.33766,-0.007 0.59732,-0.64123 0.77226,-0.64123 0.27203,0 0.3429,0.11923 0.57655,0.11923 0.27213,0 0.15112,-1.12793 0.47361,-1.12793 0.29521,0 0.23613,0.75039 0.65174,0.75039 0.44297,0 0.20855,-2.66518 0.78029,-2.69238 0.52583,0 0.2368,2.3621 0.70578,2.3621 0.47046,0 0.13031,-4.41363 0.7816,-4.41363 0.47551,0 0.14586,4.27251 0.64838,4.27251 0.52698,0 0.26031,-5.36361 0.85267,-5.36361 0.77079,0 0.18332,5.32359 0.88579,5.32359 0.69491,0 0.0861,-5.5671 0.92646,-5.5671 0.87699,0 0.23659,5.60919 0.96293,5.60919 0.78317,0 0.22877,-4.79731 1.03139,-4.79731 0.77533,0 0.11776,4.95798 0.8338,4.95798 0.69257,0 0.17518,-3.61644 0.8805,-3.61644 0.71884,0 0.18125,3.8263 0.82999,3.8263 0.61568,0 0.43822,-2.05284 0.94583,-2.05284 0.54421,0 0.21703,2.3429 0.76004,2.3429 0.5064,0 0.37033,-1.2497 0.83769,-1.2497 0.46736,0 0.11477,1.54343 0.64884,1.54343 0.57068,0 0.22806,-0.55509 0.70087,-0.55509 0.48453,0 0.25458,0.8255 0.64054,0.8255 0.31262,0 0.50538,-0.19801 0.83851,-0.19801 0.31129,0 0.68218,0.34215 1.33415,0.34215 l 0.26187,0 -9.86643,-16.97854 z"
+         id="path3360-3"
+         style="color:#000000;fill:none;stroke:#220000;stroke-width:0.98935014;stroke-linecap:round;stroke-linejoin:round;stroke-miterlimit:4;stroke-opacity:1;stroke-dasharray:none;stroke-dashoffset:0;marker:none;visibility:visible;display:inline;overflow:visible" />
+    </clipPath>
+    <clipPath
+       id="clipPath3326-4">
+      <path
+         d="m 525.06072,406.75023 -14.50688,25.05667 0.2511,0 c 0.49647,-0.0108 0.87825,-0.94633 1.13548,-0.94633 0.39997,0 0.50417,0.17596 0.84771,0.17596 0.40013,0 0.2222,-1.66458 0.69637,-1.66458 0.43404,0 0.34718,1.10741 0.95827,1.10741 0.6513,0 0.30663,-3.93324 1.14727,-3.97337 0.77315,0 0.34819,3.48594 1.03774,3.48594 0.69173,0 0.1916,-6.51356 1.14921,-6.51356 0.69915,0 0.21446,6.30529 0.95332,6.30529 0.77483,0 0.38274,-7.91551 1.25371,-7.91551 1.13331,0 0.26954,7.85646 1.30241,7.85646 1.02174,0 0.12653,-8.21584 1.36219,-8.21584 1.28946,0 0.34786,8.27795 1.41582,8.27795 1.15151,0 0.33637,-7.07979 1.5165,-7.07979 1.13997,0 0.17313,7.31691 1.22594,7.31691 1.01831,0 0.25758,-5.33708 1.29463,-5.33708 1.05694,0 0.2665,5.64679 1.22035,5.64679 0.90525,0 0.64434,-3.02956 1.39068,-3.02956 0.80018,0 0.31911,3.45763 1.11752,3.45763 0.74458,0 0.5445,-1.84429 1.23168,-1.84429 0.68716,0 0.16875,2.27776 0.954,2.27776 0.83909,0 0.33533,-0.81918 1.03051,-0.81918 0.71242,0 0.37431,1.21825 0.9418,1.21825 0.45966,0 0.74308,-0.29222 1.23288,-0.29222 0.4577,0 1.00303,0.50496 1.96165,0.50496 l 0.38503,0 -14.50689,-25.05667 z"
+         id="path3328-6"
+         style="color:#000000;fill:none;stroke:#220000;stroke-width:0.98934937;stroke-linecap:round;stroke-linejoin:round;stroke-miterlimit:4;stroke-opacity:1;stroke-dasharray:none;stroke-dashoffset:0;marker:none;visibility:visible;display:inline;overflow:visible" />
+    </clipPath>
+    <filter
+       color-interpolation-filters="sRGB"
+       id="filter3580-4">
+      <feGaussianBlur
+         id="feGaussianBlur3582-7"
+         stdDeviation="0.57069585" />
+    </filter>
+    <filter
+       color-interpolation-filters="sRGB"
+       id="filter3576-4">
+      <feGaussianBlur
+         id="feGaussianBlur3578-5"
+         stdDeviation="0.734228" />
+    </filter>
+    <filter
+       color-interpolation-filters="sRGB"
+       id="filter3564-1">
+      <feGaussianBlur
+         id="feGaussianBlur3566-0"
+         stdDeviation="1.0814088" />
+    </filter>
+    <filter
+       color-interpolation-filters="sRGB"
+       id="filter3552-0">
+      <feGaussianBlur
+         id="feGaussianBlur3554-8"
+         stdDeviation="1.4698453" />
+    </filter>
+    <linearGradient
+       x1="782.03992"
+       y1="629.48621"
+       x2="803.99365"
+       y2="591.46118"
+       id="linearGradient5635"
+       xlink:href="#linearGradient9367"
+       gradientUnits="userSpaceOnUse"
+       spreadMethod="reflect" />
+    <linearGradient
+       x1="780.35992"
+       y1="513.5202"
+       x2="826.12207"
+       y2="731.7702"
+       id="linearGradient5639"
+       xlink:href="#linearGradient9293"
+       gradientUnits="userSpaceOnUse"
+       gradientTransform="matrix(0.50793651,0,0,0.50793651,342.50449,539.07739)" />
+    <linearGradient
+       x1="796.71729"
+       y1="522.22095"
+       x2="826.12207"
+       y2="731.7702"
+       id="linearGradient5679"
+       xlink:href="#linearGradient9412"
+       gradientUnits="userSpaceOnUse"
+       gradientTransform="matrix(0.50793651,0,0,0.50793651,342.50449,539.07739)" />
+    <linearGradient
+       x1="815.85889"
+       y1="515.6084"
+       x2="695.95917"
+       y2="727.94189"
+       id="linearGradient5681"
+       xlink:href="#linearGradient9314"
+       gradientUnits="userSpaceOnUse"
+       gradientTransform="matrix(0.50793651,0,0,0.50793651,342.50449,539.07739)" />
+    <linearGradient
+       x1="777.57568"
+       y1="520.8288"
+       x2="819.1615"
+       y2="730.37811"
+       id="linearGradient5797"
+       xlink:href="#linearGradient9322"
+       gradientUnits="userSpaceOnUse"
+       gradientTransform="matrix(0.50793651,0,0,0.50793651,342.50449,539.07739)" />
+    <linearGradient
+       x1="777.57568"
+       y1="520.8288"
+       x2="819.1615"
+       y2="730.37811"
+       id="linearGradient5802"
+       xlink:href="#linearGradient9322"
+       gradientUnits="userSpaceOnUse"
+       gradientTransform="matrix(0.50793651,0,0,0.50793651,342.50449,539.07739)" />
+    <linearGradient
+       id="linearGradient3302">
+      <stop
+         id="stop3304"
+         style="stop-color:#ff5c4e;stop-opacity:1"
+         offset="0" />
+      <stop
+         id="stop3306"
+         style="stop-color:#b60e00;stop-opacity:1"
+         offset="1" />
+    </linearGradient>
+    <linearGradient
+       id="linearGradient3272">
+      <stop
+         id="stop3274"
+         style="stop-color:#ffffff;stop-opacity:1"
+         offset="0" />
+      <stop
+         id="stop3276"
+         style="stop-color:#ffffff;stop-opacity:0"
+         offset="1" />
+    </linearGradient>
+    <linearGradient
+       id="linearGradient3286">
+      <stop
+         id="stop3288"
+         style="stop-color:#ffdedb;stop-opacity:1"
+         offset="0" />
+      <stop
+         id="stop3290"
+         style="stop-color:#ffb0a7;stop-opacity:0"
+         offset="1" />
+    </linearGradient>
+    <linearGradient
+       id="linearGradient3280">
+      <stop
+         id="stop3282"
+         style="stop-color:#f54f2a;stop-opacity:1"
+         offset="0" />
+      <stop
+         id="stop3284"
+         style="stop-color:#f00000;stop-opacity:1"
+         offset="1" />
+    </linearGradient>
+    <linearGradient
+       x1="267.43826"
+       y1="472.84845"
+       x2="267.43826"
+       y2="367.69006"
+       id="linearGradient4992"
+       xlink:href="#linearGradient3280"
+       gradientUnits="userSpaceOnUse"
+       gradientTransform="matrix(0.976517,0,0,0.976481,156.7804,-373.24438)" />
+    <linearGradient
+       x1="265.87448"
+       y1="505.55688"
+       x2="265.87448"
+       y2="613.28198"
+       id="linearGradient4994"
+       xlink:href="#linearGradient3286"
+       gradientUnits="userSpaceOnUse"
+       gradientTransform="translate(152,-524.36218)" />
+    <linearGradient
+       x1="102.5"
+       y1="520.39093"
+       x2="110.35352"
+       y2="557.77966"
+       id="linearGradient4996"
+       xlink:href="#linearGradient3272"
+       gradientUnits="userSpaceOnUse"
+       gradientTransform="translate(312,-524.36218)" />
+    <linearGradient
+       x1="265.93857"
+       y1="504.10068"
+       x2="265.93857"
+       y2="613.34515"
+       id="linearGradient4998"
+       xlink:href="#linearGradient3302"
+       gradientUnits="userSpaceOnUse"
+       gradientTransform="translate(152,-524.36218)" />
+    <linearGradient
+       x1="533.21936"
+       y1="352.49026"
+       x2="533.21936"
+       y2="391.89038"
+       id="linearGradient5084"
+       xlink:href="#linearGradient2909-4"
+       gradientUnits="userSpaceOnUse"
+       gradientTransform="matrix(1.0248432,0,0,1.0248432,-291.66926,-133.15238)" />
+    <linearGradient
+       x1="510.4718"
+       y1="370.71924"
+       x2="555.99078"
+       y2="370.71924"
+       id="linearGradient5086"
+       xlink:href="#linearGradient2909-4"
+       gradientUnits="userSpaceOnUse"
+       gradientTransform="matrix(1.0248432,0,0,1.0248432,-291.66926,-133.15238)" />
+    <linearGradient
+       x1="464.53885"
+       y1="370.74988"
+       x2="474.86389"
+       y2="376.89917"
+       id="linearGradient5088"
+       xlink:href="#linearGradient2948-9"
+       gradientUnits="userSpaceOnUse"
+       gradientTransform="matrix(1.0358753,0,0,1.0358753,-235.14286,-137.48131)"
+       spreadMethod="pad" />
+    <linearGradient
+       x1="533.21936"
+       y1="352.49026"
+       x2="533.21936"
+       y2="391.89038"
+       id="linearGradient5090"
+       xlink:href="#linearGradient2909-4"
+       gradientUnits="userSpaceOnUse"
+       gradientTransform="matrix(0.637733,0,0,0.635955,185.0093,182.5827)" />
+    <linearGradient
+       x1="510.4718"
+       y1="370.71924"
+       x2="555.99078"
+       y2="370.71924"
+       id="linearGradient5092"
+       xlink:href="#linearGradient2909-4"
+       gradientUnits="userSpaceOnUse"
+       gradientTransform="matrix(0.637733,0,0,0.635955,185.0093,182.5827)" />
+    <linearGradient
+       x1="464.53885"
+       y1="370.74988"
+       x2="474.86389"
+       y2="376.89917"
+       id="linearGradient5094"
+       xlink:href="#linearGradient2948-9"
+       gradientUnits="userSpaceOnUse"
+       gradientTransform="matrix(0.644597,0,0,0.6428,220.1842,179.8964)"
+       spreadMethod="pad" />
+    <linearGradient
+       x1="533.21936"
+       y1="352.49026"
+       x2="533.21936"
+       y2="391.89038"
+       id="linearGradient5096"
+       xlink:href="#linearGradient2909-4"
+       gradientUnits="userSpaceOnUse"
+       gradientTransform="matrix(0.433735,0,0,0.430926,289.196,302.8803)" />
+    <linearGradient
+       x1="510.4718"
+       y1="370.71924"
+       x2="555.99078"
+       y2="370.71924"
+       id="linearGradient5098"
+       xlink:href="#linearGradient2909-4"
+       gradientUnits="userSpaceOnUse"
+       gradientTransform="matrix(0.433735,0,0,0.430926,289.196,302.8803)" />
+    <linearGradient
+       x1="464.53885"
+       y1="370.74988"
+       x2="474.86389"
+       y2="376.89917"
+       id="linearGradient5100"
+       xlink:href="#linearGradient2948-9"
+       gradientUnits="userSpaceOnUse"
+       gradientTransform="matrix(0.438404,0,0,0.435565,313.1192,301.0601)"
+       spreadMethod="pad" />
+    <linearGradient
+       x1="533.21936"
+       y1="352.49026"
+       x2="533.21936"
+       y2="391.89038"
+       id="linearGradient5102"
+       xlink:href="#linearGradient2909-4"
+       gradientUnits="userSpaceOnUse"
+       gradientTransform="matrix(0.300387,0,0,0.296905,357.2996,395.3686)" />
+    <linearGradient
+       x1="510.4718"
+       y1="370.71924"
+       x2="555.99078"
+       y2="370.71924"
+       id="linearGradient5104"
+       xlink:href="#linearGradient2909-4"
+       gradientUnits="userSpaceOnUse"
+       gradientTransform="matrix(0.300387,0,0,0.296905,357.2996,395.3686)" />
+    <linearGradient
+       x1="464.53885"
+       y1="370.74988"
+       x2="474.86389"
+       y2="376.89917"
+       id="linearGradient5106"
+       xlink:href="#linearGradient2948-9"
+       gradientUnits="userSpaceOnUse"
+       gradientTransform="matrix(0.303621,0,0,0.300101,373.8678,394.1145)"
+       spreadMethod="pad" />
+  </defs>
+  <metadata
+     id="metadata8444">
+    <rdf:RDF>
+      <cc:Work
+         rdf:about="">
+        <dc:format>image/svg+xml</dc:format>
+        <dc:type
+           rdf:resource="http://purl.org/dc/dcmitype/StillImage" />
+        <dc:title></dc:title>
+      </cc:Work>
+    </rdf:RDF>
+  </metadata>
+  <g
+     transform="translate(-183.664,-49.44248)"
+     id="layer2"
+     style="display:none">
+    <rect
+       width="256"
+       height="256"
+       x="339"
+       y="224"
+       id="ardour-app-icon_tango_256px"
+       style="fill:#e4e4e4;fill-opacity:1;stroke:none" />
+    <rect
+       width="48"
+       height="48"
+       x="613"
+       y="224"
+       id="ardour-app-icon_tango_048px"
+       style="fill:#e4e4e4;fill-opacity:1;stroke:none" />
+    <rect
+       width="22"
+       height="22"
+       x="613"
+       y="396"
+       id="ardour-app-icon_tango_022px"
+       style="fill:#e4e4e4;fill-opacity:1;stroke:none" />
+    <rect
+       width="16"
+       height="16"
+       x="613"
+       y="464"
+       id="ardour-app-icon_tango_016px"
+       style="fill:#e4e4e4;fill-opacity:1;stroke:none" />
+    <rect
+       width="32"
+       height="32"
+       x="613"
+       y="318"
+       id="ardour-app-icon_tango_032px"
+       style="fill:#e4e4e4;fill-opacity:1;stroke:none" />
+    <rect
+       width="128"
+       height="128"
+       x="339"
+       y="519"
+       id="ardour-app-icon_osx"
+       style="fill:#e4e4e4;fill-opacity:1;stroke:none" />
+    <rect
+       width="128"
+       height="128"
+       x="485"
+       y="519"
+       id="ardour-app-icon_osx_mask"
+       style="fill:#e4e4e4;fill-opacity:1;stroke:none" />
+  </g>
+  <g
+     transform="translate(-542.78607,-321.21268)"
+     id="layer1"
+     style="display:inline">
+    <g
+       transform="matrix(0.3373212,0,0,0.3373212,699.81724,348.542)"
+       id="g6666"
+       style="fill:#212a30;fill-opacity:1;display:inline" />
+    <g
+       transform="matrix(0.3373212,0,0,0.3373212,689.81729,348.542)"
+       id="g6704"
+       style="fill:#212a30;fill-opacity:1;display:inline" />
+  </g>
+  <g
+     transform="translate(-183.664,-49.44248)"
+     id="layer4"
+     style="display:none">
+    <rect
+       width="48"
+       height="48"
+       x="675"
+       y="224"
+       id="rect3566"
+       style="fill:#e4e4e4;fill-opacity:1;stroke:none;display:inline" />
+    <g
+       transform="translate(-324.53698,-271.7702)"
+       id="g6751">
+      <path
+         d="m 1260.7176,361.22437 -45.6733,79.11527 91.3482,3e-5 z"
+         id="path3453"
+         style="fill:#ff2121;fill-opacity:0.5205479;stroke:none;display:inline" />
+      <g
+         transform="matrix(0.33733958,0,0,0.33733358,1034.2763,24.71293)"
+         id="g3455"
+         style="display:inline">
+        <path
+           d="m 535.87037,1232.0937 84.61624,-87.9482 84.61624,-87.9483"
+           id="path3457"
+           style="fill:none;stroke:#000000;stroke-width:2.96439838px;stroke-linecap:butt;stroke-linejoin:miter;stroke-opacity:1" />
+        <path
+           d="M 603.55986,1114.8285 806.6649,1232.0941"
+           id="path3459"
+           style="fill:none;stroke:#000000;stroke-width:2.96439838px;stroke-linecap:butt;stroke-linejoin:miter;stroke-opacity:1" />
+      </g>
+      <g
+         transform="matrix(0.29349342,0,0,-0.08895244,1084.1881,606.86694)"
+         id="g3497"
+         style="display:inline">
+        <rect
+           width="10.046631"
+           height="81.801888"
+           x="445.87036"
+           y="-1872.0938"
+           transform="scale(1,-1)"
+           id="rect3499"
+           style="fill:#2139ff;fill-opacity:0.5205479;stroke:none" />
+        <rect
+           width="10.046631"
+           height="81.801888"
+           x="465.96362"
+           y="-1872.0938"
+           transform="scale(1,-1)"
+           id="rect3502"
+           style="fill:#2139ff;fill-opacity:0.5205479;stroke:none" />
+        <rect
+           width="10.046631"
+           height="81.801888"
+           x="486.05688"
+           y="-1872.0938"
+           transform="scale(1,-1)"
+           id="rect3504"
+           style="fill:#2139ff;fill-opacity:0.5205479;stroke:none" />
+        <rect
+           width="10.046631"
+           height="81.801888"
+           x="506.15009"
+           y="-1872.0938"
+           transform="scale(1,-1)"
+           id="rect3506"
+           style="fill:#2139ff;fill-opacity:0.5205479;stroke:none" />
+        <rect
+           width="10.046631"
+           height="81.801888"
+           x="526.24335"
+           y="-1872.0938"
+           transform="scale(1,-1)"
+           id="rect3508"
+           style="fill:#2139ff;fill-opacity:0.5205479;stroke:none" />
+        <rect
+           width="10.046631"
+           height="81.801888"
+           x="546.33661"
+           y="-1872.0938"
+           transform="scale(1,-1)"
+           id="rect3510"
+           style="fill:#2139ff;fill-opacity:0.5205479;stroke:none" />
+        <rect
+           width="10.046631"
+           height="81.801888"
+           x="566.42981"
+           y="-1872.0938"
+           transform="scale(1,-1)"
+           id="rect3512"
+           style="fill:#2139ff;fill-opacity:0.5205479;stroke:none" />
+        <rect
+           width="10.046631"
+           height="81.801888"
+           x="586.52307"
+           y="-1872.0938"
+           transform="scale(1,-1)"
+           id="rect3514"
+           style="fill:#2139ff;fill-opacity:0.5205479;stroke:none" />
+        <rect
+           width="10.046631"
+           height="81.801888"
+           x="606.61633"
+           y="-1872.0938"
+           transform="scale(1,-1)"
+           id="rect3516"
+           style="fill:#2139ff;fill-opacity:0.5205479;stroke:none" />
+        <rect
+           width="10.046631"
+           height="81.801888"
+           x="626.70959"
+           y="-1872.0938"
+           transform="scale(1,-1)"
+           id="rect3518"
+           style="fill:#2139ff;fill-opacity:0.5205479;stroke:none" />
+        <rect
+           width="10.046631"
+           height="81.801888"
+           x="646.80292"
+           y="-1872.0938"
+           transform="scale(1,-1)"
+           id="rect3520"
+           style="fill:#2139ff;fill-opacity:0.5205479;stroke:none" />
+        <rect
+           width="10.046631"
+           height="81.801888"
+           x="666.896"
+           y="-1872.0938"
+           transform="scale(1,-1)"
+           id="rect3522"
+           style="fill:#2139ff;fill-opacity:0.5205479;stroke:none" />
+        <rect
+           width="10.046631"
+           height="81.801888"
+           x="686.78412"
+           y="-1872.0938"
+           transform="scale(1,-1)"
+           id="rect3524"
+           style="fill:#2139ff;fill-opacity:0.5205479;stroke:none" />
+        <rect
+           width="10.046631"
+           height="81.801888"
+           x="706.8772"
+           y="-1872.0938"
+           transform="scale(1,-1)"
+           id="rect3526"
+           style="fill:#2139ff;fill-opacity:0.5205479;stroke:none" />
+        <rect
+           width="10.046631"
+           height="81.801888"
+           x="726.97034"
+           y="-1872.0938"
+           transform="scale(1,-1)"
+           id="rect3528"
+           style="fill:#2139ff;fill-opacity:0.5205479;stroke:none" />
+        <rect
+           width="10.046631"
+           height="81.801888"
+           x="747.06348"
+           y="-1872.0938"
+           transform="scale(1,-1)"
+           id="rect3530"
+           style="fill:#2139ff;fill-opacity:0.5205479;stroke:none" />
+      </g>
+      <path
+         d="m 940.25,4018.2373 a 4.125,42.375 0 1 1 -8.25,0 4.125,42.375 0 1 1 8.25,0 z"
+         transform="matrix(0.35741642,0,0,0.07118067,890.77426,149.32586)"
+         id="path4456"
+         style="fill:#51cd65;fill-opacity:0.71689501;stroke:none;display:inline" />
+      <path
+         d="m 940.25,4018.2373 a 4.125,42.375 0 1 1 -8.25,0 4.125,42.375 0 1 1 8.25,0 z"
+         transform="matrix(0.35741642,0,0,0.14731012,896.66747,-161.92038)"
+         id="path4458"
+         style="fill:#51cd65;fill-opacity:0.71689501;stroke:none;display:inline" />
+      <path
+         d="m 940.25,4018.2373 a 4.125,42.375 0 1 1 -8.25,0 4.125,42.375 0 1 1 8.25,0 z"
+         transform="matrix(0.35741642,0,0,0.2278683,902.56068,-491.27292)"
+         id="path4460"
+         style="fill:#51cd65;fill-opacity:0.71689501;stroke:none;display:inline" />
+      <path
+         d="m 940.25,4018.2373 a 4.125,42.375 0 1 1 -8.25,0 4.125,42.375 0 1 1 8.25,0 z"
+         transform="matrix(0.35741642,0,0,0.27983723,908.4539,-703.7417)"
+         id="path4462"
+         style="fill:#51cd65;fill-opacity:0.71689501;stroke:none;display:inline" />
+      <path
+         d="m 940.25,4018.2373 a 4.125,42.375 0 1 1 -8.25,0 4.125,42.375 0 1 1 8.25,0 z"
+         transform="matrix(0.35741642,0,0,0.29824058,914.34712,-778.98163)"
+         id="path4464"
+         style="fill:#51cd65;fill-opacity:0.71689501;stroke:none;display:inline" />
+      <path
+         d="m 940.25,4018.2373 a 4.125,42.375 0 1 1 -8.25,0 4.125,42.375 0 1 1 8.25,0 z"
+         transform="matrix(0.35741642,0,0,0.29141763,920.24036,-751.08677)"
+         id="path4466"
+         style="fill:#51cd65;fill-opacity:0.71689501;stroke:none;display:inline" />
+      <path
+         d="m 940.25,4018.2373 a 4.125,42.375 0 1 1 -8.25,0 4.125,42.375 0 1 1 8.25,0 z"
+         transform="matrix(0.35741642,0,0,0.02116867,884.88106,353.79406)"
+         id="path4468"
+         style="fill:#51cd65;fill-opacity:0.71689501;stroke:none;display:inline" />
+      <path
+         d="m 940.25,4018.2373 a 4.125,42.375 0 1 1 -8.25,0 4.125,42.375 0 1 1 8.25,0 z"
+         transform="matrix(0.35741642,0,0,0.22463596,932.02681,-478.0579)"
+         id="path4470"
+         style="fill:#51cd65;fill-opacity:0.71689501;stroke:none;display:inline" />
+      <path
+         d="m 940.25,4018.2373 a 4.125,42.375 0 1 1 -8.25,0 4.125,42.375 0 1 1 8.25,0 z"
+         transform="matrix(0.35741642,0,0,0.17841394,937.92005,-289.08461)"
+         id="path4472"
+         style="fill:#51cd65;fill-opacity:0.71689501;stroke:none;display:inline" />
+      <path
+         d="m 940.25,4018.2373 a 4.125,42.375 0 1 1 -8.25,0 4.125,42.375 0 1 1 8.25,0 z"
+         transform="matrix(0.35741642,0,0,0.13200038,943.81329,-99.328271)"
+         id="path4474"
+         style="fill:#51cd65;fill-opacity:0.71689501;stroke:none;display:inline" />
+      <path
+         d="m 940.25,4018.2373 a 4.125,42.375 0 1 1 -8.25,0 4.125,42.375 0 1 1 8.25,0 z"
+         transform="matrix(0.35741642,0,0,0.08970776,949.70651,73.58005)"
+         id="path4476"
+         style="fill:#51cd65;fill-opacity:0.71689501;stroke:none;display:inline" />
+      <path
+         d="m 940.25,4018.2373 a 4.125,42.375 0 1 1 -8.25,0 4.125,42.375 0 1 1 8.25,0 z"
+         transform="matrix(0.35741642,0,0,0.05420002,955.59974,218.74929)"
+         id="path4478"
+         style="fill:#51cd65;fill-opacity:0.71689501;stroke:none;display:inline" />
+      <path
+         d="m 940.25,4018.2373 a 4.125,42.375 0 1 1 -8.25,0 4.125,42.375 0 1 1 8.25,0 z"
+         transform="matrix(0.35741642,0,0,0.02686885,961.49297,330.48949)"
+         id="path4480"
+         style="fill:#51cd65;fill-opacity:0.71689501;stroke:none;display:inline" />
+      <path
+         d="m 940.25,4018.2373 a 4.125,42.375 0 1 1 -8.25,0 4.125,42.375 0 1 1 8.25,0 z"
+         transform="matrix(0.35741642,0,0,0.00825132,967.3862,406.60507)"
+         id="path4482"
+         style="fill:#51cd65;fill-opacity:0.71689501;stroke:none;display:inline" />
+      <path
+         d="m 940.25,4018.2373 a 4.125,42.375 0 1 1 -8.25,0 4.125,42.375 0 1 1 8.25,0 z"
+         transform="matrix(0.35741642,0,0,0.26465723,926.13359,-641.68008)"
+         id="path4484"
+         style="fill:#51cd65;fill-opacity:0.71689501;stroke:none;display:inline" />
+      <path
+         d="m 940.25,4018.2373 a 4.125,42.375 0 1 1 -8.25,0 4.125,42.375 0 1 1 8.25,0 z"
+         transform="matrix(0.35741642,0,0,0.00571076,881.93446,416.99194)"
+         id="path4486"
+         style="fill:#5195cd;fill-opacity:0.71689507;stroke:none;display:inline" />
+      <path
+         d="m 940.25,4018.2373 a 4.125,42.375 0 1 1 -8.25,0 4.125,42.375 0 1 1 8.25,0 z"
+         transform="matrix(0.35741642,0,0,0.10655593,893.72087,4.6983548)"
+         id="path4488"
+         style="fill:#5195cd;fill-opacity:0.71689507;stroke:none;display:inline" />
+      <path
+         d="m 940.25,4018.2373 a 4.125,42.375 0 1 1 -8.25,0 4.125,42.375 0 1 1 8.25,0 z"
+         transform="matrix(0.35741642,0,0,0.18950674,899.61408,-334.43621)"
+         id="path4490"
+         style="fill:#5195cd;fill-opacity:0.71689507;stroke:none;display:inline" />
+      <path
+         d="m 940.25,4018.2373 a 4.125,42.375 0 1 1 -8.25,0 4.125,42.375 0 1 1 8.25,0 z"
+         transform="matrix(0.35741642,0,0,0.25841829,905.5073,-616.173)"
+         id="path4492"
+         style="fill:#5195cd;fill-opacity:0.71689507;stroke:none;display:inline" />
+      <path
+         d="m 940.25,4018.2373 a 4.125,42.375 0 1 1 -8.25,0 4.125,42.375 0 1 1 8.25,0 z"
+         transform="matrix(0.35741642,0,0,0.29269198,911.40051,-756.29684)"
+         id="path4494"
+         style="fill:#5195cd;fill-opacity:0.71689507;stroke:none;display:inline" />
+      <path
+         d="m 940.25,4018.2373 a 4.125,42.375 0 1 1 -8.25,0 4.125,42.375 0 1 1 8.25,0 z"
+         transform="matrix(0.35741642,0,0,0.29766629,917.29374,-776.63371)"
+         id="path4496"
+         style="fill:#5195cd;fill-opacity:0.71689507;stroke:none;display:inline" />
+      <path
+         d="m 940.25,4018.2373 a 4.125,42.375 0 1 1 -8.25,0 4.125,42.375 0 1 1 8.25,0 z"
+         transform="matrix(0.35741642,0,0,0.28014287,923.18697,-704.99125)"
+         id="path4498"
+         style="fill:#5195cd;fill-opacity:0.71689507;stroke:none;display:inline" />
+      <path
+         d="m 940.25,4018.2373 a 4.125,42.375 0 1 1 -8.25,0 4.125,42.375 0 1 1 8.25,0 z"
+         transform="matrix(0.35741642,0,0,0.04274316,887.82767,265.58924)"
+         id="path4500"
+         style="fill:#5195cd;fill-opacity:0.71689507;stroke:none;display:inline" />
+      <path
+         d="m 940.25,4018.2373 a 4.125,42.375 0 1 1 -8.25,0 4.125,42.375 0 1 1 8.25,0 z"
+         transform="matrix(0.35741642,0,0,0.20188797,934.97343,-385.05544)"
+         id="path4502"
+         style="fill:#5195cd;fill-opacity:0.71689507;stroke:none;display:inline" />
+      <path
+         d="m 940.25,4018.2373 a 4.125,42.375 0 1 1 -8.25,0 4.125,42.375 0 1 1 8.25,0 z"
+         transform="matrix(0.35741642,0,0,0.15492003,940.86666,-193.03258)"
+         id="path4504"
+         style="fill:#5195cd;fill-opacity:0.71689507;stroke:none;display:inline" />
+      <path
+         d="m 940.25,4018.2373 a 4.125,42.375 0 1 1 -8.25,0 4.125,42.375 0 1 1 8.25,0 z"
+         transform="matrix(0.35741642,0,0,0.11013762,946.75988,-9.9449918)"
+         id="path4507"
+         style="fill:#5195cd;fill-opacity:0.71689507;stroke:none;display:inline" />
+      <path
+         d="m 940.25,4018.2373 a 4.125,42.375 0 1 1 -8.25,0 4.125,42.375 0 1 1 8.25,0 z"
+         transform="matrix(0.35741642,0,0,0.0709939,952.65312,150.08944)"
+         id="path4509"
+         style="fill:#5195cd;fill-opacity:0.71689507;stroke:none;display:inline" />
+      <path
+         d="m 940.25,4018.2373 a 4.125,42.375 0 1 1 -8.25,0 4.125,42.375 0 1 1 8.25,0 z"
+         transform="matrix(0.35741642,0,0,0.03946312,958.54636,278.99937)"
+         id="path4511"
+         style="fill:#5195cd;fill-opacity:0.71689507;stroke:none;display:inline" />
+      <path
+         d="m 940.25,4018.2373 a 4.125,42.375 0 1 1 -8.25,0 4.125,42.375 0 1 1 8.25,0 z"
+         transform="matrix(0.35741642,0,0,0.01646103,964.43958,373.04066)"
+         id="path4513"
+         style="fill:#5195cd;fill-opacity:0.71689507;stroke:none;display:inline" />
+      <path
+         d="m 940.25,4018.2373 a 4.125,42.375 0 1 1 -8.25,0 4.125,42.375 0 1 1 8.25,0 z"
+         transform="matrix(0.35741642,0,0,0.00295832,970.33282,428.32713)"
+         id="path4515"
+         style="fill:#5195cd;fill-opacity:0.71689507;stroke:none;display:inline" />
+      <path
+         d="m 940.25,4018.2373 a 4.125,42.375 0 1 1 -8.25,0 4.125,42.375 0 1 1 8.25,0 z"
+         transform="matrix(0.35741642,0,0,0.2458517,929.0802,-564.79584)"
+         id="path4517"
+         style="fill:#5195cd;fill-opacity:0.71689507;stroke:none;display:inline" />
+      <path
+         d="m 1215.0481,440.33962 c 16.7294,-6.50973 19.182,-33.61081 35.0864,-33.61081 15.9041,0 30.4576,29.79772 56.256,33.60746"
+         id="path4554"
+         style="fill:none;stroke:#ffdd1b;stroke-width:1.00000012px;stroke-linecap:butt;stroke-linejoin:miter;stroke-opacity:0.56164383;display:inline" />
+      <path
+         d="m 1215.0481,440.33962 c 16.7294,-1.6066 19.182,-8.29501 35.0864,-8.29501 15.9041,0 30.4576,7.35395 56.256,8.29416"
+         id="path4628"
+         style="fill:none;stroke:#ffdd1b;stroke-width:0.99999994px;stroke-linecap:butt;stroke-linejoin:miter;stroke-opacity:0.56164383;display:inline" />
+    </g>
+    <path
+       d="M 698.99773,227.15752 676.00003,267 l 45.9999,0 -23.0025,-39.84248 z"
+       id="path3568"
+       style="fill:#e02047;fill-opacity:1;stroke:none;display:inline" />
+    <g
+       transform="matrix(0.15422196,0,0,-0.06112323,606.23701,390.42843)"
+       id="g3570"
+       style="display:inline">
+      <rect
+         width="10.046631"
+         height="81.801888"
+         x="445.87036"
+         y="-1872.0938"
+         transform="scale(1,-1)"
+         id="rect3572"
+         style="fill:#2139ff;fill-opacity:0.5205479;stroke:none" />
+      <rect
+         width="10.046631"
+         height="81.801888"
+         x="465.96362"
+         y="-1872.0938"
+         transform="scale(1,-1)"
+         id="rect3574"
+         style="fill:#2139ff;fill-opacity:0.5205479;stroke:none" />
+      <rect
+         width="10.046631"
+         height="81.801888"
+         x="486.05688"
+         y="-1872.0938"
+         transform="scale(1,-1)"
+         id="rect3576"
+         style="fill:#2139ff;fill-opacity:0.5205479;stroke:none" />
+      <rect
+         width="10.046631"
+         height="81.801888"
+         x="506.15009"
+         y="-1872.0938"
+         transform="scale(1,-1)"
+         id="rect3578"
+         style="fill:#2139ff;fill-opacity:0.5205479;stroke:none" />
+      <rect
+         width="10.046631"
+         height="81.801888"
+         x="526.24335"
+         y="-1872.0938"
+         transform="scale(1,-1)"
+         id="rect3580"
+         style="fill:#2139ff;fill-opacity:0.5205479;stroke:none" />
+      <rect
+         width="10.046631"
+         height="81.801888"
+         x="546.33661"
+         y="-1872.0938"
+         transform="scale(1,-1)"
+         id="rect3582"
+         style="fill:#2139ff;fill-opacity:0.5205479;stroke:none" />
+      <rect
+         width="10.046631"
+         height="81.801888"
+         x="566.42981"
+         y="-1872.0938"
+         transform="scale(1,-1)"
+         id="rect3584"
+         style="fill:#2139ff;fill-opacity:0.5205479;stroke:none" />
+      <rect
+         width="10.046631"
+         height="81.801888"
+         x="586.52307"
+         y="-1872.0938"
+         transform="scale(1,-1)"
+         id="rect3586"
+         style="fill:#2139ff;fill-opacity:0.5205479;stroke:none" />
+      <rect
+         width="10.046631"
+         height="81.801888"
+         x="606.61633"
+         y="-1872.0938"
+         transform="scale(1,-1)"
+         id="rect3588"
+         style="fill:#2139ff;fill-opacity:0.5205479;stroke:none" />
+      <rect
+         width="10.046631"
+         height="81.801888"
+         x="626.70959"
+         y="-1872.0938"
+         transform="scale(1,-1)"
+         id="rect3590"
+         style="fill:#2139ff;fill-opacity:0.5205479;stroke:none" />
+      <rect
+         width="10.046631"
+         height="81.801888"
+         x="646.80292"
+         y="-1872.0938"
+         transform="scale(1,-1)"
+         id="rect3592"
+         style="fill:#2139ff;fill-opacity:0.5205479;stroke:none" />
+      <rect
+         width="10.046631"
+         height="81.801888"
+         x="666.896"
+         y="-1872.0938"
+         transform="scale(1,-1)"
+         id="rect3594"
+         style="fill:#2139ff;fill-opacity:0.5205479;stroke:none" />
+      <rect
+         width="10.046631"
+         height="81.801888"
+         x="686.78412"
+         y="-1872.0938"
+         transform="scale(1,-1)"
+         id="rect3596"
+         style="fill:#2139ff;fill-opacity:0.5205479;stroke:none" />
+      <rect
+         width="10.046631"
+         height="81.801888"
+         x="706.8772"
+         y="-1872.0938"
+         transform="scale(1,-1)"
+         id="rect3598"
+         style="fill:#2139ff;fill-opacity:0.5205479;stroke:none" />
+      <rect
+         width="10.046631"
+         height="81.801888"
+         x="726.97034"
+         y="-1872.0938"
+         transform="scale(1,-1)"
+         id="rect3600"
+         style="fill:#2139ff;fill-opacity:0.5205479;stroke:none" />
+      <rect
+         width="10.046631"
+         height="81.801888"
+         x="747.06348"
+         y="-1872.0938"
+         transform="scale(1,-1)"
+         id="rect3602"
+         style="fill:#2139ff;fill-opacity:0.5205479;stroke:none" />
+    </g>
+    <path
+       d="m 676.00003,267 c 8.2417,-3.18777 9.95,-16.45899 17.7853,-16.45899 7.8352,0 15.505,14.59338 28.2147,16.45899"
+       id="path3604"
+       style="fill:none;stroke:#ffdd1b;stroke-width:1px;stroke-linecap:butt;stroke-linejoin:miter;stroke-opacity:0.56164383;display:inline" />
+    <path
+       d="m 676.00003,267 c 8.4248,-0.80245 9.66,-4.24306 17.6694,-4.24306 8.0093,0 15.3384,3.77345 28.3306,4.24306"
+       id="path3606"
+       style="fill:none;stroke:#ffdd1b;stroke-width:0.99999988px;stroke-linecap:butt;stroke-linejoin:miter;stroke-opacity:0.56164383;display:inline" />
+    <path
+       d="m 688.00003,272 0,4.00001 2,0 0,-4.00001 z m 4,0 0,4.00001 2,0 0,-4.00001 z m 4,0 0,4.00001 2,0 0,-4.00001 z m 4,0 0,4.00001 2,0 0,-4.00001 z m -16,0 0,4.00001 2,0 0,-4.00001 z m 20,0 0,4.00001 2,0 0,-4.00001 z m 4,0 0,4.00001 2,0 0,-4.00001 z m -28,0 0,4.00001 2,0 0,-4.00001 z m 32,0 0,4.00001 2,0 0,-4.00001 z m 4,0 0,4.00001 2,0 0,-4.00001 z m -40,0 0,4.00001 2,0 0,-4.00001 z m 44,0 0,4.00001 2,0 0,-4.00001 z"
+       id="path3608"
+       style="fill:#21ff76;fill-opacity:0.5205479;stroke:none" />
+    <path
+       d="m 940.25,4018.2373 a 4.125,42.375 0 1 1 -8.25,0 4.125,42.375 0 1 1 8.25,0 z"
+       transform="matrix(0.24242454,0,0,0.13844079,464.06035,-299.25313)"
+       id="path3610"
+       style="fill:#51cd65;fill-opacity:0.71689501;stroke:none;display:inline" />
+    <path
+       d="m 940.25,4018.2373 a 4.125,42.375 0 1 1 -8.25,0 4.125,42.375 0 1 1 8.25,0 z"
+       transform="matrix(-0.24242991,0,0,0.14362106,919.94473,-320.42253)"
+       id="path3612"
+       style="fill:#5195cd;fill-opacity:0.71689507;stroke:none;display:inline" />
+    <path
+       d="m 940.25,4018.2373 a 4.125,42.375 0 1 1 -8.25,0 4.125,42.375 0 1 1 8.25,0 z"
+       transform="matrix(-0.24242991,0,0,0.1262963,915.94463,-249.60044)"
+       id="path3614"
+       style="fill:#5195cd;fill-opacity:0.71689507;stroke:none;display:inline" />
+    <path
+       d="m 940.25,4018.2373 a 4.125,42.375 0 1 1 -8.25,0 4.125,42.375 0 1 1 8.25,0 z"
+       transform="matrix(0.24242454,0,0,0.10680066,460.06035,-169.86448)"
+       id="path3616"
+       style="fill:#51cd65;fill-opacity:0.71689501;stroke:none;display:inline" />
+    <path
+       d="m 940.25,4018.2373 a 4.125,42.375 0 1 1 -8.25,0 4.125,42.375 0 1 1 8.25,0 z"
+       transform="matrix(-0.24242991,0,0,0.08176837,911.94473,-67.45596)"
+       id="path3618"
+       style="fill:#5195cd;fill-opacity:0.71689507;stroke:none;display:inline" />
+    <path
+       d="m 940.25,4018.2373 a 4.125,42.375 0 1 1 -8.25,0 4.125,42.375 0 1 1 8.25,0 z"
+       transform="matrix(0.24242454,0,0,0.0556619,456.06035,39.35278)"
+       id="path3620"
+       style="fill:#51cd65;fill-opacity:0.71689501;stroke:none;display:inline" />
+    <path
+       d="m 940.25,4018.2373 a 4.125,42.375 0 1 1 -8.25,0 4.125,42.375 0 1 1 8.25,0 z"
+       transform="matrix(-0.24242991,0,0,0.03295421,907.94463,132.24105)"
+       id="path3622"
+       style="fill:#5195cd;fill-opacity:0.71689507;stroke:none;display:inline" />
+    <path
+       d="m 940.25,4018.2373 a 4.125,42.375 0 1 1 -8.25,0 4.125,42.375 0 1 1 8.25,0 z"
+       transform="matrix(0.24242454,0,0,0.01571627,452.06025,202.73913)"
+       id="path3624"
+       style="fill:#51cd65;fill-opacity:0.71689501;stroke:none;display:inline" />
+    <path
+       d="m 940.25,4018.2373 a 4.125,42.375 0 1 1 -8.25,0 4.125,42.375 0 1 1 8.25,0 z"
+       transform="matrix(-0.24242991,0,0,0.00400362,903.94463,250.6314)"
+       id="path3626"
+       style="fill:#5195cd;fill-opacity:0.71689507;stroke:none;display:inline" />
+    <path
+       d="m 940.25,4018.2373 a 4.125,42.375 0 1 1 -8.25,0 4.125,42.375 0 1 1 8.25,0 z"
+       transform="matrix(-0.24242991,0,0,0.13769855,923.94473,-296.18226)"
+       id="path3628"
+       style="fill:#5195cd;fill-opacity:0.71689507;stroke:none;display:inline" />
+    <path
+       d="m 940.25,4018.2373 a 4.125,42.375 0 1 1 -8.25,0 4.125,42.375 0 1 1 8.25,0 z"
+       transform="matrix(0.24242454,0,0,0.14320273,468.06035,-318.70006)"
+       id="path3630"
+       style="fill:#51cd65;fill-opacity:0.71689501;stroke:none;display:inline" />
+    <path
+       d="m 940.25,4018.2373 a 4.125,42.375 0 1 1 -8.25,0 4.125,42.375 0 1 1 8.25,0 z"
+       transform="matrix(0.24242454,0,0,0.12820382,472.06035,-257.34404)"
+       id="path3632"
+       style="fill:#51cd65;fill-opacity:0.71689501;stroke:none;display:inline" />
+    <path
+       d="m 940.25,4018.2373 a 4.125,42.375 0 1 1 -8.25,0 4.125,42.375 0 1 1 8.25,0 z"
+       transform="matrix(-0.24242991,0,0,0.11582757,927.94473,-206.71926)"
+       id="path3634"
+       style="fill:#5195cd;fill-opacity:0.71689507;stroke:none;display:inline" />
+    <path
+       d="m 940.25,4018.2373 a 4.125,42.375 0 1 1 -8.25,0 4.125,42.375 0 1 1 8.25,0 z"
+       transform="matrix(0.24242454,0,0,0.10163802,476.06035,-148.67711)"
+       id="path3636"
+       style="fill:#51cd65;fill-opacity:0.71689501;stroke:none;display:inline" />
+    <path
+       d="m 940.25,4018.2373 a 4.125,42.375 0 1 1 -8.25,0 4.125,42.375 0 1 1 8.25,0 z"
+       transform="matrix(0.24242454,0,0,0.0714824,480.06038,-25.32942)"
+       id="path3640"
+       style="fill:#51cd65;fill-opacity:0.71689501;stroke:none;display:inline" />
+    <path
+       d="m 940.25,4018.2373 a 4.125,42.375 0 1 1 -8.25,0 4.125,42.375 0 1 1 8.25,0 z"
+       transform="matrix(-0.24242991,0,0,0.05696361,935.94473,34.05425)"
+       id="path3642"
+       style="fill:#5195cd;fill-opacity:0.71689507;stroke:none;display:inline" />
+    <path
+       d="m 940.25,4018.2373 a 4.125,42.375 0 1 1 -8.25,0 4.125,42.375 0 1 1 8.25,0 z"
+       transform="matrix(0.24242454,0,0,0.04352198,484.06038,89.02922)"
+       id="path3644"
+       style="fill:#51cd65;fill-opacity:0.71689501;stroke:none;display:inline" />
+    <path
+       d="m 940.25,4018.2373 a 4.125,42.375 0 1 1 -8.25,0 4.125,42.375 0 1 1 8.25,0 z"
+       transform="matrix(-0.24242991,0,0,0.03151447,939.94473,138.13578)"
+       id="path3646"
+       style="fill:#5195cd;fill-opacity:0.71689507;stroke:none;display:inline" />
+    <path
+       d="m 940.25,4018.2373 a 4.125,42.375 0 1 1 -8.25,0 4.125,42.375 0 1 1 8.25,0 z"
+       transform="matrix(0.24242454,0,0,0.02118668,488.06038,180.37016)"
+       id="path3648"
+       style="fill:#51cd65;fill-opacity:0.71689501;stroke:none;display:inline" />
+    <path
+       d="m 940.25,4018.2373 a 4.125,42.375 0 1 1 -8.25,0 4.125,42.375 0 1 1 8.25,0 z"
+       transform="matrix(-0.24242991,0,0,0.012698,943.94473,215.0815)"
+       id="path3650"
+       style="fill:#5195cd;fill-opacity:0.71689507;stroke:none;display:inline" />
+    <path
+       d="m 940.25,4018.2373 a 4.125,42.375 0 1 1 -8.25,0 4.125,42.375 0 1 1 8.25,0 z"
+       transform="matrix(0.24242454,0,0,0.00613946,492.0604,241.89852)"
+       id="path3652"
+       style="fill:#51cd65;fill-opacity:0.71689501;stroke:none;display:inline" />
+    <path
+       d="m 940.25,4018.2373 a 4.125,42.375 0 1 1 -8.25,0 4.125,42.375 0 1 1 8.25,0 z"
+       transform="matrix(-0.24242991,0,0,0.00155198,947.94473,260.65493)"
+       id="path3654"
+       style="fill:#5195cd;fill-opacity:0.71689507;stroke:none;display:inline" />
+    <g
+       transform="matrix(0.10281472,0,0,-0.04049549,569.15796,429.26251)"
+       id="g3745"
+       style="display:inline">
+      <rect
+         width="10.046631"
+         height="81.801888"
+         x="445.87036"
+         y="-1872.0938"
+         transform="scale(1,-1)"
+         id="rect3747"
+         style="fill:#2139ff;fill-opacity:0.5205479;stroke:none" />
+      <rect
+         width="10.046631"
+         height="81.801888"
+         x="465.96362"
+         y="-1872.0938"
+         transform="scale(1,-1)"
+         id="rect3749"
+         style="fill:#2139ff;fill-opacity:0.5205479;stroke:none" />
+      <rect
+         width="10.046631"
+         height="81.801888"
+         x="486.05688"
+         y="-1872.0938"
+         transform="scale(1,-1)"
+         id="rect3751"
+         style="fill:#2139ff;fill-opacity:0.5205479;stroke:none" />
+      <rect
+         width="10.046631"
+         height="81.801888"
+         x="506.15009"
+         y="-1872.0938"
+         transform="scale(1,-1)"
+         id="rect3753"
+         style="fill:#2139ff;fill-opacity:0.5205479;stroke:none" />
+      <rect
+         width="10.046631"
+         height="81.801888"
+         x="526.24335"
+         y="-1872.0938"
+         transform="scale(1,-1)"
+         id="rect3755"
+         style="fill:#2139ff;fill-opacity:0.5205479;stroke:none" />
+      <rect
+         width="10.046631"
+         height="81.801888"
+         x="546.33661"
+         y="-1872.0938"
+         transform="scale(1,-1)"
+         id="rect3757"
+         style="fill:#2139ff;fill-opacity:0.5205479;stroke:none" />
+      <rect
+         width="10.046631"
+         height="81.801888"
+         x="566.42981"
+         y="-1872.0938"
+         transform="scale(1,-1)"
+         id="rect3759"
+         style="fill:#2139ff;fill-opacity:0.5205479;stroke:none" />
+      <rect
+         width="10.046631"
+         height="81.801888"
+         x="586.52307"
+         y="-1872.0938"
+         transform="scale(1,-1)"
+         id="rect3761"
+         style="fill:#2139ff;fill-opacity:0.5205479;stroke:none" />
+      <rect
+         width="10.046631"
+         height="81.801888"
+         x="606.61633"
+         y="-1872.0938"
+         transform="scale(1,-1)"
+         id="rect3763"
+         style="fill:#2139ff;fill-opacity:0.5205479;stroke:none" />
+      <rect
+         width="10.046631"
+         height="81.801888"
+         x="626.70959"
+         y="-1872.0938"
+         transform="scale(1,-1)"
+         id="rect3765"
+         style="fill:#2139ff;fill-opacity:0.5205479;stroke:none" />
+      <rect
+         width="10.046631"
+         height="81.801888"
+         x="646.80292"
+         y="-1872.0938"
+         transform="scale(1,-1)"
+         id="rect3767"
+         style="fill:#2139ff;fill-opacity:0.5205479;stroke:none" />
+      <rect
+         width="10.046631"
+         height="81.801888"
+         x="666.896"
+         y="-1872.0938"
+         transform="scale(1,-1)"
+         id="rect3769"
+         style="fill:#2139ff;fill-opacity:0.5205479;stroke:none" />
+      <rect
+         width="10.046631"
+         height="81.801888"
+         x="686.78412"
+         y="-1872.0938"
+         transform="scale(1,-1)"
+         id="rect3771"
+         style="fill:#2139ff;fill-opacity:0.5205479;stroke:none" />
+      <rect
+         width="10.046631"
+         height="81.801888"
+         x="706.8772"
+         y="-1872.0938"
+         transform="scale(1,-1)"
+         id="rect3773"
+         style="fill:#2139ff;fill-opacity:0.5205479;stroke:none" />
+      <rect
+         width="10.046631"
+         height="81.801888"
+         x="726.97034"
+         y="-1872.0938"
+         transform="scale(1,-1)"
+         id="rect3775"
+         style="fill:#2139ff;fill-opacity:0.5205479;stroke:none" />
+      <rect
+         width="10.046631"
+         height="81.801888"
+         x="747.06348"
+         y="-1872.0938"
+         transform="scale(1,-1)"
+         id="rect3777"
+         style="fill:#2139ff;fill-opacity:0.5205479;stroke:none" />
+    </g>
+    <path
+       d="m 940.25,4018.2373 a 4.125,42.375 0 1 1 -8.25,0 4.125,42.375 0 1 1 8.25,0 z"
+       transform="matrix(-0.24242991,0,0,0.08658828,931.94473,-87.11714)"
+       id="path4537"
+       style="fill:#5195cd;fill-opacity:0.71689507;stroke:none;display:inline" />
+  </g>
+  <g
+     transform="translate(-183.664,-49.44248)"
+     id="layer5"
+     style="display:inline" />
+  <g
+     transform="translate(-183.664,-49.44248)"
+     id="layer3"
+     style="display:inline">
+    <g
+       transform="translate(-16.000034,0)"
+       id="g5004">
+      <g
+         transform="matrix(0.3373212,0,0,0.3373212,187.08622,36.7718)"
+         id="g1915"
+         style="fill:#212a30;fill-opacity:1;display:inline">
+        <path
+           d="m 176.58984,287.95266 c -4.4103,-2e-5 -12.12188,0 -12.12188,0 l 0,26.08961 12.12188,0 c 4.3917,2e-5 7.39131,-1.40425 9.64258,-3.63738 2.13424,-2.11706 3.89541,-5.27258 3.89543,-9.41799 -2e-5,-4.12678 -1.81,-7.26836 -3.99251,-9.42473 -2.21436,-2.18781 -5.1352,-3.60949 -9.5455,-3.60951 z m -22.86855,33.89131 0,-41.63052 22.51894,0 c 7.48435,0 12.62796,-0.0795 18.87867,6.00177 3.60173,3.5053 6.09058,9.07662 6.09058,14.77168 0,5.69505 -2.04116,11.0094 -5.86648,14.82943 -5.70324,5.69535 -11.55274,6.02764 -19.10277,6.02764 l -22.51894,0 z"
+           id="path1917"
+           style="font-size:9.055686px;font-style:normal;font-weight:normal;line-height:125%;fill:#212a30;fill-opacity:1;stroke:none;font-family:Bitstream Vera Sans" />
+        <path
+           d="m 126.97432,297.67254 c 2.59561,0 3.64258,-0.35574 4.61029,-1.19227 0.98624,-0.83649 1.69817,-2.0131 1.69815,-3.67544 -2e-5,-1.66232 -0.86816,-2.92319 -1.8544,-3.74115 -1.18646,-0.91163 -2.20237,-1.07059 -4.45404,-1.07064 l -14.3345,0 0,9.6795 14.3345,0 m 1.62252,7.76085 -15.95702,0 0,16.41058 -10.7467,0 0,-41.63052 28.47566,0 c 6.48475,0 13.53566,4.07906 13.53566,12.48815 0.0809,5.0346 -2.32365,8.20329 -5.0209,10.34803 l 10.848,18.79434 -11.70938,0 -9.42532,-16.41058 z"
+           id="path1919"
+           style="font-size:9.055686px;font-style:normal;font-weight:normal;line-height:125%;fill:#212a30;fill-opacity:1;stroke:none;font-family:Bitstream Vera Sans" />
+        <path
+           d="m 81.25583,313.1958 -27.31361,0 -4.9224,8.64817 -11.73246,0 23.88134,-41.63052 12.81227,0 23.89079,41.63052 -11.59805,0 -5.01788,-8.64817 z m -13.66703,-23.70915 -9.20858,15.98847 18.46056,0 -9.25198,-15.98847 z"
+           id="path1921"
+           style="font-size:9.055686px;font-style:normal;font-weight:normal;line-height:125%;fill:#212a30;fill-opacity:1;stroke:none;font-family:Bitstream Vera Sans" />
+        <path
+           d="m 226.49427,279.56802 c -11.98471,0.59407 -21.75,9.91765 -21.75,21.44777 0,11.53013 9.79018,20.79698 21.75,21.51562 4.22101,0.25363 7.99759,0.23803 12.125,0 11.96508,-0.69004 21.75,-9.98549 21.75,-21.51562 0,-11.53012 -9.77016,-20.82438 -21.75,-21.44777 -4.03884,-0.2283 -8.08575,-0.18207 -12.125,0 z m 9.71875,7.57276 c 7.23503,0.40144 13.125,6.45 13.125,13.87501 0,7.42501 -5.8989,13.39939 -13.125,13.84375 -2.49266,0.15329 -4.79454,0.16333 -7.34375,0 -7.22294,-0.46277 -13.125,-6.41874 -13.125,-13.84375 0,-7.42501 5.88705,-13.49244 13.125,-13.87501 2.44658,-0.14198 4.89703,-0.10411 7.34375,0 z"
+           id="path1923"
+           style="fill:#212a30;fill-opacity:1;stroke:none" />
+        <path
+           d="m 264.45322,280.20719 0,25.23047 c 0,6.93393 4.16429,10.29387 5.90625,11.75 4.29314,3.58868 11.07624,5.58352 17.86206,5.55335 6.78575,-0.0302 14.25702,-2.48029 17.85669,-5.55335 3.75213,-3.20322 5.87496,-6.49272 5.875,-11.75 l 0,-25.23047 -10.6875,0 0,24.09437 c 0,4.46995 -1.68688,6.59318 -3.71875,7.93945 -2.26002,1.49744 -5.47235,2.69882 -9.33671,2.67834 -3.86435,-0.0205 -7.48003,-1.39915 -9.3151,-2.67834 -2.26851,-1.58137 -3.75379,-3.44012 -3.75379,-7.93945 l 0,-24.09437 -10.68815,0 z"
+           id="path1925"
+           style="font-size:9.055686px;font-style:normal;font-weight:normal;line-height:125%;fill:#212a30;fill-opacity:1;stroke:none;font-family:Bitstream Vera Sans" />
+        <path
+           d="m 342.62077,297.67254 c 2.59561,0 3.64258,-0.35574 4.61029,-1.19227 0.98624,-0.83649 1.69817,-2.0131 1.69815,-3.67544 -2e-5,-1.66232 -0.86816,-2.92319 -1.8544,-3.74115 -1.18646,-0.91163 -2.20237,-1.07059 -4.45404,-1.07064 l -14.3345,0 0,9.6795 14.3345,0 m 1.62252,7.76085 -15.95702,0 0,16.41058 -10.7467,0 0,-41.63052 28.47566,0 c 6.48475,0 13.53566,4.07906 13.53566,12.48815 0.0809,5.0346 -2.32365,8.20329 -5.0209,10.34803 l 10.848,18.79434 -11.70938,0 -9.42532,-16.41058 z"
+           id="path1927"
+           style="font-size:9.055686px;font-style:normal;font-weight:normal;line-height:125%;fill:#212a30;fill-opacity:1;stroke:none;display:inline;font-family:Bitstream Vera Sans" />
+      </g>
+      <path
+         d="m 254.97805,49.44248 -45.67542,79.11236 0.79061,0 c 1.56315,-0.0342 2.18389,-2.59315 2.99374,-2.59315 0.53441,0 1.3091,0.40057 1.8447,0.40057 1.01436,0 1.30975,-5.36851 2.16519,-5.35061 0.85548,0.0178 0.69418,4.58683 2.30433,4.64434 1.66966,0.0596 0.56996,-10.53075 2.20314,-10.53075 1.42063,0 0.70743,9.03544 2.64945,9.09508 1.88243,0.0596 0.42495,-15.65069 2.30431,-15.80295 2.18891,-0.17735 0.80225,14.62113 3.06659,14.62113 1.9767,0 0.67093,-20.77962 2.42028,-20.77962 1.74728,0 1.21894,20.07661 3.00154,19.90794 1.71626,0 0.0587,-24.21638 2.0378,-24.21638 2.53432,0 1.23709,24.03664 2.97305,24.02995 1.95629,-0.008 0.36178,-26.38815 2.63916,-26.38815 2.36118,0 0.75166,26.64522 2.72057,26.57458 2.07415,-0.0745 -0.10269,-24.96445 2.18206,-24.96445 2.20038,0 0.70314,25.31232 2.62476,25.31232 2.03017,0 -0.14512,-24.98474 2.51183,-24.81058 2.57296,0.16866 0.8854,25.53855 2.72491,25.53855 1.75522,0 0.7072,-22.1642 2.80028,-22.04483 2.45368,0.13993 1.02266,22.73063 2.71694,22.73063 1.57037,0 0.85771,-18.03346 2.93737,-18.03346 2.19818,0 0.54393,18.71472 2.56634,18.63038 2.02065,-0.0843 1.11889,-14.27021 3.08561,-14.27021 1.88242,0 0.32463,15.4713 2.45219,15.4713 2.21141,0 1.09286,-10.8319 3.04436,-10.8319 2.01083,0 0.74463,11.83477 2.28353,11.83477 1.49673,0 1.05962,-7.4562 2.53716,-7.4562 1.47885,0 0.78596,7.89026 2.03115,7.89026 1.28728,0 0.76073,-4.55384 2.12478,-4.55384 1.36582,0 0.87821,4.91766 1.92578,4.97548 0.93015,0.0514 0.81098,-3.10076 1.66818,-3.10076 0.87212,0 0.50473,3.50418 1.70532,3.50165 1.1584,-0.003 0.80233,-1.93992 1.62727,-1.93992 0.81008,0 0.83511,2.27749 1.79202,2.27749 0.68204,-0.0131 1.40553,-1.04115 2.27237,-1.04115 1.14027,0 1.48965,1.27249 4.40991,1.27249 l 1.21223,0 -45.67539,-79.11236 z"
+         id="path1929"
+         style="fill:#e4214e;fill-opacity:1;fill-rule:evenodd;stroke:none;display:inline" />
+    </g>
+  </g>
+</svg>
diff --git a/debian/ardour2-session_exchange.py b/debian/ardour2-session_exchange.py
deleted file mode 100644
index 292196d..0000000
--- a/debian/ardour2-session_exchange.py
+++ /dev/null
@@ -1,834 +0,0 @@
-#! /usr/bin/python
-
-# Session Exchange
-# By Taybin Rutkin
-# Copyright 2004, under the GPL
-
-VERSION='0.1.1'
-
-#twisted libraries
-from twisted.internet import gtk2reactor
-gtk2reactor.install()
-from twisted.internet import reactor, protocol
-import twisted.internet.error
-
-#pygtk libraries
-import gobject
-import gtk
-
-#standard python2.2 libraries
-import getopt
-import os
-import os.path
-import re
-import shelve
-import string
-import sys
-import xml.dom.pulldom
-
-def get_header_size(filename):
-	size = 0
-	file = open(filename, 'r')
-	while True:
-		chunk = file.read(4)
-		size += 4
-		if chunk == "data":
-			file.close()
-			return size + 4	#include the size chunk after "data"
-		if not chunk:
-			file.close()
-			return None
-
-def append_empty_data(self, filename, size):
-	file = open(filename, 'a')
-	file.seek(size-1)
-	file.write('\x00')
-	file.close()
-	
-def get_sound_list(snapshot):
-	doc = xml.dom.pulldom.parse(snapshot)
-	seen = {}
-	soundlist = []
-	for event, node in doc:
-		if event=='START_ELEMENT' and node.nodeName=='Source':
-			soundlist.append(str(node.getAttribute('name')))
-	return soundlist
-
-def raise_error(string, parent):
-	dialog = gtk.MessageDialog(parent, gtk.DIALOG_MODAL | gtk.DIALOG_DESTROY_WITH_PARENT,
-	gtk.MESSAGE_WARNING, gtk.BUTTONS_OK, string)
-		
-	dialog.run()
-	dialog.destroy()
-
-class Data(object):
-	def delete_snap(self, session, collab, snap):
-		sessions = self._data['sessions']
-		sessions[session]['collabs'][collab]['snaps'].remove(snap)
-		self._data['sessions'] = sessions
-	
-	def delete_collab(self,session, collab):
-		sessions = self._data['sessions']
-		del sessions[session]['collabs'][collab]
-		self._data['sessions'] = sessions
-	
-	def delete_session(self, session):
-		sessions = self._data['sessions']
-		del sessions[session]
-		self._data['sessions'] = sessions
-	
-	def add_snap(self, session_name, collab_name, snap_name):
-		sessions = self._data['sessions']
-		sessions[session_name]['collabs'][collab_name]['snaps'].append(snap_name)
-		sessions[session_name]['collabs'][collab_name]['snaps'].sort()
-		self._data['sessions'] = sessions
-		
-		g_display.update_snap_view()
-	
-	def add_collab(self, session_name, collab_name, ip_address, port):
-		sessions = self._data['sessions']
-		sessions[session_name]['collabs'][collab_name] = {}
-		sessions[session_name]['collabs'][collab_name]['snaps'] = []
-		sessions[session_name]['collabs'][collab_name]['sounds'] = []
-		sessions[session_name]['collabs'][collab_name]['ip'] = ip_address
-		sessions[session_name]['collabs'][collab_name]['port'] = port
-		self._data['sessions'] = sessions
-		
-		client = ExchangeClientFactory(session_name, collab_name, None, self.debug_mode)
-		reactor.connectTCP(ip_address, port, client)
-		g_display.show_status("connecting")
-		
-		g_display.update_collab_view()
-	
-	def add_session(self, session_path):
-		sessions = self._data['sessions']
-		
-		session_name = session_path[session_path.rfind('/', 0, len(session_path)-2)+1: -1]
-		sessions[session_name] = {}
-		sessions[session_name]['path'] = session_path 
-		sessions[session_name]['collabs'] = {}
-		sessions[session_name]['collabs'][self._data['user']] = {}
-		sessions[session_name]['collabs'][self._data['user']]['snaps'] = []
-		sessions[session_name]['collabs'][self._data['user']]['sounds'] = []
-		
-		self._data['sessions'] = sessions
-		
-		self.rescan_session(session_name)
-
-	def rescan_session(self, session_name):
-		sessions = self._data['sessions']
-		
-		session_path = sessions[session_name]['path']
-		sessions[session_name]['collabs'][self._data['user']]['snaps'] = self._scan_snapshots(session_path)
-		sessions[session_name]['collabs'][self._data['user']]['sounds'] = self._scan_sounds(session_path)
-		
-		self._data['sessions'] = sessions
-		
-		g_display.update_snap_view()
-		
-		print self._data['sessions']
-	
-	def create_session(self, session_path):
-		try:
-			os.mkdir(session_path)
-			os.mkdir(session_path+"/sounds")
-		except OSError:
-			raise_error("Could not create session directory", g_display.window)
-			return
-		
-		sessions = self._data['sessions']
-		
-		session_name = session_path[session_path.rfind('/', 0, len(session_path)-2)+1: ]
-		sessions[session_name] = {}
-		sessions[session_name]['path'] = session_path
-		sessions[session_name]['collabs'] = {}
-		sessions[session_name]['collabs'][self._data['user']] = {}
-		sessions[session_name]['collabs'][self._data['user']]['snaps'] = []
-		sessions[session_name]['collabs'][self._data['user']]['sounds'] = []
-		
-		self._data['sessions'] = sessions
-		print self._data['sessions']
-	
-	def get_session_path(self, session):
-		sessions = self._data['sessions']
-		return sessions[session]['path']
-	
-	def get_user(self):
-		return self._data['user']
-	
-	def set_user(self, username):
-		self._data['user'] = username
-	
-	def get_collab_ip(self, session, collab):
-		sessions = self._data['sessions']
-		return sessions[session]['collabs'][collab]['ip']
-	
-	def close(self):
-		self._data.close()
-	
-	def get_sessions(self):
-		sessions = self._data['sessions']
-		sess = sessions.keys()
-		sess.sort()
-		return sess
-	
-	def get_collabs(self, session):
-		if session:
-			sessions = self._data['sessions']
-			collabs = sessions[session]['collabs'].keys()
-			collabs.sort()
-			return collabs
-		else:
-			return []
-	
-	def get_snaps(self, session, collab):
-		if session and collab:
-			sessions = self._data['sessions']
-			snaps = sessions[session]['collabs'][collab]['snaps']
-			snaps.sort()
-			return snaps
-		else:
-			return []
-	
-	def get_sounds(self, session, collab):
-		if session and collab:
-			sessions = self._data['sessions']
-			sounds = sessions[session]['collabs'][self._data['user']]['sounds']
-			sounds.sort()
-			return sounds
-		else:
-			return []
-		
-	def _scan_snapshots(self, session):
-		snaps = []
-		files = os.listdir(session)
-		pattern = re.compile(r'\.ardour$')
-		for file in files:
-			if pattern.search(file):
-				snaps.append(file[0:-7])
-				print file[0:-7]
-		return snaps
-	
-	def _scan_sounds(self, session):
-		sounds = []
-		files = os.listdir(session+'/sounds')
-		pattern = re.compile(r'\.peak$')
-		for file in files:
-			if not pattern.search(file):
-				sounds.append(file)
-		return sounds
-	
-	def __init__(self, *args):
-		self._data = shelve.open(os.path.expanduser('~/.session_exchange'), 'c')
-		self.port = 8970
-		self.debug_mode = False
-		if len(self._data.keys()) < 1:
-			self._data['sessions'] = {}
-			self._data['user'] = ''
-		
-		self._collabs = {}
-
-from twisted.protocols.basic import FileSender
-class FileSenderLimited(FileSender):
-	def beginFileTransfer(self, file, consumer, limit, transform = None):
-		self.file = file
-		self.consumer = consumer
-		self.CHUNK_SIZE = limit
-		self.transform = transform
-		
-		self.consumer.registerProducer(self, False)
-		self.deferred = defer.Deferred()
-		return self.deferred
-	
-	def resumeProducing(self):
-		chunk = ''
-		chunk = self.file.read(self.CHUNK_SIZE)
-		
-		if self.transform:
-			chunk = self.transform(chunk)
-
-		self.consumer.write(chunk)
-		self.lastSent = chunk[-1]
-		self.file = None
-		self.consumer.unregisterProducer()
-		self.deferred.callback(self.lastSent)
-		self.deferred = None
-
-from twisted.protocols.basic import LineReceiver
-class ExchangeServer (LineReceiver):
-	def __init__(self):
-		self.state = "IDLE"
-	
-	def error(self, message):
-		self.sendLine("ERROR")
-		self.sendLine(message)
-		self.transport.loseConnection()
-	
-	def connectionLost(self, reason):
-		print "server: connection lost: ", reason
-	
-	def connectionMade(self):
-		print "server: connection made"
-	
-	def lineReceived(self, data):
-		print "server: ", data
-		
-		if self.state == "SESSION":
-			if g_data.get_sessions().count(data):
-				self.session_name = data
-				self.state = "IDLE"
-				self.sendLine("OK")
-			else:
-				self.error(data + " doesn't exist on server")
-		elif self.state == "SNAPSHOT":
-			if g_data.get_snaps(self.session_name, g_data.get_user()).count(data):
-				filename = g_data.get_session_path(self.session_name)+data+'.ardour'
-				print filename
-				self.sendLine(str(os.stat(filename).st_size))
-				self.sendLine("OK")
-				self.file = open(filename, 'r')
-				file_sender = FileSender()
-				cb = file_sender.beginFileTransfer(self.file, self.transport)
-				cb.addCallback(self.file_done)
-			else:
-				self.error("snapshot: " + data + " doesn't exist on server")
-		elif self.state == "SOUNDFILE" or self.state == "SOUNDFILE_HEADER":
-			if g_data.get_sounds(self.session_name, g_data.get_user()).count(data):
-				filename = g_data.get_session_path(self.session_name)+"/sounds/"+data
-				print filename
-				if self.state == "SOUNDFILE":
-					self.sendLine(str(os.stat(filename).st_size))
-				else:	#SOUNDFILE_HEADER
-					header_size = get_header_size(filename)
-					if header_size:
-						self.sendLine(str(header_size))
-					else:
-						self.error('soundfile: ' + data + 'doesn\'t have "data" chunk')
-				self.sendLine("OK")
-				self.file = open(filename, 'r')
-				if self.state == "SOUNDFILE":
-					file_sender = FileSender()
-					cb = file_sender.beginFileTransfer(self.file, self.transport)
-				else:	# SOUNDFILE_HEADER
-					file_sender = FileSenderLimited()
-					cb = file_sender.beginFileTransfer(self.file, self.transport, header_size)
-				cb.addCallback(self.file_done)
-			else:
-				self.error("soundfile: " + data + "doesn't exist on server")
-		elif self.state == "SOUNDFILE_SIZE":
-			if g_data.get_sounds(self.session_name, g_data.get_user()).count(data):
-				filename = g_data.get_session_path(self.session_name)+"/sounds/"+data
-				print filename
-				self.sendLine(str(os.stat(filename).st_size))
-				self.state = "IDLE"
-		elif data == "SESSION":
-			self.state = "SESSION"
-		elif data == "SNAPS":
-			self.state = "SNAPS"
-			for snap in g_data.get_snaps(self.session_name, g_data.get_user()):
-				self.sendLine(snap)
-			self.sendLine("OK")
-			self.state = "IDLE"
-		elif data == "SNAPSHOT":
-			self.state = "SNAPSHOT"
-		elif data == "SOUNDFILE":
-			self.state = "SOUNDFILE"
-		elif data == "SOUNDFILE_HEADER":
-			self.state = "SOUNDFILE_HEADER"
-		elif data == "SOUNDFILE_SIZE":
-			self.state = "SOUNDFILE_SIZE"
-	
-	def file_done(self, data):
-		print "server: file done"
-		self.file.close()
-		self.state = "IDLE"
-	
-class ExchangeServerFactory(protocol.ServerFactory):
-	protocol = ExchangeServer
-	
-	def __init__(self):
-		pass
-
-class ExchangeClient (LineReceiver):
-	def __init__(self, session_name, collab_name, snap_name, debug_mode):
-		self.session_name = session_name
-		self.collab_name = collab_name
-		self.snap_name = snap_name
-		self.debug_mode = debug_mode
-		self.state = "IDLE"
-	
-	def connectionLost(self, reason):
-		g_display.show_status("Connection lost")
-	
-	def connectionMade(self):
-		g_display.show_status("Connection made")
-		self.state = "SESSION"
-		self.sendLine("SESSION")
-		self.sendLine(self.session_name)
-	
-	def rawDataReceived(self, data):
-		self.file.write(data)
-		self.received += len(data)
-		print self.received, self.filesize
-		if self.received >= self.filesize:
-			self.setLineMode()
-			self.file.close()
-			g_data.rescan_session(self.session_name)
-			if self.state == "SNAPSHOT":
-				self.sounds = get_sound_list(self.filename)
-				if len(self.sounds):
-					self.sound_index = 0
-					if self.debug_mode:
-						self.state = "SOUNDFILE_HEADER"
-						self.sendLine("SOUNDFILE_HEADER")
-					else:
-						self.state = "SOUNDFILE"
-						self.sendLine("SOUNDFILE")
-					self.sendLine(self.sounds[self.sound_index])
-				else:
-					self.transport.loseConnection()
-			elif self.state == "SOUNDFILE":
-				self.sound_index += 1
-				if self.sound_index > len(self.sounds)-1:
-					self.transport.loseConnection()
-				else:
-					self.sendLine("SOUNDFILE")
-					self.sendLine(self.sounds[self.sound_index])
-			elif self.state == "SOUNDFILE_HEADER":
-				self.state = "SOUNDFILE_SIZE"
-				self.sendLine("SOUNDFILE_SIZE")
-				self.sendLine(self.sounds[self.sound_index])
-	
-	def lineReceived(self, data):
-		print "client: ", data
-		
-		if data == "ERROR":
-			self.state = "ERROR"
-		elif data == "OK":
-			if self.state == "SESSION":
-				if self.snap_name:
-					self.state = "SNAPSHOT"
-					self.sendLine("SNAPSHOT")
-					self.sendLine(self.snap_name)
-				else:
-					self.state = "SNAPS"
-					self.sendLine("SNAPS")
-			elif self.state == "SNAPS":
-				self.transport.loseConnection()
-			elif self.state == "SNAPSHOT":
-				self.setRawMode()
-				self.filename = g_data.get_session_path(self.session_name)+'/'+self.snap_name+'.ardour'
-				self.file = open(self.filename, 'w')
-				self.received = 0
-			elif self.state == "SOUNDFILE" or self.state == "SOUNDFILE_HEADER":
-				self.setRawMode()
-				self.filename = g_data.get_session_path(self.session_name)+'/sounds/'+self.sounds[self.sound_index]
-				self.file = open(self.filename, 'w')
-				self.received = 0
-		elif self.state == "ERROR":
-			raise_error(data, g_display.window)
-		elif self.state == "SNAPS":
-			g_data.add_snap(self.session_name, self.collab_name, data)
-		elif self.state == "SNAPSHOT":
-			self.filesize = int(data)
-		elif self.state == "SOUNDFILE":
-			self.filesize = int(data)
-		elif self.state == "SOUNDFILE_HEADER":
-			self.filesize = int(data)
-		elif self.state == "SOUNDFILE_SIZE":
-			append_empty_data(self.filename, int(data))
-			self.sound_index += 1
-			if self.sound_index > len(self.sounds)-1:
-				self.transport.loseConnection()
-			else:
-				self.state = "SOUNDFILE_HEADER"
-				self.sendLine("SOUNDFILE_HEADER")
-				self.sendLine(self.sounds[self.sound_index])
-
-class ExchangeClientFactory(protocol.ClientFactory):
-	def buildProtocol(self, addr):
-		return ExchangeClient(self.session_name, self.collab_name, self.snap_name, self.debug_mode)
-	
-	def clientConnectionFailed(self, connector, reason):
-		raise_error('Connection failed: ' + reason.getErrorMessage(), g_display.window)
-		g_display.show_status('Connection failed')
-	
-	def __init__(self, session_name, collab_name, snap_name, debug_mode):
-		self.session_name = session_name
-		self.collab_name = collab_name
-		self.snap_name = snap_name
-		self.debug_mode = debug_mode
-
-class HelperWin(object):
-	def delete_me(self, window):
-		self = 0
-
-class Preferences(HelperWin):
-	def __init__(self):
-		self.window = gtk.Window(gtk.WINDOW_TOPLEVEL)
-		self.window.set_title('Preferences')
-		self.window.connect('destroy', self.delete_me)
-		
-		main_box = gtk.VBox()
-		self.window.add(main_box)
-		
-		hbox1 = gtk.HBox()
-		label1 = gtk.Label("User")
-		self.user = gtk.Entry()
-		self.user.set_text(g_data.get_user())
-		hbox1.pack_start(label1)
-		hbox1.pack_start(self.user)
-		main_box.pack_start(hbox1)
-		
-		ok_btn = gtk.Button("Ok")
-		ok_btn.connect('clicked', self.ok_clicked)
-		main_box.pack_start(ok_btn)
-		
-		self.window.show_all()
-		
-	def ok_clicked(self, btn):
-		g_data.set_user(self.user.get_text())
-		self.window.hide_all()
-		
-	def show_all(self):
-		self.window.show_all()
-
-class AddCollaborator(HelperWin):
-	def __init__(self, session):
-		self.session_name = session
-		
-		self.window = gtk.Window(gtk.WINDOW_TOPLEVEL)
-		self.window.set_title('Fetch Session')
-		self.window.connect('destroy', self.delete_me)
-		
-		main_box = gtk.VBox()
-		self.window.add(main_box)
-		
-		hbox0 = gtk.HBox()
-		label0 = gtk.Label("Collaborator")
-		self.collab = gtk.Entry()
-		self.collab.connect('key-release-event', self.key_press)
-		hbox0.pack_start(label0)
-		hbox0.pack_start(self.collab)
-		main_box.pack_start(hbox0)
-		
-		hbox1 = gtk.HBox()
-		label1 = gtk.Label("IP Address")
-		self.address = gtk.Entry()
-		self.address.connect('key-release-event', self.key_press)
-		hbox1.pack_start(label1)
-		hbox1.pack_start(self.address)
-		main_box.pack_start(hbox1)
-		
-		hbox2 = gtk.HBox()
-		label2 = gtk.Label("Port Number")
-		self.port = gtk.Entry()
-		self.port.connect('key-release-event', self.key_press)
-		self.port.set_text(str(g_data.port))
-		hbox2.pack_start(label2)
-		hbox2.pack_start(self.port)
-		main_box.pack_start(hbox2)
-		
-		hbox3 = gtk.HBox()
-		label3 = gtk.Label("Username")
-		label3.set_sensitive(False)
-		self.username = gtk.Entry()
-		self.username.set_sensitive(False)
-		hbox3.pack_start(label3)
-		hbox3.pack_start(self.username)
-		main_box.pack_start(hbox3)
-		
-		hbox4 = gtk.HBox()
-		label4 = gtk.Label("Password")
-		label4.set_sensitive(False)
-		self.password = gtk.Entry()
-		self.password.set_sensitive(False)
-		hbox4.pack_start(label4)
-		hbox4.pack_start(self.password)
-		main_box.pack_start(hbox4)
-		
-		self.ok_btn = gtk.Button(gtk.STOCK_OK)
-		self.ok_btn.set_use_stock(True)
-		self.ok_btn.connect('clicked', self.ok_clicked)
-		self.ok_btn.set_sensitive(False)
-		main_box.pack_start(self.ok_btn)
-		
-		self.window.show_all()
-	
-	def key_press(self, event, data):
-		if self.collab.get_text() and self.address.get_text() and self.port.get_text():
-			self.ok_btn.set_sensitive(True)
-		else:
-			self.ok_btn.set_sensitive(False)
-		return True
-	
-	def ok_clicked(self, btn):
-		self.window.hide_all()
-		g_data.add_collab(self.session_name, self.collab.get_text(), self.address.get_text(), int(self.port.get_text()))
-		self.collab.set_text('')
-		self.address.set_text('')
-		self.port.set_text('')
-		self.username.set_text('')
-		self.password.set_text('')
-	
-	def show_all(self):
-		self.window.show_all()
-
-class ArdourShareWindow(object):
-	def menuitem_cb(self, window, action, widget):
-		print self, window, action, widget
-	
-	def add_collaborator_cb(self, window, action, widget):
-		if self.session:
-			self.add_session = AddCollaborator(self.session)
-	
-	def fetch_snapshot_cb(self, window, action, widget):
-		if self.session and self.collab and self.collab != g_data.get_user():
-			client = ExchangeClientFactory(self.session, self.collab, self.snap, g_data.debug_mode)
-			reactor.connectTCP(g_data.get_collab_ip(self.session, self.collab), g_data.port, client)
-	
-	def preferences_cb(self, window, action, widget):
-		self.preferences = Preferences()
-	
-	def add_session_ok_file_btn_clicked(self, w):
-		filename = self.file_sel.get_filename()
-		if filename.endswith(".ardour"):
-			g_data.add_session(filename[0:filename.rfind("/")+1])
-			self.update_session_view()
-		else:
-			raise_error("Not an Ardour session", self.window)
-		self.file_sel.destroy()
-	
-	def add_session_cb(self, window, action, widget):
-		if g_data.get_user():
-			self.file_sel = gtk.FileSelection("Add Session...")
-			self.file_sel.ok_button.connect("clicked", self.add_session_ok_file_btn_clicked)
-			self.file_sel.cancel_button.connect("clicked", lambda w: self.file_sel.destroy())
-			self.file_sel.connect("destroy", lambda w: self.file_sel.destroy())
-			self.file_sel.show()
-		else:
-			raise_error("Set the user name in the preferences first", self.window)
-	
-	def create_session_cb(self, window, action, widget):
-		if g_data.get_user():
-			self.file_sel = gtk.FileSelection("Create Session...")
-			self.file_sel.ok_button.connect("clicked", self.create_file_ok_btn_clicked)
-			self.file_sel.cancel_button.connect("clicked", lambda w: self.file_sel.destroy())
-			self.file_sel.connect("destroy", lambda w: self.file_sel.destroy())
-			self.file_sel.show()
-		else:
-			raise_error("Set the user name in the preferences first", self.window)
-	
-	def create_file_ok_btn_clicked(self, w):
-		filename = self.file_sel.get_filename()
-		if len(filename) > 0:
-			g_data.create_session(filename)
-			self.update_session_view()
-		else:
-			raise_error("Not an Ardour session", self.window)
-		self.file_sel.destroy()
-	
-	def update_session_view(self):
-		self.session_model.clear()
-		for session in g_data.get_sessions():
-			self.session_model.set(self.session_model.append(), 0, session)
-	
-	def update_collab_view(self):
-		self.collab_model.clear()
-		for collab in g_data.get_collabs(self.session):
-			self.collab_model.set(self.collab_model.append(), 0, collab)
-	
-	def update_snap_view(self):
-		self.snap_model.clear()
-		for snap in g_data.get_snaps(self.session, self.collab):
-			self.snap_model.set(self.snap_model.append(), 0, snap)
-	
-	def cb_session_selection_changed(self, selection_object):
-		selected = []
-		selection_object.selected_foreach(lambda model, path, iter, sel = selected: sel.append(path))
-		for x in selected:
-			self.session = self.session_model[x][0]
-		self.selected_type = "session"
-		self.update_collab_view()
-	
-	def cb_collab_selection_changed(self, selection_object):
-		selected = []
-		selection_object.selected_foreach(lambda model, path, iter, sel = selected: sel.append(path))
-		for x in selected:
-			self.collab = self.collab_model[x][0]
-		self.selected_type = "collab"
-		self.update_snap_view()
-	
-	def cb_snap_selection_changed(self, selection_object):
-		selected = []
-		selection_object.selected_foreach(lambda model, path, iter, sel = selected: sel.append(path))
-		for x in selected:
-			self.snap = self.snap_model[x][0]
-		self.selected_type = "snap"
-	
-	def delete_cb(self, window, action, widget):
-		if self.selected_type == "session":
-			g_data.delete_session(self.session)
-			self.session = ""
-			self.collab = ""
-			self.snap = ""
-		elif self.selected_type == "collab":
-			g_data.delete_collab(self.session, self.collab)
-			self.collab = ""
-			self.snap = ""
-		elif self.selected_type == "snap":
-			g_data.delete_snap(self.session, self.collab, self.snap)
-			self.snap = ""
-		
-		self.update_session_view()
-		self.update_collab_view()
-		self.update_snap_view()
-		self.selected_type = ""
-		
-	def show_status(self, text):
-		mid = self.status_bar.push(self._status_cid, text)
-		if self._status_mid:
-			self.status_bar.remove(self._status_cid, self._status_mid)
-		self._status_mid = mid
-	
-	def __init__(self):
-		self.selected_type = ""
-		self.session = ""
-		self.collab = g_data.get_user()
-		self.snap = ""
-		
-		self.preferences = 0
-		self.add_collab = 0
-		self.add_session = 0
-		
-		self.window = gtk.Window(gtk.WINDOW_TOPLEVEL)
-		self.window.set_title('Session Exchange')
-		self.window.set_size_request(400, 200)
-		self.window.connect('destroy', lambda win: gtk.main_quit())
-		
-		accel_group = gtk.AccelGroup()
-		self.window.add_accel_group(accel_group)
-		
-		main_box = gtk.VBox()
-		self.window.add(main_box)
-		
-		menu_items = (
-			('/_File',            None,         None,             0, '<Branch>'),
-			('/File/_Add Session...','<control>A', self.add_session_cb, 0, ''),
-			('/File/Create _Session...', '<control>S', self.create_session_cb, 0, ''),
-			('/File/sep1',        None,         None,             0, '<Separator>'),
-			('/File/_Quit',       '<control>Q', gtk.main_quit,     0, '<StockItem>', gtk.STOCK_QUIT),
-			('/_Edit',            None,         None,             0, '<Branch>' ),
-			('/Edit/Cu_t',        '<control>X', self.menuitem_cb, 0, '<StockItem>', gtk.STOCK_CUT),
-			('/Edit/_Copy',       '<control>C', self.menuitem_cb, 0, '<StockItem>', gtk.STOCK_COPY),
-			('/Edit/_Paste',      '<control>V', self.menuitem_cb, 0, '<StockItem>', gtk.STOCK_PASTE),
-			('/Edit/_Delete',     None,         self.delete_cb, 0, '<StockItem>', gtk.STOCK_DELETE),
-			('/Edit/sep1',        None,         None,             0, '<Separator>'),
-			('/Edit/Add Colla_borator...','<control>B', self.add_collaborator_cb,0,''),
-			('/Edit/_Fetch Snapshot','<control>F', self.fetch_snapshot_cb,0,''),
-			('/Edit/sep1',        None,         None,             0, '<Separator>'),
-			('/Edit/_Preferences...','<control>P', self.preferences_cb, 0, '')
-		)
-		
-		#need to hold a reference to the item_factory or the menubar will disappear.
-		self.item_factory = gtk.ItemFactory(gtk.MenuBar, '<main>', accel_group)
-		self.item_factory.create_items(menu_items, self.window)
-		main_box.pack_start(self.item_factory.get_widget('<main>'), gtk.FALSE)
-		
-		pane1 = gtk.HPaned()
-		pane2 = gtk.HPaned()
-		pane1.pack2(pane2, gtk.TRUE, gtk.FALSE)
-		
-		scroll1 = gtk.ScrolledWindow()
-		scroll1.set_policy(gtk.POLICY_AUTOMATIC, gtk.POLICY_AUTOMATIC)
-		pane1.pack1(scroll1, gtk.TRUE, gtk.FALSE)
-		scroll2 = gtk.ScrolledWindow()
-		scroll2.set_policy(gtk.POLICY_AUTOMATIC, gtk.POLICY_AUTOMATIC)
-		pane2.pack1(scroll2, gtk.TRUE, gtk.FALSE)
-		scroll3 = gtk.ScrolledWindow()
-		scroll3.set_policy(gtk.POLICY_AUTOMATIC, gtk.POLICY_AUTOMATIC)
-		pane2.pack2(scroll3, gtk.TRUE, gtk.FALSE)
-		
-		self.session_model = gtk.ListStore(gobject.TYPE_STRING)
-		view1 = gtk.TreeView(self.session_model)
-		column1 = gtk.TreeViewColumn('Sessions', gtk.CellRendererText(), text=0)
-		view1.append_column(column1)
-		self.session_selection = view1.get_selection()
-		self.session_selection.connect("changed", self.cb_session_selection_changed)
-		scroll1.add(view1)
-		
-		self.update_session_view()
-		
-		self.collab_model = gtk.ListStore(gobject.TYPE_STRING)
-		view2 = gtk.TreeView(self.collab_model)
-		column2 = gtk.TreeViewColumn('Collaborators', gtk.CellRendererText(), text=0)
-		view2.append_column(column2)
-		self.collab_selection = view2.get_selection()
-		self.collab_selection.connect("changed", self.cb_collab_selection_changed)
-		scroll2.add(view2)
-		
-		self.snap_model = gtk.ListStore(gobject.TYPE_STRING)
-		view3 = gtk.TreeView(self.snap_model)
-		column3 = gtk.TreeViewColumn('Snapshots', gtk.CellRendererText(), text=0)
-		view3.append_column(column3)
-		self.snap_selection = view3.get_selection()
-		self.snap_selection.connect("changed", self.cb_snap_selection_changed)
-		scroll3.add(view3)
-		
-		main_box.pack_start(pane1, gtk.TRUE, gtk.TRUE)
-		
-		self.status_bar = gtk.Statusbar()
-		main_box.pack_start(self.status_bar, gtk.FALSE)
-		self._status_cid = self.status_bar.get_context_id('display')
-		self._status_mid = ''
-		
-		self.window.show_all()
-
-def print_help():
-	print """
-	-h, --help
-	-n, --no-server          Only act as a client
-	-p, --port <port number> Defaults to 8970
-	-d, --debug              Infers audio files.  For debugging Ardour.
-	-v, --version            Version
-	"""
-	sys.exit(2)
-
-def main():
-	try:
-		opts, args = getopt.getopt(sys.argv[1:], "hp:ndv", ["help", "port=", "no-server", "debug", "version"])
-	except getopt.GetoptError:
-		print_help()
-	
-	server = True
-	for o, a in opts:
-		if o in ("-h", "--help"):
-			print_help()
-		if o in ("-d", "--debug"):
-			g_display.window.set_title('Session Exchange: Debug Mode')
-			g_data.debug_mode = True
-		if o in ("-p", "--port"):
-			g_data.port = int(a)
-		if o in ("-n", "--no-server"):
-			server = False
-		if o in ("-v", "--version"):
-			print VERSION
-			sys.exit(2)
-	
-	if (server):
-		try:
-			reactor.listenTCP(g_data.port, ExchangeServerFactory())
-		except twisted.internet.error.CannotListenError:
-			print "Can not listen on a port number under 1024 unless run as root"
-			sys.exit(2)
-
-	reactor.run()
-
-	g_data.close()
-
-# global objects
-g_data = Data()
-g_display = ArdourShareWindow()
-
-if __name__ == '__main__':
-	main()
diff --git a/debian/ardour2.1 b/debian/ardour4.1
similarity index 100%
rename from debian/ardour2.1
rename to debian/ardour4.1
diff --git a/debian/ardour4.xpm b/debian/ardour4.xpm
new file mode 100644
index 0000000..c9bdb7b
--- /dev/null
+++ b/debian/ardour4.xpm
@@ -0,0 +1,139 @@
+/* XPM */
+static char *ardour4[] = {
+/* columns rows colors chars-per-pixel */
+"32 32 101 2 ",
+"   c #240000",
+".  c #2D0202",
+"X  c #330505",
+"o  c #3B0505",
+"O  c #350808",
+"+  c #3C0A0A",
+"@  c #430606",
+"#  c #490707",
+"$  c #440A0A",
+"%  c #4D0A0A",
+"&  c #530B0B",
+"*  c #5C0C0C",
+"=  c #471111",
+"-  c #4B1414",
+";  c #501717",
+":  c #571C1C",
+">  c #5A1B1B",
+",  c #640D0D",
+"<  c #6D0F0F",
+"1  c #720F0F",
+"2  c #661313",
+"3  c #6D1111",
+"4  c #671B1B",
+"5  c #751111",
+"6  c #7B1212",
+"7  c #692727",
+"8  c #6D2929",
+"9  c #7C2020",
+"0  c #7D3333",
+"q  c #841414",
+"w  c #8B1515",
+"e  c #951717",
+"r  c #991717",
+"t  c #9B1818",
+"y  c #B40909",
+"u  c #BB0D0D",
+"i  c #A71A1A",
+"p  c #AA1C1C",
+"a  c #BA1D1D",
+"s  c #892020",
+"d  c #8E2A2A",
+"f  c #863737",
+"g  c #8B3E3E",
+"h  c #923838",
+"j  c #983939",
+"k  c #B12F2F",
+"l  c #A83D3D",
+"z  c #C00F0F",
+"x  c #C41212",
+"c  c #CD1616",
+"v  c #C01F1F",
+"b  c #C91C1C",
+"n  c #D71C1C",
+"m  c #D91C1C",
+"M  c #CA2222",
+"N  c #CD2B2B",
+"B  c #D22222",
+"V  c #DE2222",
+"C  c #D32C2C",
+"Z  c #CF3030",
+"A  c #D23535",
+"S  c #D43D3D",
+"D  c #E22222",
+"F  c #E82525",
+"G  c #EA2F2F",
+"H  c #E33636",
+"J  c #904040",
+"K  c #9D4747",
+"L  c #9D4949",
+"P  c #A54646",
+"I  c #A04B4B",
+"U  c #BE4C4C",
+"Y  c #B25656",
+"T  c #B95454",
+"R  c #B45858",
+"E  c #B95858",
+"W  c #D74444",
+"Q  c #D94747",
+"!  c #D94C4C",
+"~  c #C05E5E",
+"^  c #DB5353",
+"/  c #DE5C5C",
+"(  c #E14242",
+")  c #EA4646",
+"_  c #E74F4F",
+"`  c #E85757",
+"'  c #EA5C5C",
+"]  c #C46161",
+"[  c #D36666",
+"{  c #DE6262",
+"}  c #D06868",
+"|  c #DC6E6E",
+" . c #DE7070",
+".. c #E16464",
+"X. c #EA6060",
+"o. c #E36B6B",
+"O. c #EC6C6C",
+"+. c #E67373",
+"@. c #EA7373",
+"#. c #EE7A7A",
+"$. c None",
+/* pixels */
+"$.$.$.$.$.$.$.$.$.$.$.$.$.$.$.$.$.$.$.$.$.$.$.$.$.$.$.$.$.$.$.$.",
+"$.$.$.$.$.$.$.$.$.$.$.$.$.$.$.$.$.$.$.$.$.$.$.$.$.$.$.$.$.$.$.$.",
+"$.$.$.$.$.$.$.$.$.$.$.$.$.$.$.$.$.$.$.$.$.$.$.$.$.$.$.$.$.$.$.$.",
+"$.$.$.$.$.$.$.$.$.$.$.$.$.$.5 5 $.$.$.$.$.$.$.$.$.$.$.$.$.$.$.$.",
+"$.$.$.$.$.$.$.$.$.$.$.$.$.$.O O 6 $.$.$.$.$.$.$.$.$.$.$.$.$.$.$.",
+"$.$.$.$.$.$.$.$.$.$.$.$.$.& L P & $.$.$.$.$.$.$.$.$.$.$.$.$.$.$.",
+"$.$.$.$.$.$.$.$.$.$.$.$.w = #. at .= 5 $.$.$.$.$.$.$.$.$.$.$.$.$.$.",
+"$.$.$.$.$.$.$.$.$.$.$.$.+ ~ @...U + + $.$.$.$.$.$.$.$.$.$.$.$.$.",
+"$.$.$.$.$.$.$.$.$.$.$.5 7 #.o.{ ' 4 3 $.$.$.$.$.$.$.$.$.$.$.$.$.",
+"$.$.$.$.$.$.$.$.$.$.$.X  .o.../ ^ Q . * $.$.$.$.$.$.$.$.$.$.$.$.",
+"$.$.$.$.$.$.$.$.$.$., g @.o./ ^ Q ( s * $.$.$.$.$.$.$.$.$.$.$.$.",
+"$.$.$.$.$.$.$.$.$.e + #.o./ / ! S A H o 5 $.$.$.$.$.$.$.$.$.$.$.",
+"$.$.$.$.$.$.$.$.$.$ Y +.{ / ^ W S Z C p @ $.$.$.$.$.$.$.$.$.$.$.",
+"$.$.$.$.$.$.$.$.q : #.o.{ ^ Q S A B b D & 1 $.$.$.$.$.$.$.$.$.$.",
+"$.$.$.$.$.$.$.$.O } o.{ / ! W A Z b c c M . % $.$.$.$.$.$.$.$.$.",
+"$.$.$.$.$.$.$.3 0 #.{ / ^ W S Z M b x x m 6 , $.$.$.$.$.$.$.$.$.",
+"$.$.$.$.$.$.$.O +.o.{ ^ ! S Z N b x x u x V . < $.$.$.$.$.$.$.$.",
+"$.$.$.$.$.$.& I +.O.X.` _ ) N M c x x x z c t & $.$.$.$.$.$.$.$.",
+"$.$.$.$.$.w - #. at .j l !   k H G n x u x z u D @ 5 $.$.$.$.$.$.$.",
+"$.$.$.$.$.+ ] +.O.. + d @ , C o M c u u u u x a o + $.$.$.$.$.$.",
+"$.$.$.$.5 8 +.. h X   9 , & q . w F D x u u y m , < $.$.$.$.$.$.",
+"$.$.$.$.O  .T   > @ X 2 5 @ 1 & 6 < , m u u y u B X * $.$.$.$.$.",
+"$.$.$., J [ P . = & o % q O * 3 < . . D x u y y c w * $.$.$.$.$.",
+"$.$.t + Y   ; @   * @ + w . @ 6 & . o B i V u y y V o 3 $.$.$.$.",
+"$.$.$ R f O   *   , #   w   o q o @ # w   q D n y u p @ $.$.$.$.",
+"$.q : E > 6   5   1 ,   6 @ X w . & & # # , w r n u m & 1 $.$.$.",
+"$.O g     t & & * + 5 q $.w w & & q 5   q . X O a b m M O % $.$.",
+"&     w q $.$.$.$.$.$.$.$.$.$.$.$.$.$.w $.1 e @ o O w q # , $.$.",
+"O , $.$.$.$.$.$.$.$.$.$.$.$.$.$.$.$.$.$.$.$.$.$., < < < & & $.$.",
+"$.$.$.$.$.$.$.$.$.$.$.$.$.$.$.$.$.$.$.$.$.$.$.$.$.$.$.$.$.$.$.$.",
+"$.$.$.$.$.$.$.$.$.$.$.$.$.$.$.$.$.$.$.$.$.$.$.$.$.$.$.$.$.$.$.$.",
+"$.$.$.$.$.$.$.$.$.$.$.$.$.$.$.$.$.$.$.$.$.$.$.$.$.$.$.$.$.$.$.$."
+};
diff --git a/debian/ardourino.template b/debian/ardourino.template
deleted file mode 100644
index 9dcca4d..0000000
--- a/debian/ardourino.template
+++ /dev/null
@@ -1,73 +0,0 @@
-<?xml version="1.0" encoding="UTF-8"?>
-<Session version="2.0.0" id-counter="136">
-  <Config>
-    <Option name="output-auto-connect" value="2"/>
-    <Option name="input-auto-connect" value="1"/>
-    <Option name="mtc-port-name" value="control"/>
-    <Option name="mmc-port-name" value="control"/>
-    <Option name="midi-port-name" value="control"/>
-    <Option name="meter-falloff" value="32"/>
-    <end-marker-is-free val="no"/>
-  </Config>
-  <Sources/>
-  <Regions/>
-  <DiskStreams/>
-  <Locations>
-    <Location id="137" name="start" start="0" end="0" flags="IsMark,IsStart"/>
-    <Location id="138" name="end" start="13230000" end="13230000" flags="IsMark,IsEnd"/>
-  </Locations>
-  <Connections/>
-  <Routes>
-    <Route flags="MasterOut" default-type="audio" active="yes" muted="no" soloed="no" phase-invert="no" denormal-protection="no" mute-affects-pre-fader="yes" mute-affects-post-fader="yes" mute-affects-control-outs="yes" mute-affects-main-outs="yes" order-keys="editor=0:signal=0">
-      <IO name="master" id="18" inputs="{}{}" outputs="{alsa_pcm:playback_1}{alsa_pcm:playback_2}" gain="1.000000000000" iolimits="-1,2,-1,2">
-        <Panner linked="no" link_direction="SameDirection" bypassed="no">
-          <Output x="0" y="0"/>
-          <Output x="1" y="0"/>
-          <StreamPanner x="0" type="Equal Power Stereo" muted="no">
-            <Automation>
-              <AutomationList id="47" default="0" min_yval="0" max_yval="1" max_xval="0" state="Off" style="Absolute"/>
-            </Automation>
-            <controllable name="panner" id="46"/>
-          </StreamPanner>
-          <StreamPanner x="1" type="Equal Power Stereo" muted="no">
-            <Automation>
-              <AutomationList id="50" default="1" min_yval="0" max_yval="1" max_xval="0" state="Off" style="Absolute"/>
-            </Automation>
-            <controllable name="panner" id="49"/>
-          </StreamPanner>
-        </Panner>
-        <controllable name="gaincontrol" id="19"/>
-      </IO>
-      <controllable name="solo" id="22"/>
-      <controllable name="mute" id="23"/>
-      <remote_control id="1"/>
-      <extra>
-        <GUI color="22846:21313:45729" track_height="normal" shown_editor="no" shown_mixer="yes">
-          <gain track_height="normal" shown="no"/>
-          <pan track_height="normal" shown="no"/>
-        </GUI>
-      </extra>
-    </Route>
-  </Routes>
-  <EditGroups/>
-  <MixGroups/>
-  <Playlists/>
-  <UnusedPlaylists/>
-  <Click>
-    <IO name="click" id="24" inputs="" outputs="{alsa_pcm:playback_1}" gain="1.000000000000" iolimits="0,0,-1,-1">
-      <Panner linked="no" link_direction="SameDirection" bypassed="no"/>
-      <controllable name="gaincontrol" id="25"/>
-    </IO>
-  </Click>
-  <TempoMap>
-    <Tempo start="1|1|0" beats-per-minute="120.000000" movable="no"/>
-    <Meter start="1|1|0" note-type="4.000000" beats-per-bar="4.000000" movable="no"/>
-  </TempoMap>
-  <ControlProtocols>
-    <Protocol name="Mackie" active="no"/>
-    <Protocol name="Generic MIDI" active="no"/>
-  </ControlProtocols>
-  <extra>
-    <RulerVisibility smpte="no" bbt="no" frames="no" minsec="no" tempo="no" meter="no" marker="no" rangemarker="no" transportmarker="yes"/>
-  </extra>
-</Session>
diff --git a/debian/control b/debian/control
index a55fae7..5925ad1 100644
--- a/debian/control
+++ b/debian/control
@@ -2,48 +2,56 @@ Source: ardour
 Section: sound
 Priority: optional
 Maintainer: Debian Multimedia Maintainers <pkg-multimedia-maintainers at lists.alioth.debian.org>
-Uploaders: Adrian Knoth <adi at drcomp.erfurt.thur.de>,
- Free Ekanayaka <freee at debian.org>,
- Jaromír Mikeš <mira.mikes at seznam.cz>,
- Jonas Smedegaard <dr at jones.dk>
-Build-Depends: cdbs,
+Uploaders:
+ Adrian Knoth <adi at drcomp.erfurt.thur.de>,
+ Jaromír Mikeš <mira.mikes at seznam.cz>
+Build-Depends:
+ cdbs,
  autotools-dev,
  devscripts,
- debhelper (>= 9),
+ debhelper (>= 9~),
  dh-buildinfo,
  gettext,
  intltool,
- scons,
- libboost-dev,
- libcurl4-gnutls-dev,
- libfftw3-dev,
- libraptor1-dev (>= 1.4.21-5),
- liblrdf0-dev (>= 0.4.0-4),
- libsigc++-2.0-dev,
+ libboost-dev (>= 1.49.0),
+ libcurl4-gnutls-dev (>= 7.25.0),
+ libfftw3-dev (>= 3.3.1),
+ libraptor2-dev (>= 2.0.9),
+ librdf0-dev (>= 1.0.15),
+ liblrdf0-dev (>= 0.4.0),
+ libserd-dev (>= 0.18.2~),
+ libsord-dev (>= 0.12.0~),
+ libsuil-dev (>= 0.6.10~),
+ liblilv-dev,
+ libsratom-dev (>= 0.4.2~),
+ libsigc++-2.0-dev (>= 2.2.10),
  libusb-dev,
+ uuid-dev,
  libxml2-dev (>= 2.5.7),
- librasqal3-dev | librasqal2-dev (>= 0.9.14),
- libcairomm-1.0-dev (>= 1.2.4),
- libglade2-dev,
- libglademm-2.4-dev,
- libglib2.0-dev,
- libgnomecanvas2-dev,
- libgnomecanvasmm-2.6-dev,
- libgtkmm-2.4-dev,
- libpango1.0-dev,
+ librasqal3-dev (>= 0.9.28),
+ libcwiid-dev,
+ libcairomm-1.0-dev (>= 1.10.0),
+ libgnomecanvas2-dev (>= 2.30.3),
+ libgnomecanvasmm-2.6-dev (>= 2.26.0),
+ libgtkmm-2.4-dev (>= 2.24.2),
+ libpangomm-1.4-dev (>= 2.28.4),
  ladspa-sdk (>= 1.1-2),
- libasound2-dev (>= 0.9.4) [linux-any],
- liboss-salsa-dev [!linux-any],
- libaubio-dev (>= 0.4.0),
+ libasound2-dev (>= 0.9.4),
+ libaubio-dev (>= 0.3.2),
  libjack-dev,
- liblo-dev,
- libsuil-dev,
- libsamplerate0-dev,
- libsndfile1-dev,
+ liblo-dev (>= 0.26~),
+ libltc-dev,
+ librubberband-dev,
+ libsamplerate0-dev (>= 0.1.8),
+ libsndfile1-dev (>= 1.0.25),
  libsoundtouch-dev (>= 1.5.0),
- lv2-dev,
- liblilv-dev,
- vamp-plugin-sdk (>=2.1)
+ libtagc0-dev,
+ lv2-dev (>= 1.2.0),
+ vamp-plugin-sdk (>=2.1),
+ python-setuptools,
+ python-isodate,
+ libpcre3-dev,
+ python-rdflib
 Standards-Version: 3.9.6
 Homepage: http://www.ardour.org/
 Vcs-Git: git://anonscm.debian.org/pkg-multimedia/ardour.git
@@ -51,18 +59,23 @@ Vcs-Browser: http://anonscm.debian.org/gitweb/?p=pkg-multimedia/ardour.git
 
 Package: ardour
 Architecture: any
-Depends: ${shlibs:Depends},
- ${python:Depends},
+Depends:
+ ${cdbs:Depends},
  ${misc:Depends},
- ${cdbs:Depends}
-Recommends: ${cdbs:Recommends}
-Suggests: ${shlibs:Suggests}
-Conflicts: ${cdbs:Conflicts}
-Replaces: ${cdbs:Replaces}
+ ${python:Depends},
+ ${shlibs:Depends}
+Recommends:
+ ${cdbs:Recommends}
+Suggests:
+ ${shlibs:Suggests}
+Conflicts:
+ ${cdbs:Conflicts}
+Replaces:
+ ${cdbs:Replaces}
 Description: digital audio workstation (graphical gtk2 interface)
  Ardour is a multichannel hard disk recorder (HDR) and digital audio
  workstation (DAW).  It can be used to control, record, edit and run
- complex audio setups. 
+ complex audio setups.
  .
  Ardour supports pro-audio interfaces
  through the ALSA project, which provides high quality, well designed
@@ -113,65 +126,3 @@ Description: digital audio workstation (graphical gtk2 interface)
  .
  Further information can be
  found at <http://ardour.org/>.
-
-Package: ardour-dbg
-Section: debug
-Priority: extra
-Architecture: any
-Depends: ardour (= ${binary:Version}) | ardour-altivec (= ${binary:Version}) | ardour-i686 (= ${binary:Version}),
- ${shlibs:Depends},
- ${misc:Depends},
-Suggests: ${shlibs:Suggests}
-Description: digital audio workstation (debug)
- Ardour is a multichannel hard disk recorder (HDR) and digital audio
- workstation (DAW).  It can be used to control, record, edit and run
- complex audio setups. 
- .
- This package contains the debugging symbols.
-
-Package: ardour-altivec
-Architecture: powerpc
-Depends: ${shlibs:Depends},
- ${python:Depends},
- ${misc:Depends},
- ${cdbs:Depends}
-Recommends: ${cdbs:Recommends}
-Suggests: ${shlibs:Suggests}
-Provides: ${cdbs:Provides}
-Conflicts: ${cdbs:Conflicts}
-Replaces: ${cdbs:Replaces}
-Description: digital audio workstation (graphical gtk2 interface) [altivec]
- Ardour is a multichannel hard disk recorder (HDR) and digital audio
- workstation (DAW).  It can be used to control, record, edit and run
- complex audio setups. For more information see the description
- of the ardour package or <http://ardour.org/>.
- .
- This package is optimized for altivec and will not run on
- subarchitectures that don't support features enabled in altivec.
- It might fail with weird error SIGILLs and other non-obvious failures.
- Please refrain from filling bugs to the upstream author about this package
- that are not reproducible in the non-optimized package.
-
-Package: ardour-i686
-Architecture: i386
-Depends: ${shlibs:Depends},
- ${python:Depends},
- ${misc:Depends},
- ${cdbs:Depends}
-Recommends: ${cdbs:Recommends}
-Suggests: ${shlibs:Suggests}
-Provides: ${cdbs:Provides}
-Conflicts: ${cdbs:Conflicts}
-Replaces: ${cdbs:Replaces}
-Description: digital audio workstation (graphical gtk2 interface) [i686]
- Ardour is a multichannel hard disk recorder (HDR) and digital audio
- workstation (DAW).  It can be used to control, record, edit and run
- complex audio setups. For more information see the description
- of the ardour package or <http://ardour.org/>.
- .
- This package is optimized for i686 and will not run on
- subarchitectures that don't support features enabled in i686.
- It might fail with weird error SIGILLs and other non-obvious failures.
- Please refrain from filling bugs to the upstream author about this package
- that are not reproducible in the non-optimized package.
-
diff --git a/debian/control.in b/debian/control.in
index 7fd00a3..ca601cb 100644
--- a/debian/control.in
+++ b/debian/control.in
@@ -2,30 +2,35 @@ Source: ardour
 Section: sound
 Priority: optional
 Maintainer: Debian Multimedia Maintainers <pkg-multimedia-maintainers at lists.alioth.debian.org>
-Uploaders: Adrian Knoth <adi at drcomp.erfurt.thur.de>,
- Free Ekanayaka <freee at debian.org>,
- Jaromír Mikeš <mira.mikes at seznam.cz>,
- Jonas Smedegaard <dr at jones.dk>
-Build-Depends: @cdbs@
-Standards-Version: 3.9.4
+Uploaders:
+ Adrian Knoth <adi at drcomp.erfurt.thur.de>,
+ Jaromír Mikeš <mira.mikes at seznam.cz>
+Build-Depends:
+ @cdbs@
+Standards-Version: 3.9.6
 Homepage: http://www.ardour.org/
 Vcs-Git: git://anonscm.debian.org/pkg-multimedia/ardour.git
 Vcs-Browser: http://anonscm.debian.org/gitweb/?p=pkg-multimedia/ardour.git
 
 Package: ardour
 Architecture: any
-Depends: ${shlibs:Depends},
- ${python:Depends},
+Depends:
+ ${cdbs:Depends},
  ${misc:Depends},
- ${cdbs:Depends}
-Recommends: ${cdbs:Recommends}
-Suggests: ${shlibs:Suggests}
-Conflicts: ${cdbs:Conflicts}
-Replaces: ${cdbs:Replaces}
+ ${python:Depends},
+ ${shlibs:Depends}
+Recommends:
+ ${cdbs:Recommends}
+Suggests:
+ ${shlibs:Suggests}
+Conflicts:
+ ${cdbs:Conflicts}
+Replaces:
+ ${cdbs:Replaces}
 Description: digital audio workstation (graphical gtk2 interface)
  Ardour is a multichannel hard disk recorder (HDR) and digital audio
  workstation (DAW).  It can be used to control, record, edit and run
- complex audio setups. 
+ complex audio setups.
  .
  Ardour supports pro-audio interfaces
  through the ALSA project, which provides high quality, well designed
@@ -76,50 +81,3 @@ Description: digital audio workstation (graphical gtk2 interface)
  .
  Further information can be
  found at <http://ardour.org/>.
-
-Package: ardour-altivec
-Architecture: powerpc
-Depends: ${shlibs:Depends},
- ${python:Depends},
- ${misc:Depends},
- ${cdbs:Depends}
-Recommends: ${cdbs:Recommends}
-Suggests: ${shlibs:Suggests}
-Provides: ${cdbs:Provides}
-Conflicts: ${cdbs:Conflicts}
-Replaces: ${cdbs:Replaces}
-Description: digital audio workstation (graphical gtk2 interface) [altivec]
- Ardour is a multichannel hard disk recorder (HDR) and digital audio
- workstation (DAW).  It can be used to control, record, edit and run
- complex audio setups. For more information see the description
- of the ardour package or <http://ardour.org/>.
- .
- This package is optimized for altivec and will not run on
- subarchitectures that don't support features enabled in altivec.
- It might fail with weird error SIGILLs and other non-obvious failures.
- Please refrain from filling bugs to the upstream author about this package
- that are not reproducible in the non-optimized package.
-
-Package: ardour-i686
-Architecture: i386
-Depends: ${shlibs:Depends},
- ${python:Depends},
- ${misc:Depends},
- ${cdbs:Depends}
-Recommends: ${cdbs:Recommends}
-Suggests: ${shlibs:Suggests}
-Provides: ${cdbs:Provides}
-Conflicts: ${cdbs:Conflicts}
-Replaces: ${cdbs:Replaces}
-Description: digital audio workstation (graphical gtk2 interface) [i686]
- Ardour is a multichannel hard disk recorder (HDR) and digital audio
- workstation (DAW).  It can be used to control, record, edit and run
- complex audio setups. For more information see the description
- of the ardour package or <http://ardour.org/>.
- .
- This package is optimized for i686 and will not run on
- subarchitectures that don't support features enabled in i686.
- It might fail with weird error SIGILLs and other non-obvious failures.
- Please refrain from filling bugs to the upstream author about this package
- that are not reproducible in the non-optimized package.
-
diff --git a/debian/copyright b/debian/copyright
index 7862a79..59ba8c1 100644
--- a/debian/copyright
+++ b/debian/copyright
@@ -5,472 +5,888 @@ Source: http://ardour.org/download
 Copyright: 1998-2010, Paul Davis
 
 Files: *
-Copyright: 1998-2010, Paul Davis
+Copyright:
+ 1998-2010 Paul Davis
 License: GPL-2+
 
 Files: debian/*
 Copyright:
- 2006-2009 Free Ekanayaka <freee at debian.org>
- 2009-2014 Adrian Knoth <adi at drcomp.erfurt.thur.de>
- 2010-2014 Jonas Smedegaard <dr at jones.dk>
- 2011-2014 Jaromír Mikeš <mira.mikes at seznam.cz>
+ 2013-2015 Adrian Knoth <adi at drcomp.erfurt.thur.de>
+ 2013-2015 Jaromír Mikeš <mira.mikes at seznam.cz>
 License: GPL-2+
 
-Files: ./libs/glibmm2/*
-	./libs/gtkmm2/*
-Copyright: 1998-2004, The gtkmm Development Team
-License: LGPL-2+
-
-Files: ./libs/vamp-*
-Copyright: 2000-2008, Chris Cannam
-	2006-2008, Queen Mary, University of London
+Files: libs/vamp-*
+Copyright:
+ 2000-2008 Chris Cannam
+ 2006-2008 Queen Mary, University of London
 License: Expat and other-nopromo-Chris
 
-Files: ./libs/rubberband/*
-	./libs/vamp-plugins/Onset.*
-Copyright: 2006-2008, Chris Cannam
+Files:
+ libs/vamp-plugins/Onset.*
+Copyright:
+ 2006-2008 Chris Cannam
 License: GPL-2+
 
-Files: ./libs/appleutility/*
-Copyright: 2005, Apple Computer, Inc.
+Files:
+ libs/appleutility/*
+Copyright:
+ 2005 Apple Computer, Inc.
 License: other-Apple
 
-Files: ./libs/sigc++2/*
-Copyright: 2002-2003, 2005, The libsigc++ Development Team
-License: LGPL-2.1+
+Files: libs/surfaces/tranzport/*
+Copyright:
+ 2006 Paul Davis <pbd at op.net>
+ 2007 Michael Taht
+License: GPL-2+
 
-Files: ./libs/surfaces/tranzport/*
-Copyright: 2006, Paul Davis <pbd at op.net>
-	2007, Michael Taht
+Files: libs/surfaces/mackie/*
+Copyright:
+ 2006-2008 John Anderson
 License: GPL-2+
 
-Files: ./libs/soundtouch/*
-Copyright: Olli Parviainen
-License: LGPL-2.1+
+Files:
+ libs/surfaces/mackie/mcp_buttons.cc
+ libs/surfaces/mackie/led.cc
+ libs/surfaces/mackie/led.h
+ libs/surfaces/mackie/controls.h
+ libs/surfaces/mackie/jog.cc
+ libs/surfaces/mackie/meter.cc
+ libs/surfaces/mackie/button.h
+ libs/surfaces/mackie/fader.cc
+ libs/surfaces/mackie/meter.h
+ libs/surfaces/mackie/device_profile.cc
+ libs/surfaces/mackie/device_info.h
+ libs/surfaces/mackie/pot.cc
+ libs/surfaces/mackie/device_info.cc
+ libs/surfaces/mackie/button.cc
+ libs/surfaces/mackie/strip.cc
+ libs/surfaces/mackie/jog.h
+ libs/surfaces/mackie/pot.h
+ libs/surfaces/mackie/mackie_control_protocol.cc
+Copyright:
+ 2006-2008 John Anderson
+ 2012 Paul Davis
+License: GPL-2+
 
-Files: ./libs/surfaces/mackie/*
-Copyright: 2006-2008, John Anderson
+Files:
+ libs/ardour/ardour/ticker.h
+ libs/ardour/ardour/midi_patch_manager.h
+ libs/ardour/ticker.cc
+ libs/ardour/midi_patch_manager.cc
+ libs/midi++2/midnam_patch.cc
+Copyright:
+ 2008 Hans Baier
 License: GPL-2+
 
-Files: ./libs/surfaces/control_protocol/*
-Copyright: 2006, Paul Davis <pbd at op.net>
-License: LGPL-2+
+Files:
+ libs/backends/alsa/zita-alsa-pcmi.h
+ libs/backends/alsa/zita-alsa-pcmi.cc
+Copyright:
+ 2006-2012 Fons Adriaensen <fons at linuxaudio.org>
+License: GPL-3+
+
+Files:
+ libs/ardour/vumeterdsp.cc
+ libs/ardour/kmeterdsp.cc
+ libs/ardour/iec2ppmdsp.cc
+ libs/ardour/ardour/iec1ppmdsp.h
+ libs/ardour/ardour/kmeterdsp.h
+ libs/ardour/ardour/vumeterdsp.h
+ libs/ardour/ardour/iec2ppmdsp.h
+Copyright:
+ 2008-2012 Fons Adriaensen <fons at linuxaudio.org>
+License: GPL-2+
 
-Files: ./libs/gtkmm2ext/auto_spin.cc
-	./libs/gtkmm2ext/choice.cc
-	./libs/gtkmm2ext/click_box.cc
-	./libs/gtkmm2ext/gtk_ui.cc
-	./libs/gtkmm2ext/gtkmm2ext/auto_spin.h
-	./libs/gtkmm2ext/gtkmm2ext/click_box.h
-	./libs/gtkmm2ext/gtkmm2ext/gtk_ui.h
-	./libs/gtkmm2ext/gtkmm2ext/gtkutils.h
-	./libs/gtkmm2ext/gtkmm2ext/popup.h
-	./libs/gtkmm2ext/gtkmm2ext/prompter.h
-	./libs/gtkmm2ext/gtkmm2ext/selector.h
-	./libs/gtkmm2ext/gtkmm2ext/tearoff.h
-	./libs/gtkmm2ext/gtkmm2ext/textviewer.h
-	./libs/gtkmm2ext/gtkmm2ext/utils.h
-	./libs/gtkmm2ext/popup.cc
-	./libs/gtkmm2ext/prompter.cc
-	./libs/gtkmm2ext/selector.cc
-	./libs/gtkmm2ext/tearoff.cc
-	./libs/gtkmm2ext/textviewer.cc
-	./libs/gtkmm2ext/utils.cc
-	./libs/midi++2/fd_midiport.cc
-	./libs/midi++2/fifomidi.cc
-	./libs/midi++2/midi++/alsa_rawmidi.h
-	./libs/midi++2/midi++/channel.h
-	./libs/midi++2/midi++/factory.h
-	./libs/midi++2/midi++/fd_midiport.h
-	./libs/midi++2/midi++/fifomidi.h
-	./libs/midi++2/midi++/manager.h
-	./libs/midi++2/midi++/mmc.h
-	./libs/midi++2/midi++/nullmidi.h
-	./libs/midi++2/midi++/parser.h
-	./libs/midi++2/midi++/port.h
-	./libs/midi++2/midi.cc
-	./libs/midi++2/midichannel.cc
-	./libs/midi++2/midifactory.cc
-	./libs/midi++2/midimanager.cc
-	./libs/midi++2/midiparser.cc
-	./libs/midi++2/midiport.cc
-	./libs/midi++2/mmc.cc
-	./libs/midi++2/mtc.cc
-	./libs/pbd/pathscanner.cc
-	./libs/pbd/pbd/abstract_ui.h
-	./libs/pbd/pbd/mountpoint.h
-	./libs/pbd/pbd/pool.h
-	./libs/pbd/pbd/receiver.h
-	./libs/pbd/pbd/selectable.h
-	./libs/pbd/pbd/stl_delete.h
-	./libs/pbd/pbd/stl_functors.h
-	./libs/pbd/pbd/textreceiver.h
-	./libs/pbd/pbd/thrown_error.h
-	./libs/pbd/pbd/touchable.h
-	./libs/pbd/pbd/transmitter.h
-	./libs/pbd/pool.cc
-	./libs/pbd/receiver.cc
-	./libs/pbd/textreceiver.cc
-	./libs/pbd/transmitter.cc
-Copyright: 1998-2005, Paul Barton-Davis
-License: GPL-2+
-
-Files: ./gtk2_ardour/imageframe.cc
-	./gtk2_ardour/simpleline.cc
-	./gtk2_ardour/simplerect.cc
-	./gtk2_ardour/waveview.cc
-	./libs/libgnomecanvasmm/libgnomecanvasmm/canvas.cc
-	./libs/libgnomecanvasmm/libgnomecanvasmm/ellipse.cc
-	./libs/libgnomecanvasmm/libgnomecanvasmm/group.cc
-	./libs/libgnomecanvasmm/libgnomecanvasmm/item.cc
-	./libs/libgnomecanvasmm/libgnomecanvasmm/line.cc
-	./libs/libgnomecanvasmm/libgnomecanvasmm/rect-ellipse.cc
-	./libs/libgnomecanvasmm/libgnomecanvasmm/rect.cc
-	./libs/libgnomecanvasmm/libgnomecanvasmm/text.cc
-	./libs/libgnomecanvasmm/libgnomecanvasmm/widget.cc
-Copyright: 1998, EMC Capital Management Inc
+Files:
+ libs/surfaces/control_protocol/*
+ gtk2_ardour/canvas-waveview.h
+Copyright:
+ 2006 Paul Davis <pbd at op.net>
 License: LGPL-2+
 
-Files: ./gtk2_ardour/imageframe.h
-	./gtk2_ardour/simpleline.h
-	./gtk2_ardour/simplerect.h
-	./gtk2_ardour/waveview.h
-	./libs/libgnomecanvasmm/libgnomecanvasmm/canvas.h
-	./libs/libgnomecanvasmm/libgnomecanvasmm/ellipse.h
-	./libs/libgnomecanvasmm/libgnomecanvasmm/group.h
-	./libs/libgnomecanvasmm/libgnomecanvasmm/item.h
-	./libs/libgnomecanvasmm/libgnomecanvasmm/line.h
-	./libs/libgnomecanvasmm/libgnomecanvasmm/rect-ellipse.h
-	./libs/libgnomecanvasmm/libgnomecanvasmm/rect.h
-	./libs/libgnomecanvasmm/libgnomecanvasmm/text.h
-	./libs/libgnomecanvasmm/libgnomecanvasmm/widget.h
-Copyright: 1998, EMC Capital Management Inc
-	1999, The Gtk-- Development Team
-License: LGPL-2+
+Files:
+ libs/ardour/midiport_manager.cc
+ libs/ardour/async_midi_port.cc
+ libs/ardour/ardour/midiport_manager.h
+ libs/ardour/ardour/async_midi_port.h
+ libs/gtkmm2ext/auto_spin.cc
+ libs/gtkmm2ext/choice.cc
+ libs/gtkmm2ext/click_box.cc
+ libs/gtkmm2ext/gtk_ui.cc
+ libs/gtkmm2ext/gtkmm2ext/auto_spin.h
+ libs/gtkmm2ext/gtkmm2ext/click_box.h
+ libs/gtkmm2ext/gtkmm2ext/gtk_ui.h
+ libs/gtkmm2ext/gtkmm2ext/gtkutils.h
+ libs/gtkmm2ext/gtkmm2ext/popup.h
+ libs/gtkmm2ext/gtkmm2ext/prompter.h
+ libs/gtkmm2ext/gtkmm2ext/selector.h
+ libs/gtkmm2ext/gtkmm2ext/tearoff.h
+ libs/gtkmm2ext/gtkmm2ext/textviewer.h
+ libs/gtkmm2ext/gtkmm2ext/utils.h
+ libs/gtkmm2ext/popup.cc
+ libs/gtkmm2ext/prompter.cc
+ libs/gtkmm2ext/selector.cc
+ libs/gtkmm2ext/tearoff.cc
+ libs/gtkmm2ext/textviewer.cc
+ libs/gtkmm2ext/utils.cc
+ libs/midi++2/midi++/channel.h
+ libs/midi++2/midi++/mmc.h
+ libs/midi++2/midi++/parser.h
+ libs/midi++2/midi++/port.h
+ libs/midi++2/midi.cc
+ libs/midi++2/mmc.cc
+ libs/midi++2/mtc.cc
+ libs/midi++2/midi++/ipmidi_port.h
+ libs/midi++2/port.cc
+ libs/midi++2/parser.cc
+ libs/midi++2/channel.cc
+ libs/pbd/pbd/abstract_ui.h
+ libs/pbd/pbd/mountpoint.h
+ libs/pbd/pbd/pool.h
+ libs/pbd/pbd/receiver.h
+ libs/pbd/pbd/selectable.h
+ libs/pbd/pbd/stl_delete.h
+ libs/pbd/pbd/stl_functors.h
+ libs/pbd/pbd/textreceiver.h
+ libs/pbd/pbd/thrown_error.h
+ libs/pbd/pbd/touchable.h
+ libs/pbd/pbd/transmitter.h
+ libs/pbd/pool.cc
+ libs/pbd/receiver.cc
+ libs/pbd/textreceiver.cc
+ libs/pbd/transmitter.cc
+ libs/pbd/receiver.cc
+Copyright:
+ 1998-2005 Paul Barton-Davis
+License: GPL-2+
 
-Files: ./libs/libgnomecanvasmm/libgnomecanvasmm/bpath.cc
-	./libs/libgnomecanvasmm/libgnomecanvasmm/bpath.h
-	./libs/libgnomecanvasmm/libgnomecanvasmm/path-def.cc
-	./libs/libgnomecanvasmm/libgnomecanvasmm/path-def.h
-	./libs/libgnomecanvasmm/libgnomecanvasmm/pixbuf.cc
-	./libs/libgnomecanvasmm/libgnomecanvasmm/pixbuf.h
-	./libs/libgnomecanvasmm/libgnomecanvasmm/rich-text.cc
-	./libs/libgnomecanvasmm/libgnomecanvasmm/rich-text.h
-	./libs/libgnomecanvasmm/libgnomecanvasmm/shape.cc
-	./libs/libgnomecanvasmm/libgnomecanvasmm/shape.h
-Copyright: 2002, The libgnomecanvasmm Development Team
-License: LGPL-2+
+Files:
+ gtk2_ardour/bundle_env_msvc.cc
+ libs/backends/wavesaudio/wavesapi/MiscUtils/pthread_utils.h
+ libs/ardour/msvc/msvc_libardour.cc
+ libs/ardour/ardour/msvc_libardour.h
+ libs/pbd/msvc/msvc_pbd.cc
+ libs/pbd/msvc/msvc_poll.cc
+ libs/pbd/pbd/msvc_pbd.h
+ libs/pbd/pbd/windows_special_dirs.h
+ libs/pbd/windows_special_dirs.cc
+Copyright:
+ 2008-2014 John Emmas
+License: GPL-2+
 
-Files: ./libs/gtkmm2ext/application.cc
-	./libs/gtkmm2ext/gtkapplication.c
-	./libs/gtkmm2ext/gtkapplication_quartz.mm
-	./libs/gtkmm2ext/gtkapplication_x11.c
-	./libs/gtkmm2ext/gtkmm2ext/gtkapplication-private.h
-	./libs/gtkmm2ext/gtkmm2ext/gtkapplication.h
-Copyright: 2007, Imendio AB
-	2007, Pioneer Research Center USA, Inc
-	2009, Paul Davis <pbd at op.net>
-License: LGPL-2.1
+Files:
+ gtk2_ardour/nsmclient.cc
+ gtk2_ardour/nsm.h
+ gtk2_ardour/nsmclient.h
+ gtk2_ardour/nsm.cc
+Copyright:
+ 2012 Jonathan Moore Liles
+License: GPL-2+
 
-Files: ./libs/gtkmm2/pango/pangomm/attriter.cc
-	./libs/gtkmm2/pango/pangomm/attrlist.cc
-	./libs/gtkmm2/pango/pangomm/context.cc
-	./libs/gtkmm2/pango/pangomm/coverage.cc
-	./libs/gtkmm2/pango/pangomm/font.cc
-	./libs/gtkmm2/pango/pangomm/fontmetrics.cc
-Copyright: 1998-1999, The Gtk-- Development Team
-	2001, Free Software Foundation
-License: LGPL-2+
+Files:
+ libs/gtkmm2ext/paths_dialog.cc
+ libs/gtkmm2ext/gtkmm2ext/paths_dialog.h
+ libs/plugins/reasonablesynth.lv2/rsynth.c
+ libs/plugins/reasonablesynth.lv2/lv2.c
+ libs/ardouralsautil/request_device.c
+ libs/backends/jack/weak_libjack.c
+ libs/backends/asio/rt_thread.h
+ libs/backends/alsa/alsa_midi.cc
+ libs/backends/alsa/alsa_midi.h
+ libs/backends/alsa/alsa_sequencer.cc
+ libs/backends/alsa/rt_thread.h
+ libs/backends/alsa/select_sleep.h
+ libs/backends/dummy/dummy_midi_seq.h
+ libs/backends/coreaudio/coremidi_io.cc
+ libs/backends/coreaudio/coreaudio_pcmio.h
+ libs/backends/coreaudio/coremidi_io.h
+ libs/backends/coreaudio/rt_thread.h
+ libs/backends/coreaudio/coreaudio_pcmio.cc
+ libs/backends/portaudio/portaudio_io.cc
+ libs/backends/portaudio/portaudio_io.h
+ libs/backends/portaudio/rt_thread.h
+ libs/backends/alsa/alsa_sequencer.h
+ libs/backends/alsa/alsa_rawmidi.h
+ libs/ardour/ardour/mididm.h
+ libs/ardour/mididm.cc
+ libs/vfork/exec_wrapper.c
+Copyright:
+ 2013-2015 Robin Gareus <robin at gareus.org>
+License: GPL-2+
 
-Files: ./gtk2_ardour/canvas-curve.h
-	./gtk2_ardour/canvas-ruler.h
-	./gtk2_ardour/canvas-simpleline.h
-	./gtk2_ardour/canvas-simplerect.h
-	./gtk2_ardour/canvas-waveview.h
-Copyright: 2001, 2003, Paul Davis <pbd at op.net>
-License: LGPL-2+
+Files:
+ libs/canvas/xfade_curve.cc
+ libs/canvas/canvas/xfade_curve.h
+ libs/ardouralsautil/devicelist.cc
+ libs/ardouralsautil/ardouralsautil/devicelist.h
+ libs/backends/asio/asio_backend.h
+ libs/backends/asio/asio_backend.cc
+ libs/backends/alsa/alsa_audiobackend.h
+ libs/backends/alsa/alsa_audiobackend.cc
+ libs/backends/dummy/dummy_audiobackend.cc
+ libs/backends/dummy/dummy_audiobackend.h
+ libs/backends/coreaudio/coreaudio_backend.cc
+ libs/backends/coreaudio/coreaudio_backend.h
+ libs/backends/portaudio/portaudio_backend.cc
+ libs/backends/portaudio/portaudio_backend.h
+ libs/ardour/ardour/system_exec.h
+ libs/ardour/ardour/delayline.h
+ libs/ardour/delayline.cc
+ libs/pbd/pbd/system_exec.h
+Copyright:
+ 2013-2015 Robin Gareus <robin at gareus.org>
+ 2006-2013 Paul Davis
+License: GPL-2+
 
-Files: ./libs/clearlooks-newer/clearlooks_rc_style.c
-	./libs/clearlooks-newer/clearlooks_rc_style.h
-	./libs/clearlooks-older/clearlooks_rc_style.c
-	./libs/clearlooks-older/clearlooks_rc_style.h
-	./libs/clearlooks-older/clearlooks_style.h
-Copyright: 2005, Richard Stellingwerff
-License: LGPL-2+
+Files:
+ libs/gtkmm2ext/emscale.cc
+ libs/gtkmm2ext/gtkmm2ext/emscale.h
+Copyright:
+ 2014 Paul Davis, Robin Gareus
+License: GPL-2+
 
-Files: ./libs/ardour/ardour/gdither.h
-	./libs/ardour/ardour/gdither_types.h
-	./libs/ardour/ardour/gdither_types_internal.h
-	./libs/ardour/gdither.cc
-Copyright: 2002, Steve Harris <steve at plugin.org.uk>
+Files:
+ libs/gtkmm2ext/gtkmm2ext/rgb_macros.h
+Copyright:
+ 2000 EMC Capital Management, Inc
 License: GPL-2+
 
-Files: ./gtk2_ardour/gettext.h
-	./libs/ardour/gettext.h
-	./libs/gtkmm2ext/gettext.h
-	./libs/pbd/gettext.h
-Copyright: 1995-1998, 2000-2002, Free Software Foundation, Inc
-License: LGPL-2+
+Files:
+ gtk2_ardour/timers.cc
+ gtk2_ardour/timers.h
+ libs/backends/jack/jack_utils.h
+ libs/ardour/session_state_utils.cc
+ libs/ardour/ardour/session_state_utils.h
+ libs/ardour/ardour/jack_utils.h
+ libs/ardour/ardour/session_directory.h
+ libs/ardour/ardour/tape_file_matcher.h
+ libs/ardour/ardour/filesystem_paths.h
+ libs/ardour/test/test_ui.h
+ libs/ardour/test/test_ui.cc
+ libs/ardour/session_directory.cc
+ libs/ardour/filesystem_paths.cc
+ libs/ardour/test/test_util.cc
+ libs/pbd/search_path.cc
+ libs/pbd/timer.cc
+ libs/pbd/resource.cc
+ libs/pbd/ffs.cc
+ libs/pbd/glib_semaphore.cc
+ libs/pbd/test/test_common.h
+ libs/pbd/test/test_common.cc
+ libs/pbd/pbd/timing.h
+ libs/pbd/pbd/file_utils.h
+ libs/pbd/pbd/timer.h
+ libs/pbd/pbd/glib_semaphore.h
+ libs/pbd/pbd/ffs.h
+ libs/pbd/pbd/resource.h
+ libs/pbd/pbd/search_path.h
+ libs/pbd/pbd/atomic_counter.h
+ libs/pbd/timing.cc
+Copyright:
+ 2007-2015 Tim Mayberry
+License: GPL-2+
 
-Files: ./libs/gtkmm2/pango/pangomm.h
-	./libs/gtkmm2/pango/pangomm/attributes.cc
-	./libs/gtkmm2/pango/pangomm/glyph.cc
-	./libs/libgnomecanvasmm/libgnomecanvasmm.h
-Copyright: 1999-2002, Free Software Foundation
-License: LGPL-2+
+Files:
+ libs/backends/jack/jack_utils.cc
+ libs/ardour/ardour/search_paths.h
+ libs/ardour/test/test_util.h
+ libs/ardour/test/test_receiver.h
+ libs/pbd/file_utils.cc
+Copyright:
+ 2007-2014 Tim Mayberry
+ 1998-2014 Paul Davis
+License: GPL-2+
 
-Files: ./gtk2_ardour/gtk-custom-hruler.c
-	./gtk2_ardour/gtk-custom-hruler.h
-	./gtk2_ardour/gtk-custom-ruler.c
-	./gtk2_ardour/gtk-custom-ruler.h
-Copyright: 1995-1997, Peter Mattis, Spencer Kimball and Josh MacDonald
-License: LGPL-2+
+Files:
+ libs/backends/wavesaudio/waves_audioport.h
+ libs/backends/wavesaudio/waves_midi_device.cc
+ libs/backends/wavesaudio/waves_midi_buffer.cc
+ libs/backends/wavesaudio/waves_midi_event.h
+ libs/backends/wavesaudio/waves_midiport.cc
+ libs/backends/wavesaudio/wavesapi/refmanager/WCRefManager.h
+ libs/backends/wavesaudio/wavesapi/BasicTypes/WUTypes.h
+ libs/backends/wavesaudio/wavesapi/BasicTypes/WUDefines.h
+ libs/backends/wavesaudio/wavesapi/BasicTypes/WUComPtr.h
+ libs/backends/wavesaudio/wavesapi/BasicTypes/WUMathConsts.h
+ libs/backends/wavesaudio/wavesapi/BasicTypes/WTByteOrder.h
+ libs/backends/wavesaudio/wavesapi/BasicTypes/WCFourCC.h
+ libs/backends/wavesaudio/wavesapi/WavesPublicAPI/1.0/WavesPublicAPI_Defines.h
+ libs/backends/wavesaudio/wavesapi/WavesPublicAPI/wstdint.h
+ libs/backends/wavesaudio/wavesapi/WavesPublicAPI/WTErr.h
+ libs/backends/wavesaudio/wavesapi/Threads/WCThreadSafe.h
+ libs/backends/wavesaudio/wavesapi/devicemanager/WCMRAudioDeviceManager.h
+ libs/backends/wavesaudio/wavesapi/devicemanager/WCMRPortAudioDeviceManager.h
+ libs/backends/wavesaudio/wavesapi/devicemanager/WCMRCoreAudioDeviceManager.h
+ libs/backends/wavesaudio/wavesapi/devicemanager/IncludeWindows.h
+ libs/backends/wavesaudio/wavesapi/devicemanager/WCMRNativeAudio.h
+ libs/backends/wavesaudio/wavesapi/MiscUtils/safe_delete.h
+ libs/backends/wavesaudio/wavesapi/MiscUtils/WUErrors.h
+ libs/backends/wavesaudio/wavesapi/MiscUtils/MinMaxUtilities.h
+ libs/backends/wavesaudio/wavesapi/MiscUtils/UMicroseconds.h
+ libs/backends/wavesaudio/wavesapi/MiscUtils/WCFixedString.h
+ libs/backends/wavesaudio/waves_dataport.h
+ libs/backends/wavesaudio/waves_midi_event.cc
+ libs/backends/wavesaudio/waves_midi_device.h
+ libs/backends/wavesaudio/waves_audiobackend.midi.cc
+ libs/backends/wavesaudio/waves_midi_device_manager.h
+ libs/backends/wavesaudio/waves_audiobackend.latency.cc
+ libs/backends/wavesaudio/waves_audioport.cc
+ libs/backends/wavesaudio/waves_dataport.cc
+ libs/backends/wavesaudio/waves_midi_device_manager.cc
+ libs/backends/wavesaudio/waves_midiport.h
+ libs/backends/wavesaudio/waves_midi_buffer.h
+ libs/backends/wavesaudio/waves_audiobackend.port_engine.cc
+ libs/backends/wavesaudio/waves_audiobackend.cc
+ libs/backends/wavesaudio/waves_audiobackend.h
+Copyright:
+ 2013-2014 Waves Audio Ltd
+License: GPL-2+
 
-Files: ./libs/gtkmm2/atk/atkmm/wrap_init.h
-	./libs/gtkmm2/pango/pangomm/wrap_init.h
-	./libs/libgnomecanvasmm/libgnomecanvasmm/polygon.cc
-	./libs/libgnomecanvasmm/libgnomecanvasmm/polygon.h
-Copyright: 1998-2001, The Gtk-- Development Team
+Files:
+ libs/clearlooks-newer/clearlooks_rc_style.c
+ libs/clearlooks-newer/clearlooks_rc_style.h
+ libs/clearlooks-older/clearlooks_rc_style.c
+ libs/clearlooks-older/clearlooks_rc_style.h
+ libs/clearlooks-older/clearlooks_style.h
+Copyright:
+ 2005 Richard Stellingwerff
 License: LGPL-2+
 
-Files: ./libs/glibmm2/glibmm/init.cc
-	./libs/glibmm2/glibmm/optioncontext.h
-	./libs/glibmm2/glibmm/optionentry.h
-	./libs/glibmm2/glibmm/optiongroup.h
-Copyright: 2003, The glibmm Development Team
-	2004, The glibmm Development Team
+Files:
+ gtk2_ardour/gettext.h
+ libs/ardour/gettext.h
+ libs/gtkmm2ext/gettext.h
+ libs/pbd/gettext.h
+Copyright:
+ 1995-1998, 2000-2002 Free Software Foundation, Inc
 License: LGPL-2+
 
-Files: ./libs/libgnomecanvasmm/libgnomecanvasmm/affinetrans.cc
-	./libs/libgnomecanvasmm/libgnomecanvasmm/affinetrans.h
-	./libs/libgnomecanvasmm/libgnomecanvasmm/point.cc
-	./libs/libgnomecanvasmm/libgnomecanvasmm/point.h
-Copyright: 1999, The gnomemm Development Team
-License: LGPL-2+
+Files:
+ libs/pbd/pbd/timersub.h
+Copyright:	
+ 1991-1994,1996-2003,2005,2006,2009 Free Software Foundation, Inc.
+License: LGPL-2.1+
 
-Files: ./gtk2_ardour/export_range_markers_dialog.h
-	./gtk2_ardour/export_region_dialog.h
-	./gtk2_ardour/export_session_dialog.h
-Copyright: 2006, Andre Raue
+Files:
+ gtk2_ardour/export_range_markers_dialog.h
+Copyright:
+ 2006 Andre Raue
 License: GPL-2+
 
-Files: ./libs/libgnomecanvasmm/libgnomecanvasmm/init.cc
-	./libs/libgnomecanvasmm/libgnomecanvasmm/init.h
-	./libs/libgnomecanvasmm/libgnomecanvasmm/wrap_init.h
-Copyright: 1998-2001, The libgnomeuimm Development Team
-	2001, The libgnomeuimm Development Team
-License: LGPL-2+
+Files:
+ libs/pbd/msvc/getopt.h
+ libs/pbd/msvc/getopt_long.c
+Copyright:
+ 2000 The NetBSD Foundation, Inc
+License: BSD-4-clause
 
-Files: ./libs/pbd/pbd/undo.h
-	./libs/pbd/undo.cc
-Copyright: 2001-2002, Brett Viren & Paul Davis
+Files:
+ libs/pbd/pbd/undo.h
+ libs/pbd/undo.cc
+Copyright:
+ 2001-2002 Brett Viren & Paul Davis
 License: GPL-2+
 
-Files: ./libs/ardour/ardour/mtdm.h
-	./libs/ardour/mtdm.cc
-Copyright: 2003-2008, Fons Adriaensen <fons at kokkinizita.net>
+Files:
+ libs/ardour/ardour/mtdm.h
+ libs/ardour/mtdm.cc
+Copyright:
+ 2003-2008 Fons Adriaensen <fons at kokkinizita.net>
 License: GPL-2+
 
-Files: ./libs/midi++2/coremidi_midiport.cc
-	./libs/midi++2/midi++/coremidi_midiport.h
-Copyright: 2004, Grame
-	2004, Paul Davis
+Files:
+ libs/pbd/pbd/command.h
+ libs/pbd/pbd/memento_command.h
+Copyright:
+ 2006 Hans Fugal
+ 2006 Paul Davis
 License: GPL-2+
 
-Files: ./libs/pbd/pbd/command.h
-	./libs/pbd/pbd/memento_command.h
-Copyright: 2006, Hans Fugal
-	2006, Paul Davis
+Files:
+ libs/ardour/ardour/vestige/aeffectx.h
+Copyright:
+ 2006 Javier Serrano Polo <jasp00/at/users.sourceforge.net>
 License: GPL-2+
 
-Files: ./libs/pbd/pbd/ringbuffer.h
-	./libs/pbd/pbd/ringbufferNPT.h
-Copyright: 2000, Benno Senoner
-	2000, Paul Davis
+Files:
+ libs/pbd/pbd/ringbuffer.h
+ libs/pbd/pbd/ringbufferNPT.h
+Copyright:
+ 2000 Benno Senoner
+ 2000 Paul Davis
 License: GPL-2+
 
-Files: ./libs/clearlooks-newer/clearlooks_draw_gummy.c
-	./libs/clearlooks-newer/clearlooks_draw_inverted.c
-Copyright: 2007, Andrea Cimitan
-License: LGPL-2+
-
-Files: ./libs/clearlooks-newer/clearlooks_style.c
-	./libs/clearlooks-newer/support.c
-Copyright: 2005, Richard Stellingwerff
-	2007, Benjamin Berg <benjamin at sipsolutions.net>
-License: LGPL-2+
-
-Files: ./libs/libgnomecanvasmm/libgnomecanvasmm/properties.cc
-	./libs/libgnomecanvasmm/libgnomecanvasmm/properties.h
-Copyright: 1999-2002, The Free Software Foundation
-License: LGPL-2+
-
-Files: ./libs/gtkmm2/atk/atkmm/init.cc
-	./libs/gtkmm2/atk/atkmm/init.h
-Copyright: 2003, The atkmm Development Team
+Files:
+ libs/clearlooks-newer/clearlooks_draw_gummy.c
+ libs/clearlooks-newer/clearlooks_draw_inverted.c
+Copyright:
+ 2007 Andrea Cimitan
 License: LGPL-2+
 
-Files: ./libs/pbd/pbd/xml++.h
-	./libs/pbd/xml++.cc
-Copyright: 2000, Ari Johnson, and
+Files:
+ libs/clearlooks-newer/clearlooks_style.c
+ libs/clearlooks-newer/support.c
+Copyright:
+ 2005 Richard Stellingwerff
+ 2007, Benjamin Berg <benjamin at sipsolutions.net>
 License: LGPL-2+
 
-Files: ./gtk2_ardour/canvas-imageframe.c
-	./gtk2_ardour/canvas-imageframe.h
-Copyright: 1998, The Free Software Foundation
-License: UNKNOWN
-
-Files: ./libs/ardour/cycle_timer.cc
-Copyright: 2002, Andrew Morton
+Files:
+ libs/pbd/pbd/xml++.h
+ libs/pbd/xml++.cc
+Copyright:
+ 2000 Ari Johnson, and
 License: GPL-2+
 
-Files: ./libs/rubberband/src/main.cpp
-Copyright: 2007-2008, Chris Cannam
+Files:
+ libs/ardour/cycle_timer.cc
+Copyright:
+ 2002 Andrew Morton
 License: GPL-2+
 
 Files: ./gtk2_ardour/rgb_macros.h
 Copyright: 2000, EMC Capital Management, Inc
 License: GPL-2+
 
-Files: ./libs/ardour/pcm_utils.cc
-Copyright: 2006, Paul Davis
-	Erik de Castro Lopo
-License: GPL-2+
-
-Files: ./gtk2_ardour/utils.cc
-Copyright: 2000, Greg Ercolano <erco at 3dsite.com>
-	2003, Paul Davis
-License: GPL-2+
-
-Files: ./libs/fst/vestige/aeffectx.h
-Copyright: 2006, Javier Serrano Polo <jasp00 at users.sourceforge.net>
+Files:
+ libs/ardour/pcm_utils.cc
+Copyright:
+ 2006 Paul Davis
+ Erik de Castro Lopo
 License: GPL-2+
 
-Files: ./libs/fst/vsti.c
-Copyright: Kjetil S. Matheussen 2004, (k.s.matheussen at notam02.no)
+Files:
+ gtk2_ardour/utils.cc
+Copyright:
+ 2000 Greg Ercolano <erco at 3dsite.com>
+ 2003 Paul Davis
 License: GPL-2+
 
-Files: ./libs/surfaces/tranzport/tranzport_control_protocol.h
-Copyright: 2006, Paul Davis
-	2007, Mike Taht
+Files:
+ libs/ardour/audio_unit.cc
+Copyright:
+ 2006-2009, Paul Davis
+ Sophia Poirier
 License: GPL-2+
 
-Files: ./libs/ardour/audio_unit.cc
-Copyright: 2006-2009, Paul Davis
-	Sophia Poirier
+Files:
+ libs/ardour/ardour/pcm_utils.h
+Copyright:
+ 2006 Paul Davis
+ Erik de Castro Lopo
 License: GPL-2+
 
-Files: ./libs/ardour/ardour/pcm_utils.h
-Copyright: 2006, Paul Davis
-	Erik de Castro Lopo
+Files:
+ libs/ardour/sse_functions_64bit.s
+Copyright:
+ 2005-2006 John Rigg
+ 2005-2006 Paul Davis
 License: GPL-2+
 
-Files: ./libs/ardour/sse_functions_64bit.s
-Copyright: 2005-2006, John Rigg
-	2005-2006, Paul Davis
+Files:
+ libs/ardour/ardour/mix.h
+Copyright:
+ 2005 Sampo Savolainen
 License: GPL-2+
 
-Files: ./libs/ardour/ardour/mix.h
-Copyright: 2005, Sampo Savolainen
+Files:
+ libs/ardour/ardour/logcurve.h
+Copyright:
+ 2001 Paul Davis
+ 2001 Steve Harris
 License: GPL-2+
 
-Files: ./libs/ardour/ardour/logcurve.h
-Copyright: 2001, Paul Davis
-	2001, Steve Harris
+Files:
+ libs/audiographer/private/gdither/gdither_types.h
+ libs/audiographer/private/gdither/gdither.cc
+ libs/audiographer/private/gdither/gdither.h
+ libs/audiographer/private/gdither/gdither_types_internal.h
+Copyright:
+ 2002 Steve Harris
 License: GPL-2+
 
-Files: ./libs/gtkmm2ext/gtkmm2ext/application.h
-Copyright: 2009, Paul Davis
+Files:
+ libs/gtkmm2ext/gtkmm2ext/application.h
+Copyright:
+ 2009 Paul Davis
 License: LGPL-2.1
 
-Files: ./libs/clearlooks-newer/clearlooks_draw_glossy.c
-Copyright: 2006, Benjamin Berg
-	2007, Andrea Cimitan
+Files:
+ libs/clearlooks-newer/clearlooks_draw_glossy.c
+Copyright:
+ 2006 Benjamin Berg
+ 2007 Andrea Cimitan
 License: LGPL-2+
 
-Files: ./libs/clearlooks-newer/clearlooks_draw.c
-Copyright: 2006, Daniel Borgman
-	2006, Richard Stellingwerff
-	2007, Andrea Cimitan
-	2007, Benjamin Berg
+Files:
+ libs/clearlooks-newer/clearlooks_draw.c
+Copyright:
+ 2006 Daniel Borgman
+ 2006 Richard Stellingwerff
+ 2007 Andrea Cimitan
+ 2007 Benjamin Berg
 License: LGPL-2+
 
-Files: ./libs/clearlooks-newer/clearlooks_style.h
-Copyright: 2005, Richard Stellingwerff
-	2006, Benjamin Berg
+Files:
+ libs/clearlooks-newer/clearlooks_style.h
+Copyright:
+ 2005 Richard Stellingwerff
+ 2006 Benjamin Berg
 License: LGPL-2+
 
-Files: ./libs/clearlooks-newer/animation.c
-Copyright: 2006, Benjamin Berg <benjamin at sipsolutions.net>
-	2006, Kulyk Nazar <schamane at myeburg.net>
+Files:
+ libs/clearlooks-newer/animation.c
+Copyright:
+ 2006 Benjamin Berg <benjamin at sipsolutions.net>
+ 2006 Kulyk Nazar <schamane at myeburg.net>
 License: LGPL-2+
 
-Files: ./libs/ardour/ardour/spline.h
-Copyright: 1997, David Mosberger
+Files:
+ libs/ardour/ardour/spline.h
+Copyright:
+ 1997 David Mosberger
 License: LGPL-2+
 
-Files: ./libs/gtkmm2/gtk/gtkmm/layout.cc
-Copyright: 1998, EMC Capital Management Inc
-	1998-2002, The gtkmm Development Team
+Files:
+ libs/surfaces/mackie/timer.h
+Copyright:
+ 1998-2000, 2007 John Anderson
 License: LGPL-2+
 
-Files: ./libs/glibmm2/glibmm/class.h
-Copyright: 1998-2002, The gtkmm Development Team
-	2001, Free Software Foundation
-License: LGPL-2+
+Files:
+ libs/pbd/pbd/compose.h
+Copyright:
+ 2002 Ole Laursen <olau at hardworking.dk>
+License: LGPL-2.1+
 
-Files: ./libs/surfaces/mackie/timer.h
-Copyright: 1998-2000, 2007 John Anderson
-License: LGPL-2+
+Files:
+ libs/ardour/ardour/ladspa.h
+Copyright:
+ 2000-2002 Richard W.E. Furse, Paul Barton-Davis
+License: LGPL-2.1+
 
-Files: ./gtk2_ardour/canvas-noevent-text.h
-Copyright: 2009, Paul Davis <paul at linuxaudiosystems.com>
-License: LGPL-2+
+Files:
+ libs/vamp-plugins/AmplitudeFollower.cpp
+ libs/vamp-plugins/AmplitudeFollower.h
+Copyright:
+ 2006 Dan Stowell
+License: Expat and other-nopromo-Chris
 
-Files: ./libs/gtkmm2/gtk/gtkmm/accelmap.h
-Copyright: 2002, The Gtkmm Development Team
-License: LGPL-2+
+Files:
+ tools/session_exchange.py
+Copyright:
+ 2004-2005 Taybin Rutkin
+License: GPL-2+
+Comment: 
+ From: Paul Davis <paul at linuxaudiosystems.com> 
+ To: Jaromír Mikeš <mira.mikes at gmail.com>
+ Copy: Debian Multimedia Maintainers <pkg-multimedia-maintainers at lists.alioth.debian.org> 
+ Subject: Re: Ardour3 in debian
+ Date: Wed Sep 18 13:32:49 UTC 2013
+ .
+ GPL2+ is taybin's answer (actually, his real answer is that he doesn't care :)
+ .
+ --p
+ .
+ On Wed, Sep 18, 2013 at 2:53 AM, Jaromír Mikeš <mira.mikes at gmail.com> wrote:
+ .
+ >
+ > 2013/9/17 Paul Davis <paul at linuxaudiosystems.com>
+ >
+ >> Since that script (session_exchange.py) no longer works, I suggest you
+ >> remove it from your packaging.
+ >>
+ >> The licensing intent ... I will ask Taybin.
+ >>
+ >
+ > Thank you Paul for prompt answer,
+ >
+ > removing session_exchange.py would be solution, clarifying from Taybin
+ > would be great.
+ >
+ > mira
+
+Files:
+ libs/pbd/pbd/fastlog.h
+Copyright:
+ unknown. Code by Laurent de Soras <laurent at ohmforce.com>
+License: WTFPL-2
 
-Files: ./libs/gtkmm2/gtk/gtkmm/aboutdialog.h
-Copyright: 2004, The gtkmm Development Team
+Files:
+ tools/omf/loader.cc
+ tools/omf/omftool.cc
+Copyright:
+ 2009 Hannes Breul
+License: zlib
+
+Files:
+ libs/libltc/ltc/ltc.h
+ libs/libltc/ltc/decoder.h
+ libs/libltc/ltc/encoder.h
+ libs/libltc/timecode.c
+ libs/libltc/ltc.c
+ libs/libltc/encoder.c
+ libs/libltc/decoder.c
+Copyright:
+ 2006-2012 Robin Gareus <robin at gareus.org>
+License: LGPL-3+
+
+Files:
+ libs/timecode/timecode/time.h
+ libs/timecode/timecode/bbt_time.h
+ libs/timecode/src/time.cc
+ libs/timecode/src/bbt_time.cc
+Copyright:
+ 2002-2010 Paul Davis
 License: LGPL-2+
 
-Files: ./libs/glibmm2/glibmm/fileutils.h
-Copyright: 2002, The gtkmm Development Team
-License: LGPL-2+
+Files:
+ gtk2_ardour/gtk_pianokeyboard.c
+Copyright:
+ 2007, 2008 Edward Tomasz Napiera <trasz at FreeBSD.org>
+License: BSD-2-clause
 
-Files: ./libs/gtkmm2/pango/pangomm/init.cc
-Copyright: 2003, The pangomm Development Team
-License: LGPL-2+
+Files:
+ libs/pbd/boost-debug/shared_ptr.hpp
+Copyright:
+ 2001, 2002, 2003 Peter Dimov
+ 1998, 1999 Greg Colvin and Beman Dawes
+License: BSL-1.0
+
+Files:
+ libs/gtkmm2ext/sync-menu.c
+ libs/gtkmm2ext/gtkmm2ext/sync-menu.h
+ libs/gtkmm2ext/gtkmm2ext/gtkapplication.h
+Copyright:
+ 2007 Pioneer Research Center USA, Inc / 2007 Imendio AB
+License: LGPL-2.1
 
-Files: ./libs/pbd/pbd/compose.h
-Copyright: 2002, Ole Laursen <olau at hardworking.dk>
-License: LGPL-2.1+
+Files:
+ libs/gtkmm2ext/gtkapplication.c
+ libs/gtkmm2ext/application.cc
+ libs/gtkmm2ext/gtkapplication_x11.c
+ libs/gtkmm2ext/gtkmm2ext/gtkapplication-private.h
+ libs/gtkmm2ext/gtkapplication_win32.c
+Copyright:
+ 2009 Paul Davis / 2007 Imendio AB / 2007 Pioneer Research Center USA, Inc
+License: LGPL-2.1
+
+Files:
+ libs/evoral/evoral/Control.hpp
+ libs/evoral/evoral/Note.hpp
+ libs/evoral/evoral/Event.hpp
+ libs/evoral/evoral/Curve.hpp
+ libs/evoral/evoral/EventSink.hpp
+ libs/evoral/evoral/types.hpp
+ libs/evoral/evoral/Range.hpp
+ libs/evoral/evoral/Parameter.hpp
+ libs/evoral/evoral/ControlList.hpp
+ libs/evoral/evoral/Sequence.hpp
+ libs/evoral/evoral/MIDIEvent.hpp
+ libs/evoral/evoral/TypeMap.hpp
+ libs/evoral/evoral/ControlSet.hpp
+ libs/evoral/evoral/EventList.hpp
+ libs/evoral/evoral/PatchChange.hpp
+ libs/evoral/evoral/TimeConverter.hpp
+ libs/evoral/evoral/midi_util.h
+ libs/evoral/evoral/SMF.hpp
+ libs/evoral/evoral/OldSMF.hpp
+ libs/evoral/evoral/SMFReader.hpp
+ libs/evoral/test/SMFTest.hpp
+ libs/evoral/src/Curve.cpp
+ libs/evoral/src/SMF.cpp
+ libs/evoral/src/MIDIEvent.cpp
+ libs/evoral/src/Sequence.cpp
+ libs/evoral/src/midi_util.cpp
+ libs/evoral/src/Note.cpp
+ libs/evoral/src/ControlList.cpp
+ libs/evoral/src/OldSMF.cpp
+ libs/evoral/src/ControlSet.cpp
+ libs/evoral/src/Control.cpp
+ libs/evoral/src/SMFReader.cpp
+ libs/evoral/src/Event.cpp
+Copyright:
+ 2000-2008 Paul Davis / 2008 David Robillard <http://drobilla.net>
+License: GPL-2+
 
-Files: ./libs/ardour/ardour/ladspa.h
-Copyright: 2000-2002, Richard W.E. Furse, Paul Barton-Davis
+Files:
+ libs/evoral/evoral/midi_events.h
+ libs/midi++2/midi++/events.h
+Copyright:
+ *No copyright*
 License: LGPL-2.1+
 
-Files: ./libs/vamp-plugins/AmplitudeFollower.cpp
-	./libs/vamp-plugins/AmplitudeFollower.h
-Copyright: 2006, Dan Stowell
-License: Expat and other-nopromo-Chris
+Files:
+ libs/backends/jack/weak_libjack.h
+Copyright:
+ 2004 Jack O'Quin
+ 2001 Paul Davis
+ 2014 Robin Gareus <robin at gareus.org>
+License: GPL-2+
+
+Files:
+ libs/backends/alsa/alsa_rawmidi.cc
+Copyright:
+ 2010 Devin Anderson
+ 2014 Robin Gareus <robin at gareus.org>
+License: GPL-2+
+
+Files:
+ libs/backends/wavesaudio/portmidi/src/pm_mac/readbinaryplist.c
+Copyright:
+ 2007 Jens Ayton
+License: Expat
+
+Files:
+ libs/backends/wavesaudio/portmidi/portmidi.h	
+Copyright:
+ 2001-2006 Roger B. Dannenberg
+ 1999-2000 Ross Bencina and Phil Burk
+License: Expat
+
+Files:
+ libs/backends/coreaudio/coreaudio_pcmio_aggregate.cc
+Copyright:
+ 2004-2008 Grame
+ 2015 Robin Gareus <robin at gareus.org>
+License: GPL-2+
+
+Files:
+ libs/surfaces/frontier/kernel_drivers/tranzport.c	
+Copyright:
+ 2004 Greg Kroah-Hartman (greg at kroah.com)
+ 2005 Michael Hund <mhund at ld-didactic.de>
+ 2007 Michael Taht (m at taht.net)
+ 2003 David Glance <advidgsf at sourceforge.net>
+License: GPL-2+
+
+Files:
+ libs/pbd/msvc/getopt.c
+Copyright:
+ 1987, 1993, 1994 The Regents of the University of California
+License: BSD-4-clause
+
+Files:
+ libs/ardouralsautil/ardouralsautil/reserve.h
+ libs/ardouralsautil/reserve.c
+Copyright:
+ 2009 Lennart Poettering
+License: Expat
+
+Files:
+ libs/pbd/system_exec.cc
+Copyright:
+ 2005-2008 Lennart Poettering
+ 2010-2014 Robin Gareus <robin at gareus.org>
+ 2010 Paul Davis
+License: GPL-2+
+
+Files:
+ libs/qm-dsp/thread/BlockAllocator.h
+Copyright:
+ 2008 Juha Nieminen
+License: Expat
+
+Files:
+ libs/qm-dsp/base/Pitch.cpp
+ libs/qm-dsp/base/Window.h
+ libs/qm-dsp/base/Pitch.h
+Copyright:
+ 2006 Chris Cannam
+License: GPL-2+
+
+Files:
+ libs/qm-dsp/maths/CosineDistance.cpp
+ libs/qm-dsp/maths/CosineDistance.h
+Copyright:
+ 2008 Kurt Jacobson
+License: GPL-2+
+
+Files:
+ libs/qm-dsp/dsp/rhythm/BeatSpectrum.cpp
+ libs/qm-dsp/dsp/rhythm/BeatSpectrum.h
+Copyright:
+ 2008 Kurt Jacobson and QMUL
+License: GPL-2+
+
+Files:
+ libs/qm-dsp/maths/pca/pca.h
+ libs/qm-dsp/hmm/hmm.c
+ libs/qm-dsp/hmm/hmm.h
+ libs/qm-dsp/dsp/segmentation/segment.h
+ libs/qm-dsp/dsp/segmentation/Segmenter.cpp
+ libs/qm-dsp/dsp/segmentation/ClusterMeltSegmenter.cpp
+ libs/qm-dsp/dsp/segmentation/cluster_segmenter.c
+ libs/qm-dsp/dsp/segmentation/cluster_melt.c
+ libs/qm-dsp/dsp/segmentation/cluster_melt.h
+ libs/qm-dsp/dsp/segmentation/cluster_segmenter.h
+ libs/qm-dsp/dsp/segmentation/Segmenter.h
+ libs/qm-dsp/dsp/segmentation/ClusterMeltSegmenter.h
+Copyright:
+ 2006 Centre for Digital Music, Queen Mary, University of London
+License: GPL-2+
+
+Files:
+ libs/qm-dsp/maths/KLDivergence.cpp
+ libs/qm-dsp/maths/KLDivergence.h
+Copyright:
+ 2008 QMUL
+License: GPL-2+
+
+Files:
+ libs/qm-dsp/dsp/mfcc/MFCC.h
+ libs/qm-dsp/dsp/mfcc/MFCC.cpp
+Copyright:
+ 2005 Nicolas Chetry, 2008 QMUL
+License: GPL-2+
 
-Files: ./libs/pbd/dmalloc.cc
-Copyright: 1995, Gray Watson
-License: other-noncommercial-Watson
+Files:
+ libs/qm-dsp/dsp/tonal/TCSgram.h
+ libs/qm-dsp/dsp/tonal/TCSgram.cpp
+ libs/qm-dsp/dsp/tonal/ChangeDetectionFunction.h
+ libs/qm-dsp/dsp/tonal/ChangeDetectionFunction.cpp
+ libs/qm-dsp/dsp/tonal/TonalEstimator.cpp
+ libs/qm-dsp/dsp/tonal/TonalEstimator.h
+Copyright:
+ 2006 Martin Gasser
+License: GPL-2+
 
-Files: ./libs/surfaces/powermate/interface.cc
-Copyright: Harrison Audio, LLC, 2007
-License: UNKNOWN
+Files:
+ libs/qm-dsp/dsp/wavelet/Wavelet.cpp
+ libs/qm-dsp/dsp/wavelet/Wavelet.h
+Copyright:
+ 2009 Thomas Wilmering
+License: GPL-2+
 
-Files: tools/session_exchange.py
-Copyright: 2004-2005 Taybin Rutkin
-License: GPL
+Files:
+ libs/qm-dsp/dsp/tempotracking/TempoTrackV2.h
+ libs/qm-dsp/dsp/tempotracking/DownBeat.h
+ libs/qm-dsp/dsp/tempotracking/DownBeat.cpp
+ libs/qm-dsp/dsp/tempotracking/TempoTrackV2.cpp
+Copyright:
+ 2008-2009 Matthew Davies and QMUL
+License: GPL-2+
 
-Files: ./libs/pbd/pbd/fastlog.h
-Copyright: Laurent de Soras <laurent at ohmforce.com>
-License: UNKNOWN
+Files:
+ libs/qm-dsp/dsp/tempotracking/TempoTrack.cpp	
+Copyright:
+ 2005-2006 Christian Landone and Matthew Davies
+License: GPL-2+
+
+Files:
+ libs/ardour/rdff.c
+ libs/ardour/lv2_evbuf.h
+ libs/ardour/lv2_evbuf.c
+ libs/ardour/lv2_evbuf.h
+Copyright:
+ 2008-2012 David Robillard <http://drobilla.net>
+License: ISC
+
+Files:
+ libs/audiographer/private/sndfile.hh
+Copyright:
+ 2005-2007 Erik de Castro Lopo <erikd at mega-nerd.com>
+License: BSD-3-clause
 
 License: GPL-2+
  This program is free software; you can redistribute it and/or modify it
@@ -493,6 +909,23 @@ License: GPL-2+
  You should have received a copy of the GNU General Public License
  along with this program.  If not, see <http://www.gnu.org/licenses/>.
 
+License: GPL-3+
+ This program is free software: you can redistribute it and/or modify
+ it under the terms of the GNU General Public License as published by
+ the Free Software Foundation, either version 3 of the License, or
+ (at your option) any later version.
+ .
+ This package is distributed in the hope that it will be useful,
+ but WITHOUT ANY WARRANTY; without even the implied warranty of
+ MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the
+ GNU General Public License for more details.
+Comment:
+ On Debian systems, the complete text of the GNU General Public
+ License can be found in the file `/usr/share/common-licenses/GPL-3'.
+ .
+ You should have received a copy of the GNU General Public License
+ along with this program.  If not, see <http://www.gnu.org/licenses/>.
+
 License: LGPL-2.1+
  This library is free software; you can redistribute it and/or modify it
  under the terms of the GNU Lesser General Public License as published
@@ -615,31 +1048,140 @@ License: other-Apple
  STRICT LIABILITY OR OTHERWISE, EVEN IF APPLE HAS BEEN ADVISED OF THE
  POSSIBILITY OF SUCH DAMAGE.
 
-License: other-noncommercial-Watson
- Permission to use, copy, modify, and distribute this software for any
- NON-COMMERCIAL purpose and without fee is hereby granted, provided that
- the above copyright notice and this permission notice appear in all
- copies, and that the name of Gray Watson not be used in advertising or
- publicity pertaining to distribution of the document or software
- without specific, written prior permission.
+License: WTFPL-2
+ This work is free. You can redistribute it and/or modify it under the
+ terms of the Do What The Fuck You Want To Public License, Version 2,
+ as published by Sam Hocevar. See http://www.wtfpl.net/ for more details.
 
-License: GPL
- This program is free software; you can redistribute it and/or modify it
- under the terms of the GNU General Public License as published by the
- Free Software Foundation; either version 1, or (at your option) any
- later version.
+License: zlib
+ This software is provided 'as-is', without any express or implied
+ warranty.  In no event will the authors be held liable for any damages
+ arising from the use of this software.
  .
- This program is distributed in the hope that it will be useful, but
- WITHOUT ANY WARRANTY; without even the implied warranty of
- MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the GNU
- General Public License for more details.
+ Permission is granted to anyone to use this software for any purpose,
+ including commercial applications, and to alter it and redistribute it
+ freely, subject to the following restrictions:
  .
- Some files differ from above by replacing "this program" with "this
- file".
+ 1. The origin of this software must not be misrepresented; you must not
+    claim that you wrote the original software. If you use this software
+    in a product, an acknowledgment in the product documentation would be
+    appreciated but is not required.
+ 2. Altered source versions must be plainly marked as such, and must not be
+    misrepresented as being the original software.
+ 3. This notice may not be removed or altered from any source distribution.
+
+License: LGPL-3+
+ This package is free software; you can redistribute it and/or
+ modify it under the terms of the GNU Lesser General Public
+ License as published by the Free Software Foundation; either
+ version 3 of the License, or (at your option) any later version.
  .
- On Debian GNU systems, the complete text of the GNU General Public
- License (GPL) version 1 or later can be found in
- '/usr/share/common-licenses/GPL-1'.
+ This package is distributed in the hope that it will be useful,
+ but WITHOUT ANY WARRANTY; without even the implied warranty of
+ MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the GNU
+ Lesser General Public License for more details.
  .
- You should have received a copy of the GNU General Public License
+ You should have received a copy of the GNU Lesser General Public
+ License along with this library; if not, write to the Free Software
+ Foundation, Inc., 51 Franklin St, Fifth Floor, Boston, MA  02110-1301  USA
+X-Comment: On Debian systems, the complete text of the GNU Lesser
+ General Public License can be found in `/usr/share/common-licenses/LGPL-3'.
+ .
+ You should have received a copy of the GNU Lesser General Public License
  along with this program.  If not, see <http://www.gnu.org/licenses/>.
+
+License: BSD-2-clause
+ Redistribution and use in source and binary forms, with or without
+ modification, are permitted provided that the following conditions
+ are met:
+ 1. Redistributions of source code must retain the above copyright
+    notice, this list of conditions and the following disclaimer.
+ 2. Redistributions in binary form must reproduce the above copyright
+    notice, this list of conditions and the following disclaimer in the
+    documentation and/or other materials provided with the distribution.
+ .
+ THIS SOFTWARE IS PROVIDED BY THE AUTHOR AND CONTRIBUTORS ``AS IS'' AND
+ ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE
+ IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
+ ARE DISCLAIMED.  IN NO EVENT SHALL THE AUTHOR OR CONTRIBUTORS BE LIABLE
+ FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
+ DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS
+ OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION)
+ HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT
+ LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY
+ OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF
+ SUCH DAMAGE.
+
+License: BSD-4-clause
+ Redistribution and use in source and binary forms, with or without
+ modification, are permitted provided that the following conditions
+ are met:
+ 1. Redistributions of source code must retain the above copyright
+    notice, this list of conditions and the following disclaimer.
+ 2. Redistributions in binary form must reproduce the above copyright
+    notice, this list of conditions and the following disclaimer in the
+    documentation and/or other materials provided with the distribution.
+ 3. All advertising materials mentioning features or use of this software
+    must display the following acknowledgement:
+    This product includes software developed by the University of
+    California, Berkeley and its contributors.
+ 4. Neither the name of the University nor the names of its contributors
+    may be used to endorse or promote products derived from this software
+    without specific prior written permission.
+ .
+ THIS SOFTWARE IS PROVIDED BY THE REGENTS AND CONTRIBUTORS ``AS IS'' AND
+ ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE
+ IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
+ ARE DISCLAIMED.  IN NO EVENT SHALL THE REGENTS OR CONTRIBUTORS BE LIABLE
+ FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
+ DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS
+ OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION)
+ HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT
+ LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY
+ OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF
+ SUCH DAMAGE.
+
+License: BSL-1.0
+ Distributed under the Boost Software License, Version 1.0. (See
+ accompanying file LICENSE_1_0.txt or copy at
+ http://www.boost.org/LICENSE_1_0.txt)
+
+License: ISC
+ Permission to use, copy, modify, and/or distribute this software for any
+ purpose with or without fee is hereby granted, provided that the above
+ copyright notice and this permission notice appear in all copies.
+ .
+ THIS SOFTWARE IS PROVIDED "AS IS" AND THE AUTHOR DISCLAIMS ALL WARRANTIES
+ WITH REGARD TO THIS SOFTWARE INCLUDING ALL IMPLIED WARRANTIES OF
+ MERCHANTABILITY AND FITNESS. IN NO EVENT SHALL THE AUTHOR BE LIABLE FOR
+ ANY SPECIAL, DIRECT, INDIRECT, OR CONSEQUENTIAL DAMAGES OR ANY DAMAGES
+ WHATSOEVER RESULTING FROM LOSS OF USE, DATA OR PROFITS, WHETHER IN AN
+ ACTION OF CONTRACT, NEGLIGENCE OR OTHER TORTIOUS ACTION, ARISING OUT OF
+ OR IN CONNECTION WITH THE USE OR PERFORMANCE OF THIS SOFTWARE.
+
+License: BSD-3-clause
+ Redistribution and use in source and binary forms, with or without
+ modification, are permitted provided that the following conditions are
+ met:
+ .
+ * Redistributions of source code must retain the above copyright
+   notice, this list of conditions and the following disclaimer.
+ * Redistributions in binary form must reproduce the above copyright
+   notice, this list of conditions and the following disclaimer in
+   the documentation and/or other materials provided with the
+   distribution.
+ * Neither the author nor the names of any contributors may be used
+   to endorse or promote products derived from this software without
+   specific prior written permission.
+ .
+ THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
+ "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED
+ TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR
+ PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR
+ CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL,
+ EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO,
+ PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS;
+ OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY,
+ WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR
+ OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF
+ ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
diff --git a/debian/gbp.conf b/debian/gbp.conf
index 059dcc5..77d4d33 100644
--- a/debian/gbp.conf
+++ b/debian/gbp.conf
@@ -1,8 +1,7 @@
-# Configuration file for git-buildpackage and friends
-
 [DEFAULT]
 pristine-tar = True
-sign-tags = True
+upstream-branch = upstream-a4
+debian-branch = master-a4
 
-[git-buildpackage]
-compression = bzip2
+[import-orig]
+filter = waf
diff --git a/debian/patches/100_syslibs.patch b/debian/patches/100_syslibs.patch
deleted file mode 100644
index a7d97e7..0000000
--- a/debian/patches/100_syslibs.patch
+++ /dev/null
@@ -1,47 +0,0 @@
-Index: ardour/SConstruct
-===================================================================
---- ardour.orig/SConstruct	2013-10-01 22:33:14.872132489 +0200
-+++ ardour/SConstruct	2013-10-01 22:33:14.868132489 +0200
-@@ -515,6 +515,12 @@
-     libraries['aubio'] = LibraryInfo()
-     libraries['aubio'].ParseConfig('pkg-config --cflags --libs aubio')
- 
-+libraries['vamp'] = LibraryInfo ()
-+libraries['vamp'].ParseConfig('pkg-config --cflags --libs vamp-sdk')
-+
-+libraries['vamphost'] = LibraryInfo ()
-+libraries['vamphost'].ParseConfig('pkg-config --cflags --libs vamp-hostsdk')
-+
- env = conf.Finish ()
- 
- if env['FFT_ANALYSIS']:
-@@ -921,13 +927,6 @@
- # these are part of the Ardour source tree because they are C++
- # 
- 
--libraries['vamp'] = LibraryInfo (LIBS='vampsdk',
--                                 LIBPATH='#libs/vamp-sdk',
--                                 CPPPATH='#libs/vamp-sdk')
--libraries['vamphost'] = LibraryInfo (LIBS='vamphostsdk',
--                                 LIBPATH='#libs/vamp-sdk',
--                                 CPPPATH='#libs/vamp-sdk')
--
- env['RUBBERBAND'] = False
- 
- conf = Configure (env)
-@@ -1131,7 +1130,6 @@
-         'libs/pbd',
-         'libs/midi++2',
-         'libs/ardour',
--        'libs/vamp-sdk',
-         'libs/vamp-plugins/',
-     # these are unconditionally included but have
-     # tests internally to avoid compilation etc
-@@ -1194,7 +1192,6 @@
-         'libs/pbd',
-         'libs/midi++2',
-         'libs/ardour',
--        'libs/vamp-sdk',
-         'libs/vamp-plugins/',
-     # these are unconditionally included but have
-     # tests internally to avoid compilation etc
diff --git a/debian/patches/111_libardourvampplugins.patch b/debian/patches/111_libardourvampplugins.patch
deleted file mode 100644
index 974a78c..0000000
--- a/debian/patches/111_libardourvampplugins.patch
+++ /dev/null
@@ -1,17 +0,0 @@
-Description: Fix for broken symbol resolving in libardourvampplugins.so
-Forwarded: not-needed
-Origin: vendor, http://bugs.debian.org/cgi-bin/bugreport.cgi?msg=5;bug=560052
-Bug-Debian: http://bugs.debian.org/560052
-Acked-by: Klaumi Klingsporn <klaumikli at gmx.de>
-Author: Adrian Knoth <adi at drcomp.erfurt.thur.de>
-
---- a/libs/vamp-plugins/SConscript
-+++ b/libs/vamp-plugins/SConscript
-@@ -16,7 +16,6 @@
- Import('env install_prefix libraries')
- vampplugs = env.Clone()
- 
--vampplugs.Append (CPPATH='#libs/vamp-sdk/vamp', CXXFLAGS="-Ilibs/vamp-sdk")
- vampplugs.Merge ([libraries['vamp'],
-                   libraries['vamphost'],
-                   libraries['aubio'],
diff --git a/debian/patches/140_enable-ladish.patch b/debian/patches/140_enable-ladish.patch
deleted file mode 100644
index 08d93dc..0000000
--- a/debian/patches/140_enable-ladish.patch
+++ /dev/null
@@ -1,46 +0,0 @@
-Origin: http://tracker.ardour.org/view.php?id=2990
-Author: Nedko Arnaudov <nedko at arnaudov.name>
-Description: Add ladish level 1 support
---- a/gtk2_ardour/main.cc
-+++ b/gtk2_ardour/main.cc
-@@ -485,6 +485,27 @@
- 	}
- }
- 
-+static bool ladish_L1_save_request = false;
-+
-+static gboolean
-+maybe_ladish_L1_save (void* /* ignored */)
-+{
-+	if (ladish_L1_save_request) {
-+		cout << "ladish L1 save request" << endl;
-+		ladish_L1_save_request = false;
-+		ARDOUR_UI::instance()->save_state("");
-+	}
-+
-+	return true;
-+}
-+
-+static void
-+sigusr1_handler (int sig)
-+{
-+	//cout << "SIGUSR1 received!" << endl;
-+	ladish_L1_save_request = true;
-+}
-+
- #ifdef VST_SUPPORT
- 
- extern int gui_init (int* argc, char** argv[]);
-@@ -575,6 +596,12 @@
- 		cerr << _("Cannot install SIGPIPE error handler") << endl;
- 	}
- 
-+	g_timeout_add (300, maybe_ladish_L1_save, 0);
-+
-+	if (::signal (SIGUSR1, sigusr1_handler)) {
-+		cerr << _("Cannot install SIGUSR1 error handler") << endl;
-+	}
-+
-         try { 
- 		ui = new ARDOUR_UI (&argc, &argv);
- 	} catch (failed_constructor& err) {
diff --git a/debian/patches/150_soundtouch.patch b/debian/patches/150_soundtouch.patch
deleted file mode 100644
index f4d84fa..0000000
--- a/debian/patches/150_soundtouch.patch
+++ /dev/null
@@ -1,20 +0,0 @@
-Description: Use the new soundtouch.pc file.
- The pkg-config soundtouch-1.0.pc file was obsolete and it is no longer
- provided, rely on the new soundtouch.pc instead.
-Author: Miguel Colon <debian.micove at gmail.com>
-Forwarded: no
-Last-Update: 2011-02-21
-
-Index: ardour/SConstruct
-===================================================================
---- ardour.orig/SConstruct	2011-10-04 00:45:37.007369394 +0200
-+++ ardour/SConstruct	2011-10-04 00:46:29.749835171 +0200
-@@ -1104,7 +1104,7 @@
- 
- #    libraries['flowcanvas'] = LibraryInfo(LIBS='flowcanvas', LIBPATH='#/libs/flowcanvas', CPPPATH='#libs/flowcanvas')
-     libraries['soundtouch'] = LibraryInfo()
--    libraries['soundtouch'].ParseConfig ('pkg-config --cflags --libs soundtouch-1.0')
-+    libraries['soundtouch'].ParseConfig ('pkg-config --cflags --libs soundtouch')
-     # Comment the previous line and uncomment this for old versions of Debian:
-     #libraries['soundtouch'].ParseConfig ('pkg-config --cflags --libs libSoundTouch')
- 
diff --git a/debian/patches/160_kfreebsd.patch b/debian/patches/160_kfreebsd.patch
deleted file mode 100644
index 26d0ccb..0000000
--- a/debian/patches/160_kfreebsd.patch
+++ /dev/null
@@ -1,184 +0,0 @@
-From: Steven McDonald <steven.mcdonald at libremail.me>
-Description: Remove assumptions about ALSA to make it work on non-Linux ports.
-Forwarded: No (unclean patch, DEB_HOST_ARCH_OS is Debian specific)
-Bug-Debian: http://bugs.debian.org/cgi-bin/bugreport.cgi?bug=636921
-Index: ardour/SConstruct
-===================================================================
---- ardour.orig/SConstruct	2013-10-01 22:34:23.260135270 +0200
-+++ ardour/SConstruct	2013-10-01 22:34:23.256135270 +0200
-@@ -1044,7 +1044,7 @@
- 
- conf = Configure(env)
- 
--if conf.CheckCHeader('alsa/asoundlib.h'):
-+if conf.CheckCHeader('alsa/asoundlib.h') and os.environ['DEB_HOST_ARCH_OS'] == 'linux':
-     libraries['sysmidi'] = LibraryInfo (LIBS='asound')
-     env['SYSMIDI'] = 'ALSA Sequencer'
-     subst_dict['%MIDITAG%'] = "seq"
-@@ -1062,8 +1062,10 @@
-     subst_dict['%MIDITAG%'] = "ardour"
-     subst_dict['%MIDITYPE%'] = "coremidi"
- else:
--    print "It appears you don't have the required MIDI libraries installed. For Linux this means you are missing the development package for ALSA libraries."
--    sys.exit (1)
-+    libraries['sysmidi'] = LibraryInfo ()
-+    env['SYSMIDI'] = 'none'
-+    subst_dict['%MIDITAG%'] = "none"
-+    subst_dict['%MIDITYPE%'] = "none"
- 
- pname = env['PROGRAM_NAME']
- subst_dict['%MIDI_DEVICE_NAME%'] = pname.lower()
-@@ -1348,9 +1350,12 @@
- if conf.CheckCHeader('/System/Library/Frameworks/CoreAudio.framework/Versions/A/Headers/CoreAudio.h'):
-     subst_dict['%JACK_INPUT%'] = "coreaudio:Built-in Audio:in"
-     subst_dict['%JACK_OUTPUT%'] = "coreaudio:Built-in Audio:out"
--else:
-+elif os.environ['DEB_HOST_ARCH_OS'] == 'linux':
-     subst_dict['%JACK_INPUT%'] = "alsa_pcm:playback_"
-     subst_dict['%JACK_OUTPUT%'] = "alsa_pcm:capture_"
-+else:
-+    subst_dict['%JACK_INPUT%'] = "oss:playback_"
-+    subst_dict['%JACK_OUTPUT%'] = "oss:capture_"
- 
- # posix_memalign available
- if not conf.CheckFunc('posix_memalign'):
-Index: ardour/libs/midi++2/SConscript
-===================================================================
---- ardour.orig/libs/midi++2/SConscript	2013-10-01 22:34:23.260135270 +0200
-+++ ardour/libs/midi++2/SConscript	2013-10-01 22:34:23.256135270 +0200
-@@ -39,9 +39,11 @@
-    midi2.Append (CCFLAGS="-DWITH_COREMIDI")
-    midi2.Append (LINKFLAGS="-framework CoreMIDI")
-    midi2.Append (LINKFLAGS="-framework CoreFoundation")
--else:
-+elif os.environ['DEB_HOST_ARCH_OS'] == 'linux':
-    sysdep_src = [ 'alsa_sequencer_midiport.cc' ]
-    midi2.Append (CCFLAGS="-DWITH_ALSA")
-+else:
-+   sysdep_src = [ ]
- 
- midi2.Append(CCFLAGS="-D_REENTRANT -D_LARGEFILE_SOURCE -D_LARGEFILE64_SOURCE")
- midi2.Append(CCFLAGS="-DLIBSIGC_DISABLE_DEPRECATED")
-Index: ardour/gtk2_ardour/engine_dialog.cc
-===================================================================
---- ardour.orig/gtk2_ardour/engine_dialog.cc	2013-10-01 22:34:23.260135270 +0200
-+++ ardour/gtk2_ardour/engine_dialog.cc	2013-10-01 22:34:23.256135270 +0200
-@@ -14,7 +14,7 @@
- #include <CoreFoundation/CFString.h>
- #include <sys/param.h>
- #include <mach-o/dyld.h>
--#else
-+#elif defined(__linux__)
- #include <alsa/asoundlib.h>
- #endif
- 
-@@ -117,7 +117,9 @@
- #ifdef __APPLE__
- 	strings.push_back (X_("CoreAudio"));
- #else
-+#ifdef __linux__
- 	strings.push_back (X_("ALSA"));
-+#endif
- 	strings.push_back (X_("OSS"));
- 	strings.push_back (X_("FFADO"));
- #endif
-@@ -162,7 +164,7 @@
- 	basic_packer.attach (period_size_combo, 1, 2, row, row + 1, FILL|EXPAND, (AttachOptions) 0);
- 	row++;
- 
--#ifndef __APPLE__
-+#ifdef __linux__
- 	label = manage (new Label (_("Number of buffers")));
- 	basic_packer.attach (*label, 0, 1, row, row + 1, FILL|EXPAND, (AttachOptions) 0);
- 	basic_packer.attach (periods_spinner, 1, 2, row, row + 1, FILL|EXPAND, (AttachOptions) 0);
-@@ -183,7 +185,7 @@
- 	row++;
- 	/* no audio mode with CoreAudio, its duplex or nuthin' */
- 
--#ifndef __APPLE__
-+#ifdef __linux__
- 	label = manage (new Label (_("Audio Mode")));
- 	basic_packer.attach (*label, 0, 1, row, row + 1, FILL|EXPAND, (AttachOptions) 0);
- 	basic_packer.attach (audio_mode_combo, 1, 2, row, row + 1, FILL|EXPAND, (AttachOptions) 0);
-@@ -223,7 +225,7 @@
- 	realtime_button.signal_toggled().connect (mem_fun (*this, &EngineControl::realtime_changed));
- 	realtime_changed ();
- 
--#ifndef __APPLE__
-+#ifdef __linux__
- 	label = manage (new Label (_("Realtime Priority")));
- 	label->set_alignment (1.0, 0.5);
- 	options_packer.attach (*label, 0, 1, row, row + 1, FILL|EXPAND, (AttachOptions) 0);
-@@ -273,7 +275,7 @@
- 	options_packer.attach (*label, 0, 1, row, row + 1, FILL|EXPAND, (AttachOptions) 0);
- 	++row;
- 
--#ifndef __APPLE__
-+#ifdef __linux
- 	label = manage (new Label (_("Dither")));	
- 	label->set_alignment (1.0, 0.5);
- 	options_packer.attach (dither_mode_combo, 1, 2, row, row + 1, FILL|EXPAND, AttachOptions(0));
-@@ -289,7 +291,7 @@
- 	device_packer.set_spacings (6);
- 	row = 0;
- 
--#ifndef __APPLE__
-+#ifdef __linux__
- 	label = manage (new Label (_("Input device")));
- 	label->set_alignment (1.0, 0.5);
- 	device_packer.attach (*label, 0, 1, row, row+1, FILL|EXPAND, (AttachOptions) 0);
-@@ -613,7 +615,7 @@
- void
- EngineControl::realtime_changed ()
- {
--#ifndef __APPLE__
-+#ifdef __linux__
- 	priority_spinner.set_sensitive (realtime_button.get_active());
- #endif
- }
-@@ -629,8 +631,10 @@
- #endif
- 
- #ifndef __APPLE__
-+#ifdef __linux__
- 	} else if (driver == "ALSA") {
- 		devices[driver] = enumerate_alsa_devices ();
-+#endif
- 	} else if (driver == "FFADO") {
- 		devices[driver] = enumerate_ffado_devices ();
- 	} else if (driver == "OSS") {
-@@ -757,6 +761,7 @@
- 	return devs;
- }
- #else
-+#ifdef __linux__
- vector<string>
- EngineControl::enumerate_alsa_devices ()
- {
-@@ -817,6 +822,7 @@
- 
- 	return devs;
- }
-+#endif
- 
- vector<string>
- EngineControl::enumerate_ffado_devices ()
-@@ -859,7 +865,9 @@
- 	vector<string>& strings = devices[driver];
- 
- 	if (strings.empty() && driver != "FFADO" && driver != "Dummy") {
-+#ifdef __linux__
- 		error << string_compose (_("No devices found for driver \"%1\""), driver) << endmsg;
-+#endif
- 		return;
- 	}
- 	
-@@ -912,7 +920,7 @@
- EngineControl::redisplay_latency ()
- {
- 	uint32_t rate = get_rate();
--#ifdef __APPLE_
-+#ifndef __linux__
- 	float periods = 2;
- #else
- 	float periods = periods_adjustment.get_value();
diff --git a/debian/patches/170_template-ftbfs.patch b/debian/patches/170_template-ftbfs.patch
deleted file mode 100644
index 77c4dd9..0000000
--- a/debian/patches/170_template-ftbfs.patch
+++ /dev/null
@@ -1,16 +0,0 @@
-From: David Henningsson <david.henningsson at canonical.com>
-Forwarded: Yes
-Bugs-Debian: http://bugs.debian.org/cgi-bin/bugreport.cgi?bug=713713
-Descriptions: Fix FTBFS with recent compilers
-
---- a/libs/gtkmm2ext/gtk_ui.cc
-+++ b/libs/gtkmm2ext/gtk_ui.cc
-@@ -63,6 +63,8 @@
- 
- #include <pbd/abstract_ui.cc>  /* instantiate the template */
- 
-+template class AbstractUI<Gtkmm2ext::UIRequest>;
-+
- UI::UI (string namestr, int *argc, char ***argv) 
- 	: AbstractUI<UIRequest> (namestr, true)
- {
diff --git a/debian/patches/180_aubio.patch b/debian/patches/180_aubio.patch
deleted file mode 100644
index 87fe482..0000000
--- a/debian/patches/180_aubio.patch
+++ /dev/null
@@ -1,305 +0,0 @@
-Description: merge latest vamp-aubio-plugins version to use aubio 0.4.0
- Update libs/vamp-plugins/Onset.{cpp,h} to new aubio.
- Merge with the latest vamp-aubio-plugins revision 798ef8d.
- See http://git.aubio.org/?p=vamp-aubio-plugins.git;a=summary.
-Author: Paul Brossier <piem at debian.org>
-Forwarded: not-needed
-Bugs-Debian: http://bugs.debian.org/cgi-bin/bugreport.cgi?bug=733968
-Last-Update: 2013-12-29
-
---- a/SConstruct
-+++ b/SConstruct
-@@ -446,7 +446,7 @@
- 	'jack'                 : '0.120.0',
- 	'libgnomecanvas-2.0'   : '2.0',
- 	'sndfile'              : '1.0.18',
--        'aubio'                : '0.3.0',
-+        'aubio'                : '0.4.0',
- 	'liblo'                : '0.24'
- }
- 
---- a/libs/vamp-plugins/Onset.cpp
-+++ b/libs/vamp-plugins/Onset.cpp
-@@ -22,29 +22,32 @@
- using std::cerr;
- using std::endl;
- 
-+const char *getAubioNameForOnsetType(OnsetType t)
-+{
-+    // In the same order as the enum elements in the header
-+    static const char *const names[] = {
-+        "energy", "specdiff", "hfc", "complex", "phase", "kl", "mkl", "specflux"
-+    };
-+    return names[(int)t];
-+}
-+
- Onset::Onset(float inputSampleRate) :
-     Plugin(inputSampleRate),
-     m_ibuf(0),
--    m_fftgrain(0),
-     m_onset(0),
--    m_pv(0),
--    m_peakpick(0),
-     m_onsetdet(0),
--    m_onsettype(aubio_onset_complex),
-+    m_onsettype(OnsetComplex),
-     m_threshold(0.3),
--    m_silence(-90),
--    m_channelCount(1)
-+    m_silence(-70),
-+    m_minioi(4)
- {
- }
- 
- Onset::~Onset()
- {
--    if (m_onsetdet) aubio_onsetdetection_free(m_onsetdet);
-+    if (m_onsetdet) del_aubio_onset(m_onsetdet);
-     if (m_ibuf) del_fvec(m_ibuf);
-     if (m_onset) del_fvec(m_onset);
--    if (m_fftgrain) del_cvec(m_fftgrain);
--    if (m_pv) del_aubio_pvoc(m_pv);
--    if (m_peakpick) del_aubio_peakpicker(m_peakpick);
- }
- 
- string
-@@ -74,7 +77,7 @@
- int
- Onset::getPluginVersion() const
- {
--    return 1;
-+    return 2;
- }
- 
- string
-@@ -86,22 +89,18 @@
- bool
- Onset::initialise(size_t channels, size_t stepSize, size_t blockSize)
- {
--    m_channelCount = channels;
-+    if (channels != 1) {
-+        std::cerr << "Onset::initialise: channels must be 1" << std::endl;
-+        return false;
-+    }
-+
-     m_stepSize = stepSize;
-     m_blockSize = blockSize;
- 
--    m_ibuf = new_fvec(stepSize, channels);
--    m_onset = new_fvec(1, channels);
--    m_fftgrain = new_cvec(blockSize, channels);
--    m_pv = new_aubio_pvoc(blockSize, stepSize, channels);
--    m_peakpick = new_aubio_peakpicker(m_threshold);
--
--    m_onsetdet = new_aubio_onsetdetection(m_onsettype, blockSize, channels);
--    
--    m_delay = Vamp::RealTime::frame2RealTime(4 * stepSize,
--                                             lrintf(m_inputSampleRate));
-+    m_ibuf = new_fvec(stepSize);
-+    m_onset = new_fvec(1);
- 
--    m_lastOnset = Vamp::RealTime::zeroTime - m_delay - m_delay;
-+    reset();
- 
-     return true;
- }
-@@ -109,6 +108,22 @@
- void
- Onset::reset()
- {
-+    if (m_onsetdet) del_aubio_onset(m_onsetdet);
-+
-+    m_onsetdet = new_aubio_onset
-+        (const_cast<char *>(getAubioNameForOnsetType(m_onsettype)),
-+         m_blockSize,
-+         m_stepSize,
-+         lrintf(m_inputSampleRate));
-+    
-+    aubio_onset_set_threshold(m_onsetdet, m_threshold);
-+    aubio_onset_set_silence(m_onsetdet, m_silence);
-+    aubio_onset_set_minioi(m_onsetdet, m_minioi);
-+
-+    m_delay = Vamp::RealTime::frame2RealTime(4 * m_stepSize,
-+                                             lrintf(m_inputSampleRate));
-+
-+    m_lastOnset = Vamp::RealTime::zeroTime - m_delay - m_delay;
- }
- 
- size_t
-@@ -132,8 +147,8 @@
-     desc.identifier = "onsettype";
-     desc.name = "Onset Detection Function Type";
-     desc.minValue = 0;
--    desc.maxValue = 6;
--    desc.defaultValue = (int)aubio_onset_complex;
-+    desc.maxValue = 7;
-+    desc.defaultValue = (int)OnsetComplex;
-     desc.isQuantized = true;
-     desc.quantizeStep = 1;
-     desc.valueNames.push_back("Energy Based");
-@@ -143,6 +158,7 @@
-     desc.valueNames.push_back("Phase Deviation");
-     desc.valueNames.push_back("Kullback-Liebler");
-     desc.valueNames.push_back("Modified Kullback-Liebler");
-+    desc.valueNames.push_back("Spectral Flux");
-     list.push_back(desc);
- 
-     desc = ParameterDescriptor();
-@@ -159,11 +175,22 @@
-     desc.name = "Silence Threshold";
-     desc.minValue = -120;
-     desc.maxValue = 0;
--    desc.defaultValue = -90;
-+    desc.defaultValue = -70;
-     desc.unit = "dB";
-     desc.isQuantized = false;
-     list.push_back(desc);
- 
-+    desc = ParameterDescriptor();
-+    desc.identifier = "minioi";
-+    desc.name = "Minimum Inter-Onset Interval";
-+    desc.minValue = 0;
-+    desc.maxValue = 40;
-+    desc.defaultValue = 4;
-+    desc.unit = "ms";
-+    desc.isQuantized = true;
-+    desc.quantizeStep = 1;
-+    list.push_back(desc);
-+
-     return list;
- }
- 
-@@ -176,6 +203,8 @@
-         return m_threshold;
-     } else if (param == "silencethreshold") {
-         return m_silence;
-+    } else if (param == "minioi") {
-+        return m_minioi;
-     } else {
-         return 0.0;
-     }
-@@ -186,18 +215,21 @@
- {
-     if (param == "onsettype") {
-         switch (lrintf(value)) {
--        case 0: m_onsettype = aubio_onset_energy; break;
--        case 1: m_onsettype = aubio_onset_specdiff; break;
--        case 2: m_onsettype = aubio_onset_hfc; break;
--        case 3: m_onsettype = aubio_onset_complex; break;
--        case 4: m_onsettype = aubio_onset_phase; break;
--        case 5: m_onsettype = aubio_onset_kl; break;
--        case 6: m_onsettype = aubio_onset_mkl; break;
-+        case 0: m_onsettype = OnsetEnergy; break;
-+        case 1: m_onsettype = OnsetSpecDiff; break;
-+        case 2: m_onsettype = OnsetHFC; break;
-+        case 3: m_onsettype = OnsetComplex; break;
-+        case 4: m_onsettype = OnsetPhase; break;
-+        case 5: m_onsettype = OnsetKL; break;
-+        case 6: m_onsettype = OnsetMKL; break;
-+        case 7: m_onsettype = OnsetSpecFlux; break;
-         }
-     } else if (param == "peakpickthreshold") {
-         m_threshold = value;
-     } else if (param == "silencethreshold") {
-         m_silence = value;
-+    } else if (param == "minioi") {
-+        m_minioi = value;
-     }
- }
- 
-@@ -216,17 +248,6 @@
-     d.sampleRate = 0;
-     list.push_back(d);
- 
--    d = OutputDescriptor();
--    d.identifier = "detectionfunction";
--    d.name = "Onset Detection Function";
--    d.unit = "";
--    d.hasFixedBinCount = true;
--    d.binCount = m_channelCount;
--    d.hasKnownExtents = false;
--    d.isQuantized = false;
--    d.sampleType = OutputDescriptor::OneSamplePerStep;
--    list.push_back(d);
--
-     return list;
- }
- 
-@@ -235,21 +256,12 @@
-                Vamp::RealTime timestamp)
- {
-     for (size_t i = 0; i < m_stepSize; ++i) {
--        for (size_t j = 0; j < m_channelCount; ++j) {
--            fvec_write_sample(m_ibuf, inputBuffers[j][i], j, i);
--        }
-+        fvec_set_sample(m_ibuf, inputBuffers[0][i], i);
-     }
- 
--    aubio_pvoc_do(m_pv, m_ibuf, m_fftgrain);
--    aubio_onsetdetection(m_onsetdet, m_fftgrain, m_onset);
-+    aubio_onset_do(m_onsetdet, m_ibuf, m_onset);
- 
--    bool isonset = aubio_peakpick_pimrt(m_onset, m_peakpick);
--
--    if (isonset) {
--        if (aubio_silence_detection(m_ibuf, m_silence)) {
--            isonset = false;
--        }
--    }
-+    bool isonset = m_onset->data[0];
- 
-     FeatureSet returnFeatures;
- 
-@@ -263,11 +275,6 @@
-             m_lastOnset = timestamp;
-         }
-     }
--    Feature feature;
--    for (size_t j = 0; j < m_channelCount; ++j) {
--        feature.values.push_back(m_onset->data[j][0]);
--    }
--    returnFeatures[1].push_back(feature);
- 
-     return returnFeatures;
- }
---- a/libs/vamp-plugins/Onset.h
-+++ b/libs/vamp-plugins/Onset.h
-@@ -20,6 +20,17 @@
- #include <vamp-sdk/Plugin.h>
- #include <aubio/aubio.h>
- 
-+enum OnsetType {
-+    OnsetEnergy,
-+    OnsetSpecDiff,
-+    OnsetHFC,
-+    OnsetComplex,
-+    OnsetPhase,
-+    OnsetKL,
-+    OnsetMKL,
-+    OnsetSpecFlux // new in 0.4!
-+};
-+
- class Onset : public Vamp::Plugin
- {
- public:
-@@ -54,17 +65,14 @@
- 
- protected:
-     fvec_t *m_ibuf;
--    cvec_t *m_fftgrain;
-     fvec_t *m_onset;
--    aubio_pvoc_t *m_pv;
--    aubio_pickpeak_t *m_peakpick;
--    aubio_onsetdetection_t *m_onsetdet;
--    aubio_onsetdetection_type m_onsettype;
-+    aubio_onset_t *m_onsetdet;
-+    OnsetType m_onsettype;
-     float m_threshold;
-     float m_silence;
-+    float m_minioi;
-     size_t m_stepSize;
-     size_t m_blockSize;
--    size_t m_channelCount;
-     Vamp::RealTime m_delay;
-     Vamp::RealTime m_lastOnset;
- };
diff --git a/debian/patches/60-libdir.patch b/debian/patches/60-libdir.patch
deleted file mode 100644
index 023d5bd..0000000
--- a/debian/patches/60-libdir.patch
+++ /dev/null
@@ -1,11 +0,0 @@
---- a/SConstruct
-+++ b/SConstruct
-@@ -785,7 +785,7 @@
- 
- if env['DIST_LIBDIR'] == '':
-     if env['DIST_TARGET'] == 'x86_64':
--        env['LIBDIR']='lib64'
-+        env['LIBDIR']='lib'
-     else:
-         env['LIBDIR']='lib'
- else:
diff --git a/debian/patches/80_ardourino.patch b/debian/patches/80_ardourino.patch
deleted file mode 100644
index 73cb22b..0000000
--- a/debian/patches/80_ardourino.patch
+++ /dev/null
@@ -1,364 +0,0 @@
-Index: ardour/gtk2_ardour/mixer_strip.cc
-===================================================================
---- ardour.orig/gtk2_ardour/mixer_strip.cc	2011-10-04 00:17:40.604134219 +0200
-+++ ardour/gtk2_ardour/mixer_strip.cc	2011-10-04 00:22:56.426947262 +0200
-@@ -57,6 +57,7 @@
- #include "io_selector.h"
- #include "utils.h"
- #include "gui_thread.h"
-+#include "opts.h"
- 
- #include "i18n.h"
- 
-@@ -213,10 +214,11 @@
- 
- 	group_label.set_name ("MixerGroupButtonLabel");
- 
--	comment_button.set_name ("MixerCommentButton");
-+	if (!ARDOUR_COMMAND_LINE::ardourino) {
-+	  comment_button.set_name ("MixerCommentButton");
-+	  comment_button.signal_clicked().connect (mem_fun(*this, &MixerStrip::comment_button_clicked));
-+	}
- 
--	comment_button.signal_clicked().connect (mem_fun(*this, &MixerStrip::comment_button_clicked));
--	
- 	global_vpacker.set_border_width (0);
- 	global_vpacker.set_spacing (0);
- 
-@@ -244,10 +246,16 @@
- 	global_vpacker.pack_start (middle_button_table,Gtk::PACK_SHRINK);
- 	global_vpacker.pack_start (*gain_meter_alignment,Gtk::PACK_SHRINK);
- 	global_vpacker.pack_start (bottom_button_table,Gtk::PACK_SHRINK);
--	global_vpacker.pack_start (post_redirect_box, true, true);
--	global_vpacker.pack_start (panners, Gtk::PACK_SHRINK);
-+	if (!ARDOUR_COMMAND_LINE::ardourino) {
-+	  global_vpacker.pack_start (post_redirect_box, true, true);
-+	}
-+	if (!ARDOUR_COMMAND_LINE::ardourino) {
-+	  global_vpacker.pack_start (panners, Gtk::PACK_SHRINK);
-+	}
- 	global_vpacker.pack_start (output_button, Gtk::PACK_SHRINK);
--	global_vpacker.pack_start (comment_button, Gtk::PACK_SHRINK);
-+	if (!ARDOUR_COMMAND_LINE::ardourino) {
-+	  global_vpacker.pack_start (comment_button, Gtk::PACK_SHRINK);
-+	}
- 
- 	global_frame.add (global_vpacker);
- 	global_frame.set_shadow_type (Gtk::SHADOW_IN);
-@@ -435,7 +443,10 @@
- 	/* now force an update of all the various elements */
- 
- 	pre_redirect_box.update();
--	post_redirect_box.update();
-+	if (!ARDOUR_COMMAND_LINE::ardourino) {
-+	  post_redirect_box.update();
-+	}
-+
- 	mute_changed (0);
- 	solo_changed (0);
- 	name_changed (0);
-@@ -444,7 +455,9 @@
- 
- 	connect_to_pan ();
- 
--	panners.setup_pan ();
-+	if (!ARDOUR_COMMAND_LINE::ardourino) {
-+	  panners.setup_pan ();
-+	}
- 
- 	if (is_audio_track()) {
- 		speed_changed ();
-@@ -486,10 +499,13 @@
- 	/* always set the gpm width again, things may be hidden */
- 
- 	gpm.set_width (w);
--	panners.set_width (w);
-+	if (!ARDOUR_COMMAND_LINE::ardourino) {
-+	  panners.set_width (w);
-+	}
- 	pre_redirect_box.set_width (w);
--	post_redirect_box.set_width (w);
--
-+	if (!ARDOUR_COMMAND_LINE::ardourino) {
-+	  post_redirect_box.set_width (w);
-+	}
- 	_width_owner = owner;
- 
- 	ensure_xml_node ();
-@@ -509,18 +525,21 @@
- 		((Gtk::Label*)mute_button->get_child())->set_text  (_("Mute"));
- 		((Gtk::Label*)solo_button->get_child())->set_text (_("Solo"));
- 
--		if (_route->comment() == "") {
--		       comment_button.unset_bg (STATE_NORMAL);
--		       ((Gtk::Label*)comment_button.get_child())->set_text (_("Comments"));
--		} else {
--		       comment_button.modify_bg (STATE_NORMAL, color());
--		       ((Gtk::Label*)comment_button.get_child())->set_text (_("*Comments*"));
-+		if (!ARDOUR_COMMAND_LINE::ardourino) {
-+		  if (_route->comment() == "") {
-+		         comment_button.unset_bg (STATE_NORMAL);
-+		         ((Gtk::Label*)comment_button.get_child())->set_text (_("Comments"));
-+		  } else {
-+		         comment_button.modify_bg (STATE_NORMAL, color());
-+		         ((Gtk::Label*)comment_button.get_child())->set_text (_("*Comments*"));
-+		  }
- 		}
--
- 		((Gtk::Label*)gpm.gain_automation_style_button.get_child())->set_text (gpm.astyle_string(_route->gain_automation_curve().automation_style()));
- 		((Gtk::Label*)gpm.gain_automation_state_button.get_child())->set_text (gpm.astate_string(_route->gain_automation_curve().automation_state()));
--		((Gtk::Label*)panners.pan_automation_style_button.get_child())->set_text (panners.astyle_string(_route->panner().automation_style()));
--		((Gtk::Label*)panners.pan_automation_state_button.get_child())->set_text (panners.astate_string(_route->panner().automation_state()));
-+		if (!ARDOUR_COMMAND_LINE::ardourino) {
-+		  ((Gtk::Label*)panners.pan_automation_style_button.get_child())->set_text (panners.astyle_string(_route->panner().automation_style()));
-+		  ((Gtk::Label*)panners.pan_automation_state_button.get_child())->set_text (panners.astate_string(_route->panner().automation_state()));
-+		}
- 		Gtkmm2ext::set_size_request_to_display_given_text (name_button, "long", 2, 2);
- 		set_size_request (-1, -1);
- 		break;
-@@ -532,18 +551,21 @@
- 		((Gtk::Label*)mute_button->get_child())->set_text (_("M"));
- 		((Gtk::Label*)solo_button->get_child())->set_text (_("S"));
- 
--		if (_route->comment() == "") {
--		       comment_button.unset_bg (STATE_NORMAL);
--		       ((Gtk::Label*)comment_button.get_child())->set_text (_("Cmt"));
--		} else {
--		       comment_button.modify_bg (STATE_NORMAL, color());
--		       ((Gtk::Label*)comment_button.get_child())->set_text (_("*Cmt*"));
-+		if (!ARDOUR_COMMAND_LINE::ardourino) {
-+		  if (_route->comment() == "") {
-+		    comment_button.unset_bg (STATE_NORMAL);
-+		    ((Gtk::Label*)comment_button.get_child())->set_text (_("Cmt"));
-+		  } else {
-+		    comment_button.modify_bg (STATE_NORMAL, color());
-+		    ((Gtk::Label*)comment_button.get_child())->set_text (_("*Cmt*"));
-+		  }
- 		}
--
- 		((Gtk::Label*)gpm.gain_automation_style_button.get_child())->set_text (gpm.short_astyle_string(_route->gain_automation_curve().automation_style()));
- 		((Gtk::Label*)gpm.gain_automation_state_button.get_child())->set_text (gpm.short_astate_string(_route->gain_automation_curve().automation_state()));
--		((Gtk::Label*)panners.pan_automation_style_button.get_child())->set_text (panners.short_astyle_string(_route->panner().automation_style()));
--		((Gtk::Label*)panners.pan_automation_state_button.get_child())->set_text (panners.short_astate_string(_route->panner().automation_state()));
-+		if (!ARDOUR_COMMAND_LINE::ardourino) {
-+		  ((Gtk::Label*)panners.pan_automation_style_button.get_child())->set_text (panners.short_astyle_string(_route->panner().automation_style()));
-+		  ((Gtk::Label*)panners.pan_automation_state_button.get_child())->set_text (panners.short_astate_string(_route->panner().automation_state()));
-+		}
- 		Gtkmm2ext::set_size_request_to_display_given_text (name_button, "longest label", 2, 2);
- 		set_size_request (max (50, gpm.get_gm_width()), -1);
- 		break;
-@@ -1010,7 +1032,9 @@
- {
-     update_io_button (_route, _width, false);
- 	gpm.setup_meters ();
--	panners.setup_pan ();
-+	if (!ARDOUR_COMMAND_LINE::ardourino) {
-+	  panners.setup_pan ();
-+	}
- }
- 
- void
-@@ -1406,13 +1430,17 @@
- 	if (at) {
- 		switch (at->freeze_state()) {
- 		case AudioTrack::Frozen:
--			pre_redirect_box.set_sensitive (false);
--			post_redirect_box.set_sensitive (false);
-+		        pre_redirect_box.set_sensitive (false);
-+			if (!ARDOUR_COMMAND_LINE::ardourino) {
-+			  post_redirect_box.set_sensitive (false);
-+			}
- 			speed_spinner.set_sensitive (false);
- 			break;
- 		default:
--			pre_redirect_box.set_sensitive (true);
--			post_redirect_box.set_sensitive (true);
-+		        pre_redirect_box.set_sensitive (true);
-+			if (!ARDOUR_COMMAND_LINE::ardourino) {
-+			  post_redirect_box.set_sensitive (true);
-+			}
- 			speed_spinner.set_sensitive (true);
- 			// XXX need some way, maybe, to retoggle redirect editors
- 			break;
-Index: ardour/gtk2_ardour/opts.cc
-===================================================================
---- ardour.orig/gtk2_ardour/opts.cc	2011-10-04 00:17:40.608134102 +0200
-+++ ardour/gtk2_ardour/opts.cc	2011-10-04 00:22:56.426947262 +0200
-@@ -41,6 +41,7 @@
- std::string ARDOUR_COMMAND_LINE::keybindings_path = ""; /* empty means use builtin default */
- std::string ARDOUR_COMMAND_LINE::menus_file = "ardour.menus";
- bool ARDOUR_COMMAND_LINE::finder_invoked_ardour = false;
-+bool ARDOUR_COMMAND_LINE::ardourino = false;
- 
- using namespace ARDOUR_COMMAND_LINE;
- 
-@@ -64,6 +65,7 @@
- 	     << _("  [session-name]                   Name of session to load\n")
- 	     << _("  -C, --curvetest filename         Curve algorithm debugger\n")
- 	     << _("  -k, --keybindings filename       Name of key bindings to load (default is ~/.ardour2/ardour.bindings)\n")
-+	     << _("  -a, --ardourino                  Fit on screens with 800x600 resolution\n")
- 		;
- 	return 1;
- 
-@@ -99,6 +101,7 @@
- 		{ "sync", 0, 0, 'S' },
- 		{ "curvetest", 1, 0, 'C' },
- 		{ "sillyAppleUndocumentedFinderFeature", 1, 0, 'p' },
-+		{ "ardourino", 0, 0, 'a' },
- 		{ 0, 0, 0, 0 }
- 	};
- 
-@@ -177,6 +180,10 @@
- 			keybindings_path = optarg;
- 			break;
- 
-+		case 'a':
-+			ardourino = true;
-+			break;
-+
- 		default:
- 			return print_help(execname);
- 		}
-Index: ardour/gtk2_ardour/opts.h
-===================================================================
---- ardour.orig/gtk2_ardour/opts.h	2011-10-04 00:17:40.608134102 +0200
-+++ ardour/gtk2_ardour/opts.h	2011-10-04 00:33:28.248568229 +0200
-@@ -35,6 +35,7 @@
- extern bool   try_hw_optimization;
- extern bool   use_gtk_theme;
- extern std::string keybindings_path;
-+extern bool   ardourino;
- extern std::string menus_file;
- extern bool   finder_invoked_ardour;
- 
-Index: ardour/gtk2_ardour/sfdb_ui.cc
-===================================================================
---- ardour.orig/gtk2_ardour/sfdb_ui.cc	2011-10-04 00:17:40.648132939 +0200
-+++ ardour/gtk2_ardour/sfdb_ui.cc	2011-10-04 00:22:56.430947145 +0200
-@@ -53,6 +53,7 @@
- #include "editing.h"
- #include "utils.h"
- #include "gain_meter.h"
-+#include "opts.h"
- 
- #ifdef FREESOUND
- #include "sfdb_freesound_mootcher.h"
-@@ -426,7 +427,9 @@
- 		chooser.add_filter (custom_filter);
- 		chooser.add_filter (matchall_filter);
- 		chooser.set_select_multiple (true);
--		chooser.signal_update_preview().connect(mem_fun(*this, &SoundFileBrowser::update_preview));
-+		if (!ARDOUR_COMMAND_LINE::ardourino) {
-+			chooser.signal_update_preview().connect(mem_fun(*this, &SoundFileBrowser::update_preview));
-+		}
- 		chooser.signal_file_activated().connect (mem_fun (*this, &SoundFileBrowser::chooser_file_activated));
- #ifdef GTKOSX
- 		/* some broken redraw behaviour - this is a bandaid */
-@@ -441,7 +444,9 @@
- 	
- 	hpacker.set_spacing (6);
- 	hpacker.pack_start (notebook, true, true);
--	hpacker.pack_start (preview, false, false);
-+	if (!ARDOUR_COMMAND_LINE::ardourino) {
-+		hpacker.pack_start (preview, false, false);
-+	}
- 	
- 	get_vbox()->pack_start (hpacker, true, true);
- 
-@@ -577,7 +582,9 @@
- SoundFileBrowser::set_session (Session* s)
- {
- 	ArdourDialog::set_session (s);
--	preview.set_session (s);
-+	if (!ARDOUR_COMMAND_LINE::ardourino) {
-+		preview.set_session (s);
-+	}
- 	if (s) {
- 		add_gain_meter ();
- 	} else {
-@@ -642,6 +649,9 @@
- void
- SoundFileBrowser::update_preview ()
- {
-+	if (ARDOUR_COMMAND_LINE::ardourino) {
-+		return;
-+	}
- 	if (preview.setup_labels (chooser.get_filename())) {
- 		if (preview.autoplay()) {
- 			Glib::signal_idle().connect (mem_fun (preview, &SoundFileBox::audition_oneshot));
-@@ -668,7 +678,9 @@
- 			set_response_sensitive (RESPONSE_OK, false);
- 		}
- 		
--		preview.setup_labels (file);
-+		if (!ARDOUR_COMMAND_LINE::ardourino) {
-+			preview.setup_labels (file);
-+		}
- 	}
- }
- 
-@@ -691,7 +703,9 @@
- 			set_response_sensitive (RESPONSE_OK, false);
- 		}
- 		
--		preview.setup_labels (file);
-+		if (!ARDOUR_COMMAND_LINE::ardourino) {
-+			preview.setup_labels (file);
-+		}
- 	}
- }
- 
-@@ -1249,29 +1263,30 @@
- 	channel_combo.set_active_text (str.front());
- 	channel_combo.set_sensitive (false);
- 
--	l = manage (new Label);
--	l->set_text (_("Conversion Quality:"));
--
--	hbox = manage (new HBox);
--	hbox->set_border_width (12);
--	hbox->set_spacing (6);
--	hbox->pack_start (*l, false, false);
--	hbox->pack_start (src_combo, false, false);
--	vbox = manage (new VBox);
--	vbox->pack_start (*hbox, false, false);
--	options.pack_start (*vbox, false, false);
--
--	str.clear ();
--	str.push_back (_("Best"));
--	str.push_back (_("Good"));
--	str.push_back (_("Quick"));
--	str.push_back (_("Fast"));
--	str.push_back (_("Fastest"));
--
--	set_popdown_strings (src_combo, str);
--	src_combo.set_active_text (str.front());
--	src_combo.set_sensitive (false);
--
-+	if (!ARDOUR_COMMAND_LINE::ardourino) {
-+	  l = manage (new Label);
-+	  l->set_text (_("Conversion Quality:"));
-+
-+	  hbox = manage (new HBox);
-+	  hbox->set_border_width (12);
-+	  hbox->set_spacing (6);
-+	  hbox->pack_start (*l, false, false);
-+	  hbox->pack_start (src_combo, false, false);
-+	  vbox = manage (new VBox);
-+	  vbox->pack_start (*hbox, false, false);
-+	  options.pack_start (*vbox, false, false);
-+
-+	  str.clear ();
-+	  str.push_back (_("Best"));
-+	  str.push_back (_("Good"));
-+	  str.push_back (_("Quick"));
-+	  str.push_back (_("Fast"));
-+	  str.push_back (_("Fastest"));
-+
-+	  set_popdown_strings (src_combo, str);
-+	  src_combo.set_active_text (str.front());
-+	  src_combo.set_sensitive (false);
-+	}
- 	reset_options ();
- 
- 	action_combo.signal_changed().connect (mem_fun (*this, &SoundFileOmega::reset_options_noret));
diff --git a/debian/patches/90_ardour-x-change.patch b/debian/patches/90_ardour-x-change.patch
deleted file mode 100644
index 69ab8ec..0000000
--- a/debian/patches/90_ardour-x-change.patch
+++ /dev/null
@@ -1,1887 +0,0 @@
-Index: ardour/gtk2_ardour/SConscript
-===================================================================
---- ardour.orig/gtk2_ardour/SConscript	2013-10-01 22:33:01.132131930 +0200
-+++ ardour/gtk2_ardour/SConscript	2013-10-01 22:33:01.128131930 +0200
-@@ -180,6 +180,7 @@
- imageframe_time_axis_group.cc
- imageframe_time_axis_view.cc
- imageframe_view.cc
-+import_helper_aaf.cc
- io_selector.cc
- keyboard.cc
- keyeditor.cc
-Index: ardour/gtk2_ardour/import_helper_aaf.cc
-===================================================================
---- /dev/null	1970-01-01 00:00:00.000000000 +0000
-+++ ardour/gtk2_ardour/import_helper_aaf.cc	2013-10-01 22:33:01.128131930 +0200
-@@ -0,0 +1,1389 @@
-+/*
-+    Copyright (C) 2005 Paul Davis
-+
-+    This program is free software; you can redistribute it and/or modify
-+    it under the terms of the GNU General Public License as published by
-+    the Free Software Foundation; either version 2 of the License, or
-+    (at your option) any later version.
-+
-+    This program is distributed in the hope that it will be useful,
-+    but WITHOUT ANY WARRANTY; without even the implied warranty of
-+    MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the
-+    GNU General Public License for more details.
-+
-+    You should have received a copy of the GNU General Public License
-+    along with this program; if not, write to the Free Software
-+    Foundation, Inc., 675 Mass Ave, Cambridge, MA 02139, USA.
-+
-+*/
-+
-+#include "i18n.h"
-+#include "ardour_ui.h"
-+#include "import_helper_aaf.h"
-+#include <errno.h>
-+#include <unistd.h>
-+#include <sys/wait.h>
-+#include <sys/stat.h>
-+
-+#if defined (PORT_SYS_WINDOWS) && !defined(PORT_SYS_CYGWIN)
-+#include <direct.h>
-+#else
-+#include <glib/gstdio.h>
-+#endif
-+
-+#include <ardour/ardour.h>
-+#include <pbd/error.h>
-+#include <pbd/failed_constructor.h>
-+
-+using namespace ARDOUR;
-+
-+bool AafImportHelper::already_instantiated = 0;
-+
-+AafImportHelper::AafImportHelper (bool use_native_importer /*= false */)
-+{
-+    // We don't want multiple instances all trying
-+    // to import an AAF file simultaneously.
-+    if (already_instantiated)
-+        throw failed_constructor();
-+    else
-+        already_instantiated = true;
-+
-+    if (use_native_importer)
-+    {
-+        // If the importer is native,
-+        // we can't be using Wine.
-+        importer_uses_wine = false;
-+        importer_is_native = true;
-+    }
-+    else
-+    {
-+        // Set both variables to 'true'. Since 'is_native'
-+        // and 'uses_wine' are mutually exclusive, setting
-+        // them both to 'true' is used to indicate that we
-+        // haven't yet initialized the AAF import helper.
-+        // 'initialize_for_aaf_import()' will set them
-+        // appropriately when it gets called, later.
-+        importer_uses_wine = true;
-+        importer_is_native = true;
-+    }
-+
-+    string spath;
-+    string shome = Glib::get_home_dir();
-+
-+    /* * * * * * * * * * * * * * * * * * * * * * * * * * * */
-+    /*                                                     */
-+    /* If you write an (external) plugin for importing AAF */
-+    /* files add its path to the list of possible importer */
-+    /* locations. New apps should be nearer to the top of  */
-+    /* the list. Each application can have more than one   */
-+    /* possible location path. The least likely locations  */
-+    /* should be nearer to the top of the list and moos    */
-+    /* likely locations should be nearer the bottom. The   */
-+    /* list should be terminated by an empty string.       */
-+    /*                                                     */
-+    /* * * * * * * * * * * * * * * * * * * * * * * * * * * */
-+    // Push any paths that are likely to contain an importer
-+    spath = shome + "/.cxoffice/default/drive_c/Program Files/ArdourXchange/ArdourXchange.exe";
-+    _aryPossibleImporterLocations.push_back(new string(spath));
-+    spath = shome + "/.wine/drive_c/Program Files/ArdourXchange/ArdourXchange.exe";
-+    _aryPossibleImporterLocations.push_back(new string(spath));
-+    spath = "/usr/lib/ardour-x-change/ArdourXchange.exe";
-+    _aryPossibleImporterLocations.push_back(new string(spath));
-+    spath = ""; // Terminate with an empty string
-+    _aryPossibleImporterLocations.push_back(new string(spath));
-+
-+    // Push any paths that are likely to contain the Wine executable
-+    spath = "/opt/cxoffice/bin/wine";
-+    _aryPossibleWineLocations.push_back(new string(spath));
-+    spath = "/usr/local/bin/wine";
-+    _aryPossibleWineLocations.push_back(new string(spath));
-+    spath = "/usr/bin/wine";
-+    _aryPossibleWineLocations.push_back(new string(spath));
-+    spath = ""; // Terminate with an empty string
-+    _aryPossibleWineLocations.push_back(new string(spath));
-+}
-+
-+AafImportHelper::~AafImportHelper ()
-+{
-+vector<string*>::iterator iter;
-+
-+    if (_aryPossibleImporterLocations.size())
-+        for (iter=_aryPossibleImporterLocations.begin(); iter < _aryPossibleImporterLocations.end(); iter++)
-+            if (*iter)
-+                delete *iter;
-+
-+    if (_aryPossibleWineLocations.size())
-+        for (iter=_aryPossibleWineLocations.begin(); iter < _aryPossibleWineLocations.end(); iter++)
-+            if (*iter)
-+                delete *iter;
-+
-+    already_instantiated = false;
-+}
-+
-+
-+//***************************************************************
-+//
-+//	is_native()
-+//
-+// Finds out whether a native (i.e. internal) AAF importer is in
-+// use or whether we're using an external plugin. Will initialize
-+// this helper if initialization isn't already completed.
-+//
-+//	Returns:
-+//
-+//    If the importer is internal: TRUE
-+//    If the importer is external: FALSE
-+//
-+bool
-+AafImportHelper::is_native (bool bSilent /*= false */)
-+{
-+    if (!is_initialized())
-+        get_user_path("AAF Importer", bSilent); // Carries out initialization
-+
-+    return (importer_is_native);
-+}
-+
-+
-+//***************************************************************
-+//
-+//	needs_wine()
-+//
-+// Finds out whether the AAF importer is an external application
-+// that needs Wine (or cxMac) to be present. Will initialize
-+// this helper if initialization isn't already completed.
-+//
-+//	Returns:
-+//
-+//    If the importer needs Wine: TRUE
-+//    Otherwise:                  FALSE
-+//
-+bool
-+AafImportHelper::needs_wine (bool bSilent /*= false */)
-+{
-+    if (!is_initialized())
-+        get_user_path("AAF Importer", bSilent); // Carries out initialization
-+
-+    return (importer_uses_wine);
-+}
-+
-+
-+//***************************************************************
-+//
-+//	create_symlink()
-+//
-+// ArdourXchange needs 3 x symbolic links to be in place at run
-+// time. This master function calls the appropriate helper func
-+// to create a link if it didn't exist when ArdourXchange gets
-+// run for the first time. The helper functions can be called
-+// explicitly if necessary (e.g. to change a given link and
-+// replace its target). Call this function if you prefer the
-+// helper functions to choose automatically from an appropriate
-+// set of default options. Currently supported entries are:-
-+//
-+// 'Wine'         - creates a link to wine (or cxMac etc).
-+// 'Session Base' - creates the 'base' folder for storing Ardour
-+//                  sessions.
-+// 'AAF Importer' - finds the AAF importer app and vreates a
-+//                  links to it.
-+//
-+// See the c'tor for AafImportHelper for a list of appropriate
-+// default path settings.
-+//
-+//	Returns:
-+//
-+//    The value returned from the called helper function or
-+//    AXERR_NOT_IMPLEMENTED if there's no matching helper func.
-+//
-+ax_error
-+AafImportHelper::create_symlink (const string& srequested_link, bool bReplaceExisting /*= false */, bool bSilent /*= false */)
-+{
-+ax_error axRet = AXERR_NONE;
-+
-+    if ("Wine" == srequested_link)
-+        axRet = create_link_to_wine(bReplaceExisting, bSilent);
-+    else if ("Session Base" == srequested_link)
-+        axRet = create_link_to_sessions(bReplaceExisting, bSilent);
-+    else if ("AAF Importer" == srequested_link)
-+        axRet = create_link_to_importer(bReplaceExisting, bSilent);
-+    else
-+    {
-+        axRet = AXERR_NOT_IMPLEMENTED;
-+
-+        if (!bSilent)
-+            error << _("AafImportHelper::create_symlink(): unsupported option") << endmsg;
-+    }
-+
-+    return (axRet);
-+}
-+
-+
-+//***************************************************************
-+//
-+//	create_link_to_importer()
-+//
-+// The AAF importer needs 3 x symbolic links to be in place at
-+// run time, one of which is a link to the AAF Import application
-+// itself. This function can be used to create a link to that
-+// application if the application exists but the link doesn't.
-+// When creating a link you can choose whether to overwrite (i.e.
-+// update) a previously existing link. You can also supply an
-+// explicit path for the import application in 'spath_to_importer'.
-+// If 'spath_to_importer' is empty, this function will try to
-+// find an import application based on an array of likely paths
-+// that were set up in this object's c'tor.
-+//
-+//
-+//	Returns:
-+//
-+//    If the link was successfully created
-+//    (or if a suitable link already existed): AXERR_NONE
-+//    On Failure: An appropriate ax_error
-+//
-+ax_error
-+AafImportHelper::create_link_to_importer (bool bReplaceExisting /*= false */, bool bSilent /*= false */, const string& spath_to_importer /*= "" */)
-+{
-+string spath_to_use = spath_to_importer;
-+ax_error axRet      = AXERR_NONE;
-+
-+    if ((0 == spath_to_use.length()) && (_aryPossibleImporterLocations.size()))
-+    {
-+        vector<string*>::iterator iter;
-+
-+        // If we weren't given a specific
-+        // path, try to find an importer
-+        for (iter=_aryPossibleImporterLocations.begin(); iter < _aryPossibleImporterLocations.end(); iter++)
-+        {
-+            if (*iter)
-+            {
-+                spath_to_use = ((string*)*iter)->c_str();
-+
-+                if (spath_to_use.length())
-+                    if (Glib::file_test(spath_to_use, Glib::FILE_TEST_EXISTS))
-+                        break;
-+            }
-+        }
-+    }
-+
-+    if (0 == spath_to_use.length())
-+    {
-+        axRet = AXERR_PROCESS_NOT_FOUND;
-+
-+        if (!bSilent)
-+            warning << _("AAF Importer could not be found") << endmsg;
-+    }
-+    else
-+    {
-+        string spath_to_symbolic_link = get_user_ardour_path();
-+        spath_to_symbolic_link += "AAF Importer";
-+
-+        // Test to see if the symbolic link already exists
-+        bool bFound = Glib::file_test(spath_to_symbolic_link.c_str(), Glib::FILE_TEST_EXISTS);
-+
-+        if ((!bFound) || (bReplaceExisting))
-+        {
-+            // If the symbolic link already exists but we've
-+            // been told to replace it, delete the existing link.
-+            if (bFound)
-+                if (0 != remove(spath_to_symbolic_link.c_str()))
-+                {
-+                    axRet = AXERR_INVALID_ACCESS;
-+
-+                    if (!bSilent)
-+                        error << _("Deletion error while making a symbolic link") << endmsg;
-+                }
-+
-+            if (AXERR_NONE == axRet)
-+            {
-+                // symlink() will fail if the symbolic link's path doesn't already exist.
-+                if ((axRet = create_path_folders(get_user_ardour_path().c_str())) == AXERR_NONE)
-+                {
-+                    if (0 != symlink(spath_to_use.c_str(), spath_to_symbolic_link.c_str()))
-+                    {
-+                        axRet = AXERR_INVALID_ACCESS;
-+
-+                        if (!bSilent)
-+                            error << _("Creation error while making a symbolic link") << endmsg;
-+                    }
-+                }
-+                else if (!bSilent)
-+                    error << _("Access error while making a symbolic link") << endmsg;
-+            }
-+        }
-+        else if ((bFound) && (!bReplaceExisting))
-+        {
-+            int len;
-+
-+            // See if the link already points to the reqested destination
-+            if ((len = readlink(spath_to_symbolic_link.c_str(), temp_path, PATH_MAX)) > 0)
-+            {
-+                // Add a NUL terminator
-+                temp_path[len] = 0;
-+
-+                if (spath_to_use == temp_path);
-+                    // Do nothing here. Assume that it's okay to have a
-+                    // pre-existing link if we were told not to replace.
-+                else
-+                {
-+                    axRet = AXERR_DESTINATION_ERROR;
-+
-+                    if (!bSilent)
-+                        error << _("Cannot create symbolic link (an incompatible link already exists)") << endmsg;
-+
-+                }
-+            }
-+        }
-+    }
-+
-+    return (axRet);
-+}
-+
-+
-+//***************************************************************
-+//
-+//	create_link_to_sessions()
-+//
-+// The AAF importer needs 3 x symbolic links to be in place at
-+// run time, one of which is a link to the 'base' folder that
-+// will be used for storing the user's Ardour sessions. This
-+// function can be used to create that folder (and create a
-+// suitable link to it) if the folder or link don't already
-+// exist. When creating the link you can choose whether to
-+// update (i.e. overwrite) a previously existing link. You can
-+// supply an explicit path for the folder in 'spath_to_sessions'.
-+// If 'spath_to_sessions' is empty, this function will try to
-+// create a default folder called "Ardour Sessions" inside the
-+// user's home folder.
-+//
-+//
-+//	Returns:
-+//
-+//    If the link was successfully created
-+//    (or if a suitable link already existed): AXERR_NONE
-+//    On Failure: An appropriate ax_error
-+//
-+ax_error
-+AafImportHelper::create_link_to_sessions (bool bReplaceExisting /*= false */, bool bSilent /*= false */, const string& spath_to_sessions /*= "" */)
-+{
-+string spath_to_use = spath_to_sessions;
-+string shome        = Glib::get_home_dir();
-+ax_error axRet      = AXERR_NONE;
-+
-+    if (0 == spath_to_use.length())
-+        spath_to_use = shome + "/Ardour Sessions";
-+    else
-+    {
-+        // Symbolic links don't like terminating forward slashes
-+        size_t ilen = spath_to_use.length();
-+
-+        while ((ilen) && ('/' == spath_to_use[ilen-1]))
-+        {
-+            spath_to_use[--ilen] = 0;
-+        }
-+    }
-+
-+    if (0 == spath_to_use.length())
-+    {
-+        axRet = AXERR_DESTINATION_ERROR;
-+
-+#if !defined(DEBUG) && !defined(_DEBUG)
-+        if (!bSilent)
-+#endif
-+            error << _("(AafImportHelper) Invalid target supplied for Session Base folder") << endmsg;
-+    }
-+    else
-+    {
-+        string spath_to_symbolic_link = get_user_ardour_path();
-+        spath_to_symbolic_link += "Ardour Sessions";
-+
-+        // Test to see if the symbolic link already exists
-+        bool bFound = Glib::file_test(spath_to_symbolic_link.c_str(), Glib::FILE_TEST_EXISTS);
-+
-+        if ((!bFound) || (bReplaceExisting))
-+        {
-+            // Firstly, check to see if the destination target
-+            // already exists. If it does, it MUST be a folder.
-+            if (Glib::file_test(spath_to_use.c_str(), Glib::FILE_TEST_EXISTS))
-+            {
-+                if (!Glib::file_test(spath_to_use.c_str(), Glib::FILE_TEST_IS_DIR))
-+                {
-+                    axRet = AXERR_DESTINATION_ERROR;
-+
-+#if !defined(DEBUG) && !defined(_DEBUG)
-+                    if (!bSilent)
-+#endif
-+                        error << _("(AafImportHelper) Invalid target supplied for Session Base folder") << endmsg;
-+                }
-+            }
-+            else
-+            {
-+               if ((axRet = create_path_folders(spath_to_use.c_str())) != AXERR_NONE)
-+#if !defined(DEBUG) && !defined(_DEBUG)
-+                    if (!bSilent)
-+#endif
-+                        error << _("Access error while creating the Session Base folder") << endmsg;
-+            }
-+
-+            // The target now exists and is guaranteed to be a folder. Now, if the symbolic
-+            // link already exists but we've been told to replace it, delete the existing link.
-+            if ((bFound) && (AXERR_NONE == axRet))
-+                if (0 != remove(spath_to_symbolic_link.c_str()))
-+                {
-+                    axRet = AXERR_INVALID_ACCESS;
-+
-+                    if (!bSilent)
-+                        error << _("Deletion error while making a symbolic link") << endmsg;
-+                }
-+
-+            if (AXERR_NONE == axRet)
-+            {
-+                // symlink() will fail if the symbolic link's path doesn't already exist.
-+                if ((axRet = create_path_folders(get_user_ardour_path().c_str())) == AXERR_NONE)
-+                {
-+                    if (0 != symlink(spath_to_use.c_str(), spath_to_symbolic_link.c_str()))
-+                    {
-+                        axRet = AXERR_INVALID_ACCESS;
-+
-+                        if (!bSilent)
-+                            error << _("Creation error while making a symbolic link") << endmsg;
-+                    }
-+                }
-+                else if (!bSilent)
-+                    error << _("Access error while making a symbolic link") << endmsg;
-+            }
-+        }
-+        else if ((bFound) && (!bReplaceExisting))
-+        {
-+            int len;
-+
-+            // See if the link already points to the reqested destination
-+            if ((len = readlink(spath_to_symbolic_link.c_str(), temp_path, PATH_MAX)) > 0)
-+            {
-+                // Add a NUL terminator
-+                temp_path[len] = 0;
-+
-+                if (spath_to_use == temp_path);
-+                    // Do nothing here. Assume that it's okay to have a
-+                    // pre-existing link if we were told not to replace.
-+                else
-+                {
-+                    axRet = AXERR_DESTINATION_ERROR;
-+
-+                    if (!bSilent)
-+                        error << _("Cannot create symbolic link (an incompatible link already exists)") << endmsg;
-+
-+                }
-+            }
-+        }
-+    }
-+
-+    return (axRet);
-+}
-+
-+
-+//***************************************************************
-+//
-+//	create_link_to_wine()
-+//
-+// The AAF importer needs 3 x symbolic links to be in place at
-+// run time, one of which is a link to the Wine utility (or cxMac
-+// or cxLinux utility). This function can be used to create a link
-+// to that utility if the utility exists but the link doesn't.
-+// When creating a link you can choose whether to overwrite (i.e.
-+// update) a previously existing link. You can also supply an
-+// explicit path for the utility you wish to use in 'spath_to_wine'.
-+// If 'spath_to_wine' is empty, this function will try to find a
-+// suitable utility based on an array of likely paths that were set
-+// up in this object's c'tor.
-+//
-+//
-+//	Returns:
-+//
-+//    If the link was successfully created
-+//    (or if a suitable link already existed): AXERR_NONE
-+//    On Failure: An appropriate ax_error
-+//
-+ax_error
-+AafImportHelper::create_link_to_wine (bool bReplaceExisting /*= false */, bool bSilent /*= false */, const string& spath_to_wine /*= "" */)
-+{
-+string spath_to_use = spath_to_wine;
-+ax_error axRet      = AXERR_NONE;
-+
-+    if ((0 == spath_to_use.length()) && (_aryPossibleWineLocations.size()))
-+    {
-+        vector<string*>::iterator iter;
-+
-+        // If we weren't given a specific
-+        // path for Wine, try to find it.
-+        for (iter=_aryPossibleWineLocations.begin(); iter < _aryPossibleWineLocations.end(); iter++)
-+        {
-+            if (*iter)
-+            {
-+                spath_to_use = ((string*)*iter)->c_str();
-+
-+                if (spath_to_use.length())
-+                    if (Glib::file_test(spath_to_use, Glib::FILE_TEST_EXISTS))
-+                        break;
-+            }
-+        }
-+    }
-+
-+    if (0 == spath_to_use.length())
-+    {
-+        axRet = AXERR_PROCESS_NOT_FOUND;
-+
-+        if (!bSilent)
-+            warning << _("Wine (or equivalent) could not be found") << endmsg;
-+    }
-+    else
-+    {
-+        string shome = Glib::get_home_dir();
-+        string spath_to_symbolic_link = shome + "/.Wine";
-+
-+        // Test to see if the symbolic link already exists
-+        bool bFound = Glib::file_test(spath_to_symbolic_link.c_str(), Glib::FILE_TEST_EXISTS);
-+
-+        if ((!bFound) || (bReplaceExisting))
-+        {
-+            // If the symbolic link already exists but we've
-+            // been told to replace it, delete the existing link.
-+            if (bFound)
-+                if (0 != remove(spath_to_symbolic_link.c_str()))
-+                {
-+                    axRet = AXERR_INVALID_ACCESS;
-+
-+                    if (!bSilent)
-+                        error << _("Deletion error while making a symbolic link") << endmsg;
-+                }
-+
-+            if (AXERR_NONE == axRet)
-+            {
-+                // It's very unlikely that the user's home folder wouldn't exist but we need to be
-+                // certain. symlink() will fail if the symbolic link's path doesn't already exist.
-+                if ((axRet = create_path_folders(shome.c_str())) == AXERR_NONE)
-+                {
-+                    if (0 != symlink(spath_to_use.c_str(), spath_to_symbolic_link.c_str()))
-+                    {
-+                        axRet = AXERR_INVALID_ACCESS;
-+
-+                        if (!bSilent)
-+                            error << _("Creation error while making a symbolic link") << endmsg;
-+                    }
-+                }
-+                else if (!bSilent)
-+                    error << _("Access error while making a symbolic link") << endmsg;
-+            }
-+        }
-+        else if ((bFound) && (!bReplaceExisting))
-+        {
-+            int len;
-+
-+            // See if the link already points to the reqested destination
-+            if ((len = readlink(spath_to_symbolic_link.c_str(), temp_path, PATH_MAX)) > 0)
-+            {
-+                // Add a NUL terminator
-+                temp_path[len] = 0;
-+
-+                if (spath_to_use == temp_path);
-+                    // Do nothing here. Assume that it's okay to have a
-+                    // pre-existing link if we were told not to replace.
-+                else
-+                {
-+                    axRet = AXERR_DESTINATION_ERROR;
-+
-+                    if (!bSilent)
-+                        error << _("Cannot create symbolic link (an incompatible link already exists)") << endmsg;
-+
-+                }
-+            }
-+        }
-+    }
-+
-+    return (axRet);
-+}
-+
-+
-+//***************************************************************
-+//
-+//	get_user_path()
-+//
-+// Obtains (where appropriate) a path to certain components that
-+// might be needed before we can carry out an AAF import. Will
-+// initialize this helper if initialization isn't already
-+// completed. Pass in a string which identifies the file or path
-+// that you want to identify. Currently supported entries are:-
-+//
-+// 'AAF Importer' - returns a path to the importer app
-+// 'Wine'         - returns the path to Wine, cxLinux or cxMac.
-+// 'Session Base' - returns the 'base' path for storing Ardour
-+//                  sessions.
-+//
-+// Note that if the file or path is successfully identified, this
-+// doesn't necessarily mean that it actually exists !
-+//
-+//	Returns:
-+//
-+//    If there was no relevant path
-+//    (e.g. the importer is native):        An empty string.
-+//    If the path couldn't be identified:   An empty string.
-+//    If the requested path was identified: A string containing
-+//                                          the path.
-+//
-+string
-+AafImportHelper::get_user_path (const string& srequested_path, bool bSilent /*= false */)
-+{
-+string   junk, sRet;
-+ax_error axRet = AXERR_NONE;
-+
-+    if (!is_initialized())
-+        axRet = initialize_for_aaf_import(importer_is_native, importer_uses_wine, &junk, NULL, NULL, NULL, bSilent);
-+
-+    if (AXERR_NONE == axRet)
-+    {
-+        if ("AAF Importer" == srequested_path)
-+            axRet = get_path_to_importer(sRet, bSilent);
-+        else if ("Wine" == srequested_path)
-+            axRet = get_path_to_wine(sRet, bSilent);
-+        else if ("Session Base" == srequested_path)
-+            axRet = get_path_to_sessions(sRet, bSilent);
-+        else
-+        {
-+            axRet = AXERR_NOT_IMPLEMENTED;
-+
-+            if (!bSilent)
-+                error << _("AafImportHelper::get_user_path(): unsupported option") << endmsg;
-+        }
-+    }
-+
-+    if (AXERR_NONE != axRet)
-+        sRet = "";
-+
-+    return (sRet);
-+}
-+
-+
-+//***************************************************************
-+//
-+//	get_path_to_importer()
-+//
-+// Opens the 'Paths' config file (if available) and reads in the
-+// path for the AAF importer. If no 'Paths' file was found, it
-+// attempts to locate a symbolic link to the importer and then
-+// translates the symbolic link. The found path is returned in
-+// 'sfound_path'. An empty string is returned if the path could
-+// not be identified (or if an internal importer is specified).
-+// If 'bSilent' is false, displays an error message to the user
-+// if any error gets detected.
-+//
-+// This function will attempt to create a suitable symbolic link
-+// if a pre-existing one couldn't be found.
-+//
-+//	Returns:
-+//
-+//    On Success: AXERR_NONE
-+//    On Failure: AXERR_DESTINATION_ERROR
-+//
-+ax_error
-+AafImportHelper::get_path_to_importer (string& sfound_path, bool bSilent /*= false */)
-+{
-+ax_error axRet = AXERR_NONE;
-+
-+    if ((importer_is_native) && (!importer_uses_wine))
-+        sfound_path = "";
-+    else
-+    {
-+        sfound_path = "";
-+
-+        int  len;
-+        bool bPathFound = false; // TODO: Read this path from the 'Paths' file
-+
-+        if (!bPathFound)
-+        {
-+            string spath_to_symbolic_link = get_user_ardour_path();
-+            spath_to_symbolic_link += "AAF Importer";
-+
-+            if ((len = readlink(spath_to_symbolic_link.c_str(), temp_path, PATH_MAX)) >= 0)
-+            {
-+                // Add a NUL terminator
-+                temp_path[len] = 0;
-+                sfound_path = temp_path;
-+            }
-+            else
-+            {
-+                // We couldn't find a symbolic link for the
-+                // AAF Import application. Try to create one.
-+                if (AXERR_NONE == create_symlink("AAF Importer", false, bSilent))
-+                    axRet = get_path_to_importer(sfound_path, bSilent);
-+                else
-+                    axRet = AXERR_DESTINATION_ERROR;
-+            }
-+        }
-+    }
-+
-+    return (axRet);
-+}
-+
-+
-+//***************************************************************
-+//
-+//	get_path_to_sessions()
-+//
-+// Opens the 'Paths' config file (if available) and reads in the
-+// user's preferred base path for saving Ardour sessions. If no
-+// 'Paths' file was found, it attempts to locate a symbolic link
-+// called "Ardour Sessions" and translates the symbolic link. The
-+// found path is returned in 'sfound_path'. An empty string is
-+// returned if the path could not be identified. If 'bSilent' is
-+// false, displays an error message to the user if any error
-+// gets detected.
-+//
-+// This function will attempt to create a suitable symbolic link
-+// if a pre-existing one couldn't be found. However, it will only
-+// attempt to create the link if an AAF import application is
-+// already installed on the user's system.
-+//
-+//	Returns:
-+//
-+//    On Success: AXERR_NONE
-+//    On Failure: AXERR_DESTINATION_ERROR
-+//
-+ax_error
-+AafImportHelper::get_path_to_sessions (string& sfound_path, bool bSilent /*= false */)
-+{
-+int len;
-+ax_error axRet = AXERR_NONE;
-+
-+    bool bPathFound = false; // TODO: Read this path from the 'Paths' file
-+
-+    sfound_path = "";
-+
-+    if (!bPathFound)
-+    {
-+        string spath_to_symbolic_link = get_user_ardour_path();
-+        spath_to_symbolic_link += "Ardour Sessions";
-+
-+        if ((len = readlink(spath_to_symbolic_link.c_str(), temp_path, PATH_MAX)) >= 0)
-+        {
-+            // Add a NUL terminator
-+            temp_path[len] = 0;
-+            sfound_path = temp_path;
-+        }
-+        else
-+        {
-+           string junk;
-+
-+            // We couldn't find a symbolic link for the user's session path.
-+            // Assuming that an AAF Import app is present, try to create one.
-+            if (AXERR_NONE == get_path_to_importer(junk, true))
-+            {
-+                if (AXERR_NONE == create_symlink("Session Base", false, bSilent))
-+                    axRet = get_path_to_sessions(sfound_path, bSilent);
-+                else
-+                    axRet = AXERR_DESTINATION_ERROR;
-+            }
-+            else
-+            {
-+                axRet = AXERR_DESTINATION_ERROR;
-+
-+                if (!bSilent)
-+                    warning << _("Symbolic link failure (could not detect an AAF importer app)") << endmsg;
-+            }
-+        }
-+    }
-+
-+    return (axRet);
-+}
-+
-+
-+//***************************************************************
-+//
-+//	get_path_to_wine()
-+//
-+// Opens the 'Paths' config file (if available) and reads in the
-+// location of Wine (or cxLinux or cxMac) on the user's system.
-+// If no 'Paths' file was found, it attempts to locate a symbolic
-+// link called "Wine" and translates the symbolic link. The found
-+// path is returned in 'sfound_path' (regardless of whether or not
-+// Wine is actually needed). An empty string is returned if the
-+// path could not be identified. If 'bSilent' is false, displays
-+// an error message to the user if any error occurred.
-+//
-+// This function will attempt to create a suitable symbolic link
-+// if a pre-existing one couldn't be found. However, it will only
-+// attempt to create the link if an AAF import application is
-+// already installed on the user's system.
-+//
-+//	Returns:
-+//
-+//    On Success: AXERR_NONE
-+//    On Failure: AXERR_DESTINATION_ERROR
-+//
-+ax_error
-+AafImportHelper::get_path_to_wine (string& sfound_path, bool bSilent /*= false */)
-+{
-+int len;
-+ax_error axRet = AXERR_NONE;
-+
-+    bool bPathFound = false; // TODO: Read this path from the 'Paths' file
-+
-+    sfound_path = "";
-+
-+    if (!bPathFound)
-+    {
-+        string shome = Glib::get_home_dir();
-+        string spath_to_symbolic_link = shome + "/.Wine";
-+
-+        if ((len = readlink(spath_to_symbolic_link.c_str(), temp_path, PATH_MAX)) >= 0)
-+        {
-+            // Add a NUL terminator
-+            temp_path[len] = 0;
-+            sfound_path = temp_path;
-+        }
-+        else
-+        {
-+            string junk;
-+
-+            // We couldn't find a symbolic link for Wine. Assuming that
-+            // an AAF Import app is present, try to create a link to Wine.
-+            if (AXERR_NONE == get_path_to_importer(junk, true))
-+            {
-+                if (AXERR_NONE == create_symlink("Wine", false, bSilent))
-+                    axRet = get_path_to_wine(sfound_path, bSilent);
-+                else
-+                    axRet = AXERR_DESTINATION_ERROR;
-+            }
-+            else
-+            {
-+                axRet = AXERR_DESTINATION_ERROR;
-+
-+                if (!bSilent)
-+                    warning << _("Symbolic link failure (could not detect an AAF importer app)") << endmsg;
-+            }
-+        }
-+    }
-+
-+    return (axRet);
-+}
-+
-+
-+//***************************************************************
-+//
-+//	initialize_for_aaf_import()
-+//
-+// Calls the various 'get_path_to' functions to obtain any paths
-+// that will be needed for importing an AAF file (e.g. the path
-+// to the import plugin and the base path for converted sessions).
-+// The base path is MANDATORY since no import process can take
-+// place if the importer doesn't know where to dump the imported
-+// sessions. AXERR_NOT_INITIALIZED will be issued if you pass a
-+// NUL pointer to 'ppath_to_ardour_sessions'. The other paths
-+// are optional. 'ppath_to_importer' returns an empty string if
-+// the importer is internal. 'initialize_for_aaf_import()' will
-+// optionally return (a) the user's preferred color scheme;
-+// (b) whether or not he importer needs Wine, and (c) whether
-+// the importer is native or external. If 'bSilent' is false, we
-+// display an error message to the user if any error got detected.
-+//
-+//	Returns:
-+//
-+//    On Success: AXERR_NONE
-+//    On Failure: An appropriate ax_error
-+//
-+ax_error
-+AafImportHelper::initialize_for_aaf_import (bool& is_native, bool& use_wine, string* ppath_to_ardour_sessions,
-+                    string* ppath_to_importer /*= NULL */, string* ppath_to_wine /*= NULL */,
-+                    string* ppreferred_color_scheme /*= NULL*/, bool bSilent /*= false */)
-+{
-+ax_error axRet = AXERR_NOT_INITIALIZED;
-+
-+    if (ppath_to_importer || ppath_to_wine || ppath_to_ardour_sessions)
-+    {
-+        string spath_to_importer;
-+        axRet = AXERR_NONE;
-+
-+        // Determine the current color scheme
-+        if (ppreferred_color_scheme)
-+        {
-+            string sardour_theme = ARDOUR_UI::config()->ui_rc_file.get();
-+
-+            if (string::npos != sardour_theme.find("ui_dark"))
-+                *ppreferred_color_scheme = "dark";
-+            else // assume a light theme
-+                *ppreferred_color_scheme = "light";
-+        }
-+
-+        // Always find a path to the importer since we
-+        // can't set 'is_native' or 'use_wine' without it
-+        if ((axRet = get_path_to_importer(spath_to_importer, bSilent)) != AXERR_NONE)
-+        {
-+            is_native = importer_is_native = false;
-+            axRet = AXERR_NOT_INITIALIZED;
-+        }
-+        else
-+        {
-+            if (0 == spath_to_importer.length())
-+            {
-+                // No external importer was found. If we're still
-+                // uninitialized, assume we want an internal importer.
-+                if ((importer_is_native) && (importer_uses_wine))
-+                {
-+                    is_native = importer_is_native = true;
-+                    use_wine  = importer_uses_wine = false;
-+                }
-+            }
-+            else
-+            {
-+                // The importer seems to be an external plugin. If
-+                // we haven't been told explicitly to use a native
-+                // importer, find out whether or not it needs Wine
-+                if ((importer_is_native) && (!importer_uses_wine))
-+                {
-+                    // We've got a problem. An internal (native) AAF importer was
-+                    // requested - but the only one we could find was external.
-+                    axRet = AXERR_INTERNAL_ERROR;
-+
-+                    if (!bSilent)
-+                        error << _("An internal error occurred while locating the AAF importer") << endmsg;
-+                }
-+                else
-+                {
-+                    is_native = importer_is_native = false;
-+
-+                    size_t ilen = spath_to_importer.length();
-+                    size_t ipos = spath_to_importer.rfind(".exe");
-+
-+                    if (ipos == string::npos)
-+                        ipos = spath_to_importer.rfind(".EXE");
-+
-+                    // Assume that Wine is needed if the importer ends in ".exe"
-+                    if ((ilen > 4) && (ipos == (ilen-4)))
-+                        use_wine = importer_uses_wine = true;
-+                    else
-+                        use_wine = importer_uses_wine = false;
-+                }
-+            }
-+
-+            if ((ppath_to_importer) && (!is_native))
-+            {
-+                *ppath_to_importer = spath_to_importer;
-+
-+                if (!axRet)
-+                {
-+                    if (ppath_to_importer->length())
-+                    {
-+                        // Find out if the import processor exists
-+                        axRet = Glib::file_test(*ppath_to_importer, Glib::FILE_TEST_EXISTS) ? AXERR_NONE : AXERR_PROCESS_NOT_FOUND;
-+                        if ((AXERR_PROCESS_NOT_FOUND == axRet) && (!bSilent))
-+                           error << _("AAF import processor could not be found") << endmsg;
-+                    }
-+                    else
-+                        axRet = AXERR_NOT_INITIALIZED;
-+                }
-+            }
-+            else if (is_native)
-+            {
-+                if (ppath_to_importer)
-+                    *ppath_to_importer = "";
-+            }
-+
-+            if ((ppath_to_wine) && (!axRet))
-+            {
-+                axRet = get_path_to_wine(*ppath_to_wine, bSilent);
-+
-+                if (!axRet)
-+                {
-+                    if (ppath_to_wine->length())
-+                    {
-+                        // Find out if 'wine' exists
-+                        axRet = Glib::file_test(*ppath_to_wine, Glib::FILE_TEST_EXISTS) ? AXERR_NONE : AXERR_WINE_NOT_FOUND;
-+                        if ((AXERR_WINE_NOT_FOUND == axRet) && (!bSilent))
-+                           error << _("Wine could not be found on this system") << endmsg;
-+                    }
-+                    else
-+                        axRet = AXERR_NOT_INITIALIZED;
-+                }
-+                else
-+                    axRet = AXERR_NOT_INITIALIZED;
-+            }
-+
-+            if ((ppath_to_ardour_sessions) && (!axRet))
-+            {
-+                get_path_to_sessions(*ppath_to_ardour_sessions, bSilent);
-+
-+                int len = ppath_to_ardour_sessions->length();
-+
-+                if (len)
-+                {
-+                    // Now make sure that the path is
-+                    // terminated by a forward slash.
-+                    if ('/' != (*ppath_to_ardour_sessions)[len-1])
-+                        *ppath_to_ardour_sessions += "/";
-+
-+                    // Simply check that the path starts with a forward slash
-+                    if ('/' != (*ppath_to_ardour_sessions)[0])
-+                        axRet = AXERR_DESTINATION_ERROR;
-+
-+                    if ((AXERR_DESTINATION_ERROR == axRet) && (!bSilent))
-+                    {
-+                        error << _("The target path for your imported sessions is not a valid path") << endmsg;
-+                        error << _("Unable to initialize the AAF importer") << endmsg;
-+                    }
-+                }
-+                else
-+                {
-+                    axRet = AXERR_DESTINATION_ERROR;
-+
-+                    if (!bSilent)
-+                    {
-+                        error << _("Could not locate the target path for your imported sessions") << endmsg;
-+                        error << _("Unable to initialize the AAF importer") << endmsg;
-+                    }
-+                }
-+            }
-+            else if (!axRet)
-+                axRet = AXERR_NOT_INITIALIZED;
-+        }
-+    }
-+
-+    return (axRet);
-+}
-+
-+
-+//***************************************************************
-+//
-+//	import_from_aaf()
-+//
-+// This function identifies the AAF import process (if an import
-+// process exists) and instructs it to import the file specified
-+// by 'sfile_to_import'. If the import operation was successful,
-+// 'spath_to_new_project' should contain the path to the imported
-+// ".ardour" session.
-+//
-+//	Returns:
-+//
-+//    On Success: The (converted) exit code returned by the
-+//                AAF importer
-+//    On Failure: An (internally generated) ax_error
-+//
-+ax_error
-+AafImportHelper::import_from_aaf (const string& sfile_to_import, string& spath_to_new_project)
-+{
-+ax_error axRet = AXERR_SOURCE_ERROR;
-+
-+    // Make sure that the file to
-+    // be imported ends in ".aaf"
-+    size_t ilen = sfile_to_import.length();
-+    size_t ipos = sfile_to_import.rfind(".aaf");
-+
-+    if (ipos == string::npos)
-+        ipos = sfile_to_import.rfind(".AAF");
-+
-+    if ((ilen > 4) && (ipos == (ilen-4)))
-+    {
-+        string  spreferred_color_scheme,
-+                spath_to_importer,
-+                spath_to_wine,
-+                spath_for_ardour_sessions;
-+        bool    importer_is_native,
-+                use_wine;
-+
-+        axRet = initialize_for_aaf_import(importer_is_native, use_wine, &spath_for_ardour_sessions,
-+                        &spath_to_importer, &spath_to_wine, &spreferred_color_scheme);
-+
-+        if (AXERR_NONE == axRet)
-+        {
-+            if (importer_is_native)
-+            {
-+                axRet = AXERR_NOT_IMPLEMENTED;
-+            }
-+            else if (use_wine)
-+            {
-+                char tmpBuf[1024], tmpBuf2[1024];
-+
-+                // Find the last forward slash in our source file
-+                if (string::npos == (ipos = sfile_to_import.rfind("/")))
-+                    strcpy(tmpBuf, sfile_to_import.c_str());
-+                else
-+                    strcpy(tmpBuf, &sfile_to_import[ipos+1]);
-+
-+                // By this point, 'tmpBuf' should contain at least 5 characters
-+                tmpBuf[strlen(tmpBuf)-4] = 0;
-+
-+                // 'tmpBuf' now contains the name of the AAF file
-+                std::string aaf_name = tmpBuf;
-+
-+                // Use 'tmpBuf2' to build a path for the ardour session
-+                sprintf(tmpBuf2, "z:%s%s/%s.ardour", spath_for_ardour_sessions.c_str(), aaf_name.c_str(), aaf_name.c_str());
-+
-+                // and use 'tmpBuf' to build a path for the input file
-+                sprintf(tmpBuf, "z:%s", sfile_to_import.c_str());
-+
-+                // Obtain a flag for the color scheme
-+                spreferred_color_scheme = attribute_to_flag(spath_to_importer, spreferred_color_scheme);
-+
-+                // We now have enough information to spawn the child prpcess
-+                int nRet; char cRet;
-+                pid_t pidChild = fork();
-+
-+                if (0 == pidChild)
-+                {
-+                    // We're running in the child process. Use exec()
-+                    // to replace the child process's excutable
-+                    execl (spath_to_wine.c_str(), "wine", spath_to_importer.c_str(), tmpBuf, tmpBuf2, spreferred_color_scheme.c_str(), NULL);
-+                }
-+                else if (pidChild != (-1))
-+                {
-+                    // We're running in parent process. Just
-+                    // wait for the child process to terminate
-+                    waitpid(pidChild, &nRet, 0); // waitpid() returns (qualified) status information in 'nRet'
-+                    cRet = (char)WEXITSTATUS(nRet);
-+                    nRet = cRet; // 'nRet' now equals the ACTUAL status value returned by the AAF importer
-+
-+                    spath_to_new_project = &tmpBuf2[2];
-+
-+                    return (assign_exit_status(spath_to_importer, nRet));
-+                }
-+                else
-+                {
-+                    // fork()/exec() failed. Consider
-+                    // this to be a terminal failure.
-+                    axRet = AXERR_PROCESS_FAILURE;
-+                }
-+            }
-+            else
-+            {
-+                axRet = AXERR_NOT_IMPLEMENTED;
-+            }
-+        }
-+    }
-+
-+    if (AXERR_PROCESS_FAILURE == axRet) // fork()/exec() failed. This is terminal
-+        fatal << _("A fatal error occurred while launching the AAF importer") << endmsg;
-+
-+    if (AXERR_SOURCE_ERROR == axRet)
-+        error << _("The selected source file is not a valid AAF file") << endmsg;
-+
-+    if (AXERR_NOT_INITIALIZED == axRet)
-+        error << _("Unable to initialize the AAF importer") << endmsg;
-+
-+    return (axRet);
-+}
-+
-+
-+//***************************************************************
-+//
-+//	create_path_folders()
-+//
-+// If a symbolic link needs to be created, this function ensures
-+// that all folders leading to the symlink are present.
-+//
-+//	Returns:
-+//
-+//    On success: AXERR_NONE
-+//    On Failure: An appropriate ax_error
-+//
-+ax_error
-+AafImportHelper::create_path_folders(const char *pRequestedPath)
-+{
-+char ch, *tmpPath  = NULL;
-+bool rootfound = false;
-+
-+#if defined (PORT_SYS_WINDOWS) && !defined(PORT_SYS_CYGWIN)
-+char separator = '\\';
-+#else
-+char separator = '/';
-+#endif
-+
-+   /* Note that this function shouldn't be used to create
-+	* a root drive or folder. Also, don't be tempted to
-+	* use 'g_mkdir_with_parents()' either because it
-+	* didn't become available until Glib v2.8, which is
-+	* higher than the base requirement for Ardour.
-+	*/
-+	if (int length = strlen(pRequestedPath))
-+    {
-+		tmpPath = new char[length+2];
-+		strcpy(tmpPath, pRequestedPath);
-+
-+		// Add a trailing separator, if there isn't one
-+		if (pRequestedPath[length-1] != separator)
-+		{
-+			strncpy(&tmpPath[length], &separator, 1);
-+			strncpy(&tmpPath[length+1], "\0", 1);
-+
-+			// Increment 'length'
-+			length += 1;
-+		}
-+
-+		// Locate the SECOND separator
-+		char* iter = tmpPath;
-+		while (strlen(iter))
-+		{
-+			ch = *iter;
-+			if ((ch == separator) && (rootfound))
-+			{
-+				// Replace it with a zero
-+				*iter = '\0';
-+
-+				// and make the first directory
-+				if (strlen(tmpPath))
-+#if defined (PORT_SYS_WINDOWS) && !defined(PORT_SYS_CYGWIN)
-+					if (0 != _mkdir(tmpPath))
-+#else
-+					if (0 != g_mkdir(tmpPath, (S_IRWXU | S_IRWXG | S_IRWXO)))
-+#endif
-+					{
-+                        if (EEXIST != errno)
-+                        {
-+                            delete[] tmpPath;
-+                            return (AXERR_PROCESS_FAILURE);
-+                        }
-+					}
-+
-+				// Now put the separator back
-+				*iter = separator;
-+
-+				// and move to the next character
-+				++iter;
-+
-+				break;
-+			}
-+			else if (ch == separator)
-+				rootfound = true;
-+
-+			iter++;
-+		}
-+
-+		// Now create the remaining directories
-+		while (strlen(iter))
-+		{
-+			ch = *iter;
-+			if (ch == separator)
-+			{
-+				// Replace it with a zero
-+				*iter = '\0';
-+
-+				// make the directory
-+				if (strlen(tmpPath))
-+#if defined (PORT_SYS_WINDOWS) && !defined(PORT_SYS_CYGWIN)
-+					if (0 != _mkdir(tmpPath))
-+#else
-+					if (0 != g_mkdir(tmpPath, (S_IRWXU | S_IRWXG | S_IRWXO)))
-+#endif
-+					{
-+                        if (EEXIST != errno)
-+                        {
-+                            delete[] tmpPath;
-+                            return (AXERR_PROCESS_FAILURE);
-+                        }
-+					}
-+
-+				// Now put the separator back
-+				*iter = separator;
-+			}
-+
-+			// move to the next character
-+			++iter;
-+		}
-+	}
-+
-+	delete[] tmpPath;
-+
-+	return (AXERR_NONE);
-+}
-+
-+
-+/***************************************************************
-+*                                                              *
-+* These functions are useful when we need to interface with an *
-+* external helper app (e.g. when we need to import audio from  *
-+* an unsupported session format such as OMF or AAF).           *
-+*                                                              *
-+****************************************************************/
-+//
-+//	attribute_to_flag()
-+//
-+// Internally, common flags may be used for specific program
-+// conditions (such as "light" or "dark" to indicate the user's
-+// preferred color scheme). This function converts any such
-+// internal flags to a format that might be applicable to an
-+// external process. For example, the string "overwrite" could
-+// be used internally as an indication that an external app
-+// should overwrite older versions of a file. App X might need
-+// this to be sent as "/O" whereas App Y might require "-o".
-+// Use this function to convert internal flags to external ones.
-+//
-+//	Returns:
-+//
-+//    On Success: A string representing the appropriate flag
-+//    On Failure: An empty string
-+//
-+string
-+AafImportHelper::attribute_to_flag (string s_application, string s_attribute)
-+{
-+string sFlag;
-+
-+    // Are we dealing with ArdourXchange ?
-+    if (string::npos != s_application.find("ArdourXchange.exe"))
-+    {
-+        // Attributes supported by ArdourXchange
-+        if (s_attribute == "dark")
-+            sFlag = "-D";
-+        else if (s_attribute == "light")
-+            sFlag = "-L";
-+    }
-+    else if (string::npos != s_application.find("put your application name here"))
-+    {
-+       // Attributes supported by the next app
-+    }
-+
-+    return (sFlag);
-+}
-+
-+//***************************************************************
-+//
-+//	assign_exit_status()
-+//
-+// If an external helper app is capable of returning a status
-+// code, the returned status code can be converted to the nearest
-+// equivalent ax_error. Pass in the relevant code and a string
-+// identifying the application.
-+//
-+//	Returns:
-+//
-+//    On Success: An appropriate ax_error
-+//    On Failure: AXERR_UNKNOWN
-+//
-+ax_error
-+AafImportHelper::assign_exit_status (string s_application, int exit_code)
-+{
-+ax_error exit_status = AXERR_UNKNOWN;
-+
-+    // Are we dealing with ArdourXchange ?
-+    if (string::npos != s_application.find("ArdourXchange.exe"))
-+    {
-+        // Exit codes supported by ArdourXchange
-+        switch (exit_code) {
-+            case -2: exit_status = AXERR_NOT_INITIALIZED;
-+                break;
-+            case -1: exit_status = AXERR_FATAL_EXCEPTION;
-+                break;
-+            case 0:  exit_status = AXERR_NONE;
-+                break;
-+            case 1:  exit_status = AXERR_USER_ABORTED;
-+                break;
-+            case 2:  exit_status = AXERR_MEDIA_NOT_FOUND;
-+                break;
-+        }
-+    }
-+    else if (string::npos != s_application.find("put your application name here"))
-+    {
-+       // Exit codes supported by the next app
-+    }
-+
-+    return (exit_status);
-+}
-+
-+
-+/***************************************************************
-+*                                                              *
-+****************************************************************/
-Index: ardour/gtk2_ardour/import_helper_aaf.h
-===================================================================
---- /dev/null	1970-01-01 00:00:00.000000000 +0000
-+++ ardour/gtk2_ardour/import_helper_aaf.h	2013-10-01 22:33:01.128131930 +0200
-@@ -0,0 +1,81 @@
-+/*
-+    Copyright (C) 2005 Paul Davis
-+
-+    This program is free software; you can redistribute it and/or modify
-+    it under the terms of the GNU General Public License as published by
-+    the Free Software Foundation; either version 2 of the License, or
-+    (at your option) any later version.
-+
-+    This program is distributed in the hope that it will be useful,
-+    but WITHOUT ANY WARRANTY; without even the implied warranty of
-+    MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the
-+    GNU General Public License for more details.
-+
-+    You should have received a copy of the GNU General Public License
-+    along with this program; if not, write to the Free Software
-+    Foundation, Inc., 675 Mass Ave, Cambridge, MA 02139, USA.
-+
-+*/
-+
-+// -*- c++ -*-
-+
-+#ifndef IMPORT_HELPER_AAF_H
-+#define IMPORT_HELPER_AAF_H
-+
-+#include <string>
-+#include <ardour/ax_errors.h>
-+
-+using namespace std;
-+
-+class AafImportHelper
-+{
-+public:
-+    AafImportHelper (bool use_native_importer = false);
-+    ~AafImportHelper ();
-+
-+    string   get_user_path (const string& /* in */srequested_path, bool /* in */bSilent = false);
-+    ax_error import_from_aaf (const string& /* in */sfile_to_import, string& /* out */spath_to_new_project);
-+    ax_error create_symlink (const string& /* in */srequested_link, bool /* in */bReplaceExisting = false, bool /* in */bSilent = false);
-+    ax_error create_link_to_importer (bool /* in */bReplaceExisting = false, bool /* in */bSilent = false, const string& /* in */spath_to_importer = "");
-+    ax_error create_link_to_sessions (bool /* in */bReplaceExisting = false, bool /* in */bSilent = false, const string& /* in */spath_to_sessions = "");
-+    ax_error create_link_to_wine  (bool /* in */bReplaceExisting = false, bool /* in */bSilent = false, const string& /* in */spath_to_wine = "");
-+    bool     is_native (bool /* in */bSilent = false);
-+    bool     needs_wine (bool /* in */bSilent = false);
-+    bool     is_initialized() { return (importer_is_native && importer_uses_wine) ? false : true; }
-+
-+protected:
-+    ax_error create_path_folders(const char *pRequestedPath);
-+    ax_error get_path_to_importer (string& /* out */sfound_path, bool /* in */bSilent = false);
-+    ax_error get_path_to_sessions (string& /* out */sfound_path, bool /* in */bSilent = false);
-+    ax_error get_path_to_wine     (string& /* out */sfound_path, bool /* in */bSilent = false);
-+    ax_error initialize_for_aaf_import (bool& /* out */is_native, bool& /* out */uses_wine, string* /* out */ppath_to_ardour_sessions,
-+                        string* /* out */ppath_to_importer = NULL, string* /* out */ppath_to_wine = NULL,
-+                        string* /* out */ppreferred_color_scheme = NULL, bool /* in */bSilent = false);
-+    ax_error assign_exit_status (string s_application, int exit_code);
-+    string   attribute_to_flag  (string s_application, string s_attribute);
-+    char     temp_path[PATH_MAX+1];
-+    vector<string*> _aryPossibleImporterLocations;
-+    vector<string*> _aryPossibleWineLocations;
-+
-+private:
-+    bool     importer_is_native;
-+    bool     importer_uses_wine;
-+    static bool already_instantiated;
-+
-+private:
-+    // AafImportHelper is not meant to get copied. These things
-+    // are private because they are never meant to be used.
-+    AafImportHelper& operator= (AafImportHelper&); // N/A
-+    AafImportHelper(AafImportHelper&); // N/A
-+};
-+
-+inline AafImportHelper& AafImportHelper::operator= (AafImportHelper&)
-+{
-+    return *this;
-+}
-+
-+inline AafImportHelper::AafImportHelper(AafImportHelper&)
-+{
-+}
-+
-+#endif // IMPORT_HELPER_AAF_H
-Index: ardour/gtk2_ardour/new_session_dialog.cc
-===================================================================
---- ardour.orig/gtk2_ardour/new_session_dialog.cc	2013-10-01 22:33:01.132131930 +0200
-+++ ardour/gtk2_ardour/new_session_dialog.cc	2013-10-01 22:33:01.128131930 +0200
-@@ -1,5 +1,5 @@
- /*
--    Copyright (C) 2005 Paul Davis 
-+    Copyright (C) 2005 Paul Davis
- 
-     This program is free software; you can redistribute it and/or modify
-     it under the terms of the GNU General Public License as published by
-@@ -45,6 +45,12 @@
- #include "i18n.h"
- #include "new_session_dialog.h"
- 
-+// Needed to launch an AAF importer
-+#include "import_helper_aaf.h"
-+#include "ardour_ui.h"
-+#include <wait.h>
-+#include <unistd.h>
-+
- NewSessionDialog::NewSessionDialog()
- 	: ArdourDialog ("session control")
- {
-@@ -84,17 +90,15 @@
- 	session_template_label = new Gtk::Label(_("Template :"));
- 	m_template = new Gtk::FileChooserButton();
- 	m_create_control_bus = new Gtk::CheckButton(_("Create Monitor Bus"));
--	
- 	Gtk::Adjustment *m_control_bus_channel_count_adj = Gtk::manage(new Gtk::Adjustment(2, 0, 100, 1, 10));
- 	m_control_bus_channel_count = new Gtk::SpinButton(*m_control_bus_channel_count_adj, 1, 0);
--	
-+
- 	Gtk::Adjustment *m_master_bus_channel_count_adj = Gtk::manage(new Gtk::Adjustment(2, 0, 100, 1, 10));
- 	m_master_bus_channel_count = new Gtk::SpinButton(*m_master_bus_channel_count_adj, 1, 0);
- 	m_create_master_bus = new Gtk::CheckButton(_("Create Master Bus"));
- 	advanced_table = new Gtk::Table(2, 2, true);
- 	m_connect_inputs = new Gtk::CheckButton(_("Automatically Connect to Physical Inputs"));
- 	m_limit_input_ports = new Gtk::CheckButton(_("Use only"));
--	
- 	Gtk::Adjustment *m_input_limit_count_adj = Gtk::manage(new Gtk::Adjustment(1, 0, 100, 1, 10));
- 	m_input_limit_count = new Gtk::SpinButton(*m_input_limit_count_adj, 1, 0);
- 	input_port_limit_hbox = new Gtk::HBox(false, 0);
-@@ -103,16 +107,15 @@
- 
- 	bus_frame = new Gtk::Frame();
- 	bus_table = new Gtk::Table (2, 3, false);
--	
-+
- 	input_frame = new Gtk::Frame();
- 	m_connect_outputs = new Gtk::CheckButton(_("Automatically Connect Outputs"));
- 	m_limit_output_ports = new Gtk::CheckButton(_("Use only"));
--	
- 	Gtk::Adjustment *m_output_limit_count_adj = Gtk::manage(new Gtk::Adjustment(1, 0, 100, 1, 10));
- 	m_output_limit_count = new Gtk::SpinButton(*m_output_limit_count_adj, 1, 0);
- 	output_port_limit_hbox = new Gtk::HBox(false, 0);
- 	output_port_vbox = new Gtk::VBox(false, 0);
--	
-+
- 	Gtk::RadioButton::Group _RadioBGroup_m_connect_outputs_to_master;
- 	m_connect_outputs_to_master = new Gtk::RadioButton(_RadioBGroup_m_connect_outputs_to_master, _("... to Master Bus"));
- 	m_connect_outputs_to_physical = new Gtk::RadioButton(_RadioBGroup_m_connect_outputs_to_master, _("... to Physical Outputs"));
-@@ -175,7 +178,7 @@
- 	m_create_master_bus->set_border_width(0);
- 	advanced_table->set_row_spacings(0);
- 	advanced_table->set_col_spacings(0);
--	
-+
- 	m_connect_inputs->set_flags(Gtk::CAN_FOCUS);
- 	m_connect_inputs->set_relief(Gtk::RELIEF_NORMAL);
- 	m_connect_inputs->set_mode(true);
-@@ -206,7 +209,7 @@
- 	bus_frame->set_label_align(0,0.5);
- 	bus_frame->add(*bus_hbox);
- 	bus_frame->set_label_widget(*bus_label);
--	
-+
- 	bus_table->set_row_spacings (0);
- 	bus_table->set_col_spacings (0);
- 	bus_table->attach (*m_create_master_bus, 0, 1, 0, 1, Gtk::EXPAND|Gtk::FILL, Gtk::EXPAND|Gtk::FILL, 0, 0);
-@@ -348,12 +351,12 @@
- 
- 	m_notebook->set_flags(Gtk::CAN_FOCUS);
- 	m_notebook->set_scrollable(true);
--	
-+
- 	get_vbox()->set_homogeneous(false);
- 	get_vbox()->set_spacing(0);
- 	get_vbox()->pack_start(*m_notebook, Gtk::PACK_SHRINK, 0);
- 
--	/* 
-+	/*
- 	   icon setting is done again in the editor (for the whole app),
- 	   but its all chickens and eggs at this point.
- 	*/
-@@ -393,7 +396,7 @@
- 	m_treeview->get_selection()->set_mode (Gtk::SELECTION_SINGLE);
- 
- 	std::string path = ARDOUR::get_user_ardour_path();
--	
-+
- 	if (path.empty()) {
- 	        path = ARDOUR::get_system_data_path();
- 	}
-@@ -426,6 +429,21 @@
- 
- 	m_template->set_title(_("select template"));
- 	Gtk::FileFilter* session_filter = manage (new (Gtk::FileFilter));
-+
-+	// Find out if we have an AAF importer present
-+	FILE *xchange = NULL;
-+    AafImportHelper importer;
-+    string spath_to_importer = importer.get_user_path("AAF Importer", true);
-+    if (spath_to_importer.length())
-+         xchange = fopen(spath_to_importer.c_str(), "r");
-+    if ((xchange != NULL) || ((importer.is_native()) && (importer.is_initialized()))) {
-+        printf("AAF Converter found; adding support for import of AAF files.\n");
-+        session_filter->add_pattern(X_("*.aaf"));
-+
-+        if (xchange)
-+            fclose(xchange);
-+	}
-+
- 	session_filter->add_pattern(X_("*.ardour"));
- 	session_filter->add_pattern(X_("*.ardour.bak"));
- 	m_open_filechooser->set_filter (*session_filter);
-@@ -477,7 +495,7 @@
- 	m_treeview->signal_row_activated().connect (mem_fun (*this, &NewSessionDialog::recent_row_activated));
- 	m_open_filechooser->signal_selection_changed ().connect (mem_fun (*this, &NewSessionDialog::file_chosen));
- 	m_template->signal_selection_changed ().connect (mem_fun (*this, &NewSessionDialog::template_chosen));
--	
-+
- 	page_set = Pages (0);
- }
- 
-@@ -584,7 +602,7 @@
- 	    4) canonicalize_file_name() & realpath() have entirely
-                    different semantics on OS X and Linux when given
- 		   a non-existent path.
--		   
-+
- 	   as result of all this, we take two distinct pathways through the code.
- 	*/
- 
-@@ -601,7 +619,6 @@
- 		engine_page_session_folder = realdir;
- 	}
- 
--	
- #else 
- 	char* res;
- 	if (!Glib::file_test (dir, Glib::FILE_TEST_IS_DIR)) {
-@@ -613,7 +630,7 @@
- 		engine_page_session_folder = res;
- 		free (res);
- 	}
--	
-+
- #endif
- 
- }
-@@ -621,7 +638,7 @@
- std::string
- NewSessionDialog::session_name() const
- {
--        std::string str = Glib::filename_from_utf8 (m_open_filechooser->get_filename());
-+    std::string str = m_filename;
- 	std::string::size_type position = str.find_last_of (G_DIR_SEPARATOR);
- 	str = str.substr (position+1);
- 	position = str.find_last_of ('.');
-@@ -633,7 +650,7 @@
- 
- 	if ((position = str.rfind(".bak")) != string::npos) {
- 	        str = str.substr (0, position);
--	}	  
-+	}
- 	*/
- 
- 	switch (which_page()) {
-@@ -652,7 +669,7 @@
- 
- 	default:
- 		break;
--	} 
-+	}
- 
- 	if (m_treeview->get_selection()->count_selected_rows() == 0) {
- 		return Glib::filename_from_utf8(str);
-@@ -668,7 +685,6 @@
- 	switch (which_page()) {
- 	case NewPage:
- 	        return Glib::filename_from_utf8(m_folder->get_filename());
--		
- 	case EnginePage:
- 		if (!(page_set & (OpenPage|NewPage))) {
- 			return Glib::filename_from_utf8(engine_page_session_folder);
-@@ -683,9 +699,9 @@
- 	default:
- 		break;
- 	}
--	       
-+
- 	if (m_treeview->get_selection()->count_selected_rows() == 0) {
--		const string filename(Glib::filename_from_utf8(m_open_filechooser->get_filename()));
-+		const string filename( m_filename );
- 		return Glib::path_get_dirname(filename);
- 	}
- 
-@@ -876,7 +892,7 @@
- {
- 	m_name->set_text("");
- 	set_response_sensitive (Gtk::RESPONSE_OK, false);
--	
-+
- }
- 
- void
-@@ -972,11 +988,57 @@
- 	}
- 
- 	if (!m_open_filechooser->get_filename().empty()) {
--	        set_response_sensitive (Gtk::RESPONSE_OK, true);
--		response (Gtk::RESPONSE_OK);
-+        std::string selection = m_open_filechooser->get_filename();
-+        if (( selection.find(".aaf") != string::npos ) ||
-+            ( selection.find(".AAF") != string::npos )) {
-+
-+            // Let's establish that ".aaf" is actually at the end of the string
-+            size_t slen = selection.length();
-+            size_t spos = selection.rfind(".aaf");
-+
-+            if (spos == string::npos)
-+                spos = selection.rfind(".AAF");
-+
-+            if (spos == (slen-4)) {
-+                std::string spath_to_imported_session;
-+
-+                AafImportHelper importer;
-+
-+                ax_error axResult = importer.import_from_aaf (selection, spath_to_imported_session);
-+
-+                // Determine the error (if any) but assume that all errors
-+                // (apart from NOT_IMPLEMENTED) have already been reported.
-+                switch (axResult) {
-+                    case AXERR_NONE:            // These are the only two cases where
-+                    case AXERR_MEDIA_NOT_FOUND: // it's possible to load the session.
-+                             m_open_filechooser->set_filename( Glib::filename_to_utf8(spath_to_imported_session) );
-+                             m_filename = spath_to_imported_session;
-+                        break;
-+                    case AXERR_NOT_IMPLEMENTED:
-+                             error << _("The requested feature is not yet available") << endmsg;
-+                    default:
-+                             m_open_filechooser->set_filename("");
-+                             set_response_sensitive (Gtk::RESPONSE_OK, false);
-+                             if (win) {                                              // Can't return to the previous cursor
-+                                 win->set_cursor(/*Gdk::Cursor(Gdk::LAST_CURSOR)*/); // under 'X' so just select the desktop
-+                             }                                                       // cursor (will usually be the same thing)
-+
-+                        return;
-+                    }
-+                 } else
-+                     m_filename = selection;
-+        } else
-+            m_filename = selection;
-+
-+        set_response_sensitive (Gtk::RESPONSE_OK, true);
-+        response (Gtk::RESPONSE_OK);
- 	} else {
- 	        set_response_sensitive (Gtk::RESPONSE_OK, false);
- 	}
-+
-+    if (win) {                                              // Can't return to the previous cursor
-+        win->set_cursor(/*Gdk::Cursor(Gdk::LAST_CURSOR)*/); // under 'X' so just select the desktop
-+    }                                                       // cursor (will usually be the same thing)
- }
- 
- void
-Index: ardour/gtk2_ardour/new_session_dialog.h
-===================================================================
---- ardour.orig/gtk2_ardour/new_session_dialog.h	2013-10-01 22:33:01.132131930 +0200
-+++ ardour/gtk2_ardour/new_session_dialog.h	2013-10-01 22:33:01.128131930 +0200
-@@ -1,5 +1,5 @@
- /*
--    Copyright (C) 2005 Paul Davis 
-+    Copyright (C) 2005 Paul Davis
- 
-     This program is free software; you can redistribute it and/or modify
-     it under the terms of the GNU General Public License as published by
-@@ -53,7 +53,7 @@
- class NewSessionDialog : public ArdourDialog
- {
- public:
--		
-+
- 	enum Pages {
- 		NewPage = 0x1,
- 		OpenPage = 0x2,
-@@ -70,7 +70,7 @@
- 
- 	std::string session_name() const;
- 	std::string session_folder() const;
--	
-+
- 	bool use_session_template() const;
- 	std::string session_template_name() const;
- 
-@@ -109,7 +109,7 @@
- 
- 	void reset_name();
- 	void reset_template();
--	
-+
- 	Gtk::Label * session_name_label;
- 	Gtk::Label * session_location_label;
- 	Gtk::Label * session_template_label;
-@@ -155,7 +155,7 @@
- 
- 	Gtk::CheckButton* m_create_master_bus;
- 	Gtk::SpinButton* m_master_bus_channel_count;
--       	
-+
- 	Gtk::CheckButton* m_create_control_bus;
- 	Gtk::SpinButton* m_control_bus_channel_count;
- 
-@@ -163,7 +163,7 @@
- 	Gtk::CheckButton* m_limit_input_ports;
- 	Gtk::SpinButton* m_input_limit_count;
- 
--	Gtk::CheckButton* m_connect_outputs;	
-+	Gtk::CheckButton* m_connect_outputs;
- 	Gtk::CheckButton* m_limit_output_ports;
- 	Gtk::SpinButton* m_output_limit_count;
- 
-@@ -180,7 +180,7 @@
- 	Pages page_set;
- 
- 	struct RecentSessionModelColumns : public Gtk::TreeModel::ColumnRecord {
--	    RecentSessionModelColumns() { 
-+	    RecentSessionModelColumns() {
- 		    add (visible_name);
- 		    add (fullpath);
- 	    }
-@@ -217,6 +217,7 @@
- 	bool have_engine;
- 	std::string engine_page_session_folder;
- 	std::string engine_page_session_name;
-+	string m_filename;
- 
- 	sigc::connection ic_connection;
- 	void engine_interface_chosen();
-Index: ardour/libs/ardour/ardour/ax_errors.h
-===================================================================
---- /dev/null	1970-01-01 00:00:00.000000000 +0000
-+++ ardour/libs/ardour/ardour/ax_errors.h	2013-10-01 22:33:01.128131930 +0200
-@@ -0,0 +1,40 @@
-+/* ax_errors.h
-+
-+   A 'humanised' list of possible errors that Ardour might encounter
-+   if it launches an external application. These error codes are
-+   limited to the range that can be returned by the Linux API 'waitpid()'
-+   Technically, it can return an int - but only the 8 least significant
-+   bits are used to represent the exit status of the spawned application.
-+   Therefore, although any number of error codes may be defined, their
-+   values must lie in the range -128 to +127. Feel free to add to the
-+   error codes already listed but please do not modify (e.g. renumber)
-+   any errors that someone has already defined before you.
-+*/
-+
-+#ifndef AXERRORS_INCLUDED
-+#define AXERRORS_INCLUDED
-+
-+enum ax_error { AXERR_LOWER_LIMIT       = (-128),
-+                AXERR_UNKNOWN           = (-127),
-+                AXERR_NOT_IMPLEMENTED   = (-126),
-+                AXERR_INTERNAL_ERROR    = (-125),
-+                AXERR_INVALID_ACCESS    = (-11),
-+                AXERR_INVALID_SWITCH    = (-10),
-+                AXERR_INVALID_PARAM     = (-9),
-+                AXERR_SOURCE_ERROR      = (-8),
-+                AXERR_DESTINATION_ERROR = (-7),
-+                AXERR_PROCESS_FAILURE   = (-6),
-+                AXERR_PROCESS_NOT_FOUND = (-5),
-+                AXERR_WINE_FAILURE      = (-4),
-+                AXERR_WINE_NOT_FOUND    = (-3),
-+                AXERR_NOT_INITIALIZED   = (-2),
-+                AXERR_FATAL_EXCEPTION   = (-1),
-+                AXERR_NONE              = 0,
-+                AXERR_USER_ABORTED      = 1,
-+                AXERR_MEDIA_NOT_FOUND   = 2,
-+                AXERR_UPPER_LIMIT       = 127
-+              };
-+
-+#endif /* AXERRORS_INCLUDED */
-+
-+/* EOF */
diff --git a/debian/patches/log-stdout.patch b/debian/patches/log-stdout.patch
new file mode 100644
index 0000000..dadc429
--- /dev/null
+++ b/debian/patches/log-stdout.patch
@@ -0,0 +1,25 @@
+Description: Log waf messages to stdout, not stderr
+ The buildds kill the build if they don't see messages in stdout for a
+ long time. Unfortunately, waf defaults to printing such messages to
+ stderr, so ardour takes too long on slower archs. Work around that
+ by printing to stdout.
+Author: Felipe Sateler <fsateler at debian.org>
+Forwarded: no
+--- ardour3.orig/wscript
++++ ardour3/wscript
+@@ -1066,6 +1066,15 @@ const char* const ardour_config_info = "
+ def build(bld):
+     create_stored_revision()
+ 
++    # Log command executions to stdout, not err
++    def our_log(msg):
++        if not msg:
++            return
++        sys.stdout.write(str(msg))
++        sys.stdout.flush()
++
++    bld.to_log = our_log
++
+     # add directories that contain only headers, to workaround an issue with waf
+ 
+     if not bld.is_defined('USE_EXTERNAL_LIBS'):
diff --git a/debian/patches/series b/debian/patches/series
index 6fa4858..c8a0a54 100644
--- a/debian/patches/series
+++ b/debian/patches/series
@@ -1,8 +1,3 @@
-60-libdir.patch
-90_ardour-x-change.patch
-100_syslibs.patch
-111_libardourvampplugins.patch
-140_enable-ladish.patch
-160_kfreebsd.patch
-170_template-ftbfs.patch
-180_aubio.patch
+waf.patch
+wscript.patch
+log-stdout.patch
diff --git a/debian/patches/waf.patch b/debian/patches/waf.patch
new file mode 100644
index 0000000..fa27c55
--- /dev/null
+++ b/debian/patches/waf.patch
@@ -0,0 +1,12785 @@
+Description: Reapply waflib as unpacked source, and add waf-light
+ waf was shipped by upstream but in non-DFSG binary form so stripped
+ from Debian redistributed source.
+ .
+ This patch was made roughly from these routines:
+ .
+    ./waf ; \
+    mv .waf*/waflib waflib && \
+    rm waf ; \
+    find ./waflib -iname "*.pyc" -delete ; \
+    wget http://waf.googlecode.com/git-history/waf-1.6.11/waf-light ; \
+    ln -s waf-light waf && chmod +x waf-light
+Author: Adrian Knoth <adi at drcomp.erfurt.thur.de>
+Bug-Debian: http://bugs.debian.org/654477
+Last-Update: 2015-04-19
+
+--- /dev/null
++++ ardour3/waf-light
+@@ -0,0 +1,163 @@
++#!/usr/bin/env python
++# encoding: ISO8859-1
++# Thomas Nagy, 2005-2011
++
++"""
++Redistribution and use in source and binary forms, with or without
++modification, are permitted provided that the following conditions
++are met:
++
++1. Redistributions of source code must retain the above copyright
++   notice, this list of conditions and the following disclaimer.
++
++2. Redistributions in binary form must reproduce the above copyright
++   notice, this list of conditions and the following disclaimer in the
++   documentation and/or other materials provided with the distribution.
++
++3. The name of the author may not be used to endorse or promote products
++   derived from this software without specific prior written permission.
++
++THIS SOFTWARE IS PROVIDED BY THE AUTHOR "AS IS" AND ANY EXPRESS OR
++IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED
++WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE
++DISCLAIMED. IN NO EVENT SHALL THE AUTHOR BE LIABLE FOR ANY DIRECT,
++INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES
++(INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR
++SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION)
++HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT,
++STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING
++IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE
++POSSIBILITY OF SUCH DAMAGE.
++"""
++
++import os, sys
++
++VERSION="1.6.11"
++REVISION="x"
++INSTALL="x"
++C1='x'
++C2='x'
++cwd = os.getcwd()
++join = os.path.join
++
++if sys.hexversion<0x206000f:
++	raise ImportError('Python >= 2.6 is required to create the waf file')
++
++WAF='waf'
++def b(x):
++	return x
++if sys.hexversion>0x300000f:
++	WAF='waf3'
++	def b(x):
++		return x.encode()
++
++def err(m):
++	print(('\033[91mError: %s\033[0m' % m))
++	sys.exit(1)
++
++def unpack_wafdir(dir):
++	f = open(sys.argv[0],'rb')
++	c = 'corrupt archive (%d)'
++	while 1:
++		line = f.readline()
++		if not line: err('run waf-light from a folder containing waflib')
++		if line == b('#==>\n'):
++			txt = f.readline()
++			if not txt: err(c % 1)
++			if f.readline() != b('#<==\n'): err(c % 2)
++			break
++	if not txt: err(c % 3)
++	txt = txt[1:-1].replace(b(C1), b('\n')).replace(b(C2), b('\r'))
++
++	import shutil, tarfile
++	try: shutil.rmtree(dir)
++	except OSError: pass
++	try:
++		for x in ['Tools', 'extras']:
++			os.makedirs(join(dir, 'waflib', x))
++	except OSError:
++		err("Cannot unpack waf lib into %s\nMove waf into a writeable directory" % dir)
++
++	os.chdir(dir)
++	tmp = 't.bz2'
++	t = open(tmp,'wb')
++	t.write(txt)
++	t.close()
++
++	try:
++		t = tarfile.open(tmp)
++	except:
++		try:
++			os.system('bunzip2 t.bz2')
++			t = tarfile.open('t')
++			tmp = 't'
++		except:
++			os.chdir(cwd)
++			try: shutil.rmtree(dir)
++			except OSError: pass
++			err("Waf cannot be unpacked, check that bzip2 support is present")
++
++	for x in t: t.extract(x)
++	t.close()
++
++	for x in ['Tools', 'extras']:
++		os.chmod(join('waflib',x), 493)
++
++	if sys.hexversion<0x300000f:
++		sys.path = [join(dir, 'waflib')] + sys.path
++		import fixpy2
++		fixpy2.fixdir(dir)
++
++	os.unlink(tmp)
++	os.chdir(cwd)
++
++	try: dir = unicode(dir, 'mbcs')
++	except: pass
++	try:
++		from ctypes import windll
++		windll.kernel32.SetFileAttributesW(dir, 2)
++	except:
++		pass
++
++def test(dir):
++	try:
++		os.stat(join(dir, 'waflib'))
++		return os.path.abspath(dir)
++	except OSError:
++		pass
++
++def find_lib():
++	name = sys.argv[0]
++	base = os.path.dirname(os.path.abspath(name))
++
++	#devs use $WAFDIR
++	w=test(os.environ.get('WAFDIR', ''))
++	if w: return w
++
++	#waf-light
++	if name.endswith('waf-light'):
++		w = test(base)
++		if w: return w
++		err('waf-light requires waflib -> export WAFDIR=/folder')
++
++	dirname = '%s-%s-%s' % (WAF, VERSION, REVISION)
++	for i in [INSTALL,'/usr','/usr/local','/opt']:
++		w = test(i + '/lib/' + dirname)
++		if w: return w
++
++	#waf-local
++	dir = join(base, (sys.platform != 'win32' and '.' or '') + dirname)
++	w = test(dir)
++	if w: return w
++
++	#unpack
++	unpack_wafdir(dir)
++	return dir
++
++wafdir = find_lib()
++sys.path.insert(0, wafdir)
++
++if __name__ == '__main__':
++	from waflib import Scripting
++	Scripting.waf_entry_point(cwd, VERSION, wafdir)
++
+--- /dev/null
++++ ardour3/waflib/ansiterm.py
+@@ -0,0 +1,177 @@
++#! /usr/bin/env python
++# encoding: utf-8
++# WARNING! Do not edit! http://waf.googlecode.com/git/docs/wafbook/single.html#_obtaining_the_waf_file
++
++import sys,os
++try:
++	if not(sys.stderr.isatty()and sys.stdout.isatty()):
++		raise ValueError('not a tty')
++	from ctypes import*
++	class COORD(Structure):
++		_fields_=[("X",c_short),("Y",c_short)]
++	class SMALL_RECT(Structure):
++		_fields_=[("Left",c_short),("Top",c_short),("Right",c_short),("Bottom",c_short)]
++	class CONSOLE_SCREEN_BUFFER_INFO(Structure):
++		_fields_=[("Size",COORD),("CursorPosition",COORD),("Attributes",c_short),("Window",SMALL_RECT),("MaximumWindowSize",COORD)]
++	class CONSOLE_CURSOR_INFO(Structure):
++		_fields_=[('dwSize',c_ulong),('bVisible',c_int)]
++	sbinfo=CONSOLE_SCREEN_BUFFER_INFO()
++	csinfo=CONSOLE_CURSOR_INFO()
++	hconsole=windll.kernel32.GetStdHandle(-11)
++	windll.kernel32.GetConsoleScreenBufferInfo(hconsole,byref(sbinfo))
++	if sbinfo.Size.X<9 or sbinfo.Size.Y<9:raise ValueError('small console')
++	windll.kernel32.GetConsoleCursorInfo(hconsole,byref(csinfo))
++except Exception:
++	pass
++else:
++	import re,threading
++	is_vista=getattr(sys,"getwindowsversion",None)and sys.getwindowsversion()[0]>=6
++	try:
++		_type=unicode
++	except:
++		_type=str
++	to_int=lambda number,default:number and int(number)or default
++	wlock=threading.Lock()
++	STD_OUTPUT_HANDLE=-11
++	STD_ERROR_HANDLE=-12
++	class AnsiTerm(object):
++		def __init__(self):
++			self.encoding=sys.stdout.encoding
++			self.hconsole=windll.kernel32.GetStdHandle(STD_OUTPUT_HANDLE)
++			self.cursor_history=[]
++			self.orig_sbinfo=CONSOLE_SCREEN_BUFFER_INFO()
++			self.orig_csinfo=CONSOLE_CURSOR_INFO()
++			windll.kernel32.GetConsoleScreenBufferInfo(self.hconsole,byref(self.orig_sbinfo))
++			windll.kernel32.GetConsoleCursorInfo(hconsole,byref(self.orig_csinfo))
++		def screen_buffer_info(self):
++			sbinfo=CONSOLE_SCREEN_BUFFER_INFO()
++			windll.kernel32.GetConsoleScreenBufferInfo(self.hconsole,byref(sbinfo))
++			return sbinfo
++		def clear_line(self,param):
++			mode=param and int(param)or 0
++			sbinfo=self.screen_buffer_info()
++			if mode==1:
++				line_start=COORD(0,sbinfo.CursorPosition.Y)
++				line_length=sbinfo.Size.X
++			elif mode==2:
++				line_start=COORD(sbinfo.CursorPosition.X,sbinfo.CursorPosition.Y)
++				line_length=sbinfo.Size.X-sbinfo.CursorPosition.X
++			else:
++				line_start=sbinfo.CursorPosition
++				line_length=sbinfo.Size.X-sbinfo.CursorPosition.X
++			chars_written=c_int()
++			windll.kernel32.FillConsoleOutputCharacterA(self.hconsole,c_wchar(' '),line_length,line_start,byref(chars_written))
++			windll.kernel32.FillConsoleOutputAttribute(self.hconsole,sbinfo.Attributes,line_length,line_start,byref(chars_written))
++		def clear_screen(self,param):
++			mode=to_int(param,0)
++			sbinfo=self.screen_buffer_info()
++			if mode==1:
++				clear_start=COORD(0,0)
++				clear_length=sbinfo.CursorPosition.X*sbinfo.CursorPosition.Y
++			elif mode==2:
++				clear_start=COORD(0,0)
++				clear_length=sbinfo.Size.X*sbinfo.Size.Y
++				windll.kernel32.SetConsoleCursorPosition(self.hconsole,clear_start)
++			else:
++				clear_start=sbinfo.CursorPosition
++				clear_length=((sbinfo.Size.X-sbinfo.CursorPosition.X)+sbinfo.Size.X*(sbinfo.Size.Y-sbinfo.CursorPosition.Y))
++			chars_written=c_int()
++			windll.kernel32.FillConsoleOutputCharacterA(self.hconsole,c_wchar(' '),clear_length,clear_start,byref(chars_written))
++			windll.kernel32.FillConsoleOutputAttribute(self.hconsole,sbinfo.Attributes,clear_length,clear_start,byref(chars_written))
++		def push_cursor(self,param):
++			sbinfo=self.screen_buffer_info()
++			self.cursor_history.append(sbinfo.CursorPosition)
++		def pop_cursor(self,param):
++			if self.cursor_history:
++				old_pos=self.cursor_history.pop()
++				windll.kernel32.SetConsoleCursorPosition(self.hconsole,old_pos)
++		def set_cursor(self,param):
++			y,sep,x=param.partition(';')
++			x=to_int(x,1)-1
++			y=to_int(y,1)-1
++			sbinfo=self.screen_buffer_info()
++			new_pos=COORD(min(max(0,x),sbinfo.Size.X),min(max(0,y),sbinfo.Size.Y))
++			windll.kernel32.SetConsoleCursorPosition(self.hconsole,new_pos)
++		def set_column(self,param):
++			x=to_int(param,1)-1
++			sbinfo=self.screen_buffer_info()
++			new_pos=COORD(min(max(0,x),sbinfo.Size.X),sbinfo.CursorPosition.Y)
++			windll.kernel32.SetConsoleCursorPosition(self.hconsole,new_pos)
++		def move_cursor(self,x_offset=0,y_offset=0):
++			sbinfo=self.screen_buffer_info()
++			new_pos=COORD(min(max(0,sbinfo.CursorPosition.X+x_offset),sbinfo.Size.X),min(max(0,sbinfo.CursorPosition.Y+y_offset),sbinfo.Size.Y))
++			windll.kernel32.SetConsoleCursorPosition(self.hconsole,new_pos)
++		def move_up(self,param):
++			self.move_cursor(y_offset=-to_int(param,1))
++		def move_down(self,param):
++			self.move_cursor(y_offset=to_int(param,1))
++		def move_left(self,param):
++			self.move_cursor(x_offset=-to_int(param,1))
++		def move_right(self,param):
++			self.move_cursor(x_offset=to_int(param,1))
++		def next_line(self,param):
++			sbinfo=self.screen_buffer_info()
++			self.move_cursor(x_offset=-sbinfo.CursorPosition.X,y_offset=to_int(param,1))
++		def prev_line(self,param):
++			sbinfo=self.screen_buffer_info()
++			self.move_cursor(x_offset=-sbinfo.CursorPosition.X,y_offset=-to_int(param,1))
++		def rgb2bgr(self,c):
++			return((c&1)<<2)|(c&2)|((c&4)>>2)
++		def set_color(self,param):
++			cols=param.split(';')
++			sbinfo=CONSOLE_SCREEN_BUFFER_INFO()
++			windll.kernel32.GetConsoleScreenBufferInfo(self.hconsole,byref(sbinfo))
++			attr=sbinfo.Attributes
++			for c in cols:
++				if is_vista:
++					c=int(c)
++				else:
++					c=to_int(c,0)
++				if c in range(30,38):
++					attr=(attr&0xfff0)|self.rgb2bgr(c-30)
++				elif c in range(40,48):
++					attr=(attr&0xff0f)|(self.rgb2bgr(c-40)<<4)
++				elif c==0:
++					attr=self.orig_sbinfo.Attributes
++				elif c==1:
++					attr|=0x08
++				elif c==4:
++					attr|=0x80
++				elif c==7:
++					attr=(attr&0xff88)|((attr&0x70)>>4)|((attr&0x07)<<4)
++			windll.kernel32.SetConsoleTextAttribute(self.hconsole,attr)
++		def show_cursor(self,param):
++			csinfo.bVisible=1
++			windll.kernel32.SetConsoleCursorInfo(self.hconsole,byref(csinfo))
++		def hide_cursor(self,param):
++			csinfo.bVisible=0
++			windll.kernel32.SetConsoleCursorInfo(self.hconsole,byref(csinfo))
++		ansi_command_table={'A':move_up,'B':move_down,'C':move_right,'D':move_left,'E':next_line,'F':prev_line,'G':set_column,'H':set_cursor,'f':set_cursor,'J':clear_screen,'K':clear_line,'h':show_cursor,'l':hide_cursor,'m':set_color,'s':push_cursor,'u':pop_cursor,}
++		ansi_tokens=re.compile('(?:\x1b\[([0-9?;]*)([a-zA-Z])|([^\x1b]+))')
++		def write(self,text):
++			try:
++				wlock.acquire()
++				for param,cmd,txt in self.ansi_tokens.findall(text):
++					if cmd:
++						cmd_func=self.ansi_command_table.get(cmd)
++						if cmd_func:
++							cmd_func(self,param)
++					else:
++						self.writeconsole(txt)
++			finally:
++				wlock.release()
++		def writeconsole(self,txt):
++			chars_written=c_int()
++			writeconsole=windll.kernel32.WriteConsoleA
++			if isinstance(txt,_type):
++				writeconsole=windll.kernel32.WriteConsoleW
++			TINY_STEP=3000
++			for x in range(0,len(txt),TINY_STEP):
++				tiny=txt[x:x+TINY_STEP]
++				writeconsole(self.hconsole,tiny,len(tiny),byref(chars_written),None)
++		def flush(self):
++			pass
++		def isatty(self):
++			return True
++	sys.stderr=sys.stdout=AnsiTerm()
++	os.environ['TERM']='vt100'
+--- /dev/null
++++ ardour3/waflib/Build.py
+@@ -0,0 +1,731 @@
++#! /usr/bin/env python
++# encoding: utf-8
++# WARNING! Do not edit! http://waf.googlecode.com/git/docs/wafbook/single.html#_obtaining_the_waf_file
++
++import os,sys,errno,re,shutil
++try:import cPickle
++except:import pickle as cPickle
++from waflib import Runner,TaskGen,Utils,ConfigSet,Task,Logs,Options,Context,Errors
++import waflib.Node
++CACHE_DIR='c4che'
++CACHE_SUFFIX='_cache.py'
++INSTALL=1337
++UNINSTALL=-1337
++SAVED_ATTRS='root node_deps raw_deps task_sigs'.split()
++CFG_FILES='cfg_files'
++POST_AT_ONCE=0
++POST_LAZY=1
++POST_BOTH=2
++class BuildContext(Context.Context):
++	'''executes the build'''
++	cmd='build'
++	variant=''
++	def __init__(self,**kw):
++		super(BuildContext,self).__init__(**kw)
++		self.is_install=0
++		self.top_dir=kw.get('top_dir',Context.top_dir)
++		self.run_dir=kw.get('run_dir',Context.run_dir)
++		self.post_mode=POST_AT_ONCE
++		self.out_dir=kw.get('out_dir',Context.out_dir)
++		self.cache_dir=kw.get('cache_dir',None)
++		if not self.cache_dir:
++			self.cache_dir=self.out_dir+os.sep+CACHE_DIR
++		self.all_envs={}
++		self.task_sigs={}
++		self.node_deps={}
++		self.raw_deps={}
++		self.cache_dir_contents={}
++		self.task_gen_cache_names={}
++		self.launch_dir=Context.launch_dir
++		self.jobs=Options.options.jobs
++		self.targets=Options.options.targets
++		self.keep=Options.options.keep
++		self.cache_global=Options.cache_global
++		self.nocache=Options.options.nocache
++		self.progress_bar=Options.options.progress_bar
++		self.deps_man=Utils.defaultdict(list)
++		self.current_group=0
++		self.groups=[]
++		self.group_names={}
++	def get_variant_dir(self):
++		if not self.variant:
++			return self.out_dir
++		return os.path.join(self.out_dir,self.variant)
++	variant_dir=property(get_variant_dir,None)
++	def __call__(self,*k,**kw):
++		kw['bld']=self
++		ret=TaskGen.task_gen(*k,**kw)
++		self.task_gen_cache_names={}
++		self.add_to_group(ret,group=kw.get('group',None))
++		return ret
++	def __copy__(self):
++		raise Errors.WafError('build contexts are not supposed to be copied')
++	def install_files(self,*k,**kw):
++		pass
++	def install_as(self,*k,**kw):
++		pass
++	def symlink_as(self,*k,**kw):
++		pass
++	def load_envs(self):
++		node=self.root.find_node(self.cache_dir)
++		if not node:
++			raise Errors.WafError('The project was not configured: run "waf configure" first!')
++		lst=node.ant_glob('**/*%s'%CACHE_SUFFIX,quiet=True)
++		if not lst:
++			raise Errors.WafError('The cache directory is empty: reconfigure the project')
++		for x in lst:
++			name=x.path_from(node).replace(CACHE_SUFFIX,'').replace('\\','/')
++			env=ConfigSet.ConfigSet(x.abspath())
++			self.all_envs[name]=env
++			for f in env[CFG_FILES]:
++				newnode=self.root.find_resource(f)
++				try:
++					h=Utils.h_file(newnode.abspath())
++				except(IOError,AttributeError):
++					Logs.error('cannot find %r'%f)
++					h=Utils.SIG_NIL
++				newnode.sig=h
++	def init_dirs(self):
++		if not(os.path.isabs(self.top_dir)and os.path.isabs(self.out_dir)):
++			raise Errors.WafError('The project was not configured: run "waf configure" first!')
++		self.path=self.srcnode=self.root.find_dir(self.top_dir)
++		self.bldnode=self.root.make_node(self.variant_dir)
++		self.bldnode.mkdir()
++	def execute(self):
++		self.restore()
++		if not self.all_envs:
++			self.load_envs()
++		self.execute_build()
++	def execute_build(self):
++		Logs.info("Waf: Entering directory `%s'"%self.variant_dir)
++		self.recurse([self.run_dir])
++		self.pre_build()
++		self.timer=Utils.Timer()
++		if self.progress_bar:
++			sys.stderr.write(Logs.colors.cursor_off)
++		try:
++			self.compile()
++		finally:
++			if self.progress_bar==1:
++				c=len(self.returned_tasks)or 1
++				self.to_log(self.progress_line(c,c,Logs.colors.BLUE,Logs.colors.NORMAL))
++				print('')
++				sys.stdout.flush()
++				sys.stderr.write(Logs.colors.cursor_on)
++			Logs.info("Waf: Leaving directory `%s'"%self.variant_dir)
++		self.post_build()
++	def restore(self):
++		try:
++			env=ConfigSet.ConfigSet(os.path.join(self.cache_dir,'build.config.py'))
++		except(IOError,OSError):
++			pass
++		else:
++			if env['version']<Context.HEXVERSION:
++				raise Errors.WafError('Version mismatch! reconfigure the project')
++			for t in env['tools']:
++				self.setup(**t)
++		f=None
++		try:
++			dbfn=os.path.join(self.variant_dir,Context.DBFILE)
++			try:
++				f=open(dbfn,'rb')
++			except(IOError,EOFError):
++				Logs.debug('build: could not load the build cache %s (missing)'%dbfn)
++			else:
++				try:
++					waflib.Node.pickle_lock.acquire()
++					waflib.Node.Nod3=self.node_class
++					try:
++						data=cPickle.load(f)
++					except Exception ,e:
++						Logs.debug('build: could not pickle the build cache %s: %r'%(dbfn,e))
++					else:
++						for x in SAVED_ATTRS:
++							setattr(self,x,data[x])
++				finally:
++					waflib.Node.pickle_lock.release()
++		finally:
++			if f:
++				f.close()
++		self.init_dirs()
++	def store(self):
++		data={}
++		for x in SAVED_ATTRS:
++			data[x]=getattr(self,x)
++		db=os.path.join(self.variant_dir,Context.DBFILE)
++		try:
++			waflib.Node.pickle_lock.acquire()
++			waflib.Node.Nod3=self.node_class
++			f=None
++			try:
++				f=open(db+'.tmp','wb')
++				cPickle.dump(data,f)
++			finally:
++				if f:
++					f.close()
++		finally:
++			waflib.Node.pickle_lock.release()
++		try:
++			st=os.stat(db)
++			os.unlink(db)
++			if not Utils.is_win32:
++				os.chown(db+'.tmp',st.st_uid,st.st_gid)
++		except(AttributeError,OSError):
++			pass
++		os.rename(db+'.tmp',db)
++	def compile(self):
++		Logs.debug('build: compile()')
++		self.producer=Runner.Parallel(self,self.jobs)
++		self.producer.biter=self.get_build_iterator()
++		self.returned_tasks=[]
++		try:
++			self.producer.start()
++		except KeyboardInterrupt:
++			self.store()
++			raise
++		else:
++			if self.producer.dirty:
++				self.store()
++		if self.producer.error:
++			raise Errors.BuildError(self.producer.error)
++	def setup(self,tool,tooldir=None,funs=None):
++		if isinstance(tool,list):
++			for i in tool:self.setup(i,tooldir)
++			return
++		module=Context.load_tool(tool,tooldir)
++		if hasattr(module,"setup"):module.setup(self)
++	def get_env(self):
++		try:
++			return self.all_envs[self.variant]
++		except KeyError:
++			return self.all_envs['']
++	def set_env(self,val):
++		self.all_envs[self.variant]=val
++	env=property(get_env,set_env)
++	def add_manual_dependency(self,path,value):
++		if isinstance(path,waflib.Node.Node):
++			node=path
++		elif os.path.isabs(path):
++			node=self.root.find_resource(path)
++		else:
++			node=self.path.find_resource(path)
++		self.deps_man[id(node)].append(value)
++	def launch_node(self):
++		try:
++			return self.p_ln
++		except AttributeError:
++			self.p_ln=self.root.find_dir(self.launch_dir)
++			return self.p_ln
++	def hash_env_vars(self,env,vars_lst):
++		if not env.table:
++			env=env.parent
++			if not env:
++				return Utils.SIG_NIL
++		idx=str(id(env))+str(vars_lst)
++		try:
++			cache=self.cache_env
++		except AttributeError:
++			cache=self.cache_env={}
++		else:
++			try:
++				return self.cache_env[idx]
++			except KeyError:
++				pass
++		lst=[env[a]for a in vars_lst]
++		ret=Utils.h_list(lst)
++		Logs.debug('envhash: %s %r',Utils.to_hex(ret),lst)
++		cache[idx]=ret
++		return ret
++	def get_tgen_by_name(self,name):
++		cache=self.task_gen_cache_names
++		if not cache:
++			for g in self.groups:
++				for tg in g:
++					try:
++						cache[tg.name]=tg
++					except AttributeError:
++						pass
++		try:
++			return cache[name]
++		except KeyError:
++			raise Errors.WafError('Could not find a task generator for the name %r'%name)
++	def progress_line(self,state,total,col1,col2):
++		n=len(str(total))
++		Utils.rot_idx+=1
++		ind=Utils.rot_chr[Utils.rot_idx%4]
++		pc=(100.*state)/total
++		eta=str(self.timer)
++		fs="[%%%dd/%%%dd][%%s%%2d%%%%%%s][%s]["%(n,n,ind)
++		left=fs%(state,total,col1,pc,col2)
++		right='][%s%s%s]'%(col1,eta,col2)
++		cols=Logs.get_term_cols()-len(left)-len(right)+2*len(col1)+2*len(col2)
++		if cols<7:cols=7
++		ratio=((cols*state)//total)-1
++		bar=('='*ratio+'>').ljust(cols)
++		msg=Utils.indicator%(left,bar,right)
++		return msg
++	def declare_chain(self,*k,**kw):
++		return TaskGen.declare_chain(*k,**kw)
++	def pre_build(self):
++		for m in getattr(self,'pre_funs',[]):
++			m(self)
++	def post_build(self):
++		for m in getattr(self,'post_funs',[]):
++			m(self)
++	def add_pre_fun(self,meth):
++		try:
++			self.pre_funs.append(meth)
++		except AttributeError:
++			self.pre_funs=[meth]
++	def add_post_fun(self,meth):
++		try:
++			self.post_funs.append(meth)
++		except AttributeError:
++			self.post_funs=[meth]
++	def get_group(self,x):
++		if not self.groups:
++			self.add_group()
++		if x is None:
++			return self.groups[self.current_group]
++		if x in self.group_names:
++			return self.group_names[x]
++		return self.groups[x]
++	def add_to_group(self,tgen,group=None):
++		assert(isinstance(tgen,TaskGen.task_gen)or isinstance(tgen,Task.TaskBase))
++		tgen.bld=self
++		self.get_group(group).append(tgen)
++	def get_group_name(self,g):
++		if not isinstance(g,list):
++			g=self.groups[g]
++		for x in self.group_names:
++			if id(self.group_names[x])==id(g):
++				return x
++		return''
++	def get_group_idx(self,tg):
++		se=id(tg)
++		for i in range(len(self.groups)):
++			for t in self.groups[i]:
++				if id(t)==se:
++					return i
++		return None
++	def add_group(self,name=None,move=True):
++		if name and name in self.group_names:
++			Logs.error('add_group: name %s already present'%name)
++		g=[]
++		self.group_names[name]=g
++		self.groups.append(g)
++		if move:
++			self.current_group=len(self.groups)-1
++	def set_group(self,idx):
++		if isinstance(idx,str):
++			g=self.group_names[idx]
++			for i in range(len(self.groups)):
++				if id(g)==id(self.groups[i]):
++					self.current_group=i
++		else:
++			self.current_group=idx
++	def total(self):
++		total=0
++		for group in self.groups:
++			for tg in group:
++				try:
++					total+=len(tg.tasks)
++				except AttributeError:
++					total+=1
++		return total
++	def get_targets(self):
++		to_post=[]
++		min_grp=0
++		for name in self.targets.split(','):
++			tg=self.get_tgen_by_name(name)
++			if not tg:
++				raise Errors.WafError('target %r does not exist'%name)
++			m=self.get_group_idx(tg)
++			if m>min_grp:
++				min_grp=m
++				to_post=[tg]
++			elif m==min_grp:
++				to_post.append(tg)
++		return(min_grp,to_post)
++	def post_group(self):
++		if self.targets=='*':
++			for tg in self.groups[self.cur]:
++				try:
++					f=tg.post
++				except AttributeError:
++					pass
++				else:
++					f()
++		elif self.targets:
++			if self.cur<self._min_grp:
++				for tg in self.groups[self.cur]:
++					try:
++						f=tg.post
++					except AttributeError:
++						pass
++					else:
++						f()
++			else:
++				for tg in self._exact_tg:
++					tg.post()
++		else:
++			ln=self.launch_node()
++			for tg in self.groups[self.cur]:
++				try:
++					f=tg.post
++				except AttributeError:
++					pass
++				else:
++					if tg.path.is_child_of(ln):
++						f()
++	def get_tasks_group(self,idx):
++		tasks=[]
++		for tg in self.groups[idx]:
++			if isinstance(tg,Task.TaskBase):
++				tasks.append(tg)
++			else:
++				tasks.extend(tg.tasks)
++		return tasks
++	def get_build_iterator(self):
++		self.cur=0
++		if self.targets and self.targets!='*':
++			(self._min_grp,self._exact_tg)=self.get_targets()
++		global lazy_post
++		if self.post_mode!=POST_LAZY:
++			while self.cur<len(self.groups):
++				self.post_group()
++				self.cur+=1
++			self.cur=0
++		while self.cur<len(self.groups):
++			if self.post_mode!=POST_AT_ONCE:
++				self.post_group()
++			tasks=self.get_tasks_group(self.cur)
++			Task.set_file_constraints(tasks)
++			Task.set_precedence_constraints(tasks)
++			self.cur_tasks=tasks
++			self.cur+=1
++			if not tasks:
++				continue
++			yield tasks
++		while 1:
++			yield[]
++class inst(Task.Task):
++	color='CYAN'
++	def post(self):
++		buf=[]
++		for x in self.source:
++			if isinstance(x,waflib.Node.Node):
++				y=x
++			else:
++				y=self.path.find_resource(x)
++				if not y:
++					if Logs.verbose:
++						Logs.warn('Could not find %s immediately (may cause broken builds)'%x)
++					idx=self.generator.bld.get_group_idx(self)
++					for tg in self.generator.bld.groups[idx]:
++						if not isinstance(tg,inst)and id(tg)!=id(self):
++							tg.post()
++						y=self.path.find_resource(x)
++						if y:
++							break
++					else:
++						raise Errors.WafError('could not find %r in %r'%(x,self.path))
++			buf.append(y)
++		self.inputs=buf
++	def runnable_status(self):
++		ret=super(inst,self).runnable_status()
++		if ret==Task.SKIP_ME:
++			return Task.RUN_ME
++		return ret
++	def __str__(self):
++		return''
++	def run(self):
++		return self.generator.exec_task()
++	def get_install_path(self,destdir=True):
++		dest=Utils.subst_vars(self.dest,self.env)
++		dest=dest.replace('/',os.sep)
++		if destdir and Options.options.destdir:
++			dest=os.path.join(Options.options.destdir,os.path.splitdrive(dest)[1].lstrip(os.sep))
++		return dest
++	def exec_install_files(self):
++		destpath=self.get_install_path()
++		if not destpath:
++			raise Errors.WafError('unknown installation path %r'%self.generator)
++		for x,y in zip(self.source,self.inputs):
++			if self.relative_trick:
++				destfile=os.path.join(destpath,y.path_from(self.path))
++				Utils.check_dir(os.path.dirname(destfile))
++			else:
++				destfile=os.path.join(destpath,y.name)
++			self.generator.bld.do_install(y.abspath(),destfile,self.chmod)
++	def exec_install_as(self):
++		destfile=self.get_install_path()
++		self.generator.bld.do_install(self.inputs[0].abspath(),destfile,self.chmod)
++	def exec_symlink_as(self):
++		destfile=self.get_install_path()
++		self.generator.bld.do_link(self.link,destfile)
++class InstallContext(BuildContext):
++	'''installs the targets on the system'''
++	cmd='install'
++	def __init__(self,**kw):
++		super(InstallContext,self).__init__(**kw)
++		self.uninstall=[]
++		self.is_install=INSTALL
++	def do_install(self,src,tgt,chmod=Utils.O644):
++		d,_=os.path.split(tgt)
++		if not d:
++			raise Errors.WafError('Invalid installation given %r->%r'%(src,tgt))
++		Utils.check_dir(d)
++		srclbl=src.replace(self.srcnode.abspath()+os.sep,'')
++		if not Options.options.force:
++			try:
++				st1=os.stat(tgt)
++				st2=os.stat(src)
++			except OSError:
++				pass
++			else:
++				if st1.st_mtime+2>=st2.st_mtime and st1.st_size==st2.st_size:
++					if not self.progress_bar:
++						Logs.info('- install %s (from %s)'%(tgt,srclbl))
++					return False
++		if not self.progress_bar:
++			Logs.info('+ install %s (from %s)'%(tgt,srclbl))
++		try:
++			os.remove(tgt)
++		except OSError:
++			pass
++		try:
++			shutil.copy2(src,tgt)
++			os.chmod(tgt,chmod)
++		except IOError:
++			try:
++				os.stat(src)
++			except(OSError,IOError):
++				Logs.error('File %r does not exist'%src)
++			raise Errors.WafError('Could not install the file %r'%tgt)
++	def do_link(self,src,tgt):
++		d,_=os.path.split(tgt)
++		Utils.check_dir(d)
++		link=False
++		if not os.path.islink(tgt):
++			link=True
++		elif os.readlink(tgt)!=src:
++			link=True
++		if link:
++			try:os.remove(tgt)
++			except OSError:pass
++			if not self.progress_bar:
++				Logs.info('+ symlink %s (to %s)'%(tgt,src))
++			os.symlink(src,tgt)
++		else:
++			if not self.progress_bar:
++				Logs.info('- symlink %s (to %s)'%(tgt,src))
++	def run_task_now(self,tsk,postpone):
++		tsk.post()
++		if not postpone:
++			if tsk.runnable_status()==Task.ASK_LATER:
++				raise self.WafError('cannot post the task %r'%tsk)
++			tsk.run()
++	def install_files(self,dest,files,env=None,chmod=Utils.O644,relative_trick=False,cwd=None,add=True,postpone=True):
++		tsk=inst(env=env or self.env)
++		tsk.bld=self
++		tsk.path=cwd or self.path
++		tsk.chmod=chmod
++		if isinstance(files,waflib.Node.Node):
++			tsk.source=[files]
++		else:
++			tsk.source=Utils.to_list(files)
++		tsk.dest=dest
++		tsk.exec_task=tsk.exec_install_files
++		tsk.relative_trick=relative_trick
++		if add:self.add_to_group(tsk)
++		self.run_task_now(tsk,postpone)
++		return tsk
++	def install_as(self,dest,srcfile,env=None,chmod=Utils.O644,cwd=None,add=True,postpone=True):
++		tsk=inst(env=env or self.env)
++		tsk.bld=self
++		tsk.path=cwd or self.path
++		tsk.chmod=chmod
++		tsk.source=[srcfile]
++		tsk.dest=dest
++		tsk.exec_task=tsk.exec_install_as
++		if add:self.add_to_group(tsk)
++		self.run_task_now(tsk,postpone)
++		return tsk
++	def symlink_as(self,dest,src,env=None,cwd=None,add=True,postpone=True):
++		if Utils.is_win32:
++			return
++		tsk=inst(env=env or self.env)
++		tsk.bld=self
++		tsk.dest=dest
++		tsk.path=cwd or self.path
++		tsk.source=[]
++		tsk.link=src
++		tsk.exec_task=tsk.exec_symlink_as
++		if add:self.add_to_group(tsk)
++		self.run_task_now(tsk,postpone)
++		return tsk
++class UninstallContext(InstallContext):
++	'''removes the targets installed'''
++	cmd='uninstall'
++	def __init__(self,**kw):
++		super(UninstallContext,self).__init__(**kw)
++		self.is_install=UNINSTALL
++	def do_install(self,src,tgt,chmod=Utils.O644):
++		if not self.progress_bar:
++			Logs.info('- remove %s'%tgt)
++		self.uninstall.append(tgt)
++		try:
++			os.remove(tgt)
++		except OSError ,e:
++			if e.errno!=errno.ENOENT:
++				if not getattr(self,'uninstall_error',None):
++					self.uninstall_error=True
++					Logs.warn('build: some files could not be uninstalled (retry with -vv to list them)')
++				if Logs.verbose>1:
++					Logs.warn('could not remove %s (error code %r)'%(e.filename,e.errno))
++		while tgt:
++			tgt=os.path.dirname(tgt)
++			try:
++				os.rmdir(tgt)
++			except OSError:
++				break
++	def do_link(self,src,tgt):
++		try:
++			if not self.progress_bar:
++				Logs.info('- unlink %s'%tgt)
++			os.remove(tgt)
++		except OSError:
++			pass
++		while tgt:
++			tgt=os.path.dirname(tgt)
++			try:
++				os.rmdir(tgt)
++			except OSError:
++				break
++	def execute(self):
++		try:
++			def runnable_status(self):
++				return Task.SKIP_ME
++			setattr(Task.Task,'runnable_status_back',Task.Task.runnable_status)
++			setattr(Task.Task,'runnable_status',runnable_status)
++			super(UninstallContext,self).execute()
++		finally:
++			setattr(Task.Task,'runnable_status',Task.Task.runnable_status_back)
++class CleanContext(BuildContext):
++	'''cleans the project'''
++	cmd='clean'
++	def execute(self):
++		self.restore()
++		if not self.all_envs:
++			self.load_envs()
++		self.recurse([self.run_dir])
++		try:
++			self.clean()
++		finally:
++			self.store()
++	def clean(self):
++		Logs.debug('build: clean called')
++		if self.bldnode!=self.srcnode:
++			lst=[self.root.find_or_declare(f)for f in self.env[CFG_FILES]]
++			for n in self.bldnode.ant_glob('**/*',excl='lock* *conf_check_*/** config.log c4che/*',quiet=True):
++				if n in lst:
++					continue
++				n.delete()
++		self.root.children={}
++		for v in'node_deps task_sigs raw_deps'.split():
++			setattr(self,v,{})
++class ListContext(BuildContext):
++	'''lists the targets to execute'''
++	cmd='list'
++	def execute(self):
++		self.restore()
++		if not self.all_envs:
++			self.load_envs()
++		self.recurse([self.run_dir])
++		self.pre_build()
++		self.timer=Utils.Timer()
++		for g in self.groups:
++			for tg in g:
++				try:
++					f=tg.post
++				except AttributeError:
++					pass
++				else:
++					f()
++		try:
++			self.get_tgen_by_name('')
++		except:
++			pass
++		lst=list(self.task_gen_cache_names.keys())
++		lst.sort()
++		for k in lst:
++			Logs.pprint('GREEN',k)
++class StepContext(BuildContext):
++	'''executes tasks in a step-by-step fashion, for debugging'''
++	cmd='step'
++	def __init__(self,**kw):
++		super(StepContext,self).__init__(**kw)
++		self.files=Options.options.files
++	def compile(self):
++		if not self.files:
++			Logs.warn('Add a pattern for the debug build, for example "waf step --files=main.c,app"')
++			BuildContext.compile(self)
++			return
++		for g in self.groups:
++			for tg in g:
++				try:
++					f=tg.post
++				except AttributeError:
++					pass
++				else:
++					f()
++			for pat in self.files.split(','):
++				matcher=self.get_matcher(pat)
++				for tg in g:
++					if isinstance(tg,Task.TaskBase):
++						lst=[tg]
++					else:
++						lst=tg.tasks
++					for tsk in lst:
++						do_exec=False
++						for node in getattr(tsk,'inputs',[]):
++							if matcher(node,output=False):
++								do_exec=True
++								break
++						for node in getattr(tsk,'outputs',[]):
++							if matcher(node,output=True):
++								do_exec=True
++								break
++						if do_exec:
++							ret=tsk.run()
++							Logs.info('%s -> exit %r'%(str(tsk),ret))
++	def get_matcher(self,pat):
++		inn=True
++		out=True
++		if pat.startswith('in:'):
++			out=False
++			pat=pat.replace('in:','')
++		elif pat.startswith('out:'):
++			inn=False
++			pat=pat.replace('out:','')
++		anode=self.root.find_node(pat)
++		pattern=None
++		if not anode:
++			if not pat.startswith('^'):
++				pat='^.+?%s'%pat
++			if not pat.endswith('$'):
++				pat='%s$'%pat
++			pattern=re.compile(pat)
++		def match(node,output):
++			if output==True and not out:
++				return False
++			if output==False and not inn:
++				return False
++			if anode:
++				return anode==node
++			else:
++				return pattern.match(node.abspath())
++		return match
++BuildContext.store=Utils.nogc(BuildContext.store)
++BuildContext.restore=Utils.nogc(BuildContext.restore)
+--- /dev/null
++++ ardour3/waflib/ConfigSet.py
+@@ -0,0 +1,151 @@
++#! /usr/bin/env python
++# encoding: utf-8
++# WARNING! Do not edit! http://waf.googlecode.com/git/docs/wafbook/single.html#_obtaining_the_waf_file
++
++import sys
++if sys.hexversion < 0x020400f0: from sets import Set as set
++import copy,re,os
++from waflib import Logs,Utils
++re_imp=re.compile('^(#)*?([^#=]*?)\ =\ (.*?)$',re.M)
++class ConfigSet(object):
++	__slots__=('table','parent')
++	def __init__(self,filename=None):
++		self.table={}
++		if filename:
++			self.load(filename)
++	def __contains__(self,key):
++		if key in self.table:return True
++		try:return self.parent.__contains__(key)
++		except AttributeError:return False
++	def keys(self):
++		keys=set()
++		cur=self
++		while cur:
++			keys.update(cur.table.keys())
++			cur=getattr(cur,'parent',None)
++		keys=list(keys)
++		keys.sort()
++		return keys
++	def __str__(self):
++		return"\n".join(["%r %r"%(x,self.__getitem__(x))for x in self.keys()])
++	def __getitem__(self,key):
++		try:
++			while 1:
++				x=self.table.get(key,None)
++				if not x is None:
++					return x
++				self=self.parent
++		except AttributeError:
++			return[]
++	def __setitem__(self,key,value):
++		self.table[key]=value
++	def __delitem__(self,key):
++		self[key]=[]
++	def __getattr__(self,name):
++		if name in self.__slots__:
++			return object.__getattr__(self,name)
++		else:
++			return self[name]
++	def __setattr__(self,name,value):
++		if name in self.__slots__:
++			object.__setattr__(self,name,value)
++		else:
++			self[name]=value
++	def __delattr__(self,name):
++		if name in self.__slots__:
++			object.__delattr__(self,name)
++		else:
++			del self[name]
++	def derive(self):
++		newenv=ConfigSet()
++		newenv.parent=self
++		return newenv
++	def detach(self):
++		tbl=self.get_merged_dict()
++		try:
++			delattr(self,'parent')
++		except AttributeError:
++			pass
++		else:
++			keys=tbl.keys()
++			for x in keys:
++				tbl[x]=copy.deepcopy(tbl[x])
++			self.table=tbl
++	def get_flat(self,key):
++		s=self[key]
++		if isinstance(s,str):return s
++		return' '.join(s)
++	def _get_list_value_for_modification(self,key):
++		try:
++			value=self.table[key]
++		except KeyError:
++			try:value=self.parent[key]
++			except AttributeError:value=[]
++			if isinstance(value,list):
++				value=value[:]
++			else:
++				value=[value]
++		else:
++			if not isinstance(value,list):
++				value=[value]
++		self.table[key]=value
++		return value
++	def append_value(self,var,val):
++		current_value=self._get_list_value_for_modification(var)
++		if isinstance(val,str):
++			val=[val]
++		current_value.extend(val)
++	def prepend_value(self,var,val):
++		if isinstance(val,str):
++			val=[val]
++		self.table[var]=val+self._get_list_value_for_modification(var)
++	def append_unique(self,var,val):
++		if isinstance(val,str):
++			val=[val]
++		current_value=self._get_list_value_for_modification(var)
++		for x in val:
++			if x not in current_value:
++				current_value.append(x)
++	def get_merged_dict(self):
++		table_list=[]
++		env=self
++		while 1:
++			table_list.insert(0,env.table)
++			try:env=env.parent
++			except AttributeError:break
++		merged_table={}
++		for table in table_list:
++			merged_table.update(table)
++		return merged_table
++	def store(self,filename):
++		try:
++			os.makedirs(os.path.split(filename)[0])
++		except OSError:
++			pass
++		f=None
++		try:
++			f=open(filename,'w')
++			merged_table=self.get_merged_dict()
++			keys=list(merged_table.keys())
++			keys.sort()
++			for k in keys:
++				if k!='undo_stack':
++					f.write('%s = %r\n'%(k,merged_table[k]))
++		finally:
++			if f:
++				f.close()
++	def load(self,filename):
++		tbl=self.table
++		code=Utils.readf(filename)
++		for m in re_imp.finditer(code):
++			g=m.group
++			tbl[g(2)]=eval(g(3))
++		Logs.debug('env: %s'%str(self.table))
++	def update(self,d):
++		for k,v in d.items():
++			self[k]=v
++	def stash(self):
++		self.undo_stack=self.undo_stack+[self.table]
++		self.table=self.table.copy()
++	def revert(self):
++		self.table=self.undo_stack.pop(-1)
+--- /dev/null
++++ ardour3/waflib/Configure.py
+@@ -0,0 +1,315 @@
++#! /usr/bin/env python
++# encoding: utf-8
++# WARNING! Do not edit! http://waf.googlecode.com/git/docs/wafbook/single.html#_obtaining_the_waf_file
++
++import os,shlex,sys,time
++from waflib import ConfigSet,Utils,Options,Logs,Context,Build,Errors
++try:
++	from urllib import request
++except:
++	from urllib import urlopen
++else:
++	urlopen=request.urlopen
++BREAK='break'
++CONTINUE='continue'
++WAF_CONFIG_LOG='config.log'
++autoconfig=False
++conf_template='''# project %(app)s configured on %(now)s by
++# waf %(wafver)s (abi %(abi)s, python %(pyver)x on %(systype)s)
++# using %(args)s
++#'''
++def download_check(node):
++	pass
++def download_tool(tool,force=False,ctx=None):
++	for x in Utils.to_list(Context.remote_repo):
++		for sub in Utils.to_list(Context.remote_locs):
++			url='/'.join((x,sub,tool+'.py'))
++			try:
++				web=urlopen(url)
++				try:
++					if web.getcode()!=200:
++						continue
++				except AttributeError:
++					pass
++			except Exception:
++				continue
++			else:
++				tmp=ctx.root.make_node(os.sep.join((Context.waf_dir,'waflib','extras',tool+'.py')))
++				tmp.write(web.read())
++				Logs.warn('Downloaded %s from %s'%(tool,url))
++				download_check(tmp)
++				try:
++					module=Context.load_tool(tool)
++				except:
++					Logs.warn('The tool %s from %s is unusable'%(tool,url))
++					try:
++						tmp.delete()
++					except:
++						pass
++					continue
++				return module
++	raise Errors.WafError('Could not load the Waf tool')
++class ConfigurationContext(Context.Context):
++	'''configures the project'''
++	cmd='configure'
++	error_handlers=[]
++	def __init__(self,**kw):
++		super(ConfigurationContext,self).__init__(**kw)
++		self.environ=dict(os.environ)
++		self.all_envs={}
++		self.top_dir=None
++		self.out_dir=None
++		self.tools=[]
++		self.hash=0
++		self.files=[]
++		self.tool_cache=[]
++		self.setenv('')
++	def setenv(self,name,env=None):
++		if name not in self.all_envs or env:
++			if not env:
++				env=ConfigSet.ConfigSet()
++				self.prepare_env(env)
++			else:
++				env=env.derive()
++			self.all_envs[name]=env
++		self.variant=name
++	def get_env(self):
++		return self.all_envs[self.variant]
++	def set_env(self,val):
++		self.all_envs[self.variant]=val
++	env=property(get_env,set_env)
++	def init_dirs(self):
++		top=self.top_dir
++		if not top:
++			top=Options.options.top
++		if not top:
++			top=getattr(Context.g_module,Context.TOP,None)
++		if not top:
++			top=self.path.abspath()
++		top=os.path.abspath(top)
++		self.srcnode=(os.path.isabs(top)and self.root or self.path).find_dir(top)
++		assert(self.srcnode)
++		out=self.out_dir
++		if not out:
++			out=Options.options.out
++		if not out:
++			out=getattr(Context.g_module,Context.OUT,None)
++		if not out:
++			out=Options.lockfile.replace('.lock-waf_%s_'%sys.platform,'').replace('.lock-waf','')
++		self.bldnode=(os.path.isabs(out)and self.root or self.path).make_node(out)
++		self.bldnode.mkdir()
++		if not os.path.isdir(self.bldnode.abspath()):
++			conf.fatal('could not create the build directory %s'%self.bldnode.abspath())
++	def execute(self):
++		self.init_dirs()
++		self.cachedir=self.bldnode.make_node(Build.CACHE_DIR)
++		self.cachedir.mkdir()
++		path=os.path.join(self.bldnode.abspath(),WAF_CONFIG_LOG)
++		self.logger=Logs.make_logger(path,'cfg')
++		app=getattr(Context.g_module,'APPNAME','')
++		if app:
++			ver=getattr(Context.g_module,'VERSION','')
++			if ver:
++				app="%s (%s)"%(app,ver)
++		now=time.ctime()
++		pyver=sys.hexversion
++		systype=sys.platform
++		args=" ".join(sys.argv)
++		wafver=Context.WAFVERSION
++		abi=Context.ABI
++		self.to_log(conf_template%vars())
++		self.msg('Setting top to',self.srcnode.abspath())
++		self.msg('Setting out to',self.bldnode.abspath())
++		if id(self.srcnode)==id(self.bldnode):
++			Logs.warn('Setting top == out (remember to use "update_outputs")')
++		elif id(self.path)!=id(self.srcnode):
++			if self.srcnode.is_child_of(self.path):
++				Logs.warn('Are you certain that you do not want to set top="." ?')
++		super(ConfigurationContext,self).execute()
++		self.store()
++		Context.top_dir=self.srcnode.abspath()
++		Context.out_dir=self.bldnode.abspath()
++		env=ConfigSet.ConfigSet()
++		env['argv']=sys.argv
++		env['options']=Options.options.__dict__
++		env.run_dir=Context.run_dir
++		env.top_dir=Context.top_dir
++		env.out_dir=Context.out_dir
++		env['hash']=self.hash
++		env['files']=self.files
++		env['environ']=dict(self.environ)
++		if not self.env.NO_LOCK_IN_RUN:
++			env.store(Context.run_dir+os.sep+Options.lockfile)
++		if not self.env.NO_LOCK_IN_TOP:
++			env.store(Context.top_dir+os.sep+Options.lockfile)
++		if not self.env.NO_LOCK_IN_OUT:
++			env.store(Context.out_dir+os.sep+Options.lockfile)
++	def prepare_env(self,env):
++		if not env.PREFIX:
++			env.PREFIX=os.path.abspath(os.path.expanduser(Options.options.prefix))
++		if not env.BINDIR:
++			env.BINDIR=Utils.subst_vars('${PREFIX}/bin',env)
++		if not env.LIBDIR:
++			env.LIBDIR=Utils.subst_vars('${PREFIX}/lib',env)
++	def store(self):
++		n=self.cachedir.make_node('build.config.py')
++		n.write('version = 0x%x\ntools = %r\n'%(Context.HEXVERSION,self.tools))
++		if not self.all_envs:
++			self.fatal('nothing to store in the configuration context!')
++		for key in self.all_envs:
++			tmpenv=self.all_envs[key]
++			tmpenv.store(os.path.join(self.cachedir.abspath(),key+Build.CACHE_SUFFIX))
++	def load(self,input,tooldir=None,funs=None,download=True):
++		tools=Utils.to_list(input)
++		if tooldir:tooldir=Utils.to_list(tooldir)
++		for tool in tools:
++			mag=(tool,id(self.env),funs)
++			if mag in self.tool_cache:
++				self.to_log('(tool %s is already loaded, skipping)'%tool)
++				continue
++			self.tool_cache.append(mag)
++			module=None
++			try:
++				module=Context.load_tool(tool,tooldir)
++			except ImportError ,e:
++				if Options.options.download:
++					module=download_tool(tool,ctx=self)
++					if not module:
++						self.fatal('Could not load the Waf tool %r or download a suitable replacement from the repository (sys.path %r)\n%s'%(tool,sys.path,e))
++				else:
++					self.fatal('Could not load the Waf tool %r from %r (try the --download option?):\n%s'%(tool,sys.path,e))
++			except Exception ,e:
++				self.to_log('imp %r (%r & %r)'%(tool,tooldir,funs))
++				self.to_log(Utils.ex_stack())
++				raise
++			if funs is not None:
++				self.eval_rules(funs)
++			else:
++				func=getattr(module,'configure',None)
++				if func:
++					if type(func)is type(Utils.readf):func(self)
++					else:self.eval_rules(func)
++			self.tools.append({'tool':tool,'tooldir':tooldir,'funs':funs})
++	def post_recurse(self,node):
++		super(ConfigurationContext,self).post_recurse(node)
++		self.hash=hash((self.hash,node.read('rb')))
++		self.files.append(node.abspath())
++	def eval_rules(self,rules):
++		self.rules=Utils.to_list(rules)
++		for x in self.rules:
++			f=getattr(self,x)
++			if not f:self.fatal("No such method '%s'."%x)
++			try:
++				f()
++			except Exception ,e:
++				ret=self.err_handler(x,e)
++				if ret==BREAK:
++					break
++				elif ret==CONTINUE:
++					continue
++				else:
++					raise
++	def err_handler(self,fun,error):
++		pass
++def conf(f):
++	def fun(*k,**kw):
++		mandatory=True
++		if'mandatory'in kw:
++			mandatory=kw['mandatory']
++			del kw['mandatory']
++		try:
++			return f(*k,**kw)
++		except Errors.ConfigurationError ,e:
++			if mandatory:
++				raise e
++	setattr(ConfigurationContext,f.__name__,fun)
++	setattr(Build.BuildContext,f.__name__,fun)
++	return f
++def add_os_flags(self,var,dest=None):
++	try:self.env.append_value(dest or var,shlex.split(self.environ[var]))
++	except KeyError:pass
++def cmd_to_list(self,cmd):
++	if isinstance(cmd,str)and cmd.find(' '):
++		try:
++			os.stat(cmd)
++		except OSError:
++			return shlex.split(cmd)
++		else:
++			return[cmd]
++	return cmd
++def check_waf_version(self,mini='1.6.0',maxi='1.7.0'):
++	self.start_msg('Checking for waf version in %s-%s'%(str(mini),str(maxi)))
++	ver=Context.HEXVERSION
++	if Utils.num2ver(mini)>ver:
++		self.fatal('waf version should be at least %r (%r found)'%(Utils.num2ver(mini),ver))
++	if Utils.num2ver(maxi)<ver:
++		self.fatal('waf version should be at most %r (%r found)'%(Utils.num2ver(maxi),ver))
++	self.end_msg('ok')
++def find_file(self,filename,path_list=[]):
++	for n in Utils.to_list(filename):
++		for d in Utils.to_list(path_list):
++			p=os.path.join(d,n)
++			if os.path.exists(p):
++				return p
++	self.fatal('Could not find %r'%filename)
++def find_program(self,filename,**kw):
++	exts=kw.get('exts',Utils.is_win32 and'.exe,.com,.bat,.cmd'or',.sh,.pl,.py')
++	environ=kw.get('environ',os.environ)
++	ret=''
++	filename=Utils.to_list(filename)
++	var=kw.get('var','')
++	if not var:
++		var=filename[0].upper()
++	if self.env[var]:
++		ret=self.env[var]
++	elif var in environ:
++		ret=environ[var]
++	path_list=kw.get('path_list','')
++	if not ret:
++		if path_list:
++			path_list=Utils.to_list(path_list)
++		else:
++			path_list=environ.get('PATH','').split(os.pathsep)
++		if not isinstance(filename,list):
++			filename=[filename]
++		for a in exts.split(','):
++			if ret:
++				break
++			for b in filename:
++				if ret:
++					break
++				for c in path_list:
++					if ret:
++						break
++					x=os.path.expanduser(os.path.join(c,b+a))
++					if os.path.isfile(x):
++						ret=x
++	if not ret and Utils.winreg:
++		ret=Utils.get_registry_app_path(Utils.winreg.HKEY_CURRENT_USER,filename)
++	if not ret and Utils.winreg:
++		ret=Utils.get_registry_app_path(Utils.winreg.HKEY_LOCAL_MACHINE,filename)
++	self.msg('Checking for program '+','.join(filename),ret or False)
++	self.to_log('find program=%r paths=%r var=%r -> %r'%(filename,path_list,var,ret))
++	if not ret:
++		self.fatal(kw.get('errmsg','')or'Could not find the program %s'%','.join(filename))
++	if var:
++		self.env[var]=ret
++	return ret
++def find_perl_program(self,filename,path_list=[],var=None,environ=None,exts=''):
++	try:
++		app=self.find_program(filename,path_list=path_list,var=var,environ=environ,exts=exts)
++	except:
++		self.find_program('perl',var='PERL')
++		app=self.find_file(filename,os.environ['PATH'].split(os.pathsep))
++		if not app:
++			raise
++		if var:
++			self.env[var]=Utils.to_list(self.env['PERL'])+[app]
++	self.msg('Checking for %r'%filename,app)
++
++conf(add_os_flags)
++conf(cmd_to_list)
++conf(check_waf_version)
++conf(find_file)
++conf(find_program)
++conf(find_perl_program)
+\ No newline at end of file
+--- /dev/null
++++ ardour3/waflib/Context.py
+@@ -0,0 +1,299 @@
++#! /usr/bin/env python
++# encoding: utf-8
++# WARNING! Do not edit! http://waf.googlecode.com/git/docs/wafbook/single.html#_obtaining_the_waf_file
++
++import os,imp,sys
++from waflib import Utils,Errors,Logs
++import waflib.Node
++HEXVERSION=0x1060b00
++WAFVERSION="1.6.11"
++WAFREVISION="a7e69d6b81b04729804754c4d5214da063779a65"
++ABI=98
++DBFILE='.wafpickle-%d'%ABI
++APPNAME='APPNAME'
++VERSION='VERSION'
++TOP='top'
++OUT='out'
++WSCRIPT_FILE='wscript'
++launch_dir=''
++run_dir=''
++top_dir=''
++out_dir=''
++waf_dir=''
++local_repo=''
++remote_repo='http://waf.googlecode.com/git/'
++remote_locs=['waflib/extras','waflib/Tools']
++g_module=None
++STDOUT=1
++STDERR=-1
++BOTH=0
++classes=[]
++def create_context(cmd_name,*k,**kw):
++	global classes
++	for x in classes:
++		if x.cmd==cmd_name:
++			return x(*k,**kw)
++	ctx=Context(*k,**kw)
++	ctx.fun=cmd_name
++	return ctx
++class store_context(type):
++	def __init__(cls,name,bases,dict):
++		super(store_context,cls).__init__(name,bases,dict)
++		name=cls.__name__
++		if name=='ctx'or name=='Context':
++			return
++		try:
++			cls.cmd
++		except AttributeError:
++			raise Errors.WafError('Missing command for the context class %r (cmd)'%name)
++		if not getattr(cls,'fun',None):
++			cls.fun=cls.cmd
++		global classes
++		classes.insert(0,cls)
++ctx=store_context('ctx',(object,),{})
++class Context(ctx):
++	errors=Errors
++	tools={}
++	def __init__(self,**kw):
++		try:
++			rd=kw['run_dir']
++		except KeyError:
++			global run_dir
++			rd=run_dir
++		class node_class(waflib.Node.Node):
++			pass
++		self.node_class=node_class
++		self.node_class.__module__="waflib.Node"
++		self.node_class.__name__="Nod3"
++		self.node_class.ctx=self
++		self.root=self.node_class('',None)
++		self.cur_script=None
++		self.path=self.root.find_dir(rd)
++		self.stack_path=[]
++		self.exec_dict={'ctx':self,'conf':self,'bld':self,'opt':self}
++		self.logger=None
++	def __hash__(self):
++		return id(self)
++	def load(self,tool_list,*k,**kw):
++		tools=Utils.to_list(tool_list)
++		path=Utils.to_list(kw.get('tooldir',''))
++		for t in tools:
++			module=load_tool(t,path)
++			fun=getattr(module,kw.get('name',self.fun),None)
++			if fun:
++				fun(self)
++	def execute(self):
++		global g_module
++		self.recurse([os.path.dirname(g_module.root_path)])
++	def pre_recurse(self,node):
++		self.stack_path.append(self.cur_script)
++		self.cur_script=node
++		self.path=node.parent
++	def post_recurse(self,node):
++		self.cur_script=self.stack_path.pop()
++		if self.cur_script:
++			self.path=self.cur_script.parent
++	def recurse(self,dirs,name=None,mandatory=True,once=True):
++		try:
++			cache=self.recurse_cache
++		except:
++			cache=self.recurse_cache={}
++		for d in Utils.to_list(dirs):
++			if not os.path.isabs(d):
++				d=os.path.join(self.path.abspath(),d)
++			WSCRIPT=os.path.join(d,WSCRIPT_FILE)
++			WSCRIPT_FUN=WSCRIPT+'_'+(name or self.fun)
++			node=self.root.find_node(WSCRIPT_FUN)
++			if node and(not once or node not in cache):
++				cache[node]=True
++				self.pre_recurse(node)
++				try:
++					function_code=node.read('rU')
++					exec(compile(function_code,node.abspath(),'exec'),self.exec_dict)
++				finally:
++					self.post_recurse(node)
++			elif not node:
++				node=self.root.find_node(WSCRIPT)
++				tup=(node,name or self.fun)
++				if node and(not once or tup not in cache):
++					cache[tup]=True
++					self.pre_recurse(node)
++					try:
++						wscript_module=load_module(node.abspath())
++						user_function=getattr(wscript_module,(name or self.fun),None)
++						if not user_function:
++							if not mandatory:
++								continue
++							raise Errors.WafError('No function %s defined in %s'%(name or self.fun,node.abspath()))
++						user_function(self)
++					finally:
++						self.post_recurse(node)
++				elif not node:
++					if not mandatory:
++						continue
++					raise Errors.WafError('No wscript file in directory %s'%d)
++	def exec_command(self,cmd,**kw):
++		subprocess=Utils.subprocess
++		kw['shell']=isinstance(cmd,str)
++		Logs.debug('runner: %r'%cmd)
++		Logs.debug('runner_env: kw=%s'%kw)
++		try:
++			if self.logger:
++				self.logger.info(cmd)
++				kw['stdout']=kw['stderr']=subprocess.PIPE
++				p=subprocess.Popen(cmd,**kw)
++				(out,err)=p.communicate()
++				if out:
++					self.logger.debug('out: %s'%out.decode(sys.stdout.encoding or'iso8859-1'))
++				if err:
++					self.logger.error('err: %s'%err.decode(sys.stdout.encoding or'iso8859-1'))
++				return p.returncode
++			else:
++				p=subprocess.Popen(cmd,**kw)
++				return p.wait()
++		except OSError:
++			return-1
++	def cmd_and_log(self,cmd,**kw):
++		subprocess=Utils.subprocess
++		kw['shell']=isinstance(cmd,str)
++		Logs.debug('runner: %r'%cmd)
++		if'quiet'in kw:
++			quiet=kw['quiet']
++			del kw['quiet']
++		else:
++			quiet=None
++		if'output'in kw:
++			to_ret=kw['output']
++			del kw['output']
++		else:
++			to_ret=STDOUT
++		kw['stdout']=kw['stderr']=subprocess.PIPE
++		if quiet is None:
++			self.to_log(cmd)
++		try:
++			p=subprocess.Popen(cmd,**kw)
++			(out,err)=p.communicate()
++		except Exception ,e:
++			raise Errors.WafError('Execution failure: %s'%str(e),ex=e)
++		if not isinstance(out,str):
++			out=out.decode(sys.stdout.encoding or'iso8859-1')
++		if not isinstance(err,str):
++			err=err.decode(sys.stdout.encoding or'iso8859-1')
++		if out and quiet!=STDOUT and quiet!=BOTH:
++			self.to_log('out: %s'%out)
++		if err and quiet!=STDERR and quiet!=BOTH:
++			self.to_log('err: %s'%err)
++		if p.returncode:
++			e=Errors.WafError('Command %r returned %r'%(cmd,p.returncode))
++			e.returncode=p.returncode
++			e.stderr=err
++			e.stdout=out
++			raise e
++		if to_ret==BOTH:
++			return(out,err)
++		elif to_ret==STDERR:
++			return err
++		return out
++	def fatal(self,msg,ex=None):
++		if self.logger:
++			self.logger.info('from %s: %s'%(self.path.abspath(),msg))
++		try:
++			msg='%s\n(complete log in %s)'%(msg,self.logger.handlers[0].baseFilename)
++		except:
++			pass
++		raise self.errors.ConfigurationError(msg,ex=ex)
++	def to_log(self,msg):
++		if not msg:
++			return
++		if self.logger:
++			self.logger.info(msg)
++		else:
++			sys.stderr.write(str(msg))
++			sys.stderr.flush()
++	def msg(self,msg,result,color=None):
++		self.start_msg(msg)
++		if not isinstance(color,str):
++			color=result and'GREEN'or'YELLOW'
++		self.end_msg(result,color)
++	def start_msg(self,msg):
++		try:
++			if self.in_msg:
++				self.in_msg+=1
++				return
++		except:
++			self.in_msg=0
++		self.in_msg+=1
++		try:
++			self.line_just=max(self.line_just,len(msg))
++		except AttributeError:
++			self.line_just=max(40,len(msg))
++		for x in(self.line_just*'-',msg):
++			self.to_log(x)
++		Logs.pprint('NORMAL',"%s :"%msg.ljust(self.line_just),sep='')
++	def end_msg(self,result,color=None):
++		self.in_msg-=1
++		if self.in_msg:
++			return
++		defcolor='GREEN'
++		if result==True:
++			msg='ok'
++		elif result==False:
++			msg='not found'
++			defcolor='YELLOW'
++		else:
++			msg=str(result)
++		self.to_log(msg)
++		Logs.pprint(color or defcolor,msg)
++	def load_special_tools(self,var,ban=[]):
++		global waf_dir
++		lst=self.root.find_node(waf_dir).find_node('waflib/extras').ant_glob(var)
++		for x in lst:
++			if not x.name in ban:
++				load_tool(x.name.replace('.py',''))
++cache_modules={}
++def load_module(path):
++	try:
++		return cache_modules[path]
++	except KeyError:
++		pass
++	module=imp.new_module(WSCRIPT_FILE)
++	try:
++		code=Utils.readf(path,m='rU')
++	except(IOError,OSError):
++		raise Errors.WafError('Could not read the file %r'%path)
++	module_dir=os.path.dirname(path)
++	sys.path.insert(0,module_dir)
++	exec(compile(code,path,'exec'),module.__dict__)
++	sys.path.remove(module_dir)
++	cache_modules[path]=module
++	return module
++def load_tool(tool,tooldir=None):
++	tool=tool.replace('++','xx')
++	tool=tool.replace('java','javaw')
++	tool=tool.replace('compiler_cc','compiler_c')
++	if tooldir:
++		assert isinstance(tooldir,list)
++		sys.path=tooldir+sys.path
++		try:
++			__import__(tool)
++			ret=sys.modules[tool]
++			Context.tools[tool]=ret
++			return ret
++		finally:
++			for d in tooldir:
++				sys.path.remove(d)
++	else:
++		global waf_dir
++		try:
++			os.stat(os.path.join(waf_dir,'waflib','extras',tool+'.py'))
++			d='waflib.extras.%s'%tool
++		except:
++			try:
++				os.stat(os.path.join(waf_dir,'waflib','Tools',tool+'.py'))
++				d='waflib.Tools.%s'%tool
++			except:
++				d=tool
++		__import__(d)
++		ret=sys.modules[d]
++		Context.tools[tool]=ret
++		return ret
+--- /dev/null
++++ ardour3/waflib/Errors.py
+@@ -0,0 +1,37 @@
++#! /usr/bin/env python
++# encoding: utf-8
++# WARNING! Do not edit! http://waf.googlecode.com/git/docs/wafbook/single.html#_obtaining_the_waf_file
++
++import traceback,sys
++class WafError(Exception):
++	def __init__(self,msg='',ex=None):
++		self.msg=msg
++		assert not isinstance(msg,Exception)
++		self.stack=[]
++		if ex:
++			if not msg:
++				self.msg=str(ex)
++			if isinstance(ex,WafError):
++				self.stack=ex.stack
++			else:
++				self.stack=traceback.extract_tb(sys.exc_info()[2])
++		self.stack+=traceback.extract_stack()[:-1]
++		self.verbose_msg=''.join(traceback.format_list(self.stack))
++	def __str__(self):
++		return str(self.msg)
++class BuildError(WafError):
++	def __init__(self,error_tasks=[]):
++		self.tasks=error_tasks
++		WafError.__init__(self,self.format_error())
++	def format_error(self):
++		lst=['Build failed']
++		for tsk in self.tasks:
++			txt=tsk.format_error()
++			if txt:lst.append(txt)
++		return'\n'.join(lst)
++class ConfigurationError(WafError):
++	pass
++class TaskRescan(WafError):
++	pass
++class TaskNotReady(WafError):
++	pass
+--- /dev/null
++++ ardour3/waflib/extras/autowaf.py
+@@ -0,0 +1,490 @@
++#! /usr/bin/env python
++# encoding: utf-8
++# WARNING! Do not edit! http://waf.googlecode.com/git/docs/wafbook/single.html#_obtaining_the_waf_file
++
++import glob
++import os
++import subprocess
++import sys
++import shutil
++from waflib import Configure,Context,Logs,Node,Options,Task,Utils
++from waflib.TaskGen import feature,before,after
++global g_is_child
++g_is_child=False
++global g_step
++g_step=0
++def include_config_h(self):
++	self.env.append_value('INCPATHS',self.bld.bldnode.abspath())
++def set_options(opt,debug_by_default=False):
++	global g_step
++	if g_step>0:
++		return
++	dirs_options=opt.add_option_group('Installation directories','')
++	for k in('--prefix','--destdir'):
++		option=opt.parser.get_option(k)
++		if option:
++			opt.parser.remove_option(k)
++			dirs_options.add_option(option)
++	dirs_options.add_option('--bindir',type='string',help="Executable programs [Default: PREFIX/bin]")
++	dirs_options.add_option('--configdir',type='string',help="Configuration data [Default: PREFIX/etc]")
++	dirs_options.add_option('--datadir',type='string',help="Shared data [Default: PREFIX/share]")
++	dirs_options.add_option('--includedir',type='string',help="Header files [Default: PREFIX/include]")
++	dirs_options.add_option('--libdir',type='string',help="Libraries [Default: PREFIX/lib]")
++	dirs_options.add_option('--mandir',type='string',help="Manual pages [Default: DATADIR/man]")
++	dirs_options.add_option('--docdir',type='string',help="HTML documentation [Default: DATADIR/doc]")
++	if debug_by_default:
++		opt.add_option('--optimize',action='store_false',default=True,dest='debug',help="Build optimized binaries")
++	else:
++		opt.add_option('--debug',action='store_true',default=False,dest='debug',help="Build debuggable binaries")
++	opt.add_option('--pardebug',action='store_true',default=False,dest='pardebug',help="Build parallel-installable debuggable libraries with D suffix")
++	opt.add_option('--grind',action='store_true',default=False,dest='grind',help="Run tests in valgrind")
++	opt.add_option('--strict',action='store_true',default=False,dest='strict',help="Use strict compiler flags and show all warnings")
++	opt.add_option('--ultra-strict',action='store_true',default=False,dest='ultra_strict',help="Use even stricter compiler flags (likely to trigger many warnings in library headers)")
++	opt.add_option('--docs',action='store_true',default=False,dest='docs',help="Build documentation - requires doxygen")
++	g_step=1
++def copyfile(task):
++	src=task.inputs[0].abspath()
++	tgt=task.outputs[0].abspath()
++	shutil.copy2(src,tgt)
++def check_header(conf,lang,name,define='',mandatory=True):
++	includes=''
++	if sys.platform=="darwin":
++		includes='/opt/local/include'
++	if lang=='c':
++		check_func=conf.check_cc
++	elif lang=='cxx':
++		check_func=conf.check_cxx
++	else:
++		Logs.error("Unknown header language `%s'"%lang)
++		return
++	if define!='':
++		check_func(header_name=name,includes=includes,define_name=define,mandatory=mandatory)
++	else:
++		check_func(header_name=name,includes=includes,mandatory=mandatory)
++def nameify(name):
++	return name.replace('/','_').replace('++','PP').replace('-','_').replace('.','_')
++def define(conf,var_name,value):
++	conf.define(var_name,value)
++	conf.env[var_name]=value
++def check_pkg(conf,name,**args):
++	if args['uselib_store'].lower()in conf.env['AUTOWAF_LOCAL_LIBS']:
++		return
++	class CheckType:
++		OPTIONAL=1
++		MANDATORY=2
++	var_name='CHECKED_'+nameify(args['uselib_store'])
++	check=not var_name in conf.env
++	mandatory=not'mandatory'in args or args['mandatory']
++	if not check and'atleast_version'in args:
++		checked_version=conf.env['VERSION_'+name]
++		if checked_version and checked_version<args['atleast_version']:
++			check=True;
++	if not check and mandatory and conf.env[var_name]==CheckType.OPTIONAL:
++		check=True;
++	if check:
++		found=None
++		pkg_var_name='PKG_'+name.replace('-','_')
++		pkg_name=name
++		if conf.env.PARDEBUG:
++			args['mandatory']=False
++			found=conf.check_cfg(package=pkg_name+'D',args="--cflags --libs",**args)
++			if found:
++				pkg_name+='D'
++		if mandatory:
++			args['mandatory']=True
++		if not found:
++			found=conf.check_cfg(package=pkg_name,args="--cflags --libs",**args)
++		if found:
++			conf.env[pkg_var_name]=pkg_name
++		if'atleast_version'in args:
++			conf.env['VERSION_'+name]=args['atleast_version']
++	if mandatory:
++		conf.env[var_name]=CheckType.MANDATORY
++	else:
++		conf.env[var_name]=CheckType.OPTIONAL
++def normpath(path):
++	if sys.platform=='win32':
++		return os.path.normpath(path).replace('\\','/')
++	else:
++		return os.path.normpath(path)
++def ensure_visible_symbols(bld,visible):
++	if bld.env['MSVC_COMPILER']:
++		if visible:
++			print('*** WARNING: MSVC does not allow symbols to be visible/exported by default while building '+bld.name)
++		else:
++			pass
++	else:
++		if not hasattr(bld,'cxxflags'):
++			bld.cxxflags=[]
++		if not hasattr(bld,'cflags'):
++			bld.cflags=[]
++		if visible:
++			bld.cxxflags+=['-fvisibility=default']
++			bld.cflags+=['-fvisibility=default']
++		else:
++			bld.cxxflags+=['-fvisibility=hidden']
++			bld.cflags+=['-fvisibility=hidden']
++def set_basic_compiler_flags(conf,flag_dict):
++	if Options.options.debug:
++		conf.env.append_value('CFLAGS',flag_dict['debuggable'])
++		conf.env.append_value('CXXFLAGS',flag_dict['debuggable'])
++		conf.env.append_value('LINKFLAGS',flag_dict['linker-debuggable'])
++	else:
++		conf.env.append_value('CFLAGS',flag_dict['nondebuggable'])
++		conf.env.append_value('CXXFLAGS',flag_dict['nondebuggable'])
++	if Options.options.ultra_strict:
++		Options.options.strict=True
++		conf.env.append_value('CFLAGS',flag_dict['ultra-strict'])
++	if Options.options.strict:
++		conf.env.append_value('CFLAGS',flag_dict['c-strict'])
++		conf.env.append_value('CXXFLAGS',flag_dict['cxx-strict'])
++		conf.env.append_value('CFLAGS',flag_dict['strict'])
++		conf.env.append_value('CXXFLAGS',flag_dict['strict'])
++	conf.env.append_value('CFLAGS',flag_dict['show-column'])
++	conf.env.append_value('CXXFLAGS',flag_dict['show-column'])
++def configure(conf):
++	global g_step
++	if g_step>1:
++		return
++	print('')
++	display_header('Global Configuration')
++	if Options.options.docs:
++		conf.load('doxygen')
++	conf.env['DOCS']=Options.options.docs
++	conf.env['DEBUG']=Options.options.debug or Options.options.pardebug
++	conf.env['PARDEBUG']=Options.options.pardebug
++	conf.env['PREFIX']=normpath(os.path.abspath(os.path.expanduser(conf.env['PREFIX'])))
++	def config_dir(var,opt,default):
++		if opt:
++			conf.env[var]=normpath(opt)
++		else:
++			conf.env[var]=normpath(default)
++	opts=Options.options
++	prefix=conf.env['PREFIX']
++	config_dir('BINDIR',opts.bindir,os.path.join(prefix,'bin'))
++	config_dir('SYSCONFDIR',opts.configdir,os.path.join(prefix,'etc'))
++	config_dir('DATADIR',opts.datadir,os.path.join(prefix,'share'))
++	config_dir('INCLUDEDIR',opts.includedir,os.path.join(prefix,'include'))
++	config_dir('LIBDIR',opts.libdir,os.path.join(prefix,'lib'))
++	config_dir('MANDIR',opts.mandir,os.path.join(conf.env['DATADIR'],'man'))
++	config_dir('DOCDIR',opts.docdir,os.path.join(conf.env['DATADIR'],'doc'))
++	if Options.options.docs:
++		doxygen=conf.find_program('doxygen')
++		if not doxygen:
++			conf.fatal("Doxygen is required to build with --docs")
++		dot=conf.find_program('dot')
++		if not dot:
++			conf.fatal("Graphviz (dot) is required to build with --docs")
++	conf.env.prepend_value('CFLAGS','-I'+os.path.abspath('.'))
++	conf.env.prepend_value('CXXFLAGS','-I'+os.path.abspath('.'))
++	display_msg(conf,"Install prefix",conf.env['PREFIX'])
++	display_msg(conf,"Debuggable build",str(conf.env['DEBUG']))
++	display_msg(conf,"Build documentation",str(conf.env['DOCS']))
++	print('')
++	g_step=2
++def set_local_lib(conf,name,has_objects):
++	var_name='HAVE_'+nameify(name.upper())
++	define(conf,var_name,1)
++	if has_objects:
++		if type(conf.env['AUTOWAF_LOCAL_LIBS'])!=dict:
++			conf.env['AUTOWAF_LOCAL_LIBS']={}
++		conf.env['AUTOWAF_LOCAL_LIBS'][name.lower()]=True
++	else:
++		if type(conf.env['AUTOWAF_LOCAL_HEADERS'])!=dict:
++			conf.env['AUTOWAF_LOCAL_HEADERS']={}
++		conf.env['AUTOWAF_LOCAL_HEADERS'][name.lower()]=True
++def append_property(obj,key,val):
++	if hasattr(obj,key):
++		setattr(obj,key,getattr(obj,key)+val)
++	else:
++		setattr(obj,key,val)
++def use_lib(bld,obj,libs):
++	abssrcdir=os.path.abspath('.')
++	libs_list=libs.split()
++	for l in libs_list:
++		in_headers=l.lower()in bld.env['AUTOWAF_LOCAL_HEADERS']
++		in_libs=l.lower()in bld.env['AUTOWAF_LOCAL_LIBS']
++		if in_libs:
++			append_property(obj,'use',' lib%s '%l.lower())
++			append_property(obj,'framework',bld.env['FRAMEWORK_'+l])
++		if in_headers or in_libs:
++			inc_flag='-iquote '+os.path.join(abssrcdir,l.lower())
++			for f in['CFLAGS','CXXFLAGS']:
++				if not inc_flag in bld.env[f]:
++					bld.env.prepend_value(f,inc_flag)
++		else:
++			append_property(obj,'uselib',' '+l)
++def version_lib(self):
++	if sys.platform=='win32':
++		self.vnum=None
++	if self.env['PARDEBUG']:
++		applicable=['cshlib','cxxshlib','cstlib','cxxstlib']
++		if[x for x in applicable if x in self.features]:
++			self.target=self.target+'D'
++def set_lib_env(conf,name,version):
++	'Set up environment for local library as if found via pkg-config.'
++	NAME=name.upper()
++	major_ver=version.split('.')[0]
++	pkg_var_name='PKG_'+name.replace('-','_')
++	lib_name='%s-%s'%(name,major_ver)
++	if conf.env.PARDEBUG:
++		lib_name+='D'
++	conf.env[pkg_var_name]=lib_name
++	conf.env['INCLUDES_'+NAME]=['${INCLUDEDIR}/%s-%s'%(name,major_ver)]
++	conf.env['LIBPATH_'+NAME]=[conf.env.LIBDIR]
++	conf.env['LIB_'+NAME]=[lib_name]
++def display_header(title):
++	Logs.pprint('BOLD',title)
++def display_msg(conf,msg,status=None,color=None):
++	color='CYAN'
++	if type(status)==bool and status or status=="True":
++		color='GREEN'
++	elif type(status)==bool and not status or status=="False":
++		color='YELLOW'
++	Logs.pprint('BOLD'," *",sep='')
++	Logs.pprint('NORMAL',"%s"%msg.ljust(conf.line_just-3),sep='')
++	Logs.pprint('BOLD',":",sep='')
++	Logs.pprint(color,status)
++def link_flags(env,lib):
++	return' '.join(map(lambda x:env['LIB_ST']%x,env['LIB_'+lib]))
++def compile_flags(env,lib):
++	return' '.join(map(lambda x:env['CPPPATH_ST']%x,env['INCLUDES_'+lib]))
++def set_recursive():
++	global g_is_child
++	g_is_child=True
++def is_child():
++	global g_is_child
++	return g_is_child
++def build_pc(bld,name,version,version_suffix,libs,subst_dict={}):
++	'''Build a pkg-config file for a library.
++    name           -- uppercase variable name     (e.g. 'SOMENAME')
++    version        -- version string              (e.g. '1.2.3')
++    version_suffix -- name version suffix         (e.g. '2')
++    libs           -- string/list of dependencies (e.g. 'LIBFOO GLIB')
++    '''
++	pkg_prefix=bld.env['PREFIX']
++	if pkg_prefix[-1]=='/':
++		pkg_prefix=pkg_prefix[:-1]
++	target=name.lower()
++	if version_suffix!='':
++		target+='-'+version_suffix
++	if bld.env['PARDEBUG']:
++		target+='D'
++	target+='.pc'
++	libdir=bld.env['LIBDIR']
++	if libdir.startswith(pkg_prefix):
++		libdir=libdir.replace(pkg_prefix,'${exec_prefix}')
++	includedir=bld.env['INCLUDEDIR']
++	if includedir.startswith(pkg_prefix):
++		includedir=includedir.replace(pkg_prefix,'${prefix}')
++	obj=bld(features='subst',source='%s.pc.in'%name.lower(),target=target,install_path=os.path.join(bld.env['LIBDIR'],'pkgconfig'),exec_prefix='${prefix}',PREFIX=pkg_prefix,EXEC_PREFIX='${prefix}',LIBDIR=libdir,INCLUDEDIR=includedir)
++	if type(libs)!=list:
++		libs=libs.split()
++	subst_dict[name+'_VERSION']=version
++	subst_dict[name+'_MAJOR_VERSION']=version[0:version.find('.')]
++	for i in libs:
++		subst_dict[i+'_LIBS']=link_flags(bld.env,i)
++		lib_cflags=compile_flags(bld.env,i)
++		if lib_cflags=='':
++			lib_cflags=' '
++		subst_dict[i+'_CFLAGS']=lib_cflags
++	obj.__dict__.update(subst_dict)
++def build_dir(name,subdir):
++	if is_child():
++		return os.path.join('build',name,subdir)
++	else:
++		return os.path.join('build',subdir)
++def make_simple_dox(name):
++	name=name.lower()
++	NAME=name.upper()
++	try:
++		top=os.getcwd()
++		os.chdir(build_dir(name,'doc/html'))
++		page='group__%s.html'%name
++		if not os.path.exists(page):
++			return
++		for i in[['%s_API '%NAME,''],['%s_DEPRECATED '%NAME,''],['group__%s.html'%name,''],[' ',''],['<script.*><\/script>',''],['<hr\/><a name="details" id="details"><\/a><h2>.*<\/h2>',''],['<link href=\"tabs.css\" rel=\"stylesheet\" type=\"text\/css\"\/>',''],['<img class=\"footer\" src=\"doxygen.png\" alt=\"doxygen\"\/>','Doxygen']]:
++			os.system("sed -i 's/%s/%s/g' %s"%(i[0],i[1],page))
++		os.rename('group__%s.html'%name,'index.html')
++		for i in(glob.glob('*.png')+glob.glob('*.html')+glob.glob('*.js')+glob.glob('*.css')):
++			if i!='index.html'and i!='style.css':
++				os.remove(i)
++		os.chdir(top)
++		os.chdir(build_dir(name,'doc/man/man3'))
++		for i in glob.glob('*.3'):
++			os.system("sed -i 's/%s_API //' %s"%(NAME,i))
++		for i in glob.glob('_*'):
++			os.remove(i)
++		os.chdir(top)
++	except Exception ,e:
++		Logs.error("Failed to fix up %s documentation: %s"%(name,e))
++def build_dox(bld,name,version,srcdir,blddir,outdir=''):
++	if not bld.env['DOCS']:
++		return
++	if is_child():
++		src_dir=os.path.join(srcdir,name.lower())
++		doc_dir=os.path.join(blddir,name.lower(),'doc')
++	else:
++		src_dir=srcdir
++		doc_dir=os.path.join(blddir,'doc')
++	subst_tg=bld(features='subst',source='doc/reference.doxygen.in',target='doc/reference.doxygen',install_path='',name='doxyfile')
++	subst_dict={name+'_VERSION':version,name+'_SRCDIR':os.path.abspath(src_dir),name+'_DOC_DIR':os.path.abspath(doc_dir)}
++	subst_tg.__dict__.update(subst_dict)
++	subst_tg.post()
++	docs=bld(features='doxygen',doxyfile='doc/reference.doxygen')
++	docs.post()
++	major=int(version[0:version.find('.')])
++	bld.install_files(os.path.join('${DOCDIR}','%s-%d'%(name.lower(),major),outdir,'html'),bld.path.get_bld().ant_glob('doc/html/*'))
++	for i in range(1,8):
++		bld.install_files('${MANDIR}/man%d'%i,bld.path.get_bld().ant_glob('doc/man/man%d/*'%i,excl='**/_*'))
++def build_version_files(header_path,source_path,domain,major,minor,micro,exportname,visheader):
++	header_path=os.path.abspath(header_path)
++	source_path=os.path.abspath(source_path)
++	text="int "+domain+"_major_version = "+str(major)+";\n"
++	text+="int "+domain+"_minor_version = "+str(minor)+";\n"
++	text+="int "+domain+"_micro_version = "+str(micro)+";\n"
++	try:
++		o=open(source_path,'w')
++		o.write(text)
++		o.close()
++	except IOError:
++		Logs.error('Failed to open %s for writing\n'%source_path)
++		sys.exit(-1)
++	text="#ifndef __"+domain+"_version_h__\n"
++	text+="#define __"+domain+"_version_h__\n"
++	if visheader!='':
++		text+="#include \""+visheader+"\"\n"
++	text+=exportname+" extern const char* "+domain+"_revision;\n"
++	text+=exportname+" extern int "+domain+"_major_version;\n"
++	text+=exportname+" extern int "+domain+"_minor_version;\n"
++	text+=exportname+" extern int "+domain+"_micro_version;\n"
++	text+="#endif /* __"+domain+"_version_h__ */\n"
++	try:
++		o=open(header_path,'w')
++		o.write(text)
++		o.close()
++	except IOError:
++		Logs.warn('Failed to open %s for writing\n'%header_path)
++		sys.exit(-1)
++	return None
++def build_i18n_pot(bld,srcdir,dir,name,sources,copyright_holder=None):
++	Logs.info('Generating pot file from %s'%name)
++	pot_file='%s.pot'%name
++	cmd=['xgettext','--keyword=_','--keyword=N_','--keyword=S_','--keyword=P_:1,2','--from-code=UTF-8','-o',pot_file]
++	if copyright_holder:
++		cmd+=['--copyright-holder="%s"'%copyright_holder]
++	cmd+=sources
++	Logs.info('Updating '+pot_file)
++	subprocess.call(cmd,cwd=os.path.join(srcdir,dir))
++def build_i18n_po(bld,srcdir,dir,name,sources,copyright_holder=None):
++	pwd=os.getcwd()
++	os.chdir(os.path.join(srcdir,dir))
++	pot_file='%s.pot'%name
++	po_files=glob.glob('po/*.po')
++	for po_file in po_files:
++		cmd=['msgmerge','--update','--no-fuzzy-matching',po_file,pot_file]
++		Logs.info('Updating '+po_file)
++		subprocess.call(cmd)
++	os.chdir(pwd)
++def build_i18n_mo(bld,srcdir,dir,name,sources,copyright_holder=None):
++	pwd=os.getcwd()
++	os.chdir(os.path.join(srcdir,dir))
++	pot_file='%s.pot'%name
++	po_files=glob.glob('po/*.po')
++	for po_file in po_files:
++		mo_file=po_file.replace('.po','.mo')
++		cmd=['msgfmt','-c','-f','-o',mo_file,po_file]
++		Logs.info('Generating '+po_file)
++		subprocess.call(cmd)
++	os.chdir(pwd)
++def build_i18n(bld,srcdir,dir,name,sources,copyright_holder=None):
++	build_i18n_pot(bld,srcdir,dir,name,sources,copyright_holder)
++	build_i18n_po(bld,srcdir,dir,name,sources,copyright_holder)
++	build_i18n_mo(bld,srcdir,dir,name,sources,copyright_holder)
++def cd_to_build_dir(ctx,appname):
++	orig_dir=os.path.abspath(os.curdir)
++	top_level=(len(ctx.stack_path)>1)
++	if top_level:
++		os.chdir(os.path.join('build',appname))
++	else:
++		os.chdir('build')
++	Logs.pprint('GREEN',"Waf: Entering directory `%s'"%os.path.abspath(os.getcwd()))
++def cd_to_orig_dir(ctx,child):
++	if child:
++		os.chdir(os.path.join('..','..'))
++	else:
++		os.chdir('..')
++def pre_test(ctx,appname,dirs=['src']):
++	diropts=''
++	for i in dirs:
++		diropts+=' -d '+i
++	cd_to_build_dir(ctx,appname)
++	clear_log=open('lcov-clear.log','w')
++	try:
++		try:
++			subprocess.call(('lcov %s -z'%diropts).split(),stdout=clear_log,stderr=clear_log)
++		except:
++			Logs.warn('Failed to run lcov, no coverage report will be generated')
++	finally:
++		clear_log.close()
++def post_test(ctx,appname,dirs=['src'],remove=['*boost*','c++*']):
++	diropts=''
++	for i in dirs:
++		diropts+=' -d '+i
++	coverage_log=open('lcov-coverage.log','w')
++	coverage_lcov=open('coverage.lcov','w')
++	coverage_stripped_lcov=open('coverage-stripped.lcov','w')
++	try:
++		try:
++			base='.'
++			if g_is_child:
++				base='..'
++			subprocess.call(('lcov -c %s -b %s'%(diropts,base)).split(),stdout=coverage_lcov,stderr=coverage_log)
++			subprocess.call(['lcov','--remove','coverage.lcov']+remove,stdout=coverage_stripped_lcov,stderr=coverage_log)
++			if not os.path.isdir('coverage'):
++				os.makedirs('coverage')
++			subprocess.call('genhtml -o coverage coverage-stripped.lcov'.split(),stdout=coverage_log,stderr=coverage_log)
++		except:
++			Logs.warn('Failed to run lcov, no coverage report will be generated')
++	finally:
++		coverage_stripped_lcov.close()
++		coverage_lcov.close()
++		coverage_log.close()
++		print('')
++		Logs.pprint('GREEN',"Waf: Leaving directory `%s'"%os.path.abspath(os.getcwd()))
++		top_level=(len(ctx.stack_path)>1)
++		if top_level:
++			cd_to_orig_dir(ctx,top_level)
++	print('')
++	Logs.pprint('BOLD','Coverage:',sep='')
++	print('<file://%s>\n\n'%os.path.abspath('coverage/index.html'))
++def run_tests(ctx,appname,tests,desired_status=0,dirs=['src'],name='*'):
++	failures=0
++	diropts=''
++	for i in dirs:
++		diropts+=' -d '+i
++	for i in tests:
++		s=i
++		if type(i)==type([]):
++			s=' '.join(i)
++		print('')
++		Logs.pprint('BOLD','** Test',sep='')
++		Logs.pprint('NORMAL','%s'%s)
++		cmd=i
++		if Options.options.grind:
++			cmd='valgrind '+i
++		if subprocess.call(cmd,shell=True)==desired_status:
++			Logs.pprint('GREEN','** Pass')
++		else:
++			failures+=1
++			Logs.pprint('RED','** FAIL')
++	print('')
++	if failures==0:
++		Logs.pprint('GREEN','** Pass: All %s.%s tests passed'%(appname,name))
++	else:
++		Logs.pprint('RED','** FAIL: %d %s.%s tests failed'%(failures,appname,name))
++
++feature('c','cxx')(include_config_h)
++after('apply_incpaths')(include_config_h)
++feature('c','cxx')(version_lib)
++before('apply_link')(version_lib)
+\ No newline at end of file
+--- /dev/null
++++ ardour3/waflib/extras/doxygen.py
+@@ -0,0 +1,117 @@
++#! /usr/bin/env python
++# encoding: utf-8
++# WARNING! Do not edit! http://waf.googlecode.com/git/docs/wafbook/single.html#_obtaining_the_waf_file
++
++from fnmatch import fnmatchcase
++import os,os.path,re,stat
++from waflib import Task,Utils,Node,Logs
++from waflib.TaskGen import feature
++DOXY_STR='${DOXYGEN} - '
++DOXY_FMTS='html latex man rft xml'.split()
++DOXY_FILE_PATTERNS='*.'+' *.'.join('''
++c cc cxx cpp c++ java ii ixx ipp i++ inl h hh hxx hpp h++ idl odl cs php php3
++inc m mm py f90c cc cxx cpp c++ java ii ixx ipp i++ inl h hh hxx
++'''.split())
++re_nl=re.compile('\\\\\r*\n',re.MULTILINE)
++class doxygen(Task.Task):
++	vars=['DOXYGEN','DOXYFLAGS']
++	color='BLUE'
++	def runnable_status(self):
++		'''
++		self.pars are populated in runnable_status - because this function is being
++		run *before* both self.pars "consumers" - scan() and run()
++
++		set output_dir (node) for the output
++		'''
++		for x in self.run_after:
++			if not x.hasrun:
++				return Task.ASK_LATER
++		if not getattr(self,'pars',None):
++			txt=self.inputs[0].read()
++			txt=re_nl.sub('',txt)
++			self.pars=Utils.str_to_dict(txt)
++			if not self.pars.get('OUTPUT_DIRECTORY'):
++				self.pars['OUTPUT_DIRECTORY']=self.inputs[0].parent.get_bld().abspath()
++			if not self.pars.get('INPUT'):
++				self.pars['INPUT']=self.inputs[0].parent.abspath()
++		if not getattr(self,'output_dir',None):
++			self.output_dir=self.generator.bld.root.find_dir(self.pars['OUTPUT_DIRECTORY'])
++		self.signature()
++		return Task.Task.runnable_status(self)
++	def scan(self):
++		if self.pars.get('RECURSIVE')=='YES':
++			Logs.warn("Doxygen RECURSIVE dependencies are not supported")
++		inputs=self.pars.get('INPUT').split()
++		exclude_patterns=self.pars.get('EXCLUDE_PATTERNS','').split()
++		file_patterns=self.pars.get('FILE_PATTERNS','').split()
++		if not file_patterns:
++			file_patterns=DOXY_FILE_PATTERNS
++		nodes=[]
++		names=[]
++		for i in inputs:
++			node=self.generator.bld.root.make_node(i)
++			if node:
++				if os.path.isdir(node.abspath()):
++					for m in node.ant_glob(file_patterns):
++						nodes.append(self.generator.bld.root.make_node(m.abspath()))
++				else:
++					nodes.append(node)
++			else:
++				names.append(i)
++		return(nodes,names)
++	def run(self):
++		code='\n'.join(['%s = %s'%(x,self.pars[x])for x in self.pars])
++		code=code
++		cmd=Utils.subst_vars(DOXY_STR,self.env)
++		env=self.env.env or None
++		proc=Utils.subprocess.Popen(cmd,shell=True,stdin=Utils.subprocess.PIPE,env=env)
++		proc.communicate(code)
++		return proc.returncode
++	def post_run(self):
++		nodes=self.output_dir.ant_glob('**/*')
++		for x in nodes:
++			x.sig=Utils.h_file(x.abspath())
++		self.outputs+=nodes
++		return Task.Task.post_run(self)
++class tar(Task.Task):
++	run_str='${TAR} ${TAROPTS} ${TGT} ${SRC}'
++	color='RED'
++	after=['doxygen']
++	def runnable_status(self):
++		for x in getattr(self,'input_tasks',[]):
++			if not x.hasrun:
++				return Task.ASK_LATER
++		if not getattr(self,'tar_done_adding',None):
++			self.tar_done_adding=True
++			for x in getattr(self,'input_tasks',[]):
++				self.set_inputs(x.outputs)
++			if not self.inputs:
++				return Task.SKIP_ME
++		return Task.Task.runnable_status(self)
++	def __str__(self):
++		tgt_str=' '.join([a.nice_path(self.env)for a in self.outputs])
++		return'%s: %s\n'%(self.__class__.__name__,tgt_str)
++def process_doxy(self):
++	if not getattr(self,'doxyfile',None):
++		self.generator.bld.fatal('no doxyfile??')
++	node=self.doxyfile
++	if not isinstance(node,Node.Node):
++		node=self.path.find_resource(node)
++	if not node:
++		raise ValueError('doxygen file not found')
++	dsk=self.create_task('doxygen',node)
++	if getattr(self,'doxy_tar',None):
++		tsk=self.create_task('tar')
++		tsk.input_tasks=[dsk]
++		tsk.set_outputs(self.path.find_or_declare(self.doxy_tar))
++		if self.doxy_tar.endswith('bz2'):
++			tsk.env['TAROPTS']=['cjf']
++		elif self.doxy_tar.endswith('gz'):
++			tsk.env['TAROPTS']=['czf']
++		else:
++			tsk.env['TAROPTS']=['cf']
++def configure(conf):
++	conf.find_program('doxygen',var='DOXYGEN')
++	conf.find_program('tar',var='TAR')
++
++feature('doxygen')(process_doxy)
+\ No newline at end of file
+--- /dev/null
++++ ardour3/waflib/extras/__init__.py
+@@ -0,0 +1,4 @@
++#! /usr/bin/env python
++# encoding: utf-8
++# WARNING! Do not edit! http://waf.googlecode.com/git/docs/wafbook/single.html#_obtaining_the_waf_file
++
+--- /dev/null
++++ ardour3/waflib/extras/misc.py
+@@ -0,0 +1,288 @@
++#! /usr/bin/env python
++# encoding: utf-8
++# WARNING! Do not edit! http://waf.googlecode.com/git/docs/wafbook/single.html#_obtaining_the_waf_file
++
++import shutil,re,os
++from waflib import TaskGen,Node,Task,Utils,Build,Errors
++from waflib.TaskGen import feature,after_method,before_method
++from waflib.Logs import debug
++def copy_attrs(orig,dest,names,only_if_set=False):
++	for a in Utils.to_list(names):
++		u=getattr(orig,a,())
++		if u or not only_if_set:
++			setattr(dest,a,u)
++def copy_func(tsk):
++	env=tsk.env
++	infile=tsk.inputs[0].abspath()
++	outfile=tsk.outputs[0].abspath()
++	try:
++		shutil.copy2(infile,outfile)
++	except(OSError,IOError):
++		return 1
++	else:
++		if tsk.chmod:os.chmod(outfile,tsk.chmod)
++		return 0
++def action_process_file_func(tsk):
++	if not tsk.fun:raise Errors.WafError('task must have a function attached to it for copy_func to work!')
++	return tsk.fun(tsk)
++def apply_cmd(self):
++	if not self.fun:raise Errors.WafError('cmdobj needs a function!')
++	tsk=Task.TaskBase()
++	tsk.fun=self.fun
++	tsk.env=self.env
++	self.tasks.append(tsk)
++	tsk.install_path=self.install_path
++def apply_copy(self):
++	Utils.def_attrs(self,fun=copy_func)
++	self.default_install_path=0
++	lst=self.to_list(self.source)
++	self.meths.remove('process_source')
++	for filename in lst:
++		node=self.path.find_resource(filename)
++		if not node:raise Errors.WafError('cannot find input file %s for processing'%filename)
++		target=self.target
++		if not target or len(lst)>1:target=node.name
++		newnode=self.path.find_or_declare(target)
++		tsk=self.create_task('copy',node,newnode)
++		tsk.fun=self.fun
++		tsk.chmod=getattr(self,'chmod',Utils.O644)
++		if not tsk.env:
++			tsk.debug()
++			raise Errors.WafError('task without an environment')
++def subst_func(tsk):
++	m4_re=re.compile('@(\w+)@',re.M)
++	code=tsk.inputs[0].read()
++	code=code.replace('%','%%')
++	s=m4_re.sub(r'%(\1)s',code)
++	env=tsk.env
++	di=getattr(tsk,'dict',{})or getattr(tsk.generator,'dict',{})
++	if not di:
++		names=m4_re.findall(code)
++		for i in names:
++			di[i]=env.get_flat(i)or env.get_flat(i.upper())
++	tsk.outputs[0].write(s%di)
++def apply_subst(self):
++	Utils.def_attrs(self,fun=subst_func)
++	lst=self.to_list(self.source)
++	self.meths.remove('process_source')
++	self.dict=getattr(self,'dict',{})
++	for filename in lst:
++		node=self.path.find_resource(filename)
++		if not node:raise Errors.WafError('cannot find input file %s for processing'%filename)
++		if self.target:
++			newnode=self.path.find_or_declare(self.target)
++		else:
++			newnode=node.change_ext('')
++		try:
++			self.dict=self.dict.get_merged_dict()
++		except AttributeError:
++			pass
++		if self.dict and not self.env['DICT_HASH']:
++			self.env=self.env.derive()
++			keys=list(self.dict.keys())
++			keys.sort()
++			lst=[self.dict[x]for x in keys]
++			self.env['DICT_HASH']=str(Utils.h_list(lst))
++		tsk=self.create_task('copy',node,newnode)
++		tsk.fun=self.fun
++		tsk.dict=self.dict
++		tsk.dep_vars=['DICT_HASH']
++		tsk.chmod=getattr(self,'chmod',Utils.O644)
++		if not tsk.env:
++			tsk.debug()
++			raise Errors.WafError('task without an environment')
++class cmd_arg(object):
++	def __init__(self,name,template='%s'):
++		self.name=name
++		self.template=template
++		self.node=None
++class input_file(cmd_arg):
++	def find_node(self,base_path):
++		assert isinstance(base_path,Node.Node)
++		self.node=base_path.find_resource(self.name)
++		if self.node is None:
++			raise Errors.WafError("Input file %s not found in "%(self.name,base_path))
++	def get_path(self,env,absolute):
++		if absolute:
++			return self.template%self.node.abspath()
++		else:
++			return self.template%self.node.srcpath()
++class output_file(cmd_arg):
++	def find_node(self,base_path):
++		assert isinstance(base_path,Node.Node)
++		self.node=base_path.find_or_declare(self.name)
++		if self.node is None:
++			raise Errors.WafError("Output file %s not found in "%(self.name,base_path))
++	def get_path(self,env,absolute):
++		if absolute:
++			return self.template%self.node.abspath()
++		else:
++			return self.template%self.node.bldpath()
++class cmd_dir_arg(cmd_arg):
++	def find_node(self,base_path):
++		assert isinstance(base_path,Node.Node)
++		self.node=base_path.find_dir(self.name)
++		if self.node is None:
++			raise Errors.WafError("Directory %s not found in "%(self.name,base_path))
++class input_dir(cmd_dir_arg):
++	def get_path(self,dummy_env,dummy_absolute):
++		return self.template%self.node.abspath()
++class output_dir(cmd_dir_arg):
++	def get_path(self,env,dummy_absolute):
++		return self.template%self.node.abspath()
++class command_output(Task.Task):
++	color="BLUE"
++	def __init__(self,env,command,command_node,command_args,stdin,stdout,cwd,os_env,stderr):
++		Task.Task.__init__(self,env=env)
++		assert isinstance(command,(str,Node.Node))
++		self.command=command
++		self.command_args=command_args
++		self.stdin=stdin
++		self.stdout=stdout
++		self.cwd=cwd
++		self.os_env=os_env
++		self.stderr=stderr
++		if command_node is not None:self.dep_nodes=[command_node]
++		self.dep_vars=[]
++	def run(self):
++		task=self
++		def input_path(node,template):
++			if task.cwd is None:
++				return template%node.bldpath()
++			else:
++				return template%node.abspath()
++		def output_path(node,template):
++			fun=node.abspath
++			if task.cwd is None:fun=node.bldpath
++			return template%fun()
++		if isinstance(task.command,Node.Node):
++			argv=[input_path(task.command,'%s')]
++		else:
++			argv=[task.command]
++		for arg in task.command_args:
++			if isinstance(arg,str):
++				argv.append(arg)
++			else:
++				assert isinstance(arg,cmd_arg)
++				argv.append(arg.get_path(task.env,(task.cwd is not None)))
++		if task.stdin:
++			stdin=open(input_path(task.stdin,'%s'))
++		else:
++			stdin=None
++		if task.stdout:
++			stdout=open(output_path(task.stdout,'%s'),"w")
++		else:
++			stdout=None
++		if task.stderr:
++			stderr=open(output_path(task.stderr,'%s'),"w")
++		else:
++			stderr=None
++		if task.cwd is None:
++			cwd=('None (actually %r)'%os.getcwd())
++		else:
++			cwd=repr(task.cwd)
++		debug("command-output: cwd=%s, stdin=%r, stdout=%r, argv=%r"%(cwd,stdin,stdout,argv))
++		if task.os_env is None:
++			os_env=os.environ
++		else:
++			os_env=task.os_env
++		command=Utils.subprocess.Popen(argv,stdin=stdin,stdout=stdout,stderr=stderr,cwd=task.cwd,env=os_env)
++		return command.wait()
++def init_cmd_output(self):
++	Utils.def_attrs(self,stdin=None,stdout=None,stderr=None,command=None,command_is_external=False,argv=[],dependencies=[],dep_vars=[],hidden_inputs=[],hidden_outputs=[],cwd=None,os_env=None)
++def apply_cmd_output(self):
++	if self.command is None:
++		raise Errors.WafError("command-output missing command")
++	if self.command_is_external:
++		cmd=self.command
++		cmd_node=None
++	else:
++		cmd_node=self.path.find_resource(self.command)
++		assert cmd_node is not None,('''Could not find command '%s' in source tree.
++Hint: if this is an external command,
++use command_is_external=True''')%(self.command,)
++		cmd=cmd_node
++	if self.cwd is None:
++		cwd=None
++	else:
++		assert isinstance(cwd,CmdDirArg)
++		self.cwd.find_node(self.path)
++	args=[]
++	inputs=[]
++	outputs=[]
++	for arg in self.argv:
++		if isinstance(arg,cmd_arg):
++			arg.find_node(self.path)
++			if isinstance(arg,input_file):
++				inputs.append(arg.node)
++			if isinstance(arg,output_file):
++				outputs.append(arg.node)
++	if self.stdout is None:
++		stdout=None
++	else:
++		assert isinstance(self.stdout,str)
++		stdout=self.path.find_or_declare(self.stdout)
++		if stdout is None:
++			raise Errors.WafError("File %s not found"%(self.stdout,))
++		outputs.append(stdout)
++	if self.stderr is None:
++		stderr=None
++	else:
++		assert isinstance(self.stderr,str)
++		stderr=self.path.find_or_declare(self.stderr)
++		if stderr is None:
++			raise Errors.WafError("File %s not found"%(self.stderr,))
++		outputs.append(stderr)
++	if self.stdin is None:
++		stdin=None
++	else:
++		assert isinstance(self.stdin,str)
++		stdin=self.path.find_resource(self.stdin)
++		if stdin is None:
++			raise Errors.WafError("File %s not found"%(self.stdin,))
++		inputs.append(stdin)
++	for hidden_input in self.to_list(self.hidden_inputs):
++		node=self.path.find_resource(hidden_input)
++		if node is None:
++			raise Errors.WafError("File %s not found in dir %s"%(hidden_input,self.path))
++		inputs.append(node)
++	for hidden_output in self.to_list(self.hidden_outputs):
++		node=self.path.find_or_declare(hidden_output)
++		if node is None:
++			raise Errors.WafError("File %s not found in dir %s"%(hidden_output,self.path))
++		outputs.append(node)
++	if not(inputs or getattr(self,'no_inputs',None)):
++		raise Errors.WafError('command-output objects must have at least one input file or give self.no_inputs')
++	if not(outputs or getattr(self,'no_outputs',None)):
++		raise Errors.WafError('command-output objects must have at least one output file or give self.no_outputs')
++	cwd=self.bld.variant_dir
++	task=command_output(self.env,cmd,cmd_node,self.argv,stdin,stdout,cwd,self.os_env,stderr)
++	task.generator=self
++	copy_attrs(self,task,'before after ext_in ext_out',only_if_set=True)
++	self.tasks.append(task)
++	task.inputs=inputs
++	task.outputs=outputs
++	task.dep_vars=self.to_list(self.dep_vars)
++	for dep in self.dependencies:
++		assert dep is not self
++		dep.post()
++		for dep_task in dep.tasks:
++			task.set_run_after(dep_task)
++	if not task.inputs:
++		task.runnable_status=type(Task.TaskBase.run)(runnable_status,task,task.__class__)
++		task.post_run=type(Task.TaskBase.run)(post_run,task,task.__class__)
++def post_run(self):
++	for x in self.outputs:
++		x.sig=Utils.h_file(x.abspath())
++def runnable_status(self):
++	return self.RUN_ME
++Task.task_factory('copy',vars=[],func=action_process_file_func)
++
++feature('cmd')(apply_cmd)
++feature('copy')(apply_copy)
++before_method('process_source')(apply_copy)
++feature('subst')(apply_subst)
++before_method('process_source')(apply_subst)
++feature('command-output')(init_cmd_output)
++feature('command-output')(apply_cmd_output)
++after_method('init_cmd_output')(apply_cmd_output)
+\ No newline at end of file
+--- /dev/null
++++ ardour3/waflib/fixpy2.py
+@@ -0,0 +1,50 @@
++#! /usr/bin/env python
++# encoding: utf-8
++# WARNING! Do not edit! http://waf.googlecode.com/git/docs/wafbook/single.html#_obtaining_the_waf_file
++
++import os
++all_modifs={}
++def fixdir(dir):
++	global all_modifs
++	for k in all_modifs:
++		for v in all_modifs[k]:
++			modif(os.path.join(dir,'waflib'),k,v)
++def modif(dir,name,fun):
++	if name=='*':
++		lst=[]
++		for y in'. Tools extras'.split():
++			for x in os.listdir(os.path.join(dir,y)):
++				if x.endswith('.py'):
++					lst.append(y+os.sep+x)
++		for x in lst:
++			modif(dir,x,fun)
++		return
++	filename=os.path.join(dir,name)
++	f=open(filename,'r')
++	txt=f.read()
++	f.close()
++	txt=fun(txt)
++	f=open(filename,'w')
++	f.write(txt)
++	f.close()
++def subst(*k):
++	def do_subst(fun):
++		global all_modifs
++		for x in k:
++			try:
++				all_modifs[x].append(fun)
++			except KeyError:
++				all_modifs[x]=[fun]
++		return fun
++	return do_subst
++def r1(code):
++	code=code.replace(',e:',',e:')
++	code=code.replace("",'')
++	code=code.replace('','')
++	return code
++def r4(code):
++	code=code.replace('next(self.biter)','self.biter.next()')
++	return code
++
++subst('*')(r1)
++subst('Runner.py')(r4)
+\ No newline at end of file
+--- /dev/null
++++ ardour3/waflib/__init__.py
+@@ -0,0 +1,4 @@
++#! /usr/bin/env python
++# encoding: utf-8
++# WARNING! Do not edit! http://waf.googlecode.com/git/docs/wafbook/single.html#_obtaining_the_waf_file
++
+--- /dev/null
++++ ardour3/waflib/Logs.py
+@@ -0,0 +1,149 @@
++#! /usr/bin/env python
++# encoding: utf-8
++# WARNING! Do not edit! http://waf.googlecode.com/git/docs/wafbook/single.html#_obtaining_the_waf_file
++
++import os,re,traceback,sys
++_nocolor=os.environ.get('NOCOLOR','no')not in('no','0','false')
++try:
++	if not _nocolor:
++		import waflib.ansiterm
++except:
++	pass
++import logging
++LOG_FORMAT="%(asctime)s %(c1)s%(zone)s%(c2)s %(message)s"
++HOUR_FORMAT="%H:%M:%S"
++zones=''
++verbose=0
++colors_lst={'USE':True,'BOLD':'\x1b[01;1m','RED':'\x1b[01;31m','GREEN':'\x1b[32m','YELLOW':'\x1b[33m','PINK':'\x1b[35m','BLUE':'\x1b[01;34m','CYAN':'\x1b[36m','NORMAL':'\x1b[0m','cursor_on':'\x1b[?25h','cursor_off':'\x1b[?25l',}
++got_tty=not os.environ.get('TERM','dumb')in['dumb','emacs']
++if got_tty:
++	try:
++		got_tty=sys.stderr.isatty()
++	except AttributeError:
++		got_tty=False
++if(not got_tty and os.environ.get('TERM','dumb')!='msys')or _nocolor:
++	colors_lst['USE']=False
++def get_term_cols():
++	return 80
++try:
++	import struct,fcntl,termios
++except ImportError:
++	pass
++else:
++	if got_tty:
++		def get_term_cols_real():
++			dummy_lines,cols=struct.unpack("HHHH",fcntl.ioctl(sys.stderr.fileno(),termios.TIOCGWINSZ,struct.pack("HHHH",0,0,0,0)))[:2]
++			return cols
++		try:
++			get_term_cols_real()
++		except:
++			pass
++		else:
++			get_term_cols=get_term_cols_real
++get_term_cols.__doc__="""
++	Get the console width in characters.
++
++	:return: the number of characters per line
++	:rtype: int
++	"""
++def get_color(cl):
++	if not colors_lst['USE']:return''
++	return colors_lst.get(cl,'')
++class color_dict(object):
++	def __getattr__(self,a):
++		return get_color(a)
++	def __call__(self,a):
++		return get_color(a)
++colors=color_dict()
++re_log=re.compile(r'(\w+): (.*)',re.M)
++class log_filter(logging.Filter):
++	def __init__(self,name=None):
++		pass
++	def filter(self,rec):
++		rec.c1=colors.PINK
++		rec.c2=colors.NORMAL
++		rec.zone=rec.module
++		if rec.levelno>=logging.INFO:
++			if rec.levelno>=logging.ERROR:
++				rec.c1=colors.RED
++			elif rec.levelno>=logging.WARNING:
++				rec.c1=colors.YELLOW
++			else:
++				rec.c1=colors.GREEN
++			return True
++		m=re_log.match(rec.msg)
++		if m:
++			rec.zone=m.group(1)
++			rec.msg=m.group(2)
++		if zones:
++			return getattr(rec,'zone','')in zones or'*'in zones
++		elif not verbose>2:
++			return False
++		return True
++class formatter(logging.Formatter):
++	def __init__(self):
++		logging.Formatter.__init__(self,LOG_FORMAT,HOUR_FORMAT)
++	def format(self,rec):
++		if rec.levelno>=logging.WARNING or rec.levelno==logging.INFO:
++			try:
++				msg=rec.msg.decode('utf-8')
++			except:
++				msg=rec.msg
++			return'%s%s%s'%(rec.c1,msg,rec.c2)
++		return logging.Formatter.format(self,rec)
++log=None
++def debug(*k,**kw):
++	if verbose:
++		k=list(k)
++		k[0]=k[0].replace('\n',' ')
++		global log
++		log.debug(*k,**kw)
++def error(*k,**kw):
++	global log
++	log.error(*k,**kw)
++	if verbose>2:
++		st=traceback.extract_stack()
++		if st:
++			st=st[:-1]
++			buf=[]
++			for filename,lineno,name,line in st:
++				buf.append('  File "%s", line %d, in %s'%(filename,lineno,name))
++				if line:
++					buf.append('	%s'%line.strip())
++			if buf:log.error("\n".join(buf))
++def warn(*k,**kw):
++	global log
++	log.warn(*k,**kw)
++def info(*k,**kw):
++	global log
++	log.info(*k,**kw)
++def init_log():
++	global log
++	log=logging.getLogger('waflib')
++	log.handlers=[]
++	log.filters=[]
++	hdlr=logging.StreamHandler()
++	hdlr.setFormatter(formatter())
++	log.addHandler(hdlr)
++	log.addFilter(log_filter())
++	log.setLevel(logging.DEBUG)
++def make_logger(path,name):
++	logger=logging.getLogger(name)
++	hdlr=logging.FileHandler(path,'w')
++	formatter=logging.Formatter('%(message)s')
++	hdlr.setFormatter(formatter)
++	logger.addHandler(hdlr)
++	logger.setLevel(logging.DEBUG)
++	return logger
++def make_mem_logger(name,to_log,size=10000):
++	from logging.handlers import MemoryHandler
++	logger=logging.getLogger(name)
++	hdlr=MemoryHandler(size,target=to_log)
++	formatter=logging.Formatter('%(message)s')
++	hdlr.setFormatter(formatter)
++	logger.addHandler(hdlr)
++	logger.memhandler=hdlr
++	logger.setLevel(logging.DEBUG)
++	return logger
++def pprint(col,str,label='',sep='\n'):
++	sys.stderr.write("%s%s%s %s%s"%(colors(col),str,colors.NORMAL,label,sep))
+--- /dev/null
++++ ardour3/waflib/Node.py
+@@ -0,0 +1,506 @@
++#! /usr/bin/env python
++# encoding: utf-8
++# WARNING! Do not edit! http://waf.googlecode.com/git/docs/wafbook/single.html#_obtaining_the_waf_file
++
++import sys
++if sys.hexversion < 0x020400f0: from sets import Set as set
++import os,re,sys,shutil
++from waflib import Utils,Errors
++exclude_regs='''
++**/*~
++**/#*#
++**/.#*
++**/%*%
++**/._*
++**/CVS
++**/CVS/**
++**/.cvsignore
++**/SCCS
++**/SCCS/**
++**/vssver.scc
++**/.svn
++**/.svn/**
++**/BitKeeper
++**/.git
++**/.git/**
++**/.gitignore
++**/.bzr
++**/.bzrignore
++**/.bzr/**
++**/.hg
++**/.hg/**
++**/_MTN
++**/_MTN/**
++**/.arch-ids
++**/{arch}
++**/_darcs
++**/_darcs/**
++**/.DS_Store'''
++def split_path(path):
++	return path.split('/')
++def split_path_cygwin(path):
++	if path.startswith('//'):
++		ret=path.split('/')[2:]
++		ret[0]='/'+ret[0]
++		return ret
++	return path.split('/')
++re_sp=re.compile('[/\\\\]')
++def split_path_win32(path):
++	if path.startswith('\\\\'):
++		ret=re.split(re_sp,path)[2:]
++		ret[0]='\\'+ret[0]
++		return ret
++	return re.split(re_sp,path)
++if sys.platform=='cygwin':
++	split_path=split_path_cygwin
++elif Utils.is_win32:
++	split_path=split_path_win32
++class Node(object):
++	__slots__=('name','sig','children','parent','cache_abspath','cache_isdir')
++	def __init__(self,name,parent):
++		self.name=name
++		self.parent=parent
++		if parent:
++			if name in parent.children:
++				raise Errors.WafError('node %s exists in the parent files %r already'%(name,parent))
++			parent.children[name]=self
++	def __setstate__(self,data):
++		self.name=data[0]
++		self.parent=data[1]
++		if data[2]is not None:
++			self.children=data[2]
++		if data[3]is not None:
++			self.sig=data[3]
++	def __getstate__(self):
++		return(self.name,self.parent,getattr(self,'children',None),getattr(self,'sig',None))
++	def __str__(self):
++		return self.name
++	def __repr__(self):
++		return self.abspath()
++	def __hash__(self):
++		return id(self)
++	def __eq__(self,node):
++		return id(self)==id(node)
++	def __copy__(self):
++		raise Errors.WafError('nodes are not supposed to be copied')
++	def read(self,flags='r'):
++		return Utils.readf(self.abspath(),flags)
++	def write(self,data,flags='w'):
++		f=None
++		try:
++			f=open(self.abspath(),flags)
++			f.write(data)
++		finally:
++			if f:
++				f.close()
++	def chmod(self,val):
++		os.chmod(self.abspath(),val)
++	def delete(self):
++		try:
++			if getattr(self,'children',None):
++				shutil.rmtree(self.abspath())
++			else:
++				os.unlink(self.abspath())
++		except:
++			pass
++		try:
++			delattr(self,'children')
++		except:
++			pass
++	def suffix(self):
++		k=max(0,self.name.rfind('.'))
++		return self.name[k:]
++	def height(self):
++		d=self
++		val=-1
++		while d:
++			d=d.parent
++			val+=1
++		return val
++	def listdir(self):
++		lst=Utils.listdir(self.abspath())
++		lst.sort()
++		return lst
++	def mkdir(self):
++		if getattr(self,'cache_isdir',None):
++			return
++		try:
++			self.parent.mkdir()
++		except:
++			pass
++		if self.name:
++			try:
++				os.makedirs(self.abspath())
++			except OSError:
++				pass
++			if not os.path.isdir(self.abspath()):
++				raise Errors.WafError('Could not create the directory %s'%self.abspath())
++			try:
++				self.children
++			except:
++				self.children={}
++		self.cache_isdir=True
++	def find_node(self,lst):
++		if isinstance(lst,str):
++			lst=[x for x in split_path(lst)if x and x!='.']
++		cur=self
++		for x in lst:
++			if x=='..':
++				cur=cur.parent or cur
++				continue
++			try:
++				if x in cur.children:
++					cur=cur.children[x]
++					continue
++			except:
++				cur.children={}
++			cur=self.__class__(x,cur)
++			try:
++				os.stat(cur.abspath())
++			except:
++				del cur.parent.children[x]
++				return None
++		ret=cur
++		try:
++			os.stat(ret.abspath())
++		except:
++			del ret.parent.children[ret.name]
++			return None
++		try:
++			while not getattr(cur.parent,'cache_isdir',None):
++				cur=cur.parent
++				cur.cache_isdir=True
++		except AttributeError:
++			pass
++		return ret
++	def make_node(self,lst):
++		if isinstance(lst,str):
++			lst=[x for x in split_path(lst)if x and x!='.']
++		cur=self
++		for x in lst:
++			if x=='..':
++				cur=cur.parent or cur
++				continue
++			if getattr(cur,'children',{}):
++				if x in cur.children:
++					cur=cur.children[x]
++					continue
++			else:
++				cur.children={}
++			cur=self.__class__(x,cur)
++		return cur
++	def search(self,lst):
++		if isinstance(lst,str):
++			lst=[x for x in split_path(lst)if x and x!='.']
++		cur=self
++		try:
++			for x in lst:
++				if x=='..':
++					cur=cur.parent or cur
++				else:
++					cur=cur.children[x]
++			return cur
++		except:
++			pass
++	def path_from(self,node):
++		c1=self
++		c2=node
++		c1h=c1.height()
++		c2h=c2.height()
++		lst=[]
++		up=0
++		while c1h>c2h:
++			lst.append(c1.name)
++			c1=c1.parent
++			c1h-=1
++		while c2h>c1h:
++			up+=1
++			c2=c2.parent
++			c2h-=1
++		while id(c1)!=id(c2):
++			lst.append(c1.name)
++			up+=1
++			c1=c1.parent
++			c2=c2.parent
++		for i in range(up):
++			lst.append('..')
++		lst.reverse()
++		return os.sep.join(lst)or'.'
++	def abspath(self):
++		try:
++			return self.cache_abspath
++		except:
++			pass
++		if os.sep=='/':
++			if not self.parent:
++				val=os.sep
++			elif not self.parent.name:
++				val=os.sep+self.name
++			else:
++				val=self.parent.abspath()+os.sep+self.name
++		else:
++			if not self.parent:
++				val=''
++			elif not self.parent.name:
++				val=self.name+os.sep
++			else:
++				val=self.parent.abspath().rstrip(os.sep)+os.sep+self.name
++		self.cache_abspath=val
++		return val
++	def is_child_of(self,node):
++		p=self
++		diff=self.height()-node.height()
++		while diff>0:
++			diff-=1
++			p=p.parent
++		return id(p)==id(node)
++	def ant_iter(self,accept=None,maxdepth=25,pats=[],dir=False,src=True,remove=True):
++		dircont=self.listdir()
++		dircont.sort()
++		try:
++			lst=set(self.children.keys())
++			if remove:
++				for x in lst-set(dircont):
++					del self.children[x]
++		except:
++			self.children={}
++		for name in dircont:
++			npats=accept(name,pats)
++			if npats and npats[0]:
++				accepted=[]in npats[0]
++				node=self.make_node([name])
++				isdir=os.path.isdir(node.abspath())
++				if accepted:
++					if isdir:
++						if dir:
++							yield node
++					else:
++						if src:
++							yield node
++				if getattr(node,'cache_isdir',None)or isdir:
++					node.cache_isdir=True
++					if maxdepth:
++						for k in node.ant_iter(accept=accept,maxdepth=maxdepth-1,pats=npats,dir=dir,src=src,remove=remove):
++							yield k
++		raise StopIteration
++	def ant_glob(self,*k,**kw):
++		src=kw.get('src',True)
++		dir=kw.get('dir',False)
++		excl=kw.get('excl',exclude_regs)
++		incl=k and k[0]or kw.get('incl','**')
++		def to_pat(s):
++			lst=Utils.to_list(s)
++			ret=[]
++			for x in lst:
++				x=x.replace('\\','/').replace('//','/')
++				if x.endswith('/'):
++					x+='**'
++				lst2=x.split('/')
++				accu=[]
++				for k in lst2:
++					if k=='**':
++						accu.append(k)
++					else:
++						k=k.replace('.','[.]').replace('*','.*').replace('?','.').replace('+','\\+')
++						k='^%s$'%k
++						try:
++							accu.append(re.compile(k))
++						except Exception ,e:
++							raise Errors.WafError("Invalid pattern: %s"%k,e)
++				ret.append(accu)
++			return ret
++		def filtre(name,nn):
++			ret=[]
++			for lst in nn:
++				if not lst:
++					pass
++				elif lst[0]=='**':
++					ret.append(lst)
++					if len(lst)>1:
++						if lst[1].match(name):
++							ret.append(lst[2:])
++					else:
++						ret.append([])
++				elif lst[0].match(name):
++					ret.append(lst[1:])
++			return ret
++		def accept(name,pats):
++			nacc=filtre(name,pats[0])
++			nrej=filtre(name,pats[1])
++			if[]in nrej:
++				nacc=[]
++			return[nacc,nrej]
++		ret=[x for x in self.ant_iter(accept=accept,pats=[to_pat(incl),to_pat(excl)],maxdepth=25,dir=dir,src=src,remove=kw.get('remove',True))]
++		if kw.get('flat',False):
++			return' '.join([x.path_from(self)for x in ret])
++		return ret
++	def find_nodes(self,find_dirs=True,find_files=True,match_fun=lambda x:True):
++		x="""
++		Recursively finds nodes::
++
++			def configure(cnf):
++				cnf.find_nodes()
++
++		:param find_dirs: whether to return directories
++		:param find_files: whether to return files
++		:param match_fun: matching function, taking a node as parameter
++		:rtype generator
++		:return: a generator that iterates over all the requested files
++		"""
++		files=self.listdir()
++		for f in files:
++			node=self.make_node([f])
++			if os.path.isdir(node.abspath()):
++				if find_dirs and match_fun(node):
++					yield node
++				gen=node.find_nodes(find_dirs,find_files,match_fun)
++				for g in gen:
++					yield g
++			else:
++				if find_files and match_fun(node):
++					yield node
++	def is_src(self):
++		cur=self
++		x=id(self.ctx.srcnode)
++		y=id(self.ctx.bldnode)
++		while cur.parent:
++			if id(cur)==y:
++				return False
++			if id(cur)==x:
++				return True
++			cur=cur.parent
++		return False
++	def is_bld(self):
++		cur=self
++		y=id(self.ctx.bldnode)
++		while cur.parent:
++			if id(cur)==y:
++				return True
++			cur=cur.parent
++		return False
++	def get_src(self):
++		cur=self
++		x=id(self.ctx.srcnode)
++		y=id(self.ctx.bldnode)
++		lst=[]
++		while cur.parent:
++			if id(cur)==y:
++				lst.reverse()
++				return self.ctx.srcnode.make_node(lst)
++			if id(cur)==x:
++				return self
++			lst.append(cur.name)
++			cur=cur.parent
++		return self
++	def get_bld(self):
++		cur=self
++		x=id(self.ctx.srcnode)
++		y=id(self.ctx.bldnode)
++		lst=[]
++		while cur.parent:
++			if id(cur)==y:
++				return self
++			if id(cur)==x:
++				lst.reverse()
++				return self.ctx.bldnode.make_node(lst)
++			lst.append(cur.name)
++			cur=cur.parent
++		lst.reverse()
++		if lst and Utils.is_win32 and len(lst[0])==2 and lst[0].endswith(':'):
++			lst[0]=lst[0][0]
++		return self.ctx.bldnode.make_node(['__root__']+lst)
++	def find_resource(self,lst):
++		if isinstance(lst,str):
++			lst=[x for x in split_path(lst)if x and x!='.']
++		node=self.get_bld().search(lst)
++		if not node:
++			self=self.get_src()
++			node=self.find_node(lst)
++		try:
++			pat=node.abspath()
++			if os.path.isdir(pat):
++				return None
++		except:
++			pass
++		return node
++	def find_or_declare(self,lst):
++		if isinstance(lst,str):
++			lst=[x for x in split_path(lst)if x and x!='.']
++		node=self.get_bld().search(lst)
++		if node:
++			if not os.path.isfile(node.abspath()):
++				node.sig=None
++				try:
++					node.parent.mkdir()
++				except:
++					pass
++			return node
++		self=self.get_src()
++		node=self.find_node(lst)
++		if node:
++			if not os.path.isfile(node.abspath()):
++				node.sig=None
++				try:
++					node.parent.mkdir()
++				except:
++					pass
++			return node
++		node=self.get_bld().make_node(lst)
++		node.parent.mkdir()
++		return node
++	def find_dir(self,lst):
++		if isinstance(lst,str):
++			lst=[x for x in split_path(lst)if x and x!='.']
++		node=self.find_node(lst)
++		try:
++			if not os.path.isdir(node.abspath()):
++				return None
++		except(OSError,AttributeError):
++			return None
++		return node
++	def change_ext(self,ext,ext_in=None):
++		name=self.name
++		if ext_in is None:
++			k=name.rfind('.')
++			if k>=0:
++				name=name[:k]+ext
++			else:
++				name=name+ext
++		else:
++			name=name[:-len(ext_in)]+ext
++		return self.parent.find_or_declare([name])
++	def nice_path(self,env=None):
++		return self.path_from(self.ctx.launch_node())
++	def bldpath(self):
++		return self.path_from(self.ctx.bldnode)
++	def srcpath(self):
++		return self.path_from(self.ctx.srcnode)
++	def relpath(self):
++		cur=self
++		x=id(self.ctx.bldnode)
++		while cur.parent:
++			if id(cur)==x:
++				return self.bldpath()
++			cur=cur.parent
++		return self.srcpath()
++	def bld_dir(self):
++		return self.parent.bldpath()
++	def bld_base(self):
++		s=os.path.splitext(self.name)[0]
++		return self.bld_dir()+os.sep+s
++	def get_bld_sig(self):
++		try:
++			ret=self.ctx.hash_cache[id(self)]
++		except KeyError:
++			pass
++		except AttributeError:
++			self.ctx.hash_cache={}
++		else:
++			return ret
++		if not self.is_bld()or self.ctx.bldnode is self.ctx.srcnode:
++			self.sig=Utils.h_file(self.abspath())
++		self.ctx.hash_cache[id(self)]=ret=self.sig
++		return ret
++pickle_lock=Utils.threading.Lock()
++class Nod3(Node):
++	pass
+--- /dev/null
++++ ardour3/waflib/Options.py
+@@ -0,0 +1,134 @@
++#! /usr/bin/env python
++# encoding: utf-8
++# WARNING! Do not edit! http://waf.googlecode.com/git/docs/wafbook/single.html#_obtaining_the_waf_file
++
++import os,tempfile,optparse,sys,re
++from waflib import Logs,Utils,Context
++cmds='distclean configure build install clean uninstall check dist distcheck'.split()
++options={}
++commands=[]
++lockfile=os.environ.get('WAFLOCK','.lock-waf_%s_build'%sys.platform)
++try:cache_global=os.path.abspath(os.environ['WAFCACHE'])
++except KeyError:cache_global=''
++platform=Utils.unversioned_sys_platform()
++class opt_parser(optparse.OptionParser):
++	def __init__(self,ctx):
++		optparse.OptionParser.__init__(self,conflict_handler="resolve",version='waf %s (%s)'%(Context.WAFVERSION,Context.WAFREVISION))
++		self.formatter.width=Logs.get_term_cols()
++		p=self.add_option
++		self.ctx=ctx
++		jobs=ctx.jobs()
++		p('-j','--jobs',dest='jobs',default=jobs,type='int',help='amount of parallel jobs (%r)'%jobs)
++		p('-k','--keep',dest='keep',default=0,action='count',help='keep running happily even if errors are found')
++		p('-v','--verbose',dest='verbose',default=0,action='count',help='verbosity level -v -vv or -vvv [default: 0]')
++		p('--nocache',dest='nocache',default=False,action='store_true',help='ignore the WAFCACHE (if set)')
++		p('--zones',dest='zones',default='',action='store',help='debugging zones (task_gen, deps, tasks, etc)')
++		gr=optparse.OptionGroup(self,'configure options')
++		self.add_option_group(gr)
++		gr.add_option('-o','--out',action='store',default='',help='build dir for the project',dest='out')
++		gr.add_option('-t','--top',action='store',default='',help='src dir for the project',dest='top')
++		default_prefix=os.environ.get('PREFIX')
++		if not default_prefix:
++			if platform=='win32':
++				d=tempfile.gettempdir()
++				default_prefix=d[0].upper()+d[1:]
++			else:
++				default_prefix='/usr/local/'
++		gr.add_option('--prefix',dest='prefix',default=default_prefix,help='installation prefix [default: %r]'%default_prefix)
++		gr.add_option('--download',dest='download',default=False,action='store_true',help='try to download the tools if missing')
++		gr=optparse.OptionGroup(self,'build and install options')
++		self.add_option_group(gr)
++		gr.add_option('-p','--progress',dest='progress_bar',default=0,action='count',help='-p: progress bar; -pp: ide output')
++		gr.add_option('--targets',dest='targets',default='',action='store',help='task generators, e.g. "target1,target2"')
++		gr=optparse.OptionGroup(self,'step options')
++		self.add_option_group(gr)
++		gr.add_option('--files',dest='files',default='',action='store',help='files to process, by regexp, e.g. "*/main.c,*/test/main.o"')
++		default_destdir=os.environ.get('DESTDIR','')
++		gr=optparse.OptionGroup(self,'install/uninstall options')
++		self.add_option_group(gr)
++		gr.add_option('--destdir',help='installation root [default: %r]'%default_destdir,default=default_destdir,dest='destdir')
++		gr.add_option('-f','--force',dest='force',default=False,action='store_true',help='force file installation')
++	def get_usage(self):
++		cmds_str={}
++		for cls in Context.classes:
++			if not cls.cmd or cls.cmd=='options':
++				continue
++			s=cls.__doc__ or''
++			cmds_str[cls.cmd]=s
++		if Context.g_module:
++			for(k,v)in Context.g_module.__dict__.items():
++				if k in['options','init','shutdown']:
++					continue
++				if type(v)is type(Context.create_context):
++					if v.__doc__ and not k.startswith('_'):
++						cmds_str[k]=v.__doc__
++		just=0
++		for k in cmds_str:
++			just=max(just,len(k))
++		lst=['  %s: %s'%(k.ljust(just),v)for(k,v)in cmds_str.items()]
++		lst.sort()
++		ret='\n'.join(lst)
++		return'''waf [commands] [options]
++
++Main commands (example: ./waf build -j4)
++%s
++'''%ret
++class OptionsContext(Context.Context):
++	cmd='options'
++	fun='options'
++	def __init__(self,**kw):
++		super(OptionsContext,self).__init__(**kw)
++		self.parser=opt_parser(self)
++		self.option_groups={}
++	def jobs(self):
++		count=int(os.environ.get('JOBS',0))
++		if count<1:
++			if'NUMBER_OF_PROCESSORS'in os.environ:
++				count=int(os.environ.get('NUMBER_OF_PROCESSORS',1))
++			else:
++				if hasattr(os,'sysconf_names'):
++					if'SC_NPROCESSORS_ONLN'in os.sysconf_names:
++						count=int(os.sysconf('SC_NPROCESSORS_ONLN'))
++					elif'SC_NPROCESSORS_CONF'in os.sysconf_names:
++						count=int(os.sysconf('SC_NPROCESSORS_CONF'))
++				if not count and os.name not in('nt','java'):
++					try:
++						tmp=self.cmd_and_log(['sysctl','-n','hw.ncpu'],quiet=0)
++					except Exception:
++						pass
++					else:
++						if re.match('^[0-9]+$',tmp):
++							count=int(tmp)
++		if count<1:
++			count=1
++		elif count>1024:
++			count=1024
++		return count
++	def add_option(self,*k,**kw):
++		self.parser.add_option(*k,**kw)
++	def add_option_group(self,*k,**kw):
++		try:
++			gr=self.option_groups[k[0]]
++		except:
++			gr=self.parser.add_option_group(*k,**kw)
++		self.option_groups[k[0]]=gr
++		return gr
++	def get_option_group(self,opt_str):
++		try:
++			return self.option_groups[opt_str]
++		except KeyError:
++			for group in self.parser.option_groups:
++				if group.title==opt_str:
++					return group
++			return None
++	def parse_args(self,_args=None):
++		global options,commands
++		(options,leftover_args)=self.parser.parse_args(args=_args)
++		commands=leftover_args
++		if options.destdir:
++			options.destdir=os.path.abspath(os.path.expanduser(options.destdir))
++		if options.verbose>=1:
++			self.load('errcheck')
++	def execute(self):
++		super(OptionsContext,self).execute()
++		self.parse_args()
+--- /dev/null
++++ ardour3/waflib/Runner.py
+@@ -0,0 +1,197 @@
++#! /usr/bin/env python
++# encoding: utf-8
++# WARNING! Do not edit! http://waf.googlecode.com/git/docs/wafbook/single.html#_obtaining_the_waf_file
++
++import random,atexit
++try:
++	from queue import Queue
++except:
++	from Queue import Queue
++from waflib import Utils,Task,Errors,Logs
++GAP=10
++class TaskConsumer(Utils.threading.Thread):
++	def __init__(self):
++		Utils.threading.Thread.__init__(self)
++		self.ready=Queue()
++		self.setDaemon(1)
++		self.start()
++	def run(self):
++		try:
++			self.loop()
++		except:
++			pass
++	def loop(self):
++		while 1:
++			tsk=self.ready.get()
++			if not isinstance(tsk,Task.TaskBase):
++				tsk(self)
++			else:
++				tsk.process()
++pool=Queue()
++def get_pool():
++	try:
++		return pool.get(False)
++	except:
++		return TaskConsumer()
++def put_pool(x):
++	pool.put(x)
++def _free_resources():
++	global pool
++	lst=[]
++	while pool.qsize():
++		lst.append(pool.get())
++	for x in lst:
++		x.ready.put(None)
++	for x in lst:
++		x.join()
++	pool=None
++atexit.register(_free_resources)
++class Parallel(object):
++	def __init__(self,bld,j=2):
++		self.numjobs=j
++		self.bld=bld
++		self.outstanding=[]
++		self.frozen=[]
++		self.out=Queue(0)
++		self.count=0
++		self.processed=1
++		self.stop=False
++		self.error=[]
++		self.biter=None
++		self.dirty=False
++	def get_next_task(self):
++		if not self.outstanding:
++			return None
++		return self.outstanding.pop(0)
++	def postpone(self,tsk):
++		if random.randint(0,1):
++			self.frozen.insert(0,tsk)
++		else:
++			self.frozen.append(tsk)
++	def refill_task_list(self):
++		while self.count>self.numjobs*GAP:
++			self.get_out()
++		while not self.outstanding:
++			if self.count:
++				self.get_out()
++			elif self.frozen:
++				try:
++					cond=self.deadlock==self.processed
++				except:
++					pass
++				else:
++					if cond:
++						msg='check the build order for the tasks'
++						for tsk in self.frozen:
++							if not tsk.run_after:
++								msg='check the methods runnable_status'
++								break
++						lst=[]
++						for tsk in self.frozen:
++							lst.append('%s\t-> %r'%(repr(tsk),[id(x)for x in tsk.run_after]))
++						raise Errors.WafError('Deadlock detected: %s%s'%(msg,''.join(lst)))
++				self.deadlock=self.processed
++			if self.frozen:
++				self.outstanding+=self.frozen
++				self.frozen=[]
++			elif not self.count:
++				self.outstanding.extend(self.biter.next())
++				self.total=self.bld.total()
++				break
++	def add_more_tasks(self,tsk):
++		if getattr(tsk,'more_tasks',None):
++			self.outstanding+=tsk.more_tasks
++			self.total+=len(tsk.more_tasks)
++	def get_out(self):
++		tsk=self.out.get()
++		if not self.stop:
++			self.add_more_tasks(tsk)
++		self.count-=1
++		self.dirty=True
++		return tsk
++	def error_handler(self,tsk):
++		if not self.bld.keep:
++			self.stop=True
++		self.error.append(tsk)
++	def add_task(self,tsk):
++		try:
++			self.pool
++		except AttributeError:
++			self.init_task_pool()
++		self.ready.put(tsk)
++	def init_task_pool(self):
++		pool=self.pool=[get_pool()for i in range(self.numjobs)]
++		self.ready=Queue(0)
++		def setq(consumer):
++			consumer.ready=self.ready
++		for x in pool:
++			x.ready.put(setq)
++		return pool
++	def free_task_pool(self):
++		def setq(consumer):
++			consumer.ready=Queue(0)
++			self.out.put(self)
++		try:
++			pool=self.pool
++		except:
++			pass
++		else:
++			for x in pool:
++				self.ready.put(setq)
++			for x in pool:
++				self.get_out()
++			for x in pool:
++				put_pool(x)
++			self.pool=[]
++	def start(self):
++		self.total=self.bld.total()
++		while not self.stop:
++			self.refill_task_list()
++			tsk=self.get_next_task()
++			if not tsk:
++				if self.count:
++					continue
++				else:
++					break
++			if tsk.hasrun:
++				self.processed+=1
++				continue
++			if self.stop:
++				break
++			try:
++				st=tsk.runnable_status()
++			except Exception:
++				self.processed+=1
++				tsk.err_msg=Utils.ex_stack()
++				if not self.stop and self.bld.keep:
++					tsk.hasrun=Task.SKIPPED
++					if self.bld.keep==1:
++						if Logs.verbose>1 or not self.error:
++							self.error.append(tsk)
++						self.stop=True
++					else:
++						if Logs.verbose>1:
++							self.error.append(tsk)
++					continue
++				tsk.hasrun=Task.EXCEPTION
++				self.error_handler(tsk)
++				continue
++			if st==Task.ASK_LATER:
++				self.postpone(tsk)
++			elif st==Task.SKIP_ME:
++				self.processed+=1
++				tsk.hasrun=Task.SKIPPED
++				self.add_more_tasks(tsk)
++			else:
++				tsk.position=(self.processed,self.total)
++				self.count+=1
++				tsk.master=self
++				self.processed+=1
++				if self.numjobs==1:
++					tsk.process()
++				else:
++					self.add_task(tsk)
++		while self.error and self.count:
++			self.get_out()
++		assert(self.count==0 or self.stop)
++		self.free_task_pool()
+--- /dev/null
++++ ardour3/waflib/Scripting.py
+@@ -0,0 +1,367 @@
++#! /usr/bin/env python
++# encoding: utf-8
++# WARNING! Do not edit! http://waf.googlecode.com/git/docs/wafbook/single.html#_obtaining_the_waf_file
++
++import os,shutil,traceback,errno,sys,stat
++from waflib import Utils,Configure,Logs,Options,ConfigSet,Context,Errors,Build,Node
++build_dir_override=None
++no_climb_commands=['configure']
++default_cmd="build"
++def waf_entry_point(current_directory,version,wafdir):
++	Logs.init_log()
++	if Context.WAFVERSION!=version:
++		Logs.error('Waf script %r and library %r do not match (directory %r)'%(version,Context.WAFVERSION,wafdir))
++		sys.exit(1)
++	if'--version'in sys.argv:
++		Context.run_dir=current_directory
++		ctx=Context.create_context('options')
++		ctx.curdir=current_directory
++		ctx.parse_args()
++		sys.exit(0)
++	Context.waf_dir=wafdir
++	Context.launch_dir=current_directory
++	no_climb=os.environ.get('NOCLIMB',None)
++	if not no_climb:
++		for k in no_climb_commands:
++			if k in sys.argv:
++				no_climb=True
++				break
++	cur=current_directory
++	while cur:
++		lst=os.listdir(cur)
++		if Options.lockfile in lst:
++			env=ConfigSet.ConfigSet()
++			try:
++				env.load(os.path.join(cur,Options.lockfile))
++				ino=os.stat(cur)[stat.ST_INO]
++			except Exception:
++				pass
++			else:
++				for x in[env.run_dir,env.top_dir,env.out_dir]:
++					if Utils.is_win32:
++						if cur==x:
++							load=True
++							break
++					else:
++						try:
++							ino2=os.stat(x)[stat.ST_INO]
++						except:
++							pass
++						else:
++							if ino==ino2:
++								load=True
++								break
++				else:
++					Logs.warn('invalid lock file in %s'%cur)
++					load=False
++				if load:
++					Context.run_dir=env.run_dir
++					Context.top_dir=env.top_dir
++					Context.out_dir=env.out_dir
++					break
++		if not Context.run_dir:
++			if Context.WSCRIPT_FILE in lst:
++				Context.run_dir=cur
++		next=os.path.dirname(cur)
++		if next==cur:
++			break
++		cur=next
++		if no_climb:
++			break
++	if not Context.run_dir:
++		if'-h'in sys.argv or'--help'in sys.argv:
++			Logs.warn('No wscript file found: the help message may be incomplete')
++			Context.run_dir=current_directory
++			ctx=Context.create_context('options')
++			ctx.curdir=current_directory
++			ctx.parse_args()
++			sys.exit(0)
++		Logs.error('Waf: Run from a directory containing a file named %r'%Context.WSCRIPT_FILE)
++		sys.exit(1)
++	try:
++		os.chdir(Context.run_dir)
++	except OSError:
++		Logs.error('Waf: The folder %r is unreadable'%Context.run_dir)
++		sys.exit(1)
++	try:
++		set_main_module(Context.run_dir+os.sep+Context.WSCRIPT_FILE)
++	except Errors.WafError ,e:
++		Logs.pprint('RED',e.verbose_msg)
++		Logs.error(str(e))
++		sys.exit(1)
++	except Exception ,e:
++		Logs.error('Waf: The wscript in %r is unreadable'%Context.run_dir,e)
++		traceback.print_exc(file=sys.stdout)
++		sys.exit(2)
++	try:
++		run_commands()
++	except Errors.WafError ,e:
++		if Logs.verbose>1:
++			Logs.pprint('RED',e.verbose_msg)
++		Logs.error(e.msg)
++		sys.exit(1)
++	except Exception ,e:
++		traceback.print_exc(file=sys.stdout)
++		sys.exit(2)
++	except KeyboardInterrupt:
++		Logs.pprint('RED','Interrupted')
++		sys.exit(68)
++def set_main_module(file_path):
++	Context.g_module=Context.load_module(file_path)
++	Context.g_module.root_path=file_path
++	def set_def(obj):
++		name=obj.__name__
++		if not name in Context.g_module.__dict__:
++			setattr(Context.g_module,name,obj)
++	for k in[update,dist,distclean,distcheck,update]:
++		set_def(k)
++	if not'init'in Context.g_module.__dict__:
++		Context.g_module.init=Utils.nada
++	if not'shutdown'in Context.g_module.__dict__:
++		Context.g_module.shutdown=Utils.nada
++	if not'options'in Context.g_module.__dict__:
++		Context.g_module.options=Utils.nada
++def parse_options():
++	Context.create_context('options').execute()
++	if not Options.commands:
++		Options.commands=[default_cmd]
++	Options.commands=[x for x in Options.commands if x!='options']
++	Logs.verbose=Options.options.verbose
++	Logs.init_log()
++	if Options.options.zones:
++		Logs.zones=Options.options.zones.split(',')
++		if not Logs.verbose:
++			Logs.verbose=1
++	elif Logs.verbose>0:
++		Logs.zones=['runner']
++	if Logs.verbose>2:
++		Logs.zones=['*']
++def run_command(cmd_name):
++	ctx=Context.create_context(cmd_name)
++	ctx.options=Options.options
++	ctx.cmd=cmd_name
++	ctx.execute()
++	return ctx
++def run_commands():
++	parse_options()
++	run_command('init')
++	while Options.commands:
++		cmd_name=Options.commands.pop(0)
++		timer=Utils.Timer()
++		run_command(cmd_name)
++		if not Options.options.progress_bar:
++			elapsed=' (%s)'%str(timer)
++			Logs.info('%r finished successfully%s'%(cmd_name,elapsed))
++	run_command('shutdown')
++def _can_distclean(name):
++	for k in'.o .moc .exe'.split():
++		if name.endswith(k):
++			return True
++	return False
++def distclean_dir(dirname):
++	for(root,dirs,files)in os.walk(dirname):
++		for f in files:
++			if _can_distclean(f):
++				fname=root+os.sep+f
++				try:
++					os.unlink(fname)
++				except:
++					Logs.warn('could not remove %r'%fname)
++	for x in[Context.DBFILE,'config.log']:
++		try:
++			os.unlink(x)
++		except:
++			pass
++	try:
++		shutil.rmtree('c4che')
++	except:
++		pass
++def distclean(ctx):
++	'''removes the build directory'''
++	lst=os.listdir('.')
++	for f in lst:
++		if f==Options.lockfile:
++			try:
++				proj=ConfigSet.ConfigSet(f)
++			except:
++				Logs.warn('could not read %r'%f)
++				continue
++			if proj['out_dir']!=proj['top_dir']:
++				try:
++					shutil.rmtree(proj['out_dir'])
++				except IOError:
++					pass
++				except OSError ,e:
++					if e.errno!=errno.ENOENT:
++						Logs.warn('project %r cannot be removed'%proj[Context.OUT])
++			else:
++				distclean_dir(proj['out_dir'])
++			for k in(proj['out_dir'],proj['top_dir'],proj['run_dir']):
++				try:
++					os.remove(os.path.join(k,Options.lockfile))
++				except OSError ,e:
++					if e.errno!=errno.ENOENT:
++						Logs.warn('file %r cannot be removed'%f)
++		if f.startswith('.waf')and not Options.commands:
++			shutil.rmtree(f,ignore_errors=True)
++class Dist(Context.Context):
++	cmd='dist'
++	fun='dist'
++	algo='tar.bz2'
++	ext_algo={}
++	def execute(self):
++		self.recurse([os.path.dirname(Context.g_module.root_path)])
++		self.archive()
++	def archive(self):
++		import tarfile
++		arch_name=self.get_arch_name()
++		try:
++			self.base_path
++		except:
++			self.base_path=self.path
++		node=self.base_path.make_node(arch_name)
++		try:
++			node.delete()
++		except:
++			pass
++		files=self.get_files()
++		if self.algo.startswith('tar.'):
++			tar=tarfile.open(arch_name,'w:'+self.algo.replace('tar.',''))
++			for x in files:
++				self.add_tar_file(x,tar)
++			tar.close()
++		elif self.algo=='zip':
++			import zipfile
++			zip=zipfile.ZipFile(arch_name,'w',compression=zipfile.ZIP_DEFLATED)
++			for x in files:
++				archive_name=self.get_base_name()+'/'+x.path_from(self.base_path)
++				zip.write(x.abspath(),archive_name,zipfile.ZIP_DEFLATED)
++			zip.close()
++		else:
++			self.fatal('Valid algo types are tar.bz2, tar.gz or zip')
++		try:
++			from hashlib import sha1 as sha
++		except ImportError:
++			from sha import sha
++		try:
++			digest=" (sha=%r)"%sha(node.read()).hexdigest()
++		except:
++			digest=''
++		Logs.info('New archive created: %s%s'%(self.arch_name,digest))
++	def get_tar_path(self,node):
++		return node.abspath()
++	def add_tar_file(self,x,tar):
++		p=self.get_tar_path(x)
++		tinfo=tar.gettarinfo(name=p,arcname=self.get_tar_prefix()+'/'+x.path_from(self.base_path))
++		tinfo.uid=0
++		tinfo.gid=0
++		tinfo.uname='root'
++		tinfo.gname='root'
++		fu=None
++		try:
++			fu=open(p,'rb')
++			tar.addfile(tinfo,fileobj=fu)
++		finally:
++			if fu:
++				fu.close()
++	def get_tar_prefix(self):
++		try:
++			return self.tar_prefix
++		except:
++			return self.get_base_name()
++	def get_arch_name(self):
++		try:
++			self.arch_name
++		except:
++			self.arch_name=self.get_base_name()+'.'+self.ext_algo.get(self.algo,self.algo)
++		return self.arch_name
++	def get_base_name(self):
++		try:
++			self.base_name
++		except:
++			appname=getattr(Context.g_module,Context.APPNAME,'noname')
++			version=getattr(Context.g_module,Context.VERSION,'1.0')
++			self.base_name=appname+'-'+version
++		return self.base_name
++	def get_excl(self):
++		try:
++			return self.excl
++		except:
++			self.excl=Node.exclude_regs+' **/waf-1.6.* **/.waf-1.6* **/waf3-1.6.* **/.waf3-1.6* **/*~ **/*.rej **/*.orig **/*.pyc **/*.pyo **/*.bak **/*.swp **/.lock-w*'
++			nd=self.root.find_node(Context.out_dir)
++			if nd:
++				self.excl+=' '+nd.path_from(self.base_path)
++			return self.excl
++	def get_files(self):
++		try:
++			files=self.files
++		except:
++			files=self.base_path.ant_glob('**/*',excl=self.get_excl())
++		return files
++def dist(ctx):
++	'''makes a tarball for redistributing the sources'''
++	pass
++class DistCheck(Dist):
++	fun='distcheck'
++	cmd='distcheck'
++	def execute(self):
++		self.recurse([os.path.dirname(Context.g_module.root_path)])
++		self.archive()
++		self.check()
++	def check(self):
++		import tempfile,tarfile
++		t=None
++		try:
++			t=tarfile.open(self.get_arch_name())
++			for x in t:
++				t.extract(x)
++		finally:
++			if t:
++				t.close()
++		instdir=tempfile.mkdtemp('.inst',self.get_base_name())
++		ret=Utils.subprocess.Popen([sys.argv[0],'configure','install','uninstall','--destdir='+instdir],cwd=self.get_base_name()).wait()
++		if ret:
++			raise Errors.WafError('distcheck failed with code %i'%ret)
++		if os.path.exists(instdir):
++			raise Errors.WafError('distcheck succeeded, but files were left in %s'%instdir)
++		shutil.rmtree(self.get_base_name())
++def distcheck(ctx):
++	'''checks if the project compiles (tarball from 'dist')'''
++	pass
++def update(ctx):
++	'''updates the plugins from the *waflib/extras* directory'''
++	lst=Options.options.files.split(',')
++	if not lst:
++		lst=[x for x in Utils.listdir(Context.waf_dir+'/waflib/extras')if x.endswith('.py')]
++	for x in lst:
++		tool=x.replace('.py','')
++		try:
++			Configure.download_tool(tool,force=True,ctx=ctx)
++		except Errors.WafError:
++			Logs.error('Could not find the tool %s in the remote repository'%x)
++def autoconfigure(execute_method):
++	def execute(self):
++		if not Configure.autoconfig:
++			return execute_method(self)
++		env=ConfigSet.ConfigSet()
++		do_config=False
++		try:
++			env.load(os.path.join(Context.top_dir,Options.lockfile))
++		except Exception:
++			Logs.warn('Configuring the project')
++			do_config=True
++		else:
++			if env.run_dir!=Context.run_dir:
++				do_config=True
++			else:
++				h=0
++				for f in env['files']:
++					h=hash((h,Utils.readf(f,'rb')))
++				do_config=h!=env.hash
++		if do_config:
++			Options.commands.insert(0,self.cmd)
++			Options.commands.insert(0,'configure')
++			return
++		return execute_method(self)
++	return execute
++Build.BuildContext.execute=autoconfigure(Build.BuildContext.execute)
+--- /dev/null
++++ ardour3/waflib/TaskGen.py
+@@ -0,0 +1,353 @@
++#! /usr/bin/env python
++# encoding: utf-8
++# WARNING! Do not edit! http://waf.googlecode.com/git/docs/wafbook/single.html#_obtaining_the_waf_file
++
++import sys
++if sys.hexversion < 0x020400f0: from sets import Set as set
++import copy,re,os
++from waflib import Task,Utils,Logs,Errors,ConfigSet
++feats=Utils.defaultdict(set)
++class task_gen(object):
++	mappings={}
++	prec=Utils.defaultdict(list)
++	def __init__(self,*k,**kw):
++		self.source=''
++		self.target=''
++		self.meths=[]
++		self.prec=Utils.defaultdict(list)
++		self.mappings={}
++		self.features=[]
++		self.tasks=[]
++		if not'bld'in kw:
++			self.env=ConfigSet.ConfigSet()
++			self.idx=0
++			self.path=None
++		else:
++			self.bld=kw['bld']
++			self.env=self.bld.env.derive()
++			self.path=self.bld.path
++			try:
++				self.idx=self.bld.idx[id(self.path)]=self.bld.idx.get(id(self.path),0)+1
++			except AttributeError:
++				self.bld.idx={}
++				self.idx=self.bld.idx[id(self.path)]=1
++		for key,val in kw.items():
++			setattr(self,key,val)
++	def __str__(self):
++		return"<task_gen %r declared in %s>"%(self.name,self.path.abspath())
++	def __repr__(self):
++		lst=[]
++		for x in self.__dict__.keys():
++			if x not in['env','bld','compiled_tasks','tasks']:
++				lst.append("%s=%s"%(x,repr(getattr(self,x))))
++		return"bld(%s) in %s"%(", ".join(lst),self.path.abspath())
++	def get_name(self):
++		try:
++			return self._name
++		except AttributeError:
++			if isinstance(self.target,list):
++				lst=[str(x)for x in self.target]
++				name=self._name=','.join(lst)
++			else:
++				name=self._name=str(self.target)
++			return name
++	def set_name(self,name):
++		self._name=name
++	name=property(get_name,set_name)
++	def to_list(self,val):
++		if isinstance(val,str):return val.split()
++		else:return val
++	def post(self):
++		if getattr(self,'posted',None):
++			return False
++		self.posted=True
++		keys=set(self.meths)
++		self.features=Utils.to_list(self.features)
++		for x in self.features+['*']:
++			st=feats[x]
++			if not st:
++				if not x in Task.classes:
++					Logs.warn('feature %r does not exist - bind at least one method to it'%x)
++			keys.update(list(st))
++		prec={}
++		prec_tbl=self.prec or task_gen.prec
++		for x in prec_tbl:
++			if x in keys:
++				prec[x]=prec_tbl[x]
++		tmp=[]
++		for a in keys:
++			for x in prec.values():
++				if a in x:break
++			else:
++				tmp.append(a)
++		out=[]
++		while tmp:
++			e=tmp.pop()
++			if e in keys:out.append(e)
++			try:
++				nlst=prec[e]
++			except KeyError:
++				pass
++			else:
++				del prec[e]
++				for x in nlst:
++					for y in prec:
++						if x in prec[y]:
++							break
++					else:
++						tmp.append(x)
++		if prec:
++			raise Errors.WafError('Cycle detected in the method execution %r'%prec)
++		out.reverse()
++		self.meths=out
++		Logs.debug('task_gen: posting %s %d'%(self,id(self)))
++		for x in out:
++			try:
++				v=getattr(self,x)
++			except AttributeError:
++				raise Errors.WafError('%r is not a valid task generator method'%x)
++			Logs.debug('task_gen: -> %s (%d)'%(x,id(self)))
++			v()
++		Logs.debug('task_gen: posted %s'%self.name)
++		return True
++	def get_hook(self,node):
++		name=node.name
++		for k in self.mappings:
++			if name.endswith(k):
++				return self.mappings[k]
++		for k in task_gen.mappings:
++			if name.endswith(k):
++				return task_gen.mappings[k]
++		raise Errors.WafError("File %r has no mapping in %r (did you forget to load a waf tool?)"%(node,task_gen.mappings.keys()))
++	def create_task(self,name,src=None,tgt=None):
++		task=Task.classes[name](env=self.env.derive(),generator=self)
++		if src:
++			task.set_inputs(src)
++		if tgt:
++			task.set_outputs(tgt)
++		self.tasks.append(task)
++		return task
++	def clone(self,env):
++		newobj=self.bld()
++		for x in self.__dict__:
++			if x in['env','bld']:
++				continue
++			elif x in['path','features']:
++				setattr(newobj,x,getattr(self,x))
++			else:
++				setattr(newobj,x,copy.copy(getattr(self,x)))
++		newobj.posted=False
++		if isinstance(env,str):
++			newobj.env=self.bld.all_envs[env].derive()
++		else:
++			newobj.env=env.derive()
++		return newobj
++def declare_chain(name='',rule=None,reentrant=None,color='BLUE',ext_in=[],ext_out=[],before=[],after=[],decider=None,scan=None,install_path=None,shell=False):
++	ext_in=Utils.to_list(ext_in)
++	ext_out=Utils.to_list(ext_out)
++	if not name:
++		name=rule
++	cls=Task.task_factory(name,rule,color=color,ext_in=ext_in,ext_out=ext_out,before=before,after=after,scan=scan,shell=shell)
++	def x_file(self,node):
++		ext=decider and decider(self,node)or cls.ext_out
++		if ext_in:
++			_ext_in=ext_in[0]
++		tsk=self.create_task(name,node)
++		cnt=0
++		keys=self.mappings.keys()+self.__class__.mappings.keys()
++		for x in ext:
++			k=node.change_ext(x,ext_in=_ext_in)
++			tsk.outputs.append(k)
++			if reentrant!=None:
++				if cnt<int(reentrant):
++					self.source.append(k)
++			else:
++				for y in keys:
++					if k.name.endswith(y):
++						self.source.append(k)
++						break
++			cnt+=1
++		if install_path:
++			self.bld.install_files(install_path,tsk.outputs)
++		return tsk
++	for x in cls.ext_in:
++		task_gen.mappings[x]=x_file
++	return x_file
++def taskgen_method(func):
++	setattr(task_gen,func.__name__,func)
++	return func
++def feature(*k):
++	def deco(func):
++		setattr(task_gen,func.__name__,func)
++		for name in k:
++			feats[name].update([func.__name__])
++		return func
++	return deco
++def before_method(*k):
++	def deco(func):
++		setattr(task_gen,func.__name__,func)
++		for fun_name in k:
++			if not func.__name__ in task_gen.prec[fun_name]:
++				task_gen.prec[fun_name].append(func.__name__)
++		return func
++	return deco
++before=before_method
++def after_method(*k):
++	def deco(func):
++		setattr(task_gen,func.__name__,func)
++		for fun_name in k:
++			if not fun_name in task_gen.prec[func.__name__]:
++				task_gen.prec[func.__name__].append(fun_name)
++		return func
++	return deco
++after=after_method
++def extension(*k):
++	def deco(func):
++		setattr(task_gen,func.__name__,func)
++		for x in k:
++			task_gen.mappings[x]=func
++		return func
++	return deco
++def to_nodes(self,lst,path=None):
++	tmp=[]
++	path=path or self.path
++	find=path.find_resource
++	if isinstance(lst,self.path.__class__):
++		lst=[lst]
++	for x in Utils.to_list(lst):
++		if isinstance(x,str):
++			node=find(x)
++		else:
++			node=x
++		if not node:
++			raise Errors.WafError("source not found: %r in %r"%(x,self))
++		tmp.append(node)
++	return tmp
++def process_source(self):
++	self.source=self.to_nodes(getattr(self,'source',[]))
++	for node in self.source:
++		self.get_hook(node)(self,node)
++def process_rule(self):
++	if not getattr(self,'rule',None):
++		return
++	name=str(getattr(self,'name',None)or self.target or self.rule)
++	cls=Task.task_factory(name,self.rule,getattr(self,'vars',[]),shell=getattr(self,'shell',True),color=getattr(self,'color','BLUE'))
++	tsk=self.create_task(name)
++	if getattr(self,'target',None):
++		if isinstance(self.target,str):
++			self.target=self.target.split()
++		if not isinstance(self.target,list):
++			self.target=[self.target]
++		for x in self.target:
++			if isinstance(x,str):
++				tsk.outputs.append(self.path.find_or_declare(x))
++			else:
++				x.parent.mkdir()
++				tsk.outputs.append(x)
++		if getattr(self,'install_path',None):
++			self.bld.install_files(self.install_path,tsk.outputs)
++	if getattr(self,'source',None):
++		tsk.inputs=self.to_nodes(self.source)
++		self.source=[]
++	if getattr(self,'scan',None):
++		cls.scan=self.scan
++	elif getattr(self,'deps',None):
++		def scan(self):
++			nodes=[]
++			for x in self.generator.to_list(self.generator.deps):
++				node=self.generator.path.find_resource(x)
++				if not node:
++					self.generator.bld.fatal('Could not find %r (was it declared?)'%x)
++				nodes.append(node)
++			return[nodes,[]]
++		cls.scan=scan
++	if getattr(self,'cwd',None):
++		tsk.cwd=self.cwd
++	if getattr(self,'update_outputs',None)or getattr(self,'on_results',None):
++		Task.update_outputs(cls)
++	if getattr(self,'always',None):
++		Task.always_run(cls)
++	for x in['after','before','ext_in','ext_out']:
++		setattr(cls,x,getattr(self,x,[]))
++def sequence_order(self):
++	if self.meths and self.meths[-1]!='sequence_order':
++		self.meths.append('sequence_order')
++		return
++	if getattr(self,'seq_start',None):
++		return
++	if getattr(self.bld,'prev',None):
++		self.bld.prev.post()
++		for x in self.bld.prev.tasks:
++			for y in self.tasks:
++				y.set_run_after(x)
++	self.bld.prev=self
++re_m4=re.compile('@(\w+)@',re.M)
++class subst_pc(Task.Task):
++	def run(self):
++		code=self.inputs[0].read()
++		code=code.replace('%','%%')
++		lst=[]
++		def repl(match):
++			g=match.group
++			if g(1):
++				lst.append(g(1))
++				return"%%(%s)s"%g(1)
++			return''
++		code=re_m4.sub(repl,code)
++		try:
++			d=self.generator.dct
++		except AttributeError:
++			d={}
++			for x in lst:
++				tmp=getattr(self.generator,x,'')or self.env.get_flat(x)or self.env.get_flat(x.upper())
++				d[x]=str(tmp)
++		self.outputs[0].write(code%d)
++		self.generator.bld.raw_deps[self.uid()]=self.dep_vars=lst
++		try:delattr(self,'cache_sig')
++		except AttributeError:pass
++		if getattr(self.generator,'chmod',None):
++			os.chmod(self.outputs[0].abspath(),self.generator.chmod)
++	def sig_vars(self):
++		bld=self.generator.bld
++		env=self.env
++		upd=self.m.update
++		vars=self.generator.bld.raw_deps.get(self.uid(),[])
++		act_sig=bld.hash_env_vars(env,vars)
++		upd(act_sig)
++		lst=[getattr(self.generator,x,'')for x in vars]
++		upd(Utils.h_list(lst))
++		return self.m.digest()
++def add_pcfile(self,node):
++	tsk=self.create_task('subst_pc',node,node.change_ext('.pc','.pc.in'))
++	self.bld.install_files(getattr(self,'install_path','${LIBDIR}/pkgconfig/'),tsk.outputs)
++class subst(subst_pc):
++	pass
++def process_subst(self):
++	src=self.to_nodes(getattr(self,'source',[]))
++	tgt=getattr(self,'target',[])
++	if isinstance(tgt,self.path.__class__):
++		tgt=[tgt]
++	tgt=[isinstance(x,self.path.__class__)and x or self.path.find_or_declare(x)for x in Utils.to_list(tgt)]
++	if len(src)!=len(tgt):
++		raise Errors.WafError('invalid source or target for %r'%self)
++	for x,y in zip(src,tgt):
++		if not(x and y):
++			raise Errors.WafError('invalid source or target for %r'%self)
++		tsk=self.create_task('subst',x,y)
++		for a in('after','before','ext_in','ext_out'):
++			val=getattr(self,a,None)
++			if val:
++				setattr(tsk,a,val)
++	inst_to=getattr(self,'install_path',None)
++	if inst_to:
++		self.bld.install_files(inst_to,tgt,chmod=getattr(self,'chmod',Utils.O644))
++	self.source=[]
++
++taskgen_method(to_nodes)
++feature('*')(process_source)
++feature('*')(process_rule)
++before_method('process_source')(process_rule)
++feature('seq')(sequence_order)
++extension('.pc.in')(add_pcfile)
++feature('subst')(process_subst)
++before_method('process_source','process_rule')(process_subst)
+\ No newline at end of file
+--- /dev/null
++++ ardour3/waflib/Task.py
+@@ -0,0 +1,672 @@
++#! /usr/bin/env python
++# encoding: utf-8
++# WARNING! Do not edit! http://waf.googlecode.com/git/docs/wafbook/single.html#_obtaining_the_waf_file
++
++import sys
++if sys.hexversion < 0x020400f0: from sets import Set as set
++import os,shutil,re,tempfile
++from waflib import Utils,Logs,Errors
++NOT_RUN=0
++MISSING=1
++CRASHED=2
++EXCEPTION=3
++SKIPPED=8
++SUCCESS=9
++ASK_LATER=-1
++SKIP_ME=-2
++RUN_ME=-3
++COMPILE_TEMPLATE_SHELL='''
++def f(tsk):
++	env = tsk.env
++	gen = tsk.generator
++	bld = gen.bld
++	wd = getattr(tsk, 'cwd', None)
++	p = env.get_flat
++	tsk.last_cmd = cmd = \'\'\' %s \'\'\' % s
++	return tsk.exec_command(cmd, cwd=wd, env=env.env or None)
++'''
++COMPILE_TEMPLATE_NOSHELL='''
++def f(tsk):
++	env = tsk.env
++	gen = tsk.generator
++	bld = gen.bld
++	wd = getattr(tsk, 'cwd', None)
++	def to_list(xx):
++		if isinstance(xx, str): return [xx]
++		return xx
++	tsk.last_cmd = lst = []
++	%s
++	lst = [x for x in lst if x]
++	return tsk.exec_command(lst, cwd=wd, env=env.env or None)
++'''
++def cache_outputs(cls):
++	m1=cls.run
++	def run(self):
++		bld=self.generator.bld
++		if bld.cache_global and not bld.nocache:
++			if self.can_retrieve_cache():
++				return 0
++		return m1(self)
++	cls.run=run
++	m2=cls.post_run
++	def post_run(self):
++		bld=self.generator.bld
++		ret=m2(self)
++		if bld.cache_global and not bld.nocache:
++			self.put_files_cache()
++		return ret
++	cls.post_run=post_run
++	return cls
++classes={}
++class store_task_type(type):
++	def __init__(cls,name,bases,dict):
++		super(store_task_type,cls).__init__(name,bases,dict)
++		name=cls.__name__
++		if name.endswith('_task'):
++			name=name.replace('_task','')
++		if name!='evil'and name!='TaskBase':
++			global classes
++			if getattr(cls,'run_str',None):
++				(f,dvars)=compile_fun(cls.run_str,cls.shell)
++				cls.hcode=cls.run_str
++				cls.run_str=None
++				cls.run=f
++				cls.vars=list(set(cls.vars+dvars))
++				cls.vars.sort()
++			elif getattr(cls,'run',None)and not'hcode'in cls.__dict__:
++				cls.hcode=Utils.h_fun(cls.run)
++			if not getattr(cls,'nocache',None):
++				cls=cache_outputs(cls)
++			classes[name]=cls
++evil=store_task_type('evil',(object,),{})
++class TaskBase(evil):
++	color='GREEN'
++	ext_in=[]
++	ext_out=[]
++	before=[]
++	after=[]
++	hcode=''
++	def __init__(self,*k,**kw):
++		self.hasrun=NOT_RUN
++		try:
++			self.generator=kw['generator']
++		except KeyError:
++			self.generator=self
++	def __repr__(self):
++		return'\n\t{task %r: %s %s}'%(self.__class__.__name__,id(self),str(getattr(self,'fun','')))
++	def __str__(self):
++		if hasattr(self,'fun'):
++			return'executing: %s\n'%self.fun.__name__
++		return self.__class__.__name__+'\n'
++	def __hash__(self):
++		return id(self)
++	def exec_command(self,cmd,**kw):
++		bld=self.generator.bld
++		try:
++			if not kw.get('cwd',None):
++				kw['cwd']=bld.cwd
++		except AttributeError:
++			bld.cwd=kw['cwd']=bld.variant_dir
++		return bld.exec_command(cmd,**kw)
++	def runnable_status(self):
++		return RUN_ME
++	def process(self):
++		m=self.master
++		if m.stop:
++			m.out.put(self)
++			return
++		try:
++			del self.generator.bld.task_sigs[self.uid()]
++		except:
++			pass
++		try:
++			self.generator.bld.returned_tasks.append(self)
++			self.log_display(self.generator.bld)
++			ret=self.run()
++		except Exception:
++			self.err_msg=Utils.ex_stack()
++			self.hasrun=EXCEPTION
++			m.error_handler(self)
++			m.out.put(self)
++			return
++		if ret:
++			self.err_code=ret
++			self.hasrun=CRASHED
++		else:
++			try:
++				self.post_run()
++			except Errors.WafError:
++				pass
++			except Exception:
++				self.err_msg=Utils.ex_stack()
++				self.hasrun=EXCEPTION
++			else:
++				self.hasrun=SUCCESS
++		if self.hasrun!=SUCCESS:
++			m.error_handler(self)
++		m.out.put(self)
++	def run(self):
++		if hasattr(self,'fun'):
++			return self.fun(self)
++		return 0
++	def post_run(self):
++		pass
++	def log_display(self,bld):
++		bld.to_log(self.display())
++	def display(self):
++		col1=Logs.colors(self.color)
++		col2=Logs.colors.NORMAL
++		master=self.master
++		def cur():
++			tmp=-1
++			if hasattr(master,'ready'):
++				tmp-=master.ready.qsize()
++			return master.processed+tmp
++		if self.generator.bld.progress_bar==1:
++			return self.generator.bld.progress_line(cur(),master.total,col1,col2)
++		if self.generator.bld.progress_bar==2:
++			ela=str(self.generator.bld.timer)
++			try:
++				ins=','.join([n.name for n in self.inputs])
++			except AttributeError:
++				ins=''
++			try:
++				outs=','.join([n.name for n in self.outputs])
++			except AttributeError:
++				outs=''
++			return'|Total %s|Current %s|Inputs %s|Outputs %s|Time %s|\n'%(master.total,cur(),ins,outs,ela)
++		s=str(self)
++		if not s:
++			return None
++		total=master.total
++		n=len(str(total))
++		fs='[%%%dd/%%%dd] %%s%%s%%s'%(n,n)
++		return fs%(cur(),total,col1,s,col2)
++	def attr(self,att,default=None):
++		ret=getattr(self,att,self)
++		if ret is self:return getattr(self.__class__,att,default)
++		return ret
++	def hash_constraints(self):
++		cls=self.__class__
++		tup=(str(cls.before),str(cls.after),str(cls.ext_in),str(cls.ext_out),cls.__name__,cls.hcode)
++		h=hash(tup)
++		return h
++	def format_error(self):
++		msg=getattr(self,'last_cmd','')
++		name=getattr(self.generator,'name','')
++		if getattr(self,"err_msg",None):
++			return self.err_msg
++		elif not self.hasrun:
++			return'task in %r was not executed for some reason: %r'%(name,self)
++		elif self.hasrun==CRASHED:
++			try:
++				return' -> task in %r failed (exit status %r): %r\n%r'%(name,self.err_code,self,msg)
++			except AttributeError:
++				return' -> task in %r failed: %r\n%r'%(name,self,msg)
++		elif self.hasrun==MISSING:
++			return' -> missing files in %r: %r\n%r'%(name,self,msg)
++		else:
++			return'invalid status for task in %r: %r'%(name,self.hasrun)
++	def colon(self,var1,var2):
++		tmp=self.env[var1]
++		if isinstance(var2,str):
++			it=self.env[var2]
++		else:
++			it=var2
++		if isinstance(tmp,str):
++			return[tmp%x for x in it]
++		else:
++			if Logs.verbose and not tmp and it:
++				Logs.warn('Missing env variable %r for task %r (generator %r)'%(var1,self,self.generator))
++			lst=[]
++			for y in it:
++				lst.extend(tmp)
++				lst.append(y)
++			return lst
++class Task(TaskBase):
++	vars=[]
++	shell=False
++	def __init__(self,*k,**kw):
++		TaskBase.__init__(self,*k,**kw)
++		self.env=kw['env']
++		self.inputs=[]
++		self.outputs=[]
++		self.dep_nodes=[]
++		self.run_after=set([])
++	def __str__(self):
++		env=self.env
++		src_str=' '.join([a.nice_path(env)for a in self.inputs])
++		tgt_str=' '.join([a.nice_path(env)for a in self.outputs])
++		if self.outputs:sep=' -> '
++		else:sep=''
++		return'%s: %s%s%s\n'%(self.__class__.__name__.replace('_task',''),src_str,sep,tgt_str)
++	def __repr__(self):
++		return"".join(['\n\t{task %r: '%id(self),self.__class__.__name__," ",",".join([x.name for x in self.inputs])," -> ",",".join([x.name for x in self.outputs]),'}'])
++	def uid(self):
++		try:
++			return self.uid_
++		except AttributeError:
++			m=Utils.md5()
++			up=m.update
++			up(self.__class__.__name__)
++			for x in self.inputs+self.outputs:
++				up(x.abspath())
++			self.uid_=m.digest()
++			return self.uid_
++	def set_inputs(self,inp):
++		if isinstance(inp,list):self.inputs+=inp
++		else:self.inputs.append(inp)
++	def set_outputs(self,out):
++		if isinstance(out,list):self.outputs+=out
++		else:self.outputs.append(out)
++	def set_run_after(self,task):
++		assert isinstance(task,TaskBase)
++		self.run_after.add(task)
++	def signature(self):
++		try:return self.cache_sig
++		except AttributeError:pass
++		self.m=Utils.md5()
++		self.m.update(self.hcode)
++		self.sig_explicit_deps()
++		self.sig_vars()
++		if self.scan:
++			try:
++				self.sig_implicit_deps()
++			except Errors.TaskRescan:
++				return self.signature()
++		ret=self.cache_sig=self.m.digest()
++		return ret
++	def runnable_status(self):
++		for t in self.run_after:
++			if not t.hasrun:
++				return ASK_LATER
++		bld=self.generator.bld
++		try:
++			new_sig=self.signature()
++		except Errors.TaskNotReady:
++			return ASK_LATER
++		key=self.uid()
++		try:
++			prev_sig=bld.task_sigs[key]
++		except KeyError:
++			Logs.debug("task: task %r must run as it was never run before or the task code changed"%self)
++			return RUN_ME
++		for node in self.outputs:
++			try:
++				if node.sig!=new_sig:
++					return RUN_ME
++			except AttributeError:
++				Logs.debug("task: task %r must run as the output nodes do not exist"%self)
++				return RUN_ME
++		if new_sig!=prev_sig:
++			return RUN_ME
++		return SKIP_ME
++	def post_run(self):
++		bld=self.generator.bld
++		sig=self.signature()
++		for node in self.outputs:
++			try:
++				os.stat(node.abspath())
++			except OSError:
++				self.hasrun=MISSING
++				self.err_msg='-> missing file: %r'%node.abspath()
++				raise Errors.WafError(self.err_msg)
++			node.sig=sig
++		bld.task_sigs[self.uid()]=self.cache_sig
++	def sig_explicit_deps(self):
++		bld=self.generator.bld
++		upd=self.m.update
++		for x in self.inputs+self.dep_nodes:
++			try:
++				upd(x.get_bld_sig())
++			except(AttributeError,TypeError):
++				raise Errors.WafError('Missing node signature for %r (required by %r)'%(x,self))
++		if bld.deps_man:
++			additional_deps=bld.deps_man
++			for x in self.inputs+self.outputs:
++				try:
++					d=additional_deps[id(x)]
++				except KeyError:
++					continue
++				for v in d:
++					if isinstance(v,bld.root.__class__):
++						try:
++							v=v.get_bld_sig()
++						except AttributeError:
++							raise Errors.WafError('Missing node signature for %r (required by %r)'%(v,self))
++					elif hasattr(v,'__call__'):
++						v=v()
++					upd(v)
++		return self.m.digest()
++	def sig_vars(self):
++		bld=self.generator.bld
++		env=self.env
++		upd=self.m.update
++		act_sig=bld.hash_env_vars(env,self.__class__.vars)
++		upd(act_sig)
++		dep_vars=getattr(self,'dep_vars',None)
++		if dep_vars:
++			upd(bld.hash_env_vars(env,dep_vars))
++		return self.m.digest()
++	scan=None
++	def sig_implicit_deps(self):
++		bld=self.generator.bld
++		key=self.uid()
++		prev=bld.task_sigs.get((key,'imp'),[])
++		if prev:
++			try:
++				if prev==self.compute_sig_implicit_deps():
++					return prev
++			except:
++				for x in bld.node_deps.get(self.uid(),[]):
++					if x.is_child_of(bld.srcnode):
++						try:
++							os.stat(x.abspath())
++						except:
++							try:
++								del x.parent.children[x.name]
++							except:
++								pass
++			del bld.task_sigs[(key,'imp')]
++			raise Errors.TaskRescan('rescan')
++		(nodes,names)=self.scan()
++		if Logs.verbose:
++			Logs.debug('deps: scanner for %s returned %s %s'%(str(self),str(nodes),str(names)))
++		bld.node_deps[key]=nodes
++		bld.raw_deps[key]=names
++		self.are_implicit_nodes_ready()
++		try:
++			bld.task_sigs[(key,'imp')]=sig=self.compute_sig_implicit_deps()
++		except:
++			if Logs.verbose:
++				for k in bld.node_deps.get(self.uid(),[]):
++					try:
++						k.get_bld_sig()
++					except:
++						Logs.warn('Missing signature for node %r (may cause rebuilds)'%k)
++		else:
++			return sig
++	def compute_sig_implicit_deps(self):
++		upd=self.m.update
++		bld=self.generator.bld
++		self.are_implicit_nodes_ready()
++		for k in bld.node_deps.get(self.uid(),[]):
++			upd(k.get_bld_sig())
++		return self.m.digest()
++	def are_implicit_nodes_ready(self):
++		bld=self.generator.bld
++		try:
++			cache=bld.dct_implicit_nodes
++		except:
++			bld.dct_implicit_nodes=cache={}
++		try:
++			dct=cache[bld.cur]
++		except KeyError:
++			dct=cache[bld.cur]={}
++			for tsk in bld.cur_tasks:
++				for x in tsk.outputs:
++					dct[x]=tsk
++		modified=False
++		for x in bld.node_deps.get(self.uid(),[]):
++			if x in dct:
++				self.run_after.add(dct[x])
++				modified=True
++		if modified:
++			for tsk in self.run_after:
++				if not tsk.hasrun:
++					raise Errors.TaskNotReady('not ready')
++	def can_retrieve_cache(self):
++		if not getattr(self,'outputs',None):
++			return None
++		sig=self.signature()
++		ssig=Utils.to_hex(self.uid())+Utils.to_hex(sig)
++		dname=os.path.join(self.generator.bld.cache_global,ssig)
++		try:
++			t1=os.stat(dname).st_mtime
++		except OSError:
++			return None
++		for node in self.outputs:
++			orig=os.path.join(dname,node.name)
++			try:
++				shutil.copy2(orig,node.abspath())
++				os.utime(orig,None)
++			except(OSError,IOError):
++				Logs.debug('task: failed retrieving file')
++				return None
++		try:
++			t2=os.stat(dname).st_mtime
++		except OSError:
++			return None
++		if t1!=t2:
++			return None
++		for node in self.outputs:
++			node.sig=sig
++			if self.generator.bld.progress_bar<1:
++				self.generator.bld.to_log('restoring from cache %r\n'%node.abspath())
++		self.cached=True
++		return True
++	def put_files_cache(self):
++		if getattr(self,'cached',None):
++			return None
++		if not getattr(self,'outputs',None):
++			return None
++		sig=self.signature()
++		ssig=Utils.to_hex(self.uid())+Utils.to_hex(sig)
++		dname=os.path.join(self.generator.bld.cache_global,ssig)
++		tmpdir=tempfile.mkdtemp(prefix=self.generator.bld.cache_global+os.sep+'waf')
++		try:
++			shutil.rmtree(dname)
++		except:
++			pass
++		try:
++			for node in self.outputs:
++				dest=os.path.join(tmpdir,node.name)
++				shutil.copy2(node.abspath(),dest)
++		except(OSError,IOError):
++			try:
++				shutil.rmtree(tmpdir)
++			except:
++				pass
++		else:
++			try:
++				os.rename(tmpdir,dname)
++			except OSError:
++				try:
++					shutil.rmtree(tmpdir)
++				except:
++					pass
++			else:
++				try:
++					os.chmod(dname,Utils.O755)
++				except:
++					pass
++def is_before(t1,t2):
++	to_list=Utils.to_list
++	for k in to_list(t2.ext_in):
++		if k in to_list(t1.ext_out):
++			return 1
++	if t1.__class__.__name__ in to_list(t2.after):
++		return 1
++	if t2.__class__.__name__ in to_list(t1.before):
++		return 1
++	return 0
++def set_file_constraints(tasks):
++	ins=Utils.defaultdict(set)
++	outs=Utils.defaultdict(set)
++	for x in tasks:
++		for a in getattr(x,'inputs',[])+getattr(x,'dep_nodes',[]):
++			ins[id(a)].add(x)
++		for a in getattr(x,'outputs',[]):
++			outs[id(a)].add(x)
++	links=set(ins.keys()).intersection(outs.keys())
++	for k in links:
++		for a in ins[k]:
++			a.run_after.update(outs[k])
++def set_precedence_constraints(tasks):
++	cstr_groups=Utils.defaultdict(list)
++	for x in tasks:
++		h=x.hash_constraints()
++		cstr_groups[h].append(x)
++	keys=list(cstr_groups.keys())
++	maxi=len(keys)
++	for i in range(maxi):
++		t1=cstr_groups[keys[i]][0]
++		for j in range(i+1,maxi):
++			t2=cstr_groups[keys[j]][0]
++			if is_before(t1,t2):
++				a=i
++				b=j
++			elif is_before(t2,t1):
++				a=j
++				b=i
++			else:
++				continue
++			for x in cstr_groups[keys[b]]:
++				x.run_after.update(cstr_groups[keys[a]])
++def funex(c):
++	dc={}
++	exec(c,dc)
++	return dc['f']
++reg_act=re.compile(r"(?P<backslash>\\)|(?P<dollar>\$\$)|(?P<subst>\$\{(?P<var>\w+)(?P<code>.*?)\})",re.M)
++def compile_fun_shell(line):
++	extr=[]
++	def repl(match):
++		g=match.group
++		if g('dollar'):return"$"
++		elif g('backslash'):return'\\\\'
++		elif g('subst'):extr.append((g('var'),g('code')));return"%s"
++		return None
++	line=reg_act.sub(repl,line)or line
++	parm=[]
++	dvars=[]
++	app=parm.append
++	for(var,meth)in extr:
++		if var=='SRC':
++			if meth:app('tsk.inputs%s'%meth)
++			else:app('" ".join([a.path_from(bld.bldnode) for a in tsk.inputs])')
++		elif var=='TGT':
++			if meth:app('tsk.outputs%s'%meth)
++			else:app('" ".join([a.path_from(bld.bldnode) for a in tsk.outputs])')
++		elif meth:
++			if meth.startswith(':'):
++				m=meth[1:]
++				if m=='SRC':
++					m='[a.path_from(bld.bldnode) for a in tsk.inputs]'
++				elif m=='TGT':
++					m='[a.path_from(bld.bldnode) for a in tsk.outputs]'
++				elif m[:3]not in('tsk','gen','bld'):
++					dvars.extend([var,meth[1:]])
++					m='%r'%m
++				app('" ".join(tsk.colon(%r, %s))'%(var,m))
++			else:
++				app('%s%s'%(var,meth))
++		else:
++			if not var in dvars:dvars.append(var)
++			app("p('%s')"%var)
++	if parm:parm="%% (%s) "%(',\n\t\t'.join(parm))
++	else:parm=''
++	c=COMPILE_TEMPLATE_SHELL%(line,parm)
++	Logs.debug('action: %s'%c)
++	return(funex(c),dvars)
++def compile_fun_noshell(line):
++	extr=[]
++	def repl(match):
++		g=match.group
++		if g('dollar'):return"$"
++		elif g('subst'):extr.append((g('var'),g('code')));return"<<|@|>>"
++		return None
++	line2=reg_act.sub(repl,line)
++	params=line2.split('<<|@|>>')
++	assert(extr)
++	buf=[]
++	dvars=[]
++	app=buf.append
++	for x in range(len(extr)):
++		params[x]=params[x].strip()
++		if params[x]:
++			app("lst.extend(%r)"%params[x].split())
++		(var,meth)=extr[x]
++		if var=='SRC':
++			if meth:app('lst.append(tsk.inputs%s)'%meth)
++			else:app("lst.extend([a.path_from(bld.bldnode) for a in tsk.inputs])")
++		elif var=='TGT':
++			if meth:app('lst.append(tsk.outputs%s)'%meth)
++			else:app("lst.extend([a.path_from(bld.bldnode) for a in tsk.outputs])")
++		elif meth:
++			if meth.startswith(':'):
++				m=meth[1:]
++				if m=='SRC':
++					m='[a.path_from(bld.bldnode) for a in tsk.inputs]'
++				elif m=='TGT':
++					m='[a.path_from(bld.bldnode) for a in tsk.outputs]'
++				elif m[:3]not in('tsk','gen','bld'):
++					dvars.extend([var,m])
++					m='%r'%m
++				app('lst.extend(tsk.colon(%r, %s))'%(var,m))
++			else:
++				app('lst.extend(gen.to_list(%s%s))'%(var,meth))
++		else:
++			app('lst.extend(to_list(env[%r]))'%var)
++			if not var in dvars:dvars.append(var)
++	if extr:
++		if params[-1]:
++			app("lst.extend(%r)"%params[-1].split())
++	fun=COMPILE_TEMPLATE_NOSHELL%"\n\t".join(buf)
++	Logs.debug('action: %s'%fun)
++	return(funex(fun),dvars)
++def compile_fun(line,shell=False):
++	if line.find('<')>0 or line.find('>')>0 or line.find('&&')>0:
++		shell=True
++	if shell:
++		return compile_fun_shell(line)
++	else:
++		return compile_fun_noshell(line)
++def task_factory(name,func=None,vars=None,color='GREEN',ext_in=[],ext_out=[],before=[],after=[],shell=False,scan=None):
++	params={'vars':vars or[],'color':color,'name':name,'ext_in':Utils.to_list(ext_in),'ext_out':Utils.to_list(ext_out),'before':Utils.to_list(before),'after':Utils.to_list(after),'shell':shell,'scan':scan,}
++	if isinstance(func,str):
++		params['run_str']=func
++	else:
++		params['run']=func
++	cls=type(Task)(name,(Task,),params)
++	global classes
++	classes[name]=cls
++	return cls
++def always_run(cls):
++	old=cls.runnable_status
++	def always(self):
++		ret=old(self)
++		if ret==SKIP_ME:
++			ret=RUN_ME
++		return ret
++	cls.runnable_status=always
++	return cls
++def update_outputs(cls):
++	old_post_run=cls.post_run
++	def post_run(self):
++		old_post_run(self)
++		for node in self.outputs:
++			node.sig=Utils.h_file(node.abspath())
++			self.generator.bld.task_sigs[node.abspath()]=self.uid()
++	cls.post_run=post_run
++	old_runnable_status=cls.runnable_status
++	def runnable_status(self):
++		status=old_runnable_status(self)
++		if status!=RUN_ME:
++			return status
++		try:
++			bld=self.generator.bld
++			prev_sig=bld.task_sigs[self.uid()]
++			if prev_sig==self.signature():
++				for x in self.outputs:
++					if not x.sig or bld.task_sigs[x.abspath()]!=self.uid():
++						return RUN_ME
++				return SKIP_ME
++		except KeyError:
++			pass
++		except IndexError:
++			pass
++		except AttributeError:
++			pass
++		return RUN_ME
++	cls.runnable_status=runnable_status
++	return cls
+--- /dev/null
++++ ardour3/waflib/Tools/ar.py
+@@ -0,0 +1,12 @@
++#! /usr/bin/env python
++# encoding: utf-8
++# WARNING! Do not edit! http://waf.googlecode.com/git/docs/wafbook/single.html#_obtaining_the_waf_file
++
++from waflib.Configure import conf
++def find_ar(conf):
++	conf.load('ar')
++def configure(conf):
++	conf.find_program('ar',var='AR')
++	conf.env.ARFLAGS='rcs'
++
++conf(find_ar)
+\ No newline at end of file
+--- /dev/null
++++ ardour3/waflib/Tools/asm.py
+@@ -0,0 +1,25 @@
++#! /usr/bin/env python
++# encoding: utf-8
++# WARNING! Do not edit! http://waf.googlecode.com/git/docs/wafbook/single.html#_obtaining_the_waf_file
++
++import os,sys
++from waflib import Task,Utils
++import waflib.Task
++from waflib.Tools.ccroot import link_task,stlink_task
++from waflib.TaskGen import extension,feature
++class asm(Task.Task):
++	color='BLUE'
++	run_str='${AS} ${ASFLAGS} ${CPPPATH_ST:INCPATHS} ${AS_SRC_F}${SRC} ${AS_TGT_F}${TGT}'
++def asm_hook(self,node):
++	return self.create_compiled_task('asm',node)
++class asmprogram(link_task):
++	run_str='${ASLINK} ${ASLINKFLAGS} ${ASLNK_TGT_F}${TGT} ${ASLNK_SRC_F}${SRC}'
++	ext_out=['.bin']
++	inst_to='${BINDIR}'
++	chmod=Utils.O755
++class asmshlib(asmprogram):
++	inst_to='${LIBDIR}'
++class asmstlib(stlink_task):
++	pass
++
++extension('.s','.S','.asm','.ASM','.spp','.SPP')(asm_hook)
+\ No newline at end of file
+--- /dev/null
++++ ardour3/waflib/Tools/bison.py
+@@ -0,0 +1,29 @@
++#! /usr/bin/env python
++# encoding: utf-8
++# WARNING! Do not edit! http://waf.googlecode.com/git/docs/wafbook/single.html#_obtaining_the_waf_file
++
++from waflib import Task
++from waflib.TaskGen import extension
++class bison(Task.Task):
++	color='BLUE'
++	run_str='${BISON} ${BISONFLAGS} ${SRC[0].abspath()} -o ${TGT[0].name}'
++	ext_out=['.h']
++def big_bison(self,node):
++	has_h='-d'in self.env['BISONFLAGS']
++	outs=[]
++	if node.name.endswith('.yc'):
++		outs.append(node.change_ext('.tab.cc'))
++		if has_h:
++			outs.append(node.change_ext('.tab.hh'))
++	else:
++		outs.append(node.change_ext('.tab.c'))
++		if has_h:
++			outs.append(node.change_ext('.tab.h'))
++	tsk=self.create_task('bison',node,outs)
++	tsk.cwd=node.parent.get_bld().abspath()
++	self.source.append(outs[0])
++def configure(conf):
++	conf.find_program('bison',var='BISON')
++	conf.env.BISONFLAGS=['-d']
++
++extension('.y','.yc','.yy')(big_bison)
+\ No newline at end of file
+--- /dev/null
++++ ardour3/waflib/Tools/c_aliases.py
+@@ -0,0 +1,56 @@
++#! /usr/bin/env python
++# encoding: utf-8
++# WARNING! Do not edit! http://waf.googlecode.com/git/docs/wafbook/single.html#_obtaining_the_waf_file
++
++import os,sys,re
++from waflib import Utils,Build
++from waflib.Configure import conf
++def get_extensions(lst):
++	ret=[]
++	for x in Utils.to_list(lst):
++		try:
++			if not isinstance(x,str):
++				x=x.name
++			ret.append(x[x.rfind('.')+1:])
++		except:
++			pass
++	return ret
++def sniff_features(**kw):
++	exts=get_extensions(kw['source'])
++	type=kw['_type']
++	feats=[]
++	if'cxx'in exts or'cpp'in exts or'c++'in exts or'cc'in exts or'C'in exts:
++		feats.append('cxx')
++	if'c'in exts or'vala'in exts:
++		feats.append('c')
++	if'd'in exts:
++		feats.append('d')
++	if'java'in exts:
++		feats.append('java')
++	if'java'in exts:
++		return'java'
++	if type in['program','shlib','stlib']:
++		for x in feats:
++			if x in['cxx','d','c']:
++				feats.append(x+type)
++	return feats
++def set_features(kw,_type):
++	kw['_type']=_type
++	kw['features']=Utils.to_list(kw.get('features',[]))+Utils.to_list(sniff_features(**kw))
++def program(bld,*k,**kw):
++	set_features(kw,'program')
++	return bld(*k,**kw)
++def shlib(bld,*k,**kw):
++	set_features(kw,'shlib')
++	return bld(*k,**kw)
++def stlib(bld,*k,**kw):
++	set_features(kw,'stlib')
++	return bld(*k,**kw)
++def objects(bld,*k,**kw):
++	set_features(kw,'objects')
++	return bld(*k,**kw)
++
++conf(program)
++conf(shlib)
++conf(stlib)
++conf(objects)
+\ No newline at end of file
+--- /dev/null
++++ ardour3/waflib/Tools/c_config.py
+@@ -0,0 +1,713 @@
++#! /usr/bin/env python
++# encoding: utf-8
++# WARNING! Do not edit! http://waf.googlecode.com/git/docs/wafbook/single.html#_obtaining_the_waf_file
++
++import sys
++if sys.hexversion < 0x020400f0: from sets import Set as set
++import os,imp,sys,re,shlex,shutil
++from waflib import Build,Utils,Configure,Task,Options,Logs,TaskGen,Errors,ConfigSet,Runner
++from waflib.TaskGen import before_method,after_method,feature
++from waflib.Configure import conf
++WAF_CONFIG_H='config.h'
++DEFKEYS='define_key'
++INCKEYS='include_key'
++cfg_ver={'atleast-version':'>=','exact-version':'==','max-version':'<=',}
++SNIP_FUNCTION='''
++	int main() {
++	void *p;
++	p=(void*)(%s);
++	return 0;
++}
++'''
++SNIP_TYPE='''
++int main() {
++	if ((%(type_name)s *) 0) return 0;
++	if (sizeof (%(type_name)s)) return 0;
++}
++'''
++SNIP_CLASS='''
++int main() {
++	if (
++}
++'''
++SNIP_EMPTY_PROGRAM='''
++int main() {
++	return 0;
++}
++'''
++SNIP_FIELD='''
++int main() {
++	char *off;
++	off = (char*) &((%(type_name)s*)0)->%(field_name)s;
++	return (size_t) off < sizeof(%(type_name)s);
++}
++'''
++MACRO_TO_DESTOS={'__linux__':'linux','__GNU__':'gnu','__FreeBSD__':'freebsd','__NetBSD__':'netbsd','__OpenBSD__':'openbsd','__sun':'sunos','__hpux':'hpux','__sgi':'irix','_AIX':'aix','__CYGWIN__':'cygwin','__MSYS__':'msys','_UWIN':'uwin','_WIN64':'win32','_WIN32':'win32','__ENVIRONMENT_MAC_OS_X_VERSION_MIN_REQUIRED__':'darwin','__ENVIRONMENT_IPHONE_OS_VERSION_MIN_REQUIRED__':'darwin','__QNX__':'qnx','__native_client__':'nacl'}
++MACRO_TO_DEST_CPU={'__x86_64__':'x86_64','__i386__':'x86','__ia64__':'ia','__mips__':'mips','__sparc__':'sparc','__alpha__':'alpha','__arm__':'arm','__hppa__':'hppa','__powerpc__':'powerpc',}
++def parse_flags(self,line,uselib,env=None,force_static=False):
++	assert(isinstance(line,str))
++	env=env or self.env
++	app=env.append_value
++	appu=env.append_unique
++	lex=shlex.shlex(line,posix=False)
++	lex.whitespace_split=True
++	lex.commenters=''
++	lst=list(lex)
++	while lst:
++		x=lst.pop(0)
++		st=x[:2]
++		ot=x[2:]
++		if st=='-I'or st=='/I':
++			if not ot:ot=lst.pop(0)
++			appu('INCLUDES_'+uselib,[ot])
++		elif st=='-include':
++			tmp=[x,lst.pop(0)]
++			app('CFLAGS',tmp)
++			app('CXXFLAGS',tmp)
++		elif st=='-D'or(self.env.CXX_NAME=='msvc'and st=='/D'):
++			if not ot:ot=lst.pop(0)
++			app('DEFINES_'+uselib,[ot])
++		elif st=='-l':
++			if not ot:ot=lst.pop(0)
++			prefix=force_static and'STLIB_'or'LIB_'
++			appu(prefix+uselib,[ot])
++		elif st=='-L':
++			if not ot:ot=lst.pop(0)
++			appu('LIBPATH_'+uselib,[ot])
++		elif x=='-pthread'or x.startswith('+')or x.startswith('-std'):
++			app('CFLAGS_'+uselib,[x])
++			app('CXXFLAGS_'+uselib,[x])
++			app('LINKFLAGS_'+uselib,[x])
++		elif x=='-framework':
++			appu('FRAMEWORK_'+uselib,[lst.pop(0)])
++		elif x.startswith('-F'):
++			appu('FRAMEWORKPATH_'+uselib,[x[2:]])
++		elif x.startswith('-Wl'):
++			app('LINKFLAGS_'+uselib,[x])
++		elif x.startswith('-m')or x.startswith('-f')or x.startswith('-dynamic'):
++			app('CFLAGS_'+uselib,[x])
++			app('CXXFLAGS_'+uselib,[x])
++		elif x.startswith('-bundle'):
++			app('LINKFLAGS_'+uselib,[x])
++		elif x.startswith('-undefined'):
++			arg=lst.pop(0)
++			app('LINKFLAGS_'+uselib,[x,arg])
++		elif x.startswith('-arch')or x.startswith('-isysroot'):
++			tmp=[x,lst.pop(0)]
++			app('CFLAGS_'+uselib,tmp)
++			app('CXXFLAGS_'+uselib,tmp)
++			app('LINKFLAGS_'+uselib,tmp)
++		elif x.endswith('.a')or x.endswith('.so')or x.endswith('.dylib'):
++			appu('LINKFLAGS_'+uselib,[x])
++def ret_msg(self,f,kw):
++	if isinstance(f,str):
++		return f
++	return f(kw)
++def validate_cfg(self,kw):
++	if not'path'in kw:
++		if not self.env.PKGCONFIG:
++			self.find_program('pkg-config',var='PKGCONFIG')
++		kw['path']=self.env.PKGCONFIG
++	if'atleast_pkgconfig_version'in kw:
++		if not'msg'in kw:
++			kw['msg']='Checking for pkg-config version >= %r'%kw['atleast_pkgconfig_version']
++		return
++	if not'okmsg'in kw:
++		kw['okmsg']='yes'
++	if not'errmsg'in kw:
++		kw['errmsg']='not found'
++	if'modversion'in kw:
++		if not'msg'in kw:
++			kw['msg']='Checking for %r version'%kw['modversion']
++		return
++	for x in cfg_ver.keys():
++		y=x.replace('-','_')
++		if y in kw:
++			if not'package'in kw:
++				raise ValueError('%s requires a package'%x)
++			if not'msg'in kw:
++				kw['msg']='Checking for %r %s %s'%(kw['package'],cfg_ver[x],kw[y])
++			return
++	if not'msg'in kw:
++		kw['msg']='Checking for %r'%(kw['package']or kw['path'])
++def exec_cfg(self,kw):
++	if'atleast_pkgconfig_version'in kw:
++		cmd=[kw['path'],'--atleast-pkgconfig-version=%s'%kw['atleast_pkgconfig_version']]
++		self.cmd_and_log(cmd)
++		if not'okmsg'in kw:
++			kw['okmsg']='yes'
++		return
++	for x in cfg_ver:
++		y=x.replace('-','_')
++		if y in kw:
++			self.cmd_and_log([kw['path'],'--%s=%s'%(x,kw[y]),kw['package']])
++			if not'okmsg'in kw:
++				kw['okmsg']='yes'
++			self.define(self.have_define(kw.get('uselib_store',kw['package'])),1,0)
++			break
++	if'modversion'in kw:
++		version=self.cmd_and_log([kw['path'],'--modversion',kw['modversion']]).strip()
++		self.define('%s_VERSION'%Utils.quote_define_name(kw.get('uselib_store',kw['modversion'])),version)
++		return version
++	lst=[kw['path']]
++	defi=kw.get('define_variable',None)
++	if not defi:
++		defi=self.env.PKG_CONFIG_DEFINES or{}
++	for key,val in defi.items():
++		lst.append('--define-variable=%s=%s'%(key,val))
++	if kw['package']:
++		lst.extend(Utils.to_list(kw['package']))
++	if'variables'in kw:
++		env=kw.get('env',self.env)
++		uselib=kw.get('uselib_store',kw['package'].upper())
++		vars=Utils.to_list(kw['variables'])
++		for v in vars:
++			val=self.cmd_and_log(lst+['--variable='+v]).strip()
++			var='%s_%s'%(uselib,v)
++			env[var]=val
++		if not'okmsg'in kw:
++			kw['okmsg']='yes'
++		return
++	static=False
++	if'args'in kw:
++		args=Utils.to_list(kw['args'])
++		if'--static'in args or'--static-libs'in args:
++			static=True
++		lst+=args
++	ret=self.cmd_and_log(lst)
++	if not'okmsg'in kw:
++		kw['okmsg']='yes'
++	self.define(self.have_define(kw.get('uselib_store',kw['package'])),1,0)
++	self.parse_flags(ret,kw.get('uselib_store',kw['package'].upper()),kw.get('env',self.env),force_static=static)
++	return ret
++def check_cfg(self,*k,**kw):
++	if k:
++		lst=k[0].split()
++		kw['package']=lst[0]
++		kw['args']=' '.join(lst[1:])
++	self.validate_cfg(kw)
++	if'msg'in kw:
++		self.start_msg(kw['msg'])
++	ret=None
++	try:
++		ret=self.exec_cfg(kw)
++	except self.errors.WafError ,e:
++		if'errmsg'in kw:
++			self.end_msg(kw['errmsg'],'YELLOW')
++		if Logs.verbose>1:
++			raise
++		else:
++			self.fatal('The configuration failed')
++	else:
++		kw['success']=ret
++		if'okmsg'in kw:
++			self.end_msg(self.ret_msg(kw['okmsg'],kw))
++	return ret
++def validate_c(self,kw):
++	if not'env'in kw:
++		kw['env']=self.env.derive()
++	env=kw['env']
++	if not'compiler'in kw and not'features'in kw:
++		kw['compiler']='c'
++		if env['CXX_NAME']and Task.classes.get('cxx',None):
++			kw['compiler']='cxx'
++			if not self.env['CXX']:
++				self.fatal('a c++ compiler is required')
++		else:
++			if not self.env['CC']:
++				self.fatal('a c compiler is required')
++	if not'compile_mode'in kw:
++		kw['compile_mode']='c'
++		if'cxx'in Utils.to_list(kw.get('features',[]))or kw.get('compiler','')=='cxx':
++			kw['compile_mode']='cxx'
++	if not'type'in kw:
++		kw['type']='cprogram'
++	if not'features'in kw:
++		kw['features']=[kw['compile_mode'],kw['type']]
++	else:
++		kw['features']=Utils.to_list(kw['features'])
++	if not'compile_filename'in kw:
++		kw['compile_filename']='test.c'+((kw['compile_mode']=='cxx')and'pp'or'')
++	def to_header(dct):
++		if'header_name'in dct:
++			dct=Utils.to_list(dct['header_name'])
++			return''.join(['#include <%s>\n'%x for x in dct])
++		return''
++	if'framework_name'in kw:
++		fwkname=kw['framework_name']
++		if not'uselib_store'in kw:
++			kw['uselib_store']=fwkname.upper()
++		if not kw.get('no_header',False):
++			if not'header_name'in kw:
++				kw['header_name']=[]
++			fwk='%s/%s.h'%(fwkname,fwkname)
++			if kw.get('remove_dot_h',None):
++				fwk=fwk[:-2]
++			kw['header_name']=Utils.to_list(kw['header_name'])+[fwk]
++		kw['msg']='Checking for framework %s'%fwkname
++		kw['framework']=fwkname
++	if'function_name'in kw:
++		fu=kw['function_name']
++		if not'msg'in kw:
++			kw['msg']='Checking for function %s'%fu
++		kw['code']=to_header(kw)+SNIP_FUNCTION%fu
++		if not'uselib_store'in kw:
++			kw['uselib_store']=fu.upper()
++		if not'define_name'in kw:
++			kw['define_name']=self.have_define(fu)
++	elif'type_name'in kw:
++		tu=kw['type_name']
++		if not'header_name'in kw:
++			kw['header_name']='stdint.h'
++		if'field_name'in kw:
++			field=kw['field_name']
++			kw['code']=to_header(kw)+SNIP_FIELD%{'type_name':tu,'field_name':field}
++			if not'msg'in kw:
++				kw['msg']='Checking for field %s in %s'%(field,tu)
++			if not'define_name'in kw:
++				kw['define_name']=self.have_define((tu+'_'+field).upper())
++		else:
++			kw['code']=to_header(kw)+SNIP_TYPE%{'type_name':tu}
++			if not'msg'in kw:
++				kw['msg']='Checking for type %s'%tu
++			if not'define_name'in kw:
++				kw['define_name']=self.have_define(tu.upper())
++	elif'header_name'in kw:
++		if not'msg'in kw:
++			kw['msg']='Checking for header %s'%kw['header_name']
++		l=Utils.to_list(kw['header_name'])
++		assert len(l)>0,'list of headers in header_name is empty'
++		kw['code']=to_header(kw)+SNIP_EMPTY_PROGRAM
++		if not'uselib_store'in kw:
++			kw['uselib_store']=l[0].upper()
++		if not'define_name'in kw:
++			kw['define_name']=self.have_define(l[0])
++	if'lib'in kw:
++		if not'msg'in kw:
++			kw['msg']='Checking for library %s'%kw['lib']
++		if not'uselib_store'in kw:
++			kw['uselib_store']=kw['lib'].upper()
++	if'stlib'in kw:
++		if not'msg'in kw:
++			kw['msg']='Checking for static library %s'%kw['stlib']
++		if not'uselib_store'in kw:
++			kw['uselib_store']=kw['stlib'].upper()
++	if'fragment'in kw:
++		kw['code']=kw['fragment']
++		if not'msg'in kw:
++			kw['msg']='Checking for code snippet'
++		if not'errmsg'in kw:
++			kw['errmsg']='no'
++	for(flagsname,flagstype)in[('cxxflags','compiler'),('cflags','compiler'),('linkflags','linker')]:
++		if flagsname in kw:
++			if not'msg'in kw:
++				kw['msg']='Checking for %s flags %s'%(flagstype,kw[flagsname])
++			if not'errmsg'in kw:
++				kw['errmsg']='no'
++	if not'execute'in kw:
++		kw['execute']=False
++	if kw['execute']:
++		kw['features'].append('test_exec')
++	if not'errmsg'in kw:
++		kw['errmsg']='not found'
++	if not'okmsg'in kw:
++		kw['okmsg']='yes'
++	if not'code'in kw:
++		kw['code']=SNIP_EMPTY_PROGRAM
++	if self.env[INCKEYS]:
++		kw['code']='\n'.join(['#include <%s>'%x for x in self.env[INCKEYS]])+'\n'+kw['code']
++	if not kw.get('success'):kw['success']=None
++	if'define_name'in kw:
++		self.undefine(kw['define_name'])
++	assert'msg'in kw,'invalid parameters, read http://freehackers.org/~tnagy/wafbook/single.html#config_helpers_c'
++def post_check(self,*k,**kw):
++	is_success=0
++	if kw['execute']:
++		if kw['success']is not None:
++			if kw.get('define_ret',False):
++				is_success=kw['success']
++			else:
++				is_success=(kw['success']==0)
++	else:
++		is_success=(kw['success']==0)
++	if'define_name'in kw:
++		if'header_name'in kw or'function_name'in kw or'type_name'in kw or'fragment'in kw:
++			nm=kw['define_name']
++			if kw['execute']and kw.get('define_ret',None)and isinstance(is_success,str):
++				self.define(kw['define_name'],is_success,quote=kw.get('quote',1))
++			else:
++				self.define_cond(kw['define_name'],is_success)
++		else:
++			self.define_cond(kw['define_name'],is_success)
++	if'header_name'in kw:
++		if kw.get('auto_add_header_name',False):
++			self.env.append_value(INCKEYS,Utils.to_list(kw['header_name']))
++	if is_success and'uselib_store'in kw:
++		from waflib.Tools import ccroot
++		_vars=set([])
++		for x in kw['features']:
++			if x in ccroot.USELIB_VARS:
++				_vars|=ccroot.USELIB_VARS[x]
++		for k in _vars:
++			lk=k.lower()
++			if k=='INCLUDES':lk='includes'
++			if k=='DEFINES':lk='defines'
++			if lk in kw:
++				val=kw[lk]
++				if isinstance(val,str):
++					val=val.rstrip(os.path.sep)
++				self.env.append_unique(k+'_'+kw['uselib_store'],val)
++	return is_success
++def check(self,*k,**kw):
++	self.validate_c(kw)
++	self.start_msg(kw['msg'])
++	ret=None
++	try:
++		ret=self.run_c_code(*k,**kw)
++	except self.errors.ConfigurationError ,e:
++		self.end_msg(kw['errmsg'],'YELLOW')
++		if Logs.verbose>1:
++			raise
++		else:
++			self.fatal('The configuration failed')
++	else:
++		kw['success']=ret
++		self.end_msg(self.ret_msg(kw['okmsg'],kw))
++	ret=self.post_check(*k,**kw)
++	if not ret:
++		self.fatal('The configuration failed %r'%ret)
++	return ret
++class test_exec(Task.Task):
++	color='PINK'
++	def run(self):
++		if getattr(self.generator,'rpath',None):
++			if getattr(self.generator,'define_ret',False):
++				self.generator.bld.retval=self.generator.bld.cmd_and_log([self.inputs[0].abspath()])
++			else:
++				self.generator.bld.retval=self.generator.bld.exec_command([self.inputs[0].abspath()])
++		else:
++			env=self.env.env or{}
++			env.update(dict(os.environ))
++			for var in('LD_LIBRARY_PATH','DYLD_LIBRARY_PATH','PATH'):
++				env[var]=self.inputs[0].parent.abspath()+os.path.pathsep+env.get(var,'')
++			if getattr(self.generator,'define_ret',False):
++				self.generator.bld.retval=self.generator.bld.cmd_and_log([self.inputs[0].abspath()],env=env)
++			else:
++				self.generator.bld.retval=self.generator.bld.exec_command([self.inputs[0].abspath()],env=env)
++def test_exec_fun(self):
++	self.create_task('test_exec',self.link_task.outputs[0])
++CACHE_RESULTS=1
++COMPILE_ERRORS=2
++def run_c_code(self,*k,**kw):
++	lst=[str(v)for(p,v)in kw.items()if p!='env']
++	h=Utils.h_list(lst)
++	dir=self.bldnode.abspath()+os.sep+(not Utils.is_win32 and'.'or'')+'conf_check_'+Utils.to_hex(h)
++	try:
++		os.makedirs(dir)
++	except:
++		pass
++	try:
++		os.stat(dir)
++	except:
++		self.fatal('cannot use the configuration test folder %r'%dir)
++	cachemode=getattr(Options.options,'confcache',None)
++	if cachemode==CACHE_RESULTS:
++		try:
++			proj=ConfigSet.ConfigSet(os.path.join(dir,'cache_run_c_code'))
++			ret=proj['cache_run_c_code']
++		except:
++			pass
++		else:
++			if isinstance(ret,str)and ret.startswith('Test does not build'):
++				self.fatal(ret)
++			return ret
++	bdir=os.path.join(dir,'testbuild')
++	if not os.path.exists(bdir):
++		os.makedirs(bdir)
++	self.test_bld=bld=Build.BuildContext(top_dir=dir,out_dir=bdir)
++	bld.init_dirs()
++	bld.progress_bar=0
++	bld.targets='*'
++	if kw['compile_filename']:
++		node=bld.srcnode.make_node(kw['compile_filename'])
++		node.write(kw['code'])
++	bld.logger=self.logger
++	bld.all_envs.update(self.all_envs)
++	bld.env=kw['env']
++	o=bld(features=kw['features'],source=kw['compile_filename'],target='testprog')
++	for k,v in kw.items():
++		setattr(o,k,v)
++	self.to_log("==>\n%s\n<=="%kw['code'])
++	bld.targets='*'
++	ret=-1
++	try:
++		try:
++			bld.compile()
++		except Errors.WafError:
++			ret='Test does not build: %s'%Utils.ex_stack()
++			self.fatal(ret)
++		else:
++			ret=getattr(bld,'retval',0)
++	finally:
++		proj=ConfigSet.ConfigSet()
++		proj['cache_run_c_code']=ret
++		proj.store(os.path.join(dir,'cache_run_c_code'))
++	return ret
++def check_cxx(self,*k,**kw):
++	kw['compiler']='cxx'
++	return self.check(*k,**kw)
++def check_cc(self,*k,**kw):
++	kw['compiler']='c'
++	return self.check(*k,**kw)
++def define(self,key,val,quote=True):
++	assert key and isinstance(key,str)
++	if isinstance(val,int)or isinstance(val,float):
++		s='%s=%s'
++	else:
++		s=quote and'%s="%s"'or'%s=%s'
++	app=s%(key,str(val))
++	ban=key+'='
++	lst=self.env['DEFINES']
++	for x in lst:
++		if x.startswith(ban):
++			lst[lst.index(x)]=app
++			break
++	else:
++		self.env.append_value('DEFINES',app)
++	self.env.append_unique(DEFKEYS,key)
++def undefine(self,key):
++	assert key and isinstance(key,str)
++	ban=key+'='
++	lst=[x for x in self.env['DEFINES']if not x.startswith(ban)]
++	self.env['DEFINES']=lst
++	self.env.append_unique(DEFKEYS,key)
++def define_cond(self,key,val):
++	assert key and isinstance(key,str)
++	if val:
++		self.define(key,1)
++	else:
++		self.undefine(key)
++def is_defined(self,key):
++	assert key and isinstance(key,str)
++	ban=key+'='
++	for x in self.env['DEFINES']:
++		if x.startswith(ban):
++			return True
++	return False
++def get_define(self,key):
++	assert key and isinstance(key,str)
++	ban=key+'='
++	for x in self.env['DEFINES']:
++		if x.startswith(ban):
++			return x[len(ban):]
++	return None
++def have_define(self,key):
++	return self.__dict__.get('HAVE_PAT','HAVE_%s')%Utils.quote_define_name(key)
++def write_config_header(self,configfile='',guard='',top=False,env=None,defines=True,headers=False,remove=True):
++	if not configfile:configfile=WAF_CONFIG_H
++	waf_guard=guard or'_%s_WAF'%Utils.quote_define_name(configfile)
++	node=top and self.bldnode or self.path.get_bld()
++	node=node.make_node(configfile)
++	node.parent.mkdir()
++	lst=['/* WARNING! All changes made to this file will be lost! */\n']
++	lst.append('#ifndef %s\n#define %s\n'%(waf_guard,waf_guard))
++	lst.append(self.get_config_header(defines,headers))
++	lst.append('\n#endif /* %s */\n'%waf_guard)
++	node.write('\n'.join(lst))
++	env=env or self.env
++	env.append_unique(Build.CFG_FILES,[node.abspath()])
++	if remove:
++		for key in self.env[DEFKEYS]:
++			self.undefine(key)
++		self.env[DEFKEYS]=[]
++def get_config_header(self,defines=True,headers=False):
++	lst=[]
++	if headers:
++		for x in self.env[INCKEYS]:
++			lst.append('#include <%s>'%x)
++	if defines:
++		for x in self.env[DEFKEYS]:
++			if self.is_defined(x):
++				val=self.get_define(x)
++				lst.append('#define %s %s'%(x,val))
++			else:
++				lst.append('/* #undef %s */'%x)
++	return"\n".join(lst)
++def cc_add_flags(conf):
++	conf.add_os_flags('CPPFLAGS','CFLAGS')
++	conf.add_os_flags('CFLAGS')
++def cxx_add_flags(conf):
++	conf.add_os_flags('CPPFLAGS','CXXFLAGS')
++	conf.add_os_flags('CXXFLAGS')
++def link_add_flags(conf):
++	conf.add_os_flags('LINKFLAGS')
++	conf.add_os_flags('LDFLAGS','LINKFLAGS')
++def cc_load_tools(conf):
++	if not conf.env.DEST_OS:
++		conf.env.DEST_OS=Utils.unversioned_sys_platform()
++	conf.load('c')
++def cxx_load_tools(conf):
++	if not conf.env.DEST_OS:
++		conf.env.DEST_OS=Utils.unversioned_sys_platform()
++	conf.load('cxx')
++def get_cc_version(conf,cc,gcc=False,icc=False):
++	cmd=cc+['-dM','-E','-']
++	env=conf.env.env or None
++	try:
++		p=Utils.subprocess.Popen(cmd,stdin=Utils.subprocess.PIPE,stdout=Utils.subprocess.PIPE,stderr=Utils.subprocess.PIPE,env=env)
++		p.stdin.write('\n')
++		out=p.communicate()[0]
++	except:
++		conf.fatal('Could not determine the compiler version %r'%cmd)
++	if not isinstance(out,str):
++		out=out.decode(sys.stdout.encoding)
++	if gcc:
++		if out.find('__INTEL_COMPILER')>=0:
++			conf.fatal('The intel compiler pretends to be gcc')
++		if out.find('__GNUC__')<0:
++			conf.fatal('Could not determine the compiler type')
++	if icc and out.find('__INTEL_COMPILER')<0:
++		conf.fatal('Not icc/icpc')
++	k={}
++	if icc or gcc:
++		out=out.split('\n')
++		for line in out:
++			lst=shlex.split(line)
++			if len(lst)>2:
++				key=lst[1]
++				val=lst[2]
++				k[key]=val
++		def isD(var):
++			return var in k
++		def isT(var):
++			return var in k and k[var]!='0'
++		if not conf.env.DEST_OS:
++			conf.env.DEST_OS=''
++		for i in MACRO_TO_DESTOS:
++			if isD(i):
++				conf.env.DEST_OS=MACRO_TO_DESTOS[i]
++				break
++		else:
++			if isD('__APPLE__')and isD('__MACH__'):
++				conf.env.DEST_OS='darwin'
++			elif isD('__unix__'):
++				conf.env.DEST_OS='generic'
++		if isD('__ELF__'):
++			conf.env.DEST_BINFMT='elf'
++		elif isD('__WINNT__')or isD('__CYGWIN__'):
++			conf.env.DEST_BINFMT='pe'
++			conf.env.LIBDIR=conf.env['PREFIX']+'/bin'
++		elif isD('__APPLE__'):
++			conf.env.DEST_BINFMT='mac-o'
++		if not conf.env.DEST_BINFMT:
++			conf.env.DEST_BINFMT=Utils.destos_to_binfmt(conf.env.DEST_OS)
++		for i in MACRO_TO_DEST_CPU:
++			if isD(i):
++				conf.env.DEST_CPU=MACRO_TO_DEST_CPU[i]
++				break
++		Logs.debug('ccroot: dest platform: '+' '.join([conf.env[x]or'?'for x in('DEST_OS','DEST_BINFMT','DEST_CPU')]))
++		if icc:
++			ver=k['__INTEL_COMPILER']
++			conf.env['CC_VERSION']=(ver[:-2],ver[-2],ver[-1])
++		else:
++			conf.env['CC_VERSION']=(k['__GNUC__'],k['__GNUC_MINOR__'],k['__GNUC_PATCHLEVEL__'])
++	return k
++def get_xlc_version(conf,cc):
++	version_re=re.compile(r"IBM XL C/C\+\+.*, V(?P<major>\d*)\.(?P<minor>\d*)",re.I).search
++	cmd=cc+['-qversion']
++	try:
++		out,err=conf.cmd_and_log(cmd,output=0)
++	except Errors.WafError:
++		conf.fatal('Could not find xlc %r'%cmd)
++	if out:match=version_re(out)
++	else:match=version_re(err)
++	if not match:
++		conf.fatal('Could not determine the XLC version.')
++	k=match.groupdict()
++	conf.env['CC_VERSION']=(k['major'],k['minor'])
++def add_as_needed(self):
++	if self.env.DEST_BINFMT=='elf'and'gcc'in(self.env.CXX_NAME,self.env.CC_NAME):
++		self.env.append_unique('LINKFLAGS','--as-needed')
++class cfgtask(Task.TaskBase):
++	def display(self):
++		return''
++	def runnable_status(self):
++		return Task.RUN_ME
++	def run(self):
++		conf=self.conf
++		bld=Build.BuildContext(top_dir=conf.srcnode.abspath(),out_dir=conf.bldnode.abspath())
++		bld.env=conf.env
++		bld.init_dirs()
++		bld.in_msg=1
++		bld.logger=self.logger
++		try:
++			bld.check(**self.args)
++		except:
++			return 1
++def multicheck(self,*k,**kw):
++	self.start_msg(kw.get('msg','Executing %d configuration tests'%len(k)))
++	class par(object):
++		def __init__(self):
++			self.keep=False
++			self.cache_global=Options.cache_global
++			self.nocache=Options.options.nocache
++			self.returned_tasks=[]
++		def total(self):
++			return len(tasks)
++		def to_log(self,*k,**kw):
++			return
++	bld=par()
++	tasks=[]
++	for dct in k:
++		x=cfgtask(bld=bld)
++		tasks.append(x)
++		x.args=dct
++		x.bld=bld
++		x.conf=self
++		x.args=dct
++		x.logger=Logs.make_mem_logger(str(id(x)),self.logger)
++	def it():
++		yield tasks
++		while 1:
++			yield[]
++	p=Runner.Parallel(bld,Options.options.jobs)
++	p.biter=it()
++	p.start()
++	for x in tasks:
++		x.logger.memhandler.flush()
++	for x in tasks:
++		if x.hasrun!=Task.SUCCESS:
++			self.end_msg(kw.get('errmsg','no'),color='YELLOW')
++			self.fatal(kw.get('fatalmsg',None)or'One of the tests has failed, see the config.log for more information')
++	self.end_msg('ok')
++
++conf(parse_flags)
++conf(ret_msg)
++conf(validate_cfg)
++conf(exec_cfg)
++conf(check_cfg)
++conf(validate_c)
++conf(post_check)
++conf(check)
++feature('test_exec')(test_exec_fun)
++after_method('apply_link')(test_exec_fun)
++conf(run_c_code)
++conf(check_cxx)
++conf(check_cc)
++conf(define)
++conf(undefine)
++conf(define_cond)
++conf(is_defined)
++conf(get_define)
++conf(have_define)
++conf(write_config_header)
++conf(get_config_header)
++conf(cc_add_flags)
++conf(cxx_add_flags)
++conf(link_add_flags)
++conf(cc_load_tools)
++conf(cxx_load_tools)
++conf(get_cc_version)
++conf(get_xlc_version)
++conf(add_as_needed)
++conf(multicheck)
+\ No newline at end of file
+--- /dev/null
++++ ardour3/waflib/Tools/ccroot.py
+@@ -0,0 +1,375 @@
++#! /usr/bin/env python
++# encoding: utf-8
++# WARNING! Do not edit! http://waf.googlecode.com/git/docs/wafbook/single.html#_obtaining_the_waf_file
++
++import sys
++if sys.hexversion < 0x020400f0: from sets import Set as set
++import os,sys,re
++from waflib import TaskGen,Task,Utils,Logs,Build,Options,Node,Errors
++from waflib.Logs import error,debug,warn
++from waflib.TaskGen import after_method,before_method,feature,taskgen_method,extension
++from waflib.Tools import c_aliases,c_preproc,c_config,c_osx,c_tests
++from waflib.Configure import conf
++USELIB_VARS=Utils.defaultdict(set)
++USELIB_VARS['c']=set(['INCLUDES','FRAMEWORKPATH','DEFINES','CPPFLAGS','CCDEPS','CFLAGS','ARCH'])
++USELIB_VARS['cxx']=set(['INCLUDES','FRAMEWORKPATH','DEFINES','CPPFLAGS','CXXDEPS','CXXFLAGS','ARCH'])
++USELIB_VARS['d']=set(['INCLUDES','DFLAGS'])
++USELIB_VARS['cprogram']=USELIB_VARS['cxxprogram']=set(['LIB','STLIB','LIBPATH','STLIBPATH','LINKFLAGS','RPATH','LINKDEPS','FRAMEWORK','FRAMEWORKPATH','ARCH'])
++USELIB_VARS['cshlib']=USELIB_VARS['cxxshlib']=set(['LIB','STLIB','LIBPATH','STLIBPATH','LINKFLAGS','RPATH','LINKDEPS','FRAMEWORK','FRAMEWORKPATH','ARCH'])
++USELIB_VARS['cstlib']=USELIB_VARS['cxxstlib']=set(['ARFLAGS','LINKDEPS'])
++USELIB_VARS['dprogram']=set(['LIB','STLIB','LIBPATH','STLIBPATH','LINKFLAGS','RPATH','LINKDEPS'])
++USELIB_VARS['dshlib']=set(['LIB','STLIB','LIBPATH','STLIBPATH','LINKFLAGS','RPATH','LINKDEPS'])
++USELIB_VARS['dstlib']=set(['ARFLAGS','LINKDEPS'])
++USELIB_VARS['go']=set(['GOCFLAGS'])
++USELIB_VARS['goprogram']=set(['GOLFLAGS'])
++USELIB_VARS['asm']=set(['ASFLAGS'])
++def create_compiled_task(self,name,node):
++	out='%s.%d.o'%(node.name,self.idx)
++	task=self.create_task(name,node,node.parent.find_or_declare(out))
++	try:
++		self.compiled_tasks.append(task)
++	except AttributeError:
++		self.compiled_tasks=[task]
++	return task
++def to_incnodes(self,inlst):
++	lst=[]
++	seen=set([])
++	for x in self.to_list(inlst):
++		if x in seen or not x:
++			continue
++		seen.add(x)
++		if isinstance(x,Node.Node):
++			lst.append(x)
++		else:
++			if os.path.isabs(x):
++				lst.append(self.bld.root.make_node(x)or x)
++			else:
++				if x[0]=='#':
++					p=self.bld.bldnode.make_node(x[1:])
++					v=self.bld.srcnode.make_node(x[1:])
++				else:
++					p=self.path.get_bld().make_node(x)
++					v=self.path.make_node(x)
++				if p.is_child_of(self.bld.bldnode):
++					p.mkdir()
++				lst.append(p)
++				lst.append(v)
++	return lst
++def apply_incpaths(self):
++	lst=self.to_incnodes(self.to_list(getattr(self,'includes',[]))+self.env['INCLUDES'])
++	self.includes_nodes=lst
++	self.env['INCPATHS']=[x.abspath()for x in lst]
++class link_task(Task.Task):
++	color='YELLOW'
++	inst_to=None
++	chmod=Utils.O644
++	def add_target(self,target):
++		if isinstance(target,str):
++			pattern=self.env[self.__class__.__name__+'_PATTERN']
++			if not pattern:
++				pattern='%s'
++			folder,name=os.path.split(target)
++			if self.__class__.__name__.find('shlib')>0:
++				if self.env.DEST_BINFMT=='pe'and getattr(self.generator,'vnum',None):
++					name=name+'-'+self.generator.vnum.split('.')[0]
++			tmp=folder+os.sep+pattern%name
++			target=self.generator.path.find_or_declare(tmp)
++		self.set_outputs(target)
++class stlink_task(link_task):
++	run_str='${AR} ${ARFLAGS} ${AR_TGT_F}${TGT} ${AR_SRC_F}${SRC}'
++def rm_tgt(cls):
++	old=cls.run
++	def wrap(self):
++		try:os.remove(self.outputs[0].abspath())
++		except OSError:pass
++		return old(self)
++	setattr(cls,'run',wrap)
++rm_tgt(stlink_task)
++def apply_link(self):
++	for x in self.features:
++		if x=='cprogram'and'cxx'in self.features:
++			x='cxxprogram'
++		elif x=='cshlib'and'cxx'in self.features:
++			x='cxxshlib'
++		if x in Task.classes:
++			if issubclass(Task.classes[x],link_task):
++				link=x
++				break
++	else:
++		return
++	objs=[t.outputs[0]for t in getattr(self,'compiled_tasks',[])]
++	self.link_task=self.create_task(link,objs)
++	self.link_task.add_target(self.target)
++	try:
++		inst_to=self.install_path
++	except AttributeError:
++		inst_to=self.link_task.__class__.inst_to
++	if inst_to:
++		self.install_task=self.bld.install_files(inst_to,self.link_task.outputs[:],env=self.env,chmod=self.link_task.chmod)
++def use_rec(self,name,**kw):
++	if name in self.tmp_use_not or name in self.tmp_use_seen:
++		return
++	try:
++		y=self.bld.get_tgen_by_name(name)
++	except Errors.WafError:
++		self.uselib.append(name)
++		self.tmp_use_not.add(name)
++		return
++	self.tmp_use_seen.append(name)
++	y.post()
++	y.tmp_use_objects=objects=kw.get('objects',True)
++	y.tmp_use_stlib=stlib=kw.get('stlib',True)
++	try:
++		link_task=y.link_task
++	except AttributeError:
++		y.tmp_use_var=''
++	else:
++		objects=False
++		if not isinstance(y.link_task,stlink_task):
++			stlib=False
++			y.tmp_use_var='LIB'
++		else:
++			y.tmp_use_var='STLIB'
++	p=self.tmp_use_prec
++	for x in self.to_list(getattr(y,'use',[])):
++		try:
++			p[x].append(name)
++		except:
++			p[x]=[name]
++		self.use_rec(x,objects=objects,stlib=stlib)
++def process_use(self):
++	use_not=self.tmp_use_not=set([])
++	use_seen=self.tmp_use_seen=[]
++	use_prec=self.tmp_use_prec={}
++	self.uselib=self.to_list(getattr(self,'uselib',[]))
++	self.includes=self.to_list(getattr(self,'includes',[]))
++	names=self.to_list(getattr(self,'use',[]))
++	for x in names:
++		self.use_rec(x)
++	for x in use_not:
++		if x in use_prec:
++			del use_prec[x]
++	out=[]
++	tmp=[]
++	for x in self.tmp_use_seen:
++		for k in use_prec.values():
++			if x in k:
++				break
++		else:
++			tmp.append(x)
++	while tmp:
++		e=tmp.pop()
++		out.append(e)
++		try:
++			nlst=use_prec[e]
++		except KeyError:
++			pass
++		else:
++			del use_prec[e]
++			for x in nlst:
++				for y in use_prec:
++					if x in use_prec[y]:
++						break
++				else:
++					tmp.append(x)
++	if use_prec:
++		raise Errors.WafError('Cycle detected in the use processing %r'%use_prec)
++	out.reverse()
++	link_task=getattr(self,'link_task',None)
++	for x in out:
++		y=self.bld.get_tgen_by_name(x)
++		var=y.tmp_use_var
++		if var and link_task:
++			if var=='LIB'or y.tmp_use_stlib:
++				self.env.append_value(var,[y.target[y.target.rfind(os.sep)+1:]])
++				self.link_task.dep_nodes.extend(y.link_task.outputs)
++				tmp_path=y.link_task.outputs[0].parent.path_from(self.bld.bldnode)
++				self.env.append_value(var+'PATH',[tmp_path])
++		else:
++			if y.tmp_use_objects:
++				self.add_objects_from_tgen(y)
++		if getattr(y,'export_includes',None):
++			self.includes.extend(y.to_incnodes(y.export_includes))
++	for x in names:
++		try:
++			y=self.bld.get_tgen_by_name(x)
++		except:
++			if not self.env['STLIB_'+x]and not x in self.uselib:
++				self.uselib.append(x)
++		else:
++			for k in self.to_list(getattr(y,'uselib',[])):
++				if not self.env['STLIB_'+k]and not k in self.uselib:
++					self.uselib.append(k)
++def add_objects_from_tgen(self,tg):
++	try:
++		link_task=self.link_task
++	except AttributeError:
++		pass
++	else:
++		for tsk in getattr(tg,'compiled_tasks',[]):
++			for x in tsk.outputs:
++				if x.name.endswith('.o')or x.name.endswith('.obj'):
++					link_task.inputs.append(x)
++def get_uselib_vars(self):
++	_vars=set([])
++	for x in self.features:
++		if x in USELIB_VARS:
++			_vars|=USELIB_VARS[x]
++	return _vars
++def propagate_uselib_vars(self):
++	_vars=self.get_uselib_vars()
++	env=self.env
++	for x in _vars:
++		y=x.lower()
++		env.append_unique(x,self.to_list(getattr(self,y,[])))
++	for x in self.features:
++		for var in _vars:
++			compvar='%s_%s'%(var,x)
++			env.append_value(var,env[compvar])
++	for x in self.to_list(getattr(self,'uselib',[])):
++		for v in _vars:
++			env.append_value(v,env[v+'_'+x])
++def apply_implib(self):
++	if not self.env.DEST_BINFMT=='pe':
++		return
++	dll=self.link_task.outputs[0]
++	if isinstance(self.target,Node.Node):
++		name=self.target.name
++	else:
++		name=os.path.split(self.target)[1]
++	implib=self.env['implib_PATTERN']%name
++	implib=dll.parent.find_or_declare(implib)
++	self.env.append_value('LINKFLAGS',self.env['IMPLIB_ST']%implib.bldpath())
++	self.link_task.outputs.append(implib)
++	if getattr(self,'defs',None)and self.env.DEST_BINFMT=='pe':
++		node=self.path.find_resource(self.defs)
++		if not node:
++			raise Errors.WafError('invalid def file %r'%self.defs)
++		if'msvc'in(self.env.CC_NAME,self.env.CXX_NAME):
++			self.env.append_value('LINKFLAGS','/def:%s'%node.path_from(self.bld.bldnode))
++			self.link_task.dep_nodes.append(node)
++		else:
++			self.link_task.inputs.append(node)
++	try:
++		inst_to=self.install_path
++	except AttributeError:
++		inst_to=self.link_task.__class__.inst_to
++	if not inst_to:
++		return
++	self.implib_install_task=self.bld.install_as('${PREFIX}/lib/%s'%implib.name,implib,self.env)
++def apply_vnum(self):
++	if not getattr(self,'vnum','')or os.name!='posix'or self.env.DEST_BINFMT not in('elf','mac-o'):
++		return
++	link=self.link_task
++	nums=self.vnum.split('.')
++	node=link.outputs[0]
++	libname=node.name
++	if libname.endswith('.dylib'):
++		name3=libname.replace('.dylib','.%s.dylib'%self.vnum)
++		name2=libname.replace('.dylib','.%s.dylib'%nums[0])
++	else:
++		name3=libname+'.'+self.vnum
++		name2=libname+'.'+nums[0]
++	if self.env.SONAME_ST:
++		v=self.env.SONAME_ST%name2
++		self.env.append_value('LINKFLAGS',v.split())
++	tsk=self.create_task('vnum',node,[node.parent.find_or_declare(name2),node.parent.find_or_declare(name3)])
++	if getattr(self.bld,'is_install',None):
++		self.install_task.hasrun=Task.SKIP_ME
++		bld=self.bld
++		path=self.install_task.dest
++		t1=bld.install_as(path+os.sep+name3,node,env=self.env,chmod=self.link_task.chmod)
++		t2=bld.symlink_as(path+os.sep+name2,name3)
++		t3=bld.symlink_as(path+os.sep+libname,name3)
++		self.vnum_install_task=(t1,t2,t3)
++	if'-dynamiclib'in self.env['LINKFLAGS']and getattr(self,'install_task',None):
++		path=os.path.join(self.install_task.get_install_path(),self.link_task.outputs[0].name)
++		self.env.append_value('LINKFLAGS',['-install_name',path])
++class vnum(Task.Task):
++	color='CYAN'
++	quient=True
++	ext_in=['.bin']
++	def run(self):
++		for x in self.outputs:
++			path=x.abspath()
++			try:
++				os.remove(path)
++			except OSError:
++				pass
++			try:
++				os.symlink(self.inputs[0].name,path)
++			except OSError:
++				return 1
++class fake_shlib(link_task):
++	def runnable_status(self):
++		for t in self.run_after:
++			if not t.hasrun:
++				return Task.ASK_LATER
++		for x in self.outputs:
++			x.sig=Utils.h_file(x.abspath())
++		return Task.SKIP_ME
++class fake_stlib(stlink_task):
++	def runnable_status(self):
++		for t in self.run_after:
++			if not t.hasrun:
++				return Task.ASK_LATER
++		for x in self.outputs:
++			x.sig=Utils.h_file(x.abspath())
++		return Task.SKIP_ME
++def read_shlib(self,name,paths=[]):
++	return self(name=name,features='fake_lib',lib_paths=paths,lib_type='shlib')
++def read_stlib(self,name,paths=[]):
++	return self(name=name,features='fake_lib',lib_paths=paths,lib_type='stlib')
++lib_patterns={'shlib':['lib%s.so','%s.so','lib%s.dll','%s.dll'],'stlib':['lib%s.a','%s.a','lib%s.dll','%s.dll','lib%s.lib','%s.lib'],}
++def process_lib(self):
++	node=None
++	names=[x%self.name for x in lib_patterns[self.lib_type]]
++	for x in self.lib_paths+[self.path,'/usr/lib64','/usr/lib','/usr/local/lib64','/usr/local/lib']:
++		if not isinstance(x,Node.Node):
++			x=self.bld.root.find_node(x)or self.path.find_node(x)
++			if not x:
++				continue
++		for y in names:
++			node=x.find_node(y)
++			if node:
++				node.sig=Utils.h_file(node.abspath())
++				break
++		else:
++			continue
++		break
++	else:
++		raise Errors.WafError('could not find library %r'%self.name)
++	self.link_task=self.create_task('fake_%s'%self.lib_type,[],[node])
++	self.target=self.name
++class fake_o(Task.Task):
++	def runnable_status(self):
++		return Task.SKIP_ME
++def add_those_o_files(self,node):
++	tsk=self.create_task('fake_o',[],node)
++	try:
++		self.compiled_tasks.append(tsk)
++	except AttributeError:
++		self.compiled_tasks=[tsk]
++
++taskgen_method(create_compiled_task)
++taskgen_method(to_incnodes)
++feature('c','cxx','d','go','asm','fc','includes')(apply_incpaths)
++after_method('propagate_uselib_vars','process_source')(apply_incpaths)
++feature('c','cxx','d','go','fc','asm')(apply_link)
++after_method('process_source')(apply_link)
++taskgen_method(use_rec)
++feature('c','cxx','d','use','fc')(process_use)
++before_method('apply_incpaths','propagate_uselib_vars')(process_use)
++after_method('apply_link','process_source')(process_use)
++taskgen_method(add_objects_from_tgen)
++taskgen_method(get_uselib_vars)
++feature('c','cxx','d','fc','javac','cs','uselib')(propagate_uselib_vars)
++after_method('process_use')(propagate_uselib_vars)
++feature('cshlib','cxxshlib','fcshlib')(apply_implib)
++after_method('apply_link')(apply_implib)
++feature('cshlib','cxxshlib','dshlib','fcshlib','vnum')(apply_vnum)
++after_method('apply_link')(apply_vnum)
++conf(read_shlib)
++conf(read_stlib)
++feature('fake_lib')(process_lib)
++extension('.o','.obj')(add_those_o_files)
+\ No newline at end of file
+--- /dev/null
++++ ardour3/waflib/Tools/compiler_c.py
+@@ -0,0 +1,39 @@
++#! /usr/bin/env python
++# encoding: utf-8
++# WARNING! Do not edit! http://waf.googlecode.com/git/docs/wafbook/single.html#_obtaining_the_waf_file
++
++import os,sys,imp,types
++from waflib.Tools import ccroot
++from waflib import Utils,Configure
++from waflib.Logs import debug
++c_compiler={'win32':['msvc','gcc'],'cygwin':['gcc'],'darwin':['gcc'],'aix':['xlc','gcc'],'linux':['gcc','icc'],'sunos':['suncc','gcc'],'irix':['gcc','irixcc'],'hpux':['gcc'],'gnu':['gcc'],'java':['gcc','msvc','icc'],'default':['gcc'],}
++def configure(conf):
++	try:test_for_compiler=conf.options.check_c_compiler
++	except AttributeError:conf.fatal("Add options(opt): opt.load('compiler_c')")
++	for compiler in test_for_compiler.split():
++		conf.env.stash()
++		conf.start_msg('Checking for %r (c compiler)'%compiler)
++		try:
++			conf.load(compiler)
++		except conf.errors.ConfigurationError ,e:
++			conf.env.revert()
++			conf.end_msg(False)
++			debug('compiler_c: %r'%e)
++		else:
++			if conf.env['CC']:
++				conf.end_msg(conf.env.get_flat('CC'))
++				conf.env['COMPILER_CC']=compiler
++				break
++			conf.end_msg(False)
++	else:
++		conf.fatal('could not configure a c compiler!')
++def options(opt):
++	opt.load_special_tools('c_*.py',ban=['c_dumbpreproc.py'])
++	global c_compiler
++	build_platform=Utils.unversioned_sys_platform()
++	possible_compiler_list=c_compiler[build_platform in c_compiler and build_platform or'default']
++	test_for_compiler=' '.join(possible_compiler_list)
++	cc_compiler_opts=opt.add_option_group("C Compiler Options")
++	cc_compiler_opts.add_option('--check-c-compiler',default="%s"%test_for_compiler,help='On this platform (%s) the following C-Compiler will be checked by default: "%s"'%(build_platform,test_for_compiler),dest="check_c_compiler")
++	for x in test_for_compiler.split():
++		opt.load('%s'%x)
+--- /dev/null
++++ ardour3/waflib/Tools/compiler_cxx.py
+@@ -0,0 +1,39 @@
++#! /usr/bin/env python
++# encoding: utf-8
++# WARNING! Do not edit! http://waf.googlecode.com/git/docs/wafbook/single.html#_obtaining_the_waf_file
++
++import os,sys,imp,types
++from waflib.Tools import ccroot
++from waflib import Utils,Configure
++from waflib.Logs import debug
++cxx_compiler={'win32':['msvc','g++'],'cygwin':['g++'],'darwin':['g++'],'aix':['xlc++','g++'],'linux':['g++','icpc'],'sunos':['sunc++','g++'],'irix':['g++'],'hpux':['g++'],'gnu':['g++'],'java':['g++','msvc','icpc'],'default':['g++']}
++def configure(conf):
++	try:test_for_compiler=conf.options.check_cxx_compiler
++	except AttributeError:conf.fatal("Add options(opt): opt.load('compiler_cxx')")
++	for compiler in test_for_compiler.split():
++		conf.env.stash()
++		conf.start_msg('Checking for %r (c++ compiler)'%compiler)
++		try:
++			conf.load(compiler)
++		except conf.errors.ConfigurationError ,e:
++			conf.env.revert()
++			conf.end_msg(False)
++			debug('compiler_cxx: %r'%e)
++		else:
++			if conf.env['CXX']:
++				conf.end_msg(conf.env.get_flat('CXX'))
++				conf.env['COMPILER_CXX']=compiler
++				break
++			conf.end_msg(False)
++	else:
++		conf.fatal('could not configure a c++ compiler!')
++def options(opt):
++	opt.load_special_tools('cxx_*.py')
++	global cxx_compiler
++	build_platform=Utils.unversioned_sys_platform()
++	possible_compiler_list=cxx_compiler[build_platform in cxx_compiler and build_platform or'default']
++	test_for_compiler=' '.join(possible_compiler_list)
++	cxx_compiler_opts=opt.add_option_group('C++ Compiler Options')
++	cxx_compiler_opts.add_option('--check-cxx-compiler',default="%s"%test_for_compiler,help='On this platform (%s) the following C++ Compiler will be checked by default: "%s"'%(build_platform,test_for_compiler),dest="check_cxx_compiler")
++	for x in test_for_compiler.split():
++		opt.load('%s'%x)
+--- /dev/null
++++ ardour3/waflib/Tools/compiler_d.py
+@@ -0,0 +1,30 @@
++#! /usr/bin/env python
++# encoding: utf-8
++# WARNING! Do not edit! http://waf.googlecode.com/git/docs/wafbook/single.html#_obtaining_the_waf_file
++
++import os,sys,imp,types
++from waflib import Utils,Configure,Options,Logs
++def configure(conf):
++	for compiler in conf.options.dcheck.split(','):
++		conf.env.stash()
++		conf.start_msg('Checking for %r (d compiler)'%compiler)
++		try:
++			conf.load(compiler)
++		except conf.errors.ConfigurationError ,e:
++			conf.env.revert()
++			conf.end_msg(False)
++			Logs.debug('compiler_cxx: %r'%e)
++		else:
++			if conf.env.D:
++				conf.end_msg(conf.env.get_flat('D'))
++				conf.env['COMPILER_D']=compiler
++				conf.env.D_COMPILER=conf.env.D
++				break
++			conf.end_msg(False)
++	else:
++		conf.fatal('no suitable d compiler was found')
++def options(opt):
++	d_compiler_opts=opt.add_option_group('D Compiler Options')
++	d_compiler_opts.add_option('--check-d-compiler',default='gdc,dmd',action='store',help='check for the compiler [Default:gdc,dmd]',dest='dcheck')
++	for d_compiler in['gdc','dmd']:
++		opt.load('%s'%d_compiler)
+--- /dev/null
++++ ardour3/waflib/Tools/compiler_fc.py
+@@ -0,0 +1,43 @@
++#! /usr/bin/env python
++# encoding: utf-8
++# WARNING! Do not edit! http://waf.googlecode.com/git/docs/wafbook/single.html#_obtaining_the_waf_file
++
++import os,sys,imp,types
++from waflib import Utils,Configure,Options,Logs,Errors
++from waflib.Tools import fc
++fc_compiler={'win32':['gfortran','ifort'],'darwin':['gfortran','g95','ifort'],'linux':['gfortran','g95','ifort'],'java':['gfortran','g95','ifort'],'default':['gfortran'],'aix':['gfortran']}
++def __list_possible_compiler(platform):
++	try:
++		return fc_compiler[platform]
++	except KeyError:
++		return fc_compiler["default"]
++def configure(conf):
++	try:test_for_compiler=conf.options.check_fc
++	except AttributeError:conf.fatal("Add options(opt): opt.load('compiler_fc')")
++	for compiler in test_for_compiler.split():
++		conf.env.stash()
++		conf.start_msg('Checking for %r (fortran compiler)'%compiler)
++		try:
++			conf.load(compiler)
++		except conf.errors.ConfigurationError ,e:
++			conf.env.revert()
++			conf.end_msg(False)
++			Logs.debug('compiler_fortran: %r'%e)
++		else:
++			if conf.env['FC']:
++				conf.end_msg(conf.env.get_flat('FC'))
++				conf.env.COMPILER_FORTRAN=compiler
++				break
++			conf.end_msg(False)
++	else:
++		conf.fatal('could not configure a fortran compiler!')
++def options(opt):
++	opt.load_special_tools('fc_*.py')
++	build_platform=Utils.unversioned_sys_platform()
++	detected_platform=Options.platform
++	possible_compiler_list=__list_possible_compiler(detected_platform)
++	test_for_compiler=' '.join(possible_compiler_list)
++	fortran_compiler_opts=opt.add_option_group("Fortran Compiler Options")
++	fortran_compiler_opts.add_option('--check-fortran-compiler',default="%s"%test_for_compiler,help='On this platform (%s) the following Fortran Compiler will be checked by default: "%s"'%(detected_platform,test_for_compiler),dest="check_fc")
++	for compiler in test_for_compiler.split():
++		opt.load('%s'%compiler)
+--- /dev/null
++++ ardour3/waflib/Tools/c_osx.py
+@@ -0,0 +1,121 @@
++#! /usr/bin/env python
++# encoding: utf-8
++# WARNING! Do not edit! http://waf.googlecode.com/git/docs/wafbook/single.html#_obtaining_the_waf_file
++
++import os,shutil,sys,platform
++from waflib import TaskGen,Task,Build,Options,Utils,Errors
++from waflib.TaskGen import taskgen_method,feature,after_method,before_method
++app_info='''
++<?xml version="1.0" encoding="UTF-8"?>
++<!DOCTYPE plist SYSTEM "file://localhost/System/Library/DTDs/PropertyList.dtd">
++<plist version="0.9">
++<dict>
++	<key>CFBundlePackageType</key>
++	<string>APPL</string>
++	<key>CFBundleGetInfoString</key>
++	<string>Created by Waf</string>
++	<key>CFBundleSignature</key>
++	<string>????</string>
++	<key>NOTE</key>
++	<string>THIS IS A GENERATED FILE, DO NOT MODIFY</string>
++	<key>CFBundleExecutable</key>
++	<string>%s</string>
++</dict>
++</plist>
++'''
++def set_macosx_deployment_target(self):
++	if self.env['MACOSX_DEPLOYMENT_TARGET']:
++		os.environ['MACOSX_DEPLOYMENT_TARGET']=self.env['MACOSX_DEPLOYMENT_TARGET']
++	elif'MACOSX_DEPLOYMENT_TARGET'not in os.environ:
++		if Utils.unversioned_sys_platform()=='darwin':
++			os.environ['MACOSX_DEPLOYMENT_TARGET']='.'.join(platform.mac_ver()[0].split('.')[:2])
++def create_bundle_dirs(self,name,out):
++	bld=self.bld
++	dir=out.parent.find_or_declare(name)
++	dir.mkdir()
++	macos=dir.find_or_declare(['Contents','MacOS'])
++	macos.mkdir()
++	return dir
++def bundle_name_for_output(out):
++	name=out.name
++	k=name.rfind('.')
++	if k>=0:
++		name=name[:k]+'.app'
++	else:
++		name=name+'.app'
++	return name
++def create_task_macapp(self):
++	if self.env['MACAPP']or getattr(self,'mac_app',False):
++		out=self.link_task.outputs[0]
++		name=bundle_name_for_output(out)
++		dir=self.create_bundle_dirs(name,out)
++		n1=dir.find_or_declare(['Contents','MacOS',out.name])
++		self.apptask=self.create_task('macapp',self.link_task.outputs,n1)
++		inst_to=getattr(self,'install_path','/Applications')+'/%s/Contents/MacOS/'%name
++		self.bld.install_files(inst_to,n1,chmod=Utils.O755)
++		if getattr(self,'mac_resources',None):
++			res_dir=n1.parent.parent.make_node('Resources')
++			inst_to=getattr(self,'install_path','/Applications')+'/%s/Resources'%name
++			for x in self.to_list(self.mac_resources):
++				node=self.path.find_node(x)
++				if not node:
++					raise Errors.WafError('Missing mac_resource %r in %r'%(x,self))
++				parent=node.parent
++				if os.path.isdir(node.abspath()):
++					nodes=node.ant_glob('**')
++				else:
++					nodes=[node]
++				for node in nodes:
++					rel=node.path_from(parent)
++					tsk=self.create_task('macapp',node,res_dir.make_node(rel))
++					self.bld.install_as(inst_to+'/%s'%rel,node)
++		if getattr(self.bld,'is_install',None):
++			self.install_task.hasrun=Task.SKIP_ME
++def create_task_macplist(self):
++	if self.env['MACAPP']or getattr(self,'mac_app',False):
++		out=self.link_task.outputs[0]
++		name=bundle_name_for_output(out)
++		dir=self.create_bundle_dirs(name,out)
++		n1=dir.find_or_declare(['Contents','Info.plist'])
++		self.plisttask=plisttask=self.create_task('macplist',[],n1)
++		if getattr(self,'mac_plist',False):
++			node=self.path.find_resource(self.mac_plist)
++			if node:
++				plisttask.inputs.append(node)
++			else:
++				plisttask.code=self.mac_plist
++		else:
++			plisttask.code=app_info%self.link_task.outputs[0].name
++		inst_to=getattr(self,'install_path','/Applications')+'/%s/Contents/'%name
++		self.bld.install_files(inst_to,n1)
++def apply_bundle(self):
++	if self.env['MACBUNDLE']or getattr(self,'mac_bundle',False):
++		self.env['LINKFLAGS_cshlib']=self.env['LINKFLAGS_cxxshlib']=[]
++		self.env['cshlib_PATTERN']=self.env['cxxshlib_PATTERN']=self.env['macbundle_PATTERN']
++		use=self.use=self.to_list(getattr(self,'use',[]))
++		if not'MACBUNDLE'in use:
++			use.append('MACBUNDLE')
++app_dirs=['Contents','Contents/MacOS','Contents/Resources']
++class macapp(Task.Task):
++	color='PINK'
++	def run(self):
++		self.outputs[0].parent.mkdir()
++		shutil.copy2(self.inputs[0].srcpath(),self.outputs[0].abspath())
++class macplist(Task.Task):
++	color='PINK'
++	ext_in=['.bin']
++	def run(self):
++		if getattr(self,'code',None):
++			txt=self.code
++		else:
++			txt=self.inputs[0].read()
++		self.outputs[0].write(txt)
++
++feature('c','cxx')(set_macosx_deployment_target)
++taskgen_method(create_bundle_dirs)
++feature('cprogram','cxxprogram')(create_task_macapp)
++after_method('apply_link')(create_task_macapp)
++feature('cprogram','cxxprogram')(create_task_macplist)
++after_method('apply_link')(create_task_macplist)
++feature('cshlib','cxxshlib')(apply_bundle)
++before_method('apply_link','propagate_uselib_vars')(apply_bundle)
+\ No newline at end of file
+--- /dev/null
++++ ardour3/waflib/Tools/c_preproc.py
+@@ -0,0 +1,606 @@
++#! /usr/bin/env python
++# encoding: utf-8
++# WARNING! Do not edit! http://waf.googlecode.com/git/docs/wafbook/single.html#_obtaining_the_waf_file
++
++import sys
++if sys.hexversion < 0x020400f0: from sets import Set as set
++import re,sys,os,string,traceback
++from waflib import Logs,Build,Utils,Errors
++from waflib.Logs import debug,error
++class PreprocError(Errors.WafError):
++	pass
++POPFILE='-'
++recursion_limit=150
++go_absolute=False
++standard_includes=['/usr/include']
++if Utils.is_win32:
++	standard_includes=[]
++use_trigraphs=0
++strict_quotes=0
++g_optrans={'not':'!','and':'&&','bitand':'&','and_eq':'&=','or':'||','bitor':'|','or_eq':'|=','xor':'^','xor_eq':'^=','compl':'~',}
++re_lines=re.compile('^[ \t]*(#|%:)[ \t]*(ifdef|ifndef|if|else|elif|endif|include|import|define|undef|pragma)[ \t]*(.*)\r*$',re.IGNORECASE|re.MULTILINE)
++re_mac=re.compile("^[a-zA-Z_]\w*")
++re_fun=re.compile('^[a-zA-Z_][a-zA-Z0-9_]*[(]')
++re_pragma_once=re.compile('^\s*once\s*',re.IGNORECASE)
++re_nl=re.compile('\\\\\r*\n',re.MULTILINE)
++re_cpp=re.compile(r"""(/\*[^*]*\*+([^/*][^*]*\*+)*/)|//[^\n]*|("(\\.|[^"\\])*"|'(\\.|[^'\\])*'|.[^/"'\\]*)""",re.MULTILINE)
++trig_def=[('??'+a,b)for a,b in zip("=-/!'()<>",r'#~\|^[]{}')]
++chr_esc={'0':0,'a':7,'b':8,'t':9,'n':10,'f':11,'v':12,'r':13,'\\':92,"'":39}
++NUM='i'
++OP='O'
++IDENT='T'
++STR='s'
++CHAR='c'
++tok_types=[NUM,STR,IDENT,OP]
++exp_types=[r"""0[xX](?P<hex>[a-fA-F0-9]+)(?P<qual1>[uUlL]*)|L*?'(?P<char>(\\.|[^\\'])+)'|(?P<n1>\d+)[Ee](?P<exp0>[+-]*?\d+)(?P<float0>[fFlL]*)|(?P<n2>\d*\.\d+)([Ee](?P<exp1>[+-]*?\d+))?(?P<float1>[fFlL]*)|(?P<n4>\d+\.\d*)([Ee](?P<exp2>[+-]*?\d+))?(?P<float2>[fFlL]*)|(?P<oct>0*)(?P<n0>\d+)(?P<qual2>[uUlL]*)""",r'L?"([^"\\]|\\.)*"',r'[a-zA-Z_]\w*',r'%:%:|<<=|>>=|\.\.\.|<<|<%|<:|<=|>>|>=|\+\+|\+=|--|->|-=|\*=|/=|%:|%=|%>|==|&&|&=|\|\||\|=|\^=|:>|!=|##|[\(\)\{\}\[\]<>\?\|\^\*\+&=:!#;,%/\-\?\~\.]',]
++re_clexer=re.compile('|'.join(["(?P<%s>%s)"%(name,part)for name,part in zip(tok_types,exp_types)]),re.M)
++accepted='a'
++ignored='i'
++undefined='u'
++skipped='s'
++def repl(m):
++	s=m.group(1)
++	if s:
++		return' '
++	return m.group(3)or''
++def filter_comments(filename):
++	code=Utils.readf(filename)
++	if use_trigraphs:
++		for(a,b)in trig_def:code=code.split(a).join(b)
++	code=re_nl.sub('',code)
++	code=re_cpp.sub(repl,code)
++	return[(m.group(2),m.group(3))for m in re.finditer(re_lines,code)]
++prec={}
++ops=['* / %','+ -','<< >>','< <= >= >','== !=','& | ^','&& ||',',']
++for x in range(len(ops)):
++	syms=ops[x]
++	for u in syms.split():
++		prec[u]=x
++def trimquotes(s):
++	if not s:return''
++	s=s.rstrip()
++	if s[0]=="'"and s[-1]=="'":return s[1:-1]
++	return s
++def reduce_nums(val_1,val_2,val_op):
++	try:a=0+val_1
++	except TypeError:a=int(val_1)
++	try:b=0+val_2
++	except TypeError:b=int(val_2)
++	d=val_op
++	if d=='%':c=a%b
++	elif d=='+':c=a+b
++	elif d=='-':c=a-b
++	elif d=='*':c=a*b
++	elif d=='/':c=a/b
++	elif d=='^':c=a^b
++	elif d=='|':c=a|b
++	elif d=='||':c=int(a or b)
++	elif d=='&':c=a&b
++	elif d=='&&':c=int(a and b)
++	elif d=='==':c=int(a==b)
++	elif d=='!=':c=int(a!=b)
++	elif d=='<=':c=int(a<=b)
++	elif d=='<':c=int(a<b)
++	elif d=='>':c=int(a>b)
++	elif d=='>=':c=int(a>=b)
++	elif d=='^':c=int(a^b)
++	elif d=='<<':c=a<<b
++	elif d=='>>':c=a>>b
++	else:c=0
++	return c
++def get_num(lst):
++	if not lst:raise PreprocError("empty list for get_num")
++	(p,v)=lst[0]
++	if p==OP:
++		if v=='(':
++			count_par=1
++			i=1
++			while i<len(lst):
++				(p,v)=lst[i]
++				if p==OP:
++					if v==')':
++						count_par-=1
++						if count_par==0:
++							break
++					elif v=='(':
++						count_par+=1
++				i+=1
++			else:
++				raise PreprocError("rparen expected %r"%lst)
++			(num,_)=get_term(lst[1:i])
++			return(num,lst[i+1:])
++		elif v=='+':
++			return get_num(lst[1:])
++		elif v=='-':
++			num,lst=get_num(lst[1:])
++			return(reduce_nums('-1',num,'*'),lst)
++		elif v=='!':
++			num,lst=get_num(lst[1:])
++			return(int(not int(num)),lst)
++		elif v=='~':
++			return(~int(num),lst)
++		else:
++			raise PreprocError("Invalid op token %r for get_num"%lst)
++	elif p==NUM:
++		return v,lst[1:]
++	elif p==IDENT:
++		return 0,lst[1:]
++	else:
++		raise PreprocError("Invalid token %r for get_num"%lst)
++def get_term(lst):
++	if not lst:raise PreprocError("empty list for get_term")
++	num,lst=get_num(lst)
++	if not lst:
++		return(num,[])
++	(p,v)=lst[0]
++	if p==OP:
++		if v=='&&'and not num:
++			return(num,[])
++		elif v=='||'and num:
++			return(num,[])
++		elif v==',':
++			return get_term(lst[1:])
++		elif v=='?':
++			count_par=0
++			i=1
++			while i<len(lst):
++				(p,v)=lst[i]
++				if p==OP:
++					if v==')':
++						count_par-=1
++					elif v=='(':
++						count_par+=1
++					elif v==':':
++						if count_par==0:
++							break
++				i+=1
++			else:
++				raise PreprocError("rparen expected %r"%lst)
++			if int(num):
++				return get_term(lst[1:i])
++			else:
++				return get_term(lst[i+1:])
++		else:
++			num2,lst=get_num(lst[1:])
++			if not lst:
++				num2=reduce_nums(num,num2,v)
++				return get_term([(NUM,num2)]+lst)
++			p2,v2=lst[0]
++			if p2!=OP:
++				raise PreprocError("op expected %r"%lst)
++			if prec[v2]>=prec[v]:
++				num2=reduce_nums(num,num2,v)
++				return get_term([(NUM,num2)]+lst)
++			else:
++				num3,lst=get_num(lst[1:])
++				num3=reduce_nums(num2,num3,v2)
++				return get_term([(NUM,num),(p,v),(NUM,num3)]+lst)
++	raise PreprocError("cannot reduce %r"%lst)
++def reduce_eval(lst):
++	num,lst=get_term(lst)
++	return(NUM,num)
++def stringize(lst):
++	lst=[str(v2)for(p2,v2)in lst]
++	return"".join(lst)
++def paste_tokens(t1,t2):
++	p1=None
++	if t1[0]==OP and t2[0]==OP:
++		p1=OP
++	elif t1[0]==IDENT and(t2[0]==IDENT or t2[0]==NUM):
++		p1=IDENT
++	elif t1[0]==NUM and t2[0]==NUM:
++		p1=NUM
++	if not p1:
++		raise PreprocError('tokens do not make a valid paste %r and %r'%(t1,t2))
++	return(p1,t1[1]+t2[1])
++def reduce_tokens(lst,defs,ban=[]):
++	i=0
++	while i<len(lst):
++		(p,v)=lst[i]
++		if p==IDENT and v=="defined":
++			del lst[i]
++			if i<len(lst):
++				(p2,v2)=lst[i]
++				if p2==IDENT:
++					if v2 in defs:
++						lst[i]=(NUM,1)
++					else:
++						lst[i]=(NUM,0)
++				elif p2==OP and v2=='(':
++					del lst[i]
++					(p2,v2)=lst[i]
++					del lst[i]
++					if v2 in defs:
++						lst[i]=(NUM,1)
++					else:
++						lst[i]=(NUM,0)
++				else:
++					raise PreprocError("Invalid define expression %r"%lst)
++		elif p==IDENT and v in defs:
++			if isinstance(defs[v],str):
++				a,b=extract_macro(defs[v])
++				defs[v]=b
++			macro_def=defs[v]
++			to_add=macro_def[1]
++			if isinstance(macro_def[0],list):
++				del lst[i]
++				for x in range(len(to_add)):
++					lst.insert(i,to_add[x])
++					i+=1
++			else:
++				args=[]
++				del lst[i]
++				if i>=len(lst):
++					raise PreprocError("expected '(' after %r (got nothing)"%v)
++				(p2,v2)=lst[i]
++				if p2!=OP or v2!='(':
++					raise PreprocError("expected '(' after %r"%v)
++				del lst[i]
++				one_param=[]
++				count_paren=0
++				while i<len(lst):
++					p2,v2=lst[i]
++					del lst[i]
++					if p2==OP and count_paren==0:
++						if v2=='(':
++							one_param.append((p2,v2))
++							count_paren+=1
++						elif v2==')':
++							if one_param:args.append(one_param)
++							break
++						elif v2==',':
++							if not one_param:raise PreprocError("empty param in funcall %s"%p)
++							args.append(one_param)
++							one_param=[]
++						else:
++							one_param.append((p2,v2))
++					else:
++						one_param.append((p2,v2))
++						if v2=='(':count_paren+=1
++						elif v2==')':count_paren-=1
++				else:
++					raise PreprocError('malformed macro')
++				accu=[]
++				arg_table=macro_def[0]
++				j=0
++				while j<len(to_add):
++					(p2,v2)=to_add[j]
++					if p2==OP and v2=='#':
++						if j+1<len(to_add)and to_add[j+1][0]==IDENT and to_add[j+1][1]in arg_table:
++							toks=args[arg_table[to_add[j+1][1]]]
++							accu.append((STR,stringize(toks)))
++							j+=1
++						else:
++							accu.append((p2,v2))
++					elif p2==OP and v2=='##':
++						if accu and j+1<len(to_add):
++							t1=accu[-1]
++							if to_add[j+1][0]==IDENT and to_add[j+1][1]in arg_table:
++								toks=args[arg_table[to_add[j+1][1]]]
++								if toks:
++									accu[-1]=paste_tokens(t1,toks[0])
++									accu.extend(toks[1:])
++								else:
++									accu.append((p2,v2))
++									accu.extend(toks)
++							elif to_add[j+1][0]==IDENT and to_add[j+1][1]=='__VA_ARGS__':
++								va_toks=[]
++								st=len(macro_def[0])
++								pt=len(args)
++								for x in args[pt-st+1:]:
++									va_toks.extend(x)
++									va_toks.append((OP,','))
++								if va_toks:va_toks.pop()
++								if len(accu)>1:
++									(p3,v3)=accu[-1]
++									(p4,v4)=accu[-2]
++									if v3=='##':
++										accu.pop()
++										if v4==','and pt<st:
++											accu.pop()
++								accu+=va_toks
++							else:
++								accu[-1]=paste_tokens(t1,to_add[j+1])
++							j+=1
++						else:
++							accu.append((p2,v2))
++					elif p2==IDENT and v2 in arg_table:
++						toks=args[arg_table[v2]]
++						reduce_tokens(toks,defs,ban+[v])
++						accu.extend(toks)
++					else:
++						accu.append((p2,v2))
++					j+=1
++				reduce_tokens(accu,defs,ban+[v])
++				for x in range(len(accu)-1,-1,-1):
++					lst.insert(i,accu[x])
++		i+=1
++def eval_macro(lst,defs):
++	reduce_tokens(lst,defs,[])
++	if not lst:raise PreprocError("missing tokens to evaluate")
++	(p,v)=reduce_eval(lst)
++	return int(v)!=0
++def extract_macro(txt):
++	t=tokenize(txt)
++	if re_fun.search(txt):
++		p,name=t[0]
++		p,v=t[1]
++		if p!=OP:raise PreprocError("expected open parenthesis")
++		i=1
++		pindex=0
++		params={}
++		prev='('
++		while 1:
++			i+=1
++			p,v=t[i]
++			if prev=='(':
++				if p==IDENT:
++					params[v]=pindex
++					pindex+=1
++					prev=p
++				elif p==OP and v==')':
++					break
++				else:
++					raise PreprocError("unexpected token (3)")
++			elif prev==IDENT:
++				if p==OP and v==',':
++					prev=v
++				elif p==OP and v==')':
++					break
++				else:
++					raise PreprocError("comma or ... expected")
++			elif prev==',':
++				if p==IDENT:
++					params[v]=pindex
++					pindex+=1
++					prev=p
++				elif p==OP and v=='...':
++					raise PreprocError("not implemented (1)")
++				else:
++					raise PreprocError("comma or ... expected (2)")
++			elif prev=='...':
++				raise PreprocError("not implemented (2)")
++			else:
++				raise PreprocError("unexpected else")
++		return(name,[params,t[i+1:]])
++	else:
++		(p,v)=t[0]
++		return(v,[[],t[1:]])
++re_include=re.compile('^\s*(<(?P<a>.*)>|"(?P<b>.*)")')
++def extract_include(txt,defs):
++	m=re_include.search(txt)
++	if m:
++		if m.group('a'):return'<',m.group('a')
++		if m.group('b'):return'"',m.group('b')
++	toks=tokenize(txt)
++	reduce_tokens(toks,defs,['waf_include'])
++	if not toks:
++		raise PreprocError("could not parse include %s"%txt)
++	if len(toks)==1:
++		if toks[0][0]==STR:
++			return'"',toks[0][1]
++	else:
++		if toks[0][1]=='<'and toks[-1][1]=='>':
++			return stringize(toks).lstrip('<').rstrip('>')
++	raise PreprocError("could not parse include %s."%txt)
++def parse_char(txt):
++	if not txt:raise PreprocError("attempted to parse a null char")
++	if txt[0]!='\\':
++		return ord(txt)
++	c=txt[1]
++	if c=='x':
++		if len(txt)==4 and txt[3]in string.hexdigits:return int(txt[2:],16)
++		return int(txt[2:],16)
++	elif c.isdigit():
++		if c=='0'and len(txt)==2:return 0
++		for i in 3,2,1:
++			if len(txt)>i and txt[1:1+i].isdigit():
++				return(1+i,int(txt[1:1+i],8))
++	else:
++		try:return chr_esc[c]
++		except KeyError:raise PreprocError("could not parse char literal '%s'"%txt)
++def tokenize(s):
++	ret=[]
++	for match in re_clexer.finditer(s):
++		m=match.group
++		for name in tok_types:
++			v=m(name)
++			if v:
++				if name==IDENT:
++					try:v=g_optrans[v];name=OP
++					except KeyError:
++						if v.lower()=="true":
++							v=1
++							name=NUM
++						elif v.lower()=="false":
++							v=0
++							name=NUM
++				elif name==NUM:
++					if m('oct'):v=int(v,8)
++					elif m('hex'):v=int(m('hex'),16)
++					elif m('n0'):v=m('n0')
++					else:
++						v=m('char')
++						if v:v=parse_char(v)
++						else:v=m('n2')or m('n4')
++				elif name==OP:
++					if v=='%:':v='#'
++					elif v=='%:%:':v='##'
++				elif name==STR:
++					v=v[1:-1]
++				ret.append((name,v))
++				break
++	return ret
++def define_name(line):
++	return re_mac.match(line).group(0)
++class c_parser(object):
++	def __init__(self,nodepaths=None,defines=None):
++		self.lines=[]
++		if defines is None:
++			self.defs={}
++		else:
++			self.defs=dict(defines)
++		self.state=[]
++		self.count_files=0
++		self.currentnode_stack=[]
++		self.nodepaths=nodepaths or[]
++		self.nodes=[]
++		self.names=[]
++		self.curfile=''
++		self.ban_includes=set([])
++	def cached_find_resource(self,node,filename):
++		try:
++			nd=node.ctx.cache_nd
++		except:
++			nd=node.ctx.cache_nd={}
++		tup=(node,filename)
++		try:
++			return nd[tup]
++		except KeyError:
++			ret=node.find_resource(filename)
++			if ret:
++				if getattr(ret,'children',None):
++					ret=None
++				elif ret.is_child_of(node.ctx.bldnode):
++					tmp=node.ctx.srcnode.search(ret.path_from(node.ctx.bldnode))
++					if tmp and getattr(tmp,'children',None):
++						ret=None
++			nd[tup]=ret
++			return ret
++	def tryfind(self,filename):
++		self.curfile=filename
++		found=self.cached_find_resource(self.currentnode_stack[-1],filename)
++		for n in self.nodepaths:
++			if found:
++				break
++			found=self.cached_find_resource(n,filename)
++		if found:
++			self.nodes.append(found)
++			if filename[-4:]!='.moc':
++				self.addlines(found)
++		else:
++			if not filename in self.names:
++				self.names.append(filename)
++		return found
++	def addlines(self,node):
++		self.currentnode_stack.append(node.parent)
++		filepath=node.abspath()
++		self.count_files+=1
++		if self.count_files>recursion_limit:
++			raise PreprocError("recursion limit exceeded")
++		pc=self.parse_cache
++		debug('preproc: reading file %r',filepath)
++		try:
++			lns=pc[filepath]
++		except KeyError:
++			pass
++		else:
++			self.lines.extend(lns)
++			return
++		try:
++			lines=filter_comments(filepath)
++			lines.append((POPFILE,''))
++			lines.reverse()
++			pc[filepath]=lines
++			self.lines.extend(lines)
++		except IOError:
++			raise PreprocError("could not read the file %s"%filepath)
++		except Exception:
++			if Logs.verbose>0:
++				error("parsing %s failed"%filepath)
++				traceback.print_exc()
++	def start(self,node,env):
++		debug('preproc: scanning %s (in %s)',node.name,node.parent.name)
++		bld=node.ctx
++		try:
++			self.parse_cache=bld.parse_cache
++		except AttributeError:
++			bld.parse_cache={}
++			self.parse_cache=bld.parse_cache
++		self.addlines(node)
++		if env['DEFINES']:
++			try:
++				lst=['%s %s'%(x[0],trimquotes('='.join(x[1:])))for x in[y.split('=')for y in env['DEFINES']]]
++				lst.reverse()
++				self.lines.extend([('define',x)for x in lst])
++			except AttributeError:
++				pass
++		while self.lines:
++			(token,line)=self.lines.pop()
++			if token==POPFILE:
++				self.count_files-=1
++				self.currentnode_stack.pop()
++				continue
++			try:
++				ve=Logs.verbose
++				if ve:debug('preproc: line is %s - %s state is %s',token,line,self.state)
++				state=self.state
++				if token[:2]=='if':
++					state.append(undefined)
++				elif token=='endif':
++					state.pop()
++				if token[0]!='e':
++					if skipped in self.state or ignored in self.state:
++						continue
++				if token=='if':
++					ret=eval_macro(tokenize(line),self.defs)
++					if ret:state[-1]=accepted
++					else:state[-1]=ignored
++				elif token=='ifdef':
++					m=re_mac.match(line)
++					if m and m.group(0)in self.defs:state[-1]=accepted
++					else:state[-1]=ignored
++				elif token=='ifndef':
++					m=re_mac.match(line)
++					if m and m.group(0)in self.defs:state[-1]=ignored
++					else:state[-1]=accepted
++				elif token=='include'or token=='import':
++					(kind,inc)=extract_include(line,self.defs)
++					if inc in self.ban_includes:
++						continue
++					if token=='import':self.ban_includes.add(inc)
++					if ve:debug('preproc: include found %s    (%s) ',inc,kind)
++					if kind=='"'or not strict_quotes:
++						self.tryfind(inc)
++				elif token=='elif':
++					if state[-1]==accepted:
++						state[-1]=skipped
++					elif state[-1]==ignored:
++						if eval_macro(tokenize(line),self.defs):
++							state[-1]=accepted
++				elif token=='else':
++					if state[-1]==accepted:state[-1]=skipped
++					elif state[-1]==ignored:state[-1]=accepted
++				elif token=='define':
++					try:
++						self.defs[define_name(line)]=line
++					except:
++						raise PreprocError("Invalid define line %s"%line)
++				elif token=='undef':
++					m=re_mac.match(line)
++					if m and m.group(0)in self.defs:
++						self.defs.__delitem__(m.group(0))
++				elif token=='pragma':
++					if re_pragma_once.match(line.lower()):
++						self.ban_includes.add(self.curfile)
++			except Exception ,e:
++				if Logs.verbose:
++					debug('preproc: line parsing failed (%s): %s %s',e,line,Utils.ex_stack())
++def scan(task):
++	global go_absolute
++	try:
++		incn=task.generator.includes_nodes
++	except AttributeError:
++		raise Errors.WafError('%r is missing a feature such as "c", "cxx" or "includes": '%task.generator)
++	if go_absolute:
++		nodepaths=incn+standard_includes
++	else:
++		nodepaths=[x for x in incn if x.is_child_of(x.ctx.srcnode)or x.is_child_of(x.ctx.bldnode)]
++	tmp=c_parser(nodepaths)
++	tmp.start(task.inputs[0],task.env)
++	if Logs.verbose:
++		debug('deps: deps for %r: %r; unresolved %r'%(task.inputs,tmp.nodes,tmp.names))
++	return(tmp.nodes,tmp.names)
++
++Utils.run_once(tokenize)
++Utils.run_once(define_name)
+\ No newline at end of file
+--- /dev/null
++++ ardour3/waflib/Tools/c.py
+@@ -0,0 +1,27 @@
++#! /usr/bin/env python
++# encoding: utf-8
++# WARNING! Do not edit! http://waf.googlecode.com/git/docs/wafbook/single.html#_obtaining_the_waf_file
++
++from waflib import TaskGen,Task,Utils
++from waflib.Tools import c_preproc
++from waflib.Tools.ccroot import link_task,stlink_task
++def c_hook(self,node):
++	return self.create_compiled_task('c',node)
++class c(Task.Task):
++	run_str='${CC} ${ARCH_ST:ARCH} ${CFLAGS} ${CPPFLAGS} ${FRAMEWORKPATH_ST:FRAMEWORKPATH} ${CPPPATH_ST:INCPATHS} ${DEFINES_ST:DEFINES} ${CC_SRC_F}${SRC} ${CC_TGT_F}${TGT}'
++	vars=['CCDEPS']
++	ext_in=['.h']
++	scan=c_preproc.scan
++Task.classes['cc']=cc=c
++class cprogram(link_task):
++	run_str='${LINK_CC} ${LINKFLAGS} ${CCLNK_SRC_F}${SRC} ${CCLNK_TGT_F}${TGT[0].abspath()} ${RPATH_ST:RPATH} ${FRAMEWORKPATH_ST:FRAMEWORKPATH} ${FRAMEWORK_ST:FRAMEWORK} ${ARCH_ST:ARCH} ${STLIB_MARKER} ${STLIBPATH_ST:STLIBPATH} ${STLIB_ST:STLIB} ${SHLIB_MARKER} ${LIBPATH_ST:LIBPATH} ${LIB_ST:LIB}'
++	ext_out=['.bin']
++	vars=['LINKDEPS']
++	inst_to='${BINDIR}'
++	chmod=Utils.O755
++class cshlib(cprogram):
++	inst_to='${LIBDIR}'
++class cstlib(stlink_task):
++	pass
++
++TaskGen.extension('.c')(c_hook)
+\ No newline at end of file
+--- /dev/null
++++ ardour3/waflib/Tools/cs.py
+@@ -0,0 +1,98 @@
++#! /usr/bin/env python
++# encoding: utf-8
++# WARNING! Do not edit! http://waf.googlecode.com/git/docs/wafbook/single.html#_obtaining_the_waf_file
++
++import sys
++if sys.hexversion < 0x020400f0: from sets import Set as set
++from waflib import Utils,Task,Options,Logs,Errors
++from waflib.TaskGen import before_method,after_method,feature
++from waflib.Tools import ccroot
++from waflib.Configure import conf
++ccroot.USELIB_VARS['cs']=set(['CSFLAGS','ASSEMBLIES','RESOURCES'])
++ccroot.lib_patterns['csshlib']=['%s']
++def apply_cs(self):
++	cs_nodes=[]
++	no_nodes=[]
++	for x in self.to_nodes(self.source):
++		if x.name.endswith('.cs'):
++			cs_nodes.append(x)
++		else:
++			no_nodes.append(x)
++	self.source=no_nodes
++	bintype=getattr(self,'type',self.gen.endswith('.dll')and'library'or'exe')
++	self.cs_task=tsk=self.create_task('mcs',cs_nodes,self.path.find_or_declare(self.gen))
++	tsk.env.CSTYPE='/target:%s'%bintype
++	tsk.env.OUT='/out:%s'%tsk.outputs[0].abspath()
++	inst_to=getattr(self,'install_path',bintype=='exe'and'${BINDIR}'or'${LIBDIR}')
++	if inst_to:
++		mod=getattr(self,'chmod',bintype=='exe'and Utils.O755 or Utils.O644)
++		self.install_task=self.bld.install_files(inst_to,self.cs_task.outputs[:],env=self.env,chmod=mod)
++def use_cs(self):
++	names=self.to_list(getattr(self,'use',[]))
++	get=self.bld.get_tgen_by_name
++	for x in names:
++		try:
++			y=get(x)
++		except Errors.WafError:
++			self.cs_task.env.append_value('CSFLAGS','/reference:%s'%x)
++			continue
++		y.post()
++		tsk=getattr(y,'cs_task',None)or getattr(y,'link_task',None)
++		if not tsk:
++			self.bld.fatal('cs task has no link task for use %r'%self)
++		self.cs_task.dep_nodes.extend(tsk.outputs)
++		self.cs_task.set_run_after(tsk)
++		self.cs_task.env.append_value('CSFLAGS','/reference:%s'%tsk.outputs[0].abspath())
++def debug_cs(self):
++	csdebug=getattr(self,'csdebug',self.env.CSDEBUG)
++	if not csdebug:
++		return
++	node=self.cs_task.outputs[0]
++	if self.env.CS_NAME=='mono':
++		out=node.parent.find_or_declare(node.name+'.mdb')
++	else:
++		out=node.change_ext('.pdb')
++	self.cs_task.outputs.append(out)
++	try:
++		self.install_task.source.append(out)
++	except AttributeError:
++		pass
++	if csdebug=='pdbonly':
++		val=['/debug+','/debug:pdbonly']
++	elif csdebug=='full':
++		val=['/debug+','/debug:full']
++	else:
++		val=['/debug-']
++	self.cs_task.env.append_value('CSFLAGS',val)
++class mcs(Task.Task):
++	color='YELLOW'
++	run_str='${MCS} ${CSTYPE} ${CSFLAGS} ${ASS_ST:ASSEMBLIES} ${RES_ST:RESOURCES} ${OUT} ${SRC}'
++def configure(conf):
++	csc=getattr(Options.options,'cscbinary',None)
++	if csc:
++		conf.env.MCS=csc
++	conf.find_program(['csc','mcs','gmcs'],var='MCS')
++	conf.env.ASS_ST='/r:%s'
++	conf.env.RES_ST='/resource:%s'
++	conf.env.CS_NAME='csc'
++	if str(conf.env.MCS).lower().find('mcs')>-1:
++		conf.env.CS_NAME='mono'
++def options(opt):
++	opt.add_option('--with-csc-binary',type='string',dest='cscbinary')
++class fake_csshlib(Task.Task):
++	color='YELLOW'
++	inst_to=None
++	def runnable_status(self):
++		for x in self.outputs:
++			x.sig=Utils.h_file(x.abspath())
++		return Task.SKIP_ME
++def read_csshlib(self,name,paths=[]):
++	return self(name=name,features='fake_lib',lib_paths=paths,lib_type='csshlib')
++
++feature('cs')(apply_cs)
++before_method('process_source')(apply_cs)
++feature('cs')(use_cs)
++after_method('apply_cs')(use_cs)
++feature('cs')(debug_cs)
++after_method('apply_cs','use_cs')(debug_cs)
++conf(read_csshlib)
+\ No newline at end of file
+--- /dev/null
++++ ardour3/waflib/Tools/c_tests.py
+@@ -0,0 +1,146 @@
++#! /usr/bin/env python
++# encoding: utf-8
++# WARNING! Do not edit! http://waf.googlecode.com/git/docs/wafbook/single.html#_obtaining_the_waf_file
++
++from waflib import Task
++from waflib.Configure import conf
++from waflib.TaskGen import feature,before_method,after_method
++import sys
++LIB_CODE='''
++#ifdef _MSC_VER
++#define testEXPORT __declspec(dllexport)
++#else
++#define testEXPORT
++#endif
++testEXPORT int lib_func(void) { return 9; }
++'''
++MAIN_CODE='''
++#ifdef _MSC_VER
++#define testEXPORT __declspec(dllimport)
++#else
++#define testEXPORT
++#endif
++testEXPORT int lib_func(void);
++int main(void) {return !(lib_func() == 9);}
++'''
++def link_lib_test_fun(self):
++	def write_test_file(task):
++		task.outputs[0].write(task.generator.code)
++	rpath=[]
++	if getattr(self,'add_rpath',False):
++		rpath=[self.bld.path.get_bld().abspath()]
++	mode=self.mode
++	m='%s %s'%(mode,mode)
++	ex=self.test_exec and'test_exec'or''
++	bld=self.bld
++	bld(rule=write_test_file,target='test.'+mode,code=LIB_CODE)
++	bld(rule=write_test_file,target='main.'+mode,code=MAIN_CODE)
++	bld(features='%sshlib'%m,source='test.'+mode,target='test')
++	bld(features='%sprogram %s'%(m,ex),source='main.'+mode,target='app',use='test',rpath=rpath)
++def check_library(self,mode=None,test_exec=True):
++	if not mode:
++		mode='c'
++		if self.env.CXX:
++			mode='cxx'
++	self.check(compile_filename=[],features='link_lib_test',msg='Checking for libraries',mode=mode,test_exec=test_exec,)
++INLINE_CODE='''
++typedef int foo_t;
++static %s foo_t static_foo () {return 0; }
++%s foo_t foo () {
++	return 0;
++}
++'''
++INLINE_VALUES=['inline','__inline__','__inline']
++def check_inline(self,**kw):
++	self.start_msg('Checking for inline')
++	if not'define_name'in kw:
++		kw['define_name']='INLINE_MACRO'
++	if not'features'in kw:
++		if self.env.CXX:
++			kw['features']=['cxx']
++		else:
++			kw['features']=['c']
++	for x in INLINE_VALUES:
++		kw['fragment']=INLINE_CODE%(x,x)
++		try:
++			self.check(**kw)
++		except self.errors.ConfigurationError:
++			continue
++		else:
++			self.end_msg(x)
++			if x!='inline':
++				self.define('inline',x,quote=False)
++			return x
++	self.fatal('could not use inline functions')
++LARGE_FRAGMENT='#include <unistd.h>\nint main() { return !(sizeof(off_t) >= 8); }\n'
++def check_large_file(self,**kw):
++	if not'define_name'in kw:
++		kw['define_name']='HAVE_LARGEFILE'
++	if not'execute'in kw:
++		kw['execute']=True
++	if not'features'in kw:
++		if self.env.CXX:
++			kw['features']=['cxx','cxxprogram']
++		else:
++			kw['features']=['c','cprogram']
++	kw['fragment']=LARGE_FRAGMENT
++	kw['msg']='Checking for large file support'
++	ret=True
++	try:
++		if self.env.DEST_BINFMT!='pe':
++			ret=self.check(**kw)
++	except self.errors.ConfigurationError:
++		pass
++	else:
++		if ret:
++			return True
++	kw['msg']='Checking for -D_FILE_OFFSET_BITS=64'
++	kw['defines']=['_FILE_OFFSET_BITS=64']
++	try:
++		ret=self.check(**kw)
++	except self.errors.ConfigurationError:
++		pass
++	else:
++		self.define('_FILE_OFFSET_BITS',64)
++		return ret
++	self.fatal('There is no support for large files')
++ENDIAN_FRAGMENT='''
++short int ascii_mm[] = { 0x4249, 0x4765, 0x6E44, 0x6961, 0x6E53, 0x7953, 0 };
++short int ascii_ii[] = { 0x694C, 0x5454, 0x656C, 0x6E45, 0x6944, 0x6E61, 0 };
++int use_ascii (int i) {
++	return ascii_mm[i] + ascii_ii[i];
++}
++short int ebcdic_ii[] = { 0x89D3, 0xE3E3, 0x8593, 0x95C5, 0x89C4, 0x9581, 0 };
++short int ebcdic_mm[] = { 0xC2C9, 0xC785, 0x95C4, 0x8981, 0x95E2, 0xA8E2, 0 };
++int use_ebcdic (int i) {
++	return ebcdic_mm[i] + ebcdic_ii[i];
++}
++extern int foo;
++'''
++class grep_for_endianness(Task.Task):
++	color='PINK'
++	def run(self):
++		txt=self.inputs[0].read(flags='rb').decode('iso8859-1')
++		if txt.find('LiTTleEnDian')>-1:
++			self.generator.tmp.append('little')
++		elif txt.find('BIGenDianSyS')>-1:
++			self.generator.tmp.append('big')
++		else:
++			return-1
++def grep_for_endianness_fun(self):
++	self.create_task('grep_for_endianness',self.compiled_tasks[0].outputs[0])
++def check_endianness(self):
++	tmp=[]
++	def check_msg(self):
++		return tmp[0]
++	self.check(fragment=ENDIAN_FRAGMENT,features='c grep_for_endianness',msg="Checking for endianness",define='ENDIANNESS',tmp=tmp,okmsg=check_msg)
++	return tmp[0]
++
++feature('link_lib_test')(link_lib_test_fun)
++before_method('process_source')(link_lib_test_fun)
++conf(check_library)
++conf(check_inline)
++conf(check_large_file)
++feature('grep_for_endianness')(grep_for_endianness_fun)
++after_method('process_source')(grep_for_endianness_fun)
++conf(check_endianness)
+\ No newline at end of file
+--- /dev/null
++++ ardour3/waflib/Tools/cxx.py
+@@ -0,0 +1,27 @@
++#! /usr/bin/env python
++# encoding: utf-8
++# WARNING! Do not edit! http://waf.googlecode.com/git/docs/wafbook/single.html#_obtaining_the_waf_file
++
++from waflib import TaskGen,Task,Utils
++from waflib.Tools import c_preproc
++from waflib.Tools.ccroot import link_task,stlink_task
++def cxx_hook(self,node):
++	return self.create_compiled_task('cxx',node)
++TaskGen.extension('.cpp','.cc','.cxx','.C','.c++')(cxx_hook)
++if not'.c'in TaskGen.task_gen.mappings:
++	TaskGen.task_gen.mappings['.c']=TaskGen.task_gen.mappings['.cpp']
++class cxx(Task.Task):
++	run_str='${CXX} ${ARCH_ST:ARCH} ${CXXFLAGS} ${CPPFLAGS} ${FRAMEWORKPATH_ST:FRAMEWORKPATH} ${CPPPATH_ST:INCPATHS} ${DEFINES_ST:DEFINES} ${CXX_SRC_F}${SRC} ${CXX_TGT_F}${TGT}'
++	vars=['CXXDEPS']
++	ext_in=['.h']
++	scan=c_preproc.scan
++class cxxprogram(link_task):
++	run_str='${LINK_CXX} ${LINKFLAGS} ${CXXLNK_SRC_F}${SRC} ${CXXLNK_TGT_F}${TGT[0].abspath()} ${RPATH_ST:RPATH} ${FRAMEWORKPATH_ST:FRAMEWORKPATH} ${FRAMEWORK_ST:FRAMEWORK} ${ARCH_ST:ARCH} ${STLIB_MARKER} ${STLIBPATH_ST:STLIBPATH} ${STLIB_ST:STLIB} ${SHLIB_MARKER} ${LIBPATH_ST:LIBPATH} ${LIB_ST:LIB}'
++	vars=['LINKDEPS']
++	ext_out=['.bin']
++	inst_to='${BINDIR}'
++	chmod=Utils.O755
++class cxxshlib(cxxprogram):
++	inst_to='${LIBDIR}'
++class cxxstlib(stlink_task):
++	pass
+--- /dev/null
++++ ardour3/waflib/Tools/dbus.py
+@@ -0,0 +1,30 @@
++#! /usr/bin/env python
++# encoding: utf-8
++# WARNING! Do not edit! http://waf.googlecode.com/git/docs/wafbook/single.html#_obtaining_the_waf_file
++
++from waflib import Task,Errors
++from waflib.TaskGen import taskgen_method,before_method
++def add_dbus_file(self,filename,prefix,mode):
++	if not hasattr(self,'dbus_lst'):
++		self.dbus_lst=[]
++	if not'process_dbus'in self.meths:
++		self.meths.append('process_dbus')
++	self.dbus_lst.append([filename,prefix,mode])
++def process_dbus(self):
++	for filename,prefix,mode in getattr(self,'dbus_lst',[]):
++		node=self.path.find_resource(filename)
++		if not node:
++			raise Errors.WafError('file not found '+filename)
++		tsk=self.create_task('dbus_binding_tool',node,node.change_ext('.h'))
++		tsk.env.DBUS_BINDING_TOOL_PREFIX=prefix
++		tsk.env.DBUS_BINDING_TOOL_MODE=mode
++class dbus_binding_tool(Task.Task):
++	color='BLUE'
++	ext_out=['.h']
++	run_str='${DBUS_BINDING_TOOL} --prefix=${DBUS_BINDING_TOOL_PREFIX} --mode=${DBUS_BINDING_TOOL_MODE} --output=${TGT} ${SRC}'
++	shell=True
++def configure(conf):
++	dbus_binding_tool=conf.find_program('dbus-binding-tool',var='DBUS_BINDING_TOOL')
++
++taskgen_method(add_dbus_file)
++before_method('apply_core')(process_dbus)
+\ No newline at end of file
+--- /dev/null
++++ ardour3/waflib/Tools/d_config.py
+@@ -0,0 +1,47 @@
++#! /usr/bin/env python
++# encoding: utf-8
++# WARNING! Do not edit! http://waf.googlecode.com/git/docs/wafbook/single.html#_obtaining_the_waf_file
++
++from waflib import Utils
++from waflib.Configure import conf
++def d_platform_flags(self):
++	v=self.env
++	if not v.DEST_OS:
++		v.DEST_OS=Utils.unversioned_sys_platform()
++	if Utils.destos_to_binfmt(self.env.DEST_OS)=='pe':
++		v['dprogram_PATTERN']='%s.exe'
++		v['dshlib_PATTERN']='lib%s.dll'
++		v['dstlib_PATTERN']='lib%s.a'
++	else:
++		v['dprogram_PATTERN']='%s'
++		v['dshlib_PATTERN']='lib%s.so'
++		v['dstlib_PATTERN']='lib%s.a'
++DLIB='''
++version(D_Version2) {
++	import std.stdio;
++	int main() {
++		writefln("phobos2");
++		return 0;
++	}
++} else {
++	version(Tango) {
++		import tango.stdc.stdio;
++		int main() {
++			printf("tango");
++			return 0;
++		}
++	} else {
++		import std.stdio;
++		int main() {
++			writefln("phobos1");
++			return 0;
++		}
++	}
++}
++'''
++def check_dlibrary(self):
++	ret=self.check_cc(features='d dprogram',fragment=DLIB,compile_filename='test.d',execute=True,define_ret=True)
++	self.env.DLIBRARY=ret.strip()
++
++conf(d_platform_flags)
++conf(check_dlibrary)
+\ No newline at end of file
+--- /dev/null
++++ ardour3/waflib/Tools/dmd.py
+@@ -0,0 +1,47 @@
++#! /usr/bin/env python
++# encoding: utf-8
++# WARNING! Do not edit! http://waf.googlecode.com/git/docs/wafbook/single.html#_obtaining_the_waf_file
++
++import sys
++from waflib.Tools import ar,d
++from waflib.Configure import conf
++def find_dmd(conf):
++	conf.find_program(['dmd','ldc'],var='D')
++def common_flags_ldc(conf):
++	v=conf.env
++	v['DFLAGS']=['-d-version=Posix']
++	v['LINKFLAGS']=[]
++	v['DFLAGS_dshlib']=['-relocation-model=pic']
++def common_flags_dmd(conf):
++	v=conf.env
++	v['D_SRC_F']=['-c']
++	v['D_TGT_F']='-of%s'
++	v['D_LINKER']=v['D']
++	v['DLNK_SRC_F']=''
++	v['DLNK_TGT_F']='-of%s'
++	v['DINC_ST']='-I%s'
++	v['DSHLIB_MARKER']=v['DSTLIB_MARKER']=''
++	v['DSTLIB_ST']=v['DSHLIB_ST']='-L-l%s'
++	v['DSTLIBPATH_ST']=v['DLIBPATH_ST']='-L-L%s'
++	v['LINKFLAGS_dprogram']=['-quiet']
++	v['DFLAGS_dshlib']=['-fPIC']
++	v['LINKFLAGS_dshlib']=['-L-shared']
++	v['DHEADER_ext']='.di'
++	v.DFLAGS_d_with_header=['-H','-Hf']
++	v['D_HDR_F']='%s'
++def configure(conf):
++	conf.find_dmd()
++	if sys.platform=='win32':
++		out=conf.cmd_and_log([conf.env.D,'--help'])
++		if out.find("D Compiler v2.")>-1:
++			conf.fatal('dmd2 on Windows is not supported, use gdc or ldc instead')
++	conf.load('ar')
++	conf.load('d')
++	conf.common_flags_dmd()
++	conf.d_platform_flags()
++	if str(conf.env.D).find('ldc')>-1:
++		conf.common_flags_ldc()
++
++conf(find_dmd)
++conf(common_flags_ldc)
++conf(common_flags_dmd)
+\ No newline at end of file
+--- /dev/null
++++ ardour3/waflib/Tools/d.py
+@@ -0,0 +1,56 @@
++#! /usr/bin/env python
++# encoding: utf-8
++# WARNING! Do not edit! http://waf.googlecode.com/git/docs/wafbook/single.html#_obtaining_the_waf_file
++
++from waflib import Utils,Task,Errors
++from waflib.TaskGen import taskgen_method,feature,extension
++from waflib.Tools import d_scan,d_config
++from waflib.Tools.ccroot import link_task,stlink_task
++class d(Task.Task):
++	color='GREEN'
++	run_str='${D} ${DFLAGS} ${DINC_ST:INCPATHS} ${D_SRC_F:SRC} ${D_TGT_F:TGT}'
++	scan=d_scan.scan
++class d_with_header(d):
++	run_str='${D} ${DFLAGS} ${DINC_ST:INCPATHS} ${D_HDR_F:tgt.outputs[1].bldpath()} ${D_SRC_F:SRC} ${D_TGT_F:tgt.outputs[0].bldpath()}'
++class d_header(Task.Task):
++	color='BLUE'
++	run_str='${D} ${D_HEADER} ${SRC}'
++class dprogram(link_task):
++	run_str='${D_LINKER} ${LINKFLAGS} ${DLNK_SRC_F}${SRC} ${DLNK_TGT_F:TGT} ${RPATH_ST:RPATH} ${DSTLIB_MARKER} ${DSTLIBPATH_ST:STLIBPATH} ${DSTLIB_ST:STLIB} ${DSHLIB_MARKER} ${DLIBPATH_ST:LIBPATH} ${DSHLIB_ST:LIB}'
++	inst_to='${BINDIR}'
++	chmod=Utils.O755
++class dshlib(dprogram):
++	inst_to='${LIBDIR}'
++class dstlib(stlink_task):
++	pass
++def d_hook(self,node):
++	ext=Utils.destos_to_binfmt(self.env.DEST_OS)=='pe'and'obj'or'o'
++	out='%s.%d.%s'%(node.name,self.idx,ext)
++	def create_compiled_task(self,name,node):
++		task=self.create_task(name,node,node.parent.find_or_declare(out))
++		try:
++			self.compiled_tasks.append(task)
++		except AttributeError:
++			self.compiled_tasks=[task]
++		return task
++	if getattr(self,'generate_headers',None):
++		tsk=create_compiled_task(self,'d_with_header',node)
++		tsk.outputs.append(node.change_ext(self.env['DHEADER_ext']))
++	else:
++		tsk=create_compiled_task(self,'d',node)
++	return tsk
++def generate_header(self,filename,install_path=None):
++	try:
++		self.header_lst.append([filename,install_path])
++	except AttributeError:
++		self.header_lst=[[filename,install_path]]
++def process_header(self):
++	for i in getattr(self,'header_lst',[]):
++		node=self.path.find_resource(i[0])
++		if not node:
++			raise Errors.WafError('file %r not found on d obj'%i[0])
++		self.create_task('d_header',node,node.change_ext('.di'))
++
++extension('.d','.di','.D')(d_hook)
++taskgen_method(generate_header)
++feature('d')(process_header)
+\ No newline at end of file
+--- /dev/null
++++ ardour3/waflib/Tools/d_scan.py
+@@ -0,0 +1,133 @@
++#! /usr/bin/env python
++# encoding: utf-8
++# WARNING! Do not edit! http://waf.googlecode.com/git/docs/wafbook/single.html#_obtaining_the_waf_file
++
++import re
++from waflib import Utils,Logs
++def filter_comments(filename):
++	txt=Utils.readf(filename)
++	i=0
++	buf=[]
++	max=len(txt)
++	begin=0
++	while i<max:
++		c=txt[i]
++		if c=='"'or c=="'":
++			buf.append(txt[begin:i])
++			delim=c
++			i+=1
++			while i<max:
++				c=txt[i]
++				if c==delim:break
++				elif c=='\\':
++					i+=1
++				i+=1
++			i+=1
++			begin=i
++		elif c=='/':
++			buf.append(txt[begin:i])
++			i+=1
++			if i==max:break
++			c=txt[i]
++			if c=='+':
++				i+=1
++				nesting=1
++				c=None
++				while i<max:
++					prev=c
++					c=txt[i]
++					if prev=='/'and c=='+':
++						nesting+=1
++						c=None
++					elif prev=='+'and c=='/':
++						nesting-=1
++						if nesting==0:break
++						c=None
++					i+=1
++			elif c=='*':
++				i+=1
++				c=None
++				while i<max:
++					prev=c
++					c=txt[i]
++					if prev=='*'and c=='/':break
++					i+=1
++			elif c=='/':
++				i+=1
++				while i<max and txt[i]!='\n':
++					i+=1
++			else:
++				begin=i-1
++				continue
++			i+=1
++			begin=i
++			buf.append(' ')
++		else:
++			i+=1
++	buf.append(txt[begin:])
++	return buf
++class d_parser(object):
++	def __init__(self,env,incpaths):
++		self.allnames=[]
++		self.re_module=re.compile("module\s+([^;]+)")
++		self.re_import=re.compile("import\s+([^;]+)")
++		self.re_import_bindings=re.compile("([^:]+):(.*)")
++		self.re_import_alias=re.compile("[^=]+=(.+)")
++		self.env=env
++		self.nodes=[]
++		self.names=[]
++		self.incpaths=incpaths
++	def tryfind(self,filename):
++		found=0
++		for n in self.incpaths:
++			found=n.find_resource(filename.replace('.','/')+'.d')
++			if found:
++				self.nodes.append(found)
++				self.waiting.append(found)
++				break
++		if not found:
++			if not filename in self.names:
++				self.names.append(filename)
++	def get_strings(self,code):
++		self.module=''
++		lst=[]
++		mod_name=self.re_module.search(code)
++		if mod_name:
++			self.module=re.sub('\s+','',mod_name.group(1))
++		import_iterator=self.re_import.finditer(code)
++		if import_iterator:
++			for import_match in import_iterator:
++				import_match_str=re.sub('\s+','',import_match.group(1))
++				bindings_match=self.re_import_bindings.match(import_match_str)
++				if bindings_match:
++					import_match_str=bindings_match.group(1)
++				matches=import_match_str.split(',')
++				for match in matches:
++					alias_match=self.re_import_alias.match(match)
++					if alias_match:
++						match=alias_match.group(1)
++					lst.append(match)
++		return lst
++	def start(self,node):
++		self.waiting=[node]
++		while self.waiting:
++			nd=self.waiting.pop(0)
++			self.iter(nd)
++	def iter(self,node):
++		path=node.abspath()
++		code="".join(filter_comments(path))
++		names=self.get_strings(code)
++		for x in names:
++			if x in self.allnames:continue
++			self.allnames.append(x)
++			self.tryfind(x)
++def scan(self):
++	env=self.env
++	gruik=d_parser(env,self.generator.includes_nodes)
++	node=self.inputs[0]
++	gruik.start(node)
++	nodes=gruik.nodes
++	names=gruik.names
++	if Logs.verbose:
++		Logs.debug('deps: deps for %s: %r; unresolved %r'%(str(node),nodes,names))
++	return(nodes,names)
+--- /dev/null
++++ ardour3/waflib/Tools/errcheck.py
+@@ -0,0 +1,161 @@
++#! /usr/bin/env python
++# encoding: utf-8
++# WARNING! Do not edit! http://waf.googlecode.com/git/docs/wafbook/single.html#_obtaining_the_waf_file
++
++import sys
++if sys.hexversion < 0x020400f0: from sets import Set as set
++typos={'feature':'features','sources':'source','targets':'target','include':'includes','export_include':'export_includes','define':'defines','importpath':'includes','installpath':'install_path',}
++meths_typos=['__call__','program','shlib','stlib','objects']
++from waflib import Logs,Build,Node,Task,TaskGen,ConfigSet,Errors,Utils
++import waflib.Tools.ccroot
++def check_same_targets(self):
++	mp=Utils.defaultdict(list)
++	uids={}
++	def check_task(tsk):
++		if not isinstance(tsk,Task.Task):
++			return
++		for node in tsk.outputs:
++			mp[node].append(tsk)
++		try:
++			uids[tsk.uid()].append(tsk)
++		except:
++			uids[tsk.uid()]=[tsk]
++	for g in self.groups:
++		for tg in g:
++			try:
++				for tsk in tg.tasks:
++					check_task(tsk)
++			except AttributeError:
++				check_task(tg)
++	dupe=False
++	for(k,v)in mp.items():
++		if len(v)>1:
++			dupe=True
++			msg='* Node %r is created by more than once%s. The task generators are:'%(k,Logs.verbose==1 and" (full message on 'waf -v -v')"or"")
++			Logs.error(msg)
++			for x in v:
++				if Logs.verbose>1:
++					Logs.error('  %d. %r'%(1+v.index(x),x.generator))
++				else:
++					Logs.error('  %d. %r in %r'%(1+v.index(x),x.generator.name,getattr(x.generator,'path',None)))
++	if not dupe:
++		for(k,v)in uids.items():
++			if len(v)>1:
++				Logs.error('* Several tasks use the same identifier. Please check the information on\n   http://waf.googlecode.com/git/docs/apidocs/Task.html#waflib.Task.Task.uid')
++				for tsk in v:
++					Logs.error('  - object %r (%r) defined in %r'%(tsk.__class__.__name__,tsk,tsk.generator))
++def check_invalid_constraints(self):
++	feat=set([])
++	for x in list(TaskGen.feats.values()):
++		feat.union(set(x))
++	for(x,y)in TaskGen.task_gen.prec.items():
++		feat.add(x)
++		feat.union(set(y))
++	ext=set([])
++	for x in TaskGen.task_gen.mappings.values():
++		ext.add(x.__name__)
++	invalid=ext&feat
++	if invalid:
++		Logs.error('The methods %r have invalid annotations:  @extension <-> @feature/@before_method/@after_method'%list(invalid))
++	for cls in list(Task.classes.values()):
++		for x in('before','after'):
++			for y in Utils.to_list(getattr(cls,x,[])):
++				if not Task.classes.get(y,None):
++					Logs.error('Erroneous order constraint %r=%r on task class %r'%(x,y,cls.__name__))
++		if getattr(cls,'rule',None):
++			Logs.error('Erroneous attribute "rule" on task class %r (rename to "run_str")'%cls.__name__)
++def replace(m):
++	oldcall=getattr(Build.BuildContext,m)
++	def call(self,*k,**kw):
++		ret=oldcall(self,*k,**kw)
++		for x in typos:
++			if x in kw:
++				err=True
++				Logs.error('Fix the typo %r -> %r on %r'%(x,typos[x],ret))
++		return ret
++	setattr(Build.BuildContext,m,call)
++def enhance_lib():
++	for m in meths_typos:
++		replace(m)
++	def ant_glob(self,*k,**kw):
++		if k:
++			lst=Utils.to_list(k[0])
++			for pat in lst:
++				if'..'in pat.split('/'):
++					Logs.error("In ant_glob pattern %r: '..' means 'two dots', not 'parent directory'"%k[0])
++		if kw.get('remove',True):
++			try:
++				if self.is_child_of(self.ctx.bldnode)and not kw.get('quiet',False):
++					Logs.error('Using ant_glob on the build folder (%r) is dangerous (quiet=True to disable this warning)'%self)
++			except AttributeError:
++				pass
++		return self.old_ant_glob(*k,**kw)
++	Node.Node.old_ant_glob=Node.Node.ant_glob
++	Node.Node.ant_glob=ant_glob
++	old=Task.is_before
++	def is_before(t1,t2):
++		ret=old(t1,t2)
++		if ret and old(t2,t1):
++			Logs.error('Contradictory order constraints in classes %r %r'%(t1,t2))
++		return ret
++	Task.is_before=is_before
++	def check_err_features(self):
++		lst=self.to_list(self.features)
++		if'shlib'in lst:
++			Logs.error('feature shlib -> cshlib, dshlib or cxxshlib')
++		for x in('c','cxx','d','fc'):
++			if not x in lst and lst and lst[0]in[x+y for y in('program','shlib','stlib')]:
++				Logs.error('%r features is probably missing %r'%(self,x))
++	TaskGen.feature('*')(check_err_features)
++	def check_err_order(self):
++		if not hasattr(self,'rule'):
++			for x in('before','after','ext_in','ext_out'):
++				if hasattr(self,x):
++					Logs.warn('Erroneous order constraint %r on non-rule based task generator %r'%(x,self))
++		else:
++			for x in('before','after'):
++				for y in self.to_list(getattr(self,x,[])):
++					if not Task.classes.get(y,None):
++						Logs.error('Erroneous order constraint %s=%r on %r'%(x,y,self))
++	TaskGen.feature('*')(check_err_order)
++	def check_compile(self):
++		check_invalid_constraints(self)
++		try:
++			ret=self.orig_compile()
++		finally:
++			check_same_targets(self)
++		return ret
++	Build.BuildContext.orig_compile=Build.BuildContext.compile
++	Build.BuildContext.compile=check_compile
++	def use_rec(self,name,**kw):
++		try:
++			y=self.bld.get_tgen_by_name(name)
++		except Errors.WafError:
++			pass
++		else:
++			idx=self.bld.get_group_idx(self)
++			odx=self.bld.get_group_idx(y)
++			if odx>idx:
++				msg="Invalid 'use' across build groups:"
++				if Logs.verbose>1:
++					msg+='\n  target %r\n  uses:\n  %r'%(self,y)
++				else:
++					msg+=" %r uses %r (try 'waf -v -v' for the full error)"%(self.name,name)
++				raise Errors.WafError(msg)
++		self.orig_use_rec(name,**kw)
++	TaskGen.task_gen.orig_use_rec=TaskGen.task_gen.use_rec
++	TaskGen.task_gen.use_rec=use_rec
++	def getattri(self,name,default=None):
++		if name=='append'or name=='add':
++			raise Errors.WafError('env.append and env.add do not exist: use env.append_value/env.append_unique')
++		elif name=='prepend':
++			raise Errors.WafError('env.prepend does not exist: use env.prepend_value')
++		if name in self.__slots__:
++			return object.__getattr__(self,name,default)
++		else:
++			return self[name]
++	ConfigSet.ConfigSet.__getattr__=getattri
++def options(opt):
++	enhance_lib()
++def configure(conf):
++	pass
+--- /dev/null
++++ ardour3/waflib/Tools/fc_config.py
+@@ -0,0 +1,283 @@
++#! /usr/bin/env python
++# encoding: utf-8
++# WARNING! Do not edit! http://waf.googlecode.com/git/docs/wafbook/single.html#_obtaining_the_waf_file
++
++import re,shutil,os,sys,string,shlex
++from waflib.Configure import conf
++from waflib.TaskGen import feature,after_method,before_method
++from waflib import Build,Utils
++FC_FRAGMENT='        program main\n        end     program main\n'
++FC_FRAGMENT2='        PROGRAM MAIN\n        END\n'
++def fc_flags(conf):
++	v=conf.env
++	v['FC_SRC_F']=[]
++	v['FC_TGT_F']=['-c','-o']
++	v['FCINCPATH_ST']='-I%s'
++	v['FCDEFINES_ST']='-D%s'
++	if not v['LINK_FC']:v['LINK_FC']=v['FC']
++	v['FCLNK_SRC_F']=[]
++	v['FCLNK_TGT_F']=['-o']
++	v['FCFLAGS_fcshlib']=['-fpic']
++	v['LINKFLAGS_fcshlib']=['-shared']
++	v['fcshlib_PATTERN']='lib%s.so'
++	v['fcstlib_PATTERN']='lib%s.a'
++	v['FCLIB_ST']='-l%s'
++	v['FCLIBPATH_ST']='-L%s'
++	v['FCSTLIB_ST']='-l%s'
++	v['FCSTLIBPATH_ST']='-L%s'
++	v['FCSTLIB_MARKER']='-Wl,-Bstatic'
++	v['FCSHLIB_MARKER']='-Wl,-Bdynamic'
++	v['SONAME_ST']='-Wl,-h,%s'
++def check_fortran(self,*k,**kw):
++	self.check_cc(fragment=FC_FRAGMENT,compile_filename='test.f',features='fc fcprogram',msg='Compiling a simple fortran app')
++def check_fc(self,*k,**kw):
++	kw['compiler']='fc'
++	if not'compile_mode'in kw:
++		kw['compile_mode']='fc'
++	if not'type'in kw:
++		kw['type']='fcprogram'
++	if not'compile_filename'in kw:
++		kw['compile_filename']='test.f90'
++	if not'code'in kw:
++		kw['code']=FC_FRAGMENT
++	return self.check(*k,**kw)
++def fortran_modifier_darwin(conf):
++	v=conf.env
++	v['FCFLAGS_fcshlib']=['-fPIC','-compatibility_version','1','-current_version','1']
++	v['LINKFLAGS_fcshlib']=['-dynamiclib']
++	v['fcshlib_PATTERN']='lib%s.dylib'
++	v['FRAMEWORKPATH_ST']='-F%s'
++	v['FRAMEWORK_ST']='-framework %s'
++	v['LINKFLAGS_fcstlib']=[]
++	v['FCSHLIB_MARKER']=''
++	v['FCSTLIB_MARKER']=''
++	v['SONAME_ST']=''
++def fortran_modifier_win32(conf):
++	v=conf.env
++	v['fcprogram_PATTERN']=v['fcprogram_test_PATTERN']='%s.exe'
++	v['fcshlib_PATTERN']='%s.dll'
++	v['implib_PATTERN']='lib%s.dll.a'
++	v['IMPLIB_ST']='-Wl,--out-implib,%s'
++	v['FCFLAGS_fcshlib']=[]
++	v.append_value('FCFLAGS_fcshlib',['-DDLL_EXPORT'])
++	v.append_value('LINKFLAGS',['-Wl,--enable-auto-import'])
++def fortran_modifier_cygwin(conf):
++	fortran_modifier_win32(conf)
++	v=conf.env
++	v['fcshlib_PATTERN']='cyg%s.dll'
++	v.append_value('LINKFLAGS_fcshlib',['-Wl,--enable-auto-image-base'])
++	v['FCFLAGS_fcshlib']=[]
++def check_fortran_dummy_main(self,*k,**kw):
++	if not self.env.CC:
++		self.fatal('A c compiler is required for check_fortran_dummy_main')
++	lst=['MAIN__','__MAIN','_MAIN','MAIN_','MAIN']
++	lst.extend([m.lower()for m in lst])
++	lst.append('')
++	self.start_msg('Detecting whether we need a dummy main')
++	for main in lst:
++		kw['fortran_main']=main
++		try:
++			self.check_cc(fragment='int %s() { return 0; }\n'%(main or'test'),features='c fcprogram',mandatory=True)
++			if not main:
++				self.env.FC_MAIN=-1
++				self.end_msg('no')
++			else:
++				self.env.FC_MAIN=main
++				self.end_msg('yes %s'%main)
++			break
++		except self.errors.ConfigurationError:
++			pass
++	else:
++		self.end_msg('not found')
++		self.fatal('could not detect whether fortran requires a dummy main, see the config.log')
++GCC_DRIVER_LINE=re.compile('^Driving:')
++POSIX_STATIC_EXT=re.compile('\S+\.a')
++POSIX_LIB_FLAGS=re.compile('-l\S+')
++def is_link_verbose(self,txt):
++	assert isinstance(txt,str)
++	for line in txt.splitlines():
++		if not GCC_DRIVER_LINE.search(line):
++			if POSIX_STATIC_EXT.search(line)or POSIX_LIB_FLAGS.search(line):
++				return True
++	return False
++def check_fortran_verbose_flag(self,*k,**kw):
++	self.start_msg('fortran link verbose flag')
++	for x in['-v','--verbose','-verbose','-V']:
++		try:
++			self.check_cc(features='fc fcprogram_test',fragment=FC_FRAGMENT2,compile_filename='test.f',linkflags=[x],mandatory=True)
++		except self.errors.ConfigurationError:
++			pass
++		else:
++			if self.is_link_verbose(self.test_bld.err)or self.is_link_verbose(self.test_bld.out):
++				self.end_msg(x)
++				break
++	else:
++		self.end_msg('failure')
++		self.fatal('Could not obtain the fortran link verbose flag (see config.log)')
++	self.env.FC_VERBOSE_FLAG=x
++	return x
++LINKFLAGS_IGNORED=[r'-lang*',r'-lcrt[a-zA-Z0-9\.]*\.o',r'-lc$',r'-lSystem',r'-libmil',r'-LIST:*',r'-LNO:*']
++if os.name=='nt':
++	LINKFLAGS_IGNORED.extend([r'-lfrt*',r'-luser32',r'-lkernel32',r'-ladvapi32',r'-lmsvcrt',r'-lshell32',r'-lmingw',r'-lmoldname'])
++else:
++	LINKFLAGS_IGNORED.append(r'-lgcc*')
++RLINKFLAGS_IGNORED=[re.compile(f)for f in LINKFLAGS_IGNORED]
++def _match_ignore(line):
++	for i in RLINKFLAGS_IGNORED:
++		if i.match(line):
++			return True
++	return False
++def parse_fortran_link(lines):
++	final_flags=[]
++	for line in lines:
++		if not GCC_DRIVER_LINE.match(line):
++			_parse_flink_line(line,final_flags)
++	return final_flags
++SPACE_OPTS=re.compile('^-[LRuYz]$')
++NOSPACE_OPTS=re.compile('^-[RL]')
++def _parse_flink_line(line,final_flags):
++	lexer=shlex.shlex(line,posix=True)
++	lexer.whitespace_split=True
++	t=lexer.get_token()
++	tmp_flags=[]
++	while t:
++		def parse(token):
++			if _match_ignore(token):
++				pass
++			elif token.startswith('-lkernel32')and sys.platform=='cygwin':
++				tmp_flags.append(token)
++			elif SPACE_OPTS.match(token):
++				t=lexer.get_token()
++				if t.startswith('P,'):
++					t=t[2:]
++				for opt in t.split(os.pathsep):
++					tmp_flags.append('-L%s'%opt)
++			elif NOSPACE_OPTS.match(token):
++				tmp_flags.append(token)
++			elif POSIX_LIB_FLAGS.match(token):
++				tmp_flags.append(token)
++			else:
++				pass
++			t=lexer.get_token()
++			return t
++		t=parse(t)
++	final_flags.extend(tmp_flags)
++	return final_flags
++def check_fortran_clib(self,autoadd=True,*k,**kw):
++	if not self.env.FC_VERBOSE_FLAG:
++		self.fatal('env.FC_VERBOSE_FLAG is not set: execute check_fortran_verbose_flag?')
++	self.start_msg('Getting fortran runtime link flags')
++	try:
++		self.check_cc(fragment=FC_FRAGMENT2,compile_filename='test.f',features='fc fcprogram_test',linkflags=[self.env.FC_VERBOSE_FLAG])
++	except:
++		self.end_msg(False)
++		if kw.get('mandatory',True):
++			conf.fatal('Could not find the c library flags')
++	else:
++		out=self.test_bld.err
++		flags=parse_fortran_link(out.splitlines())
++		self.end_msg('ok (%s)'%' '.join(flags))
++		self.env.LINKFLAGS_CLIB=flags
++		return flags
++	return[]
++def getoutput(conf,cmd,stdin=False):
++	if stdin:
++		stdin=Utils.subprocess.PIPE
++	else:
++		stdin=None
++	env=conf.env.env or None
++	try:
++		p=Utils.subprocess.Popen(cmd,stdin=stdin,stdout=Utils.subprocess.PIPE,stderr=Utils.subprocess.PIPE,env=env)
++		if stdin:
++			p.stdin.write('\n')
++		stdout,stderr=p.communicate()
++	except:
++		conf.fatal('could not determine the compiler version %r'%cmd)
++	else:
++		if not isinstance(stdout,str):
++			stdout=stdout.decode(sys.stdout.encoding)
++		if not isinstance(stderr,str):
++			stderr=stderr.decode(sys.stdout.encoding)
++		return stdout,stderr
++ROUTINES_CODE="""\
++      subroutine foobar()
++      return
++      end
++      subroutine foo_bar()
++      return
++      end
++"""
++MAIN_CODE="""
++void %(dummy_func_nounder)s(void);
++void %(dummy_func_under)s(void);
++int %(main_func_name)s() {
++  %(dummy_func_nounder)s();
++  %(dummy_func_under)s();
++  return 0;
++}
++"""
++def link_main_routines_tg_method(self):
++	def write_test_file(task):
++		task.outputs[0].write(task.generator.code)
++	bld=self.bld
++	bld(rule=write_test_file,target='main.c',code=MAIN_CODE%self.__dict__)
++	bld(rule=write_test_file,target='test.f',code=ROUTINES_CODE)
++	bld(features='fc fcstlib',source='test.f',target='test')
++	bld(features='c fcprogram',source='main.c',target='app',use='test')
++def mangling_schemes():
++	for u in['_','']:
++		for du in['','_']:
++			for c in["lower","upper"]:
++				yield(u,du,c)
++def mangle_name(u,du,c,name):
++	return getattr(name,c)()+u+(name.find('_')!=-1 and du or'')
++def check_fortran_mangling(self,*k,**kw):
++	if not self.env.CC:
++		self.fatal('A c compiler is required for link_main_routines')
++	if not self.env.FC:
++		self.fatal('A fortran compiler is required for link_main_routines')
++	if not self.env.FC_MAIN:
++		self.fatal('Checking for mangling requires self.env.FC_MAIN (execute "check_fortran_dummy_main" first?)')
++	self.start_msg('Getting fortran mangling scheme')
++	for(u,du,c)in mangling_schemes():
++		try:
++			self.check_cc(compile_filename=[],features='link_main_routines_func',msg='nomsg',errmsg='nomsg',mandatory=True,dummy_func_nounder=mangle_name(u,du,c,"foobar"),dummy_func_under=mangle_name(u,du,c,"foo_bar"),main_func_name=self.env.FC_MAIN)
++		except self.errors.ConfigurationError:
++			pass
++		else:
++			self.end_msg("ok ('%s', '%s', '%s-case')"%(u,du,c))
++			self.env.FORTRAN_MANGLING=(u,du,c)
++			break
++	else:
++		self.end_msg(False)
++		self.fatal('mangler not found')
++	return(u,du,c)
++def set_lib_pat(self):
++	self.env['fcshlib_PATTERN']=self.env['pyext_PATTERN']
++def detect_openmp(self):
++	for x in['-fopenmp','-openmp','-mp','-xopenmp','-omp','-qsmp=omp']:
++		try:
++			self.check_fc(msg='Checking for OpenMP flag %s'%x,fragment='program main\n  call omp_get_num_threads()\nend program main',fcflags=x,linkflags=x,uselib_store='OPENMP')
++		except self.errors.ConfigurationError:
++			pass
++		else:
++			break
++	else:
++		self.fatal('Could not find OpenMP')
++
++conf(fc_flags)
++conf(check_fortran)
++conf(check_fc)
++conf(fortran_modifier_darwin)
++conf(fortran_modifier_win32)
++conf(fortran_modifier_cygwin)
++conf(check_fortran_dummy_main)
++conf(is_link_verbose)
++conf(check_fortran_verbose_flag)
++conf(check_fortran_clib)
++feature('link_main_routines_func')(link_main_routines_tg_method)
++before_method('process_source')(link_main_routines_tg_method)
++conf(check_fortran_mangling)
++feature('pyext')(set_lib_pat)
++before_method('propagate_uselib_vars','apply_link')(set_lib_pat)
++conf(detect_openmp)
+\ No newline at end of file
+--- /dev/null
++++ ardour3/waflib/Tools/fc.py
+@@ -0,0 +1,123 @@
++#! /usr/bin/env python
++# encoding: utf-8
++# WARNING! Do not edit! http://waf.googlecode.com/git/docs/wafbook/single.html#_obtaining_the_waf_file
++
++import sys
++if sys.hexversion < 0x020400f0: from sets import Set as set
++import re
++from waflib import Utils,Task,TaskGen,Logs
++from waflib.Tools import ccroot,fc_config,fc_scan
++from waflib.TaskGen import feature,before_method,after_method,extension
++from waflib.Configure import conf
++ccroot.USELIB_VARS['fc']=set(['FCFLAGS','DEFINES','INCLUDES'])
++ccroot.USELIB_VARS['fcprogram_test']=ccroot.USELIB_VARS['fcprogram']=set(['LIB','STLIB','LIBPATH','STLIBPATH','LINKFLAGS','RPATH','LINKDEPS'])
++ccroot.USELIB_VARS['fcshlib']=set(['LIB','STLIB','LIBPATH','STLIBPATH','LINKFLAGS','RPATH','LINKDEPS'])
++ccroot.USELIB_VARS['fcstlib']=set(['ARFLAGS','LINKDEPS'])
++def dummy(self):
++	pass
++def fc_hook(self,node):
++	return self.create_compiled_task('fc',node)
++def modfile(conf,name):
++	return{'lower':name.lower()+'.mod','lower.MOD':name.upper()+'.MOD','UPPER.mod':name.upper()+'.mod','UPPER':name.upper()+'.MOD'}[conf.env.FC_MOD_CAPITALIZATION or'lower']
++def get_fortran_tasks(tsk):
++	bld=tsk.generator.bld
++	tasks=bld.get_tasks_group(bld.get_group_idx(tsk.generator))
++	return[x for x in tasks if isinstance(x,fc)and not getattr(x,'nomod',None)and not getattr(x,'mod_fortran_done',None)]
++class fc(Task.Task):
++	color='GREEN'
++	run_str='${FC} ${FCFLAGS} ${FCINCPATH_ST:INCPATHS} ${FCDEFINES_ST:DEFINES} ${_FCMODOUTFLAGS} ${FC_TGT_F}${TGT[0].abspath()} ${FC_SRC_F}${SRC[0].abspath()}'
++	vars=["FORTRANMODPATHFLAG"]
++	def scan(self):
++		tmp=fc_scan.fortran_parser(self.generator.includes_nodes)
++		tmp.task=self
++		tmp.start(self.inputs[0])
++		if Logs.verbose:
++			Logs.debug('deps: deps for %r: %r; unresolved %r'%(self.inputs,tmp.nodes,tmp.names))
++		return(tmp.nodes,tmp.names)
++	def runnable_status(self):
++		if getattr(self,'mod_fortran_done',None):
++			return super(fc,self).runnable_status()
++		bld=self.generator.bld
++		lst=get_fortran_tasks(self)
++		for tsk in lst:
++			tsk.mod_fortran_done=True
++		for tsk in lst:
++			ret=tsk.runnable_status()
++			if ret==Task.ASK_LATER:
++				for x in lst:
++					x.mod_fortran_done=None
++				return Task.ASK_LATER
++		ins=Utils.defaultdict(set)
++		outs=Utils.defaultdict(set)
++		for tsk in lst:
++			key=tsk.uid()
++			for x in bld.raw_deps[key]:
++				if x.startswith('MOD@'):
++					name=bld.modfile(x.replace('MOD@',''))
++					node=bld.srcnode.find_or_declare(name)
++					tsk.set_outputs(node)
++					outs[id(node)].add(tsk)
++		for tsk in lst:
++			key=tsk.uid()
++			for x in bld.raw_deps[key]:
++				if x.startswith('USE@'):
++					name=bld.modfile(x.replace('USE@',''))
++					node=bld.srcnode.find_resource(name)
++					if node and node not in tsk.outputs:
++						if not node in bld.node_deps[key]:
++							bld.node_deps[key].append(node)
++						ins[id(node)].add(tsk)
++		for k in ins.keys():
++			for a in ins[k]:
++				a.run_after.update(outs[k])
++				tmp=[]
++				for t in outs[k]:
++					tmp.extend(t.outputs)
++				a.dep_nodes.extend(tmp)
++				try:
++					a.dep_nodes.sort(key=lambda x:x.abspath())
++				except:
++					a.dep_nodes.sort(lambda x,y:cmp(x.abspath(),y.abspath()))
++		for tsk in lst:
++			try:
++				delattr(tsk,'cache_sig')
++			except AttributeError:
++				pass
++		return super(fc,self).runnable_status()
++class fcprogram(ccroot.link_task):
++	color='YELLOW'
++	run_str='${FC} ${LINKFLAGS} ${FCLNK_SRC_F}${SRC} ${FCLNK_TGT_F}${TGT[0].abspath()} ${RPATH_ST:RPATH} ${FCSTLIB_MARKER} ${FCSTLIBPATH_ST:STLIBPATH} ${FCSTLIB_ST:STLIB} ${FCSHLIB_MARKER} ${FCLIBPATH_ST:LIBPATH} ${FCLIB_ST:LIB}'
++	inst_to='${BINDIR}'
++	chmod=Utils.O755
++class fcshlib(fcprogram):
++	inst_to='${LIBDIR}'
++class fcprogram_test(fcprogram):
++	def can_retrieve_cache(self):
++		return False
++	def runnable_status(self):
++		ret=super(fcprogram_test,self).runnable_status()
++		if ret==Task.SKIP_ME:
++			ret=Task.RUN_ME
++		return ret
++	def exec_command(self,cmd,**kw):
++		bld=self.generator.bld
++		kw['shell']=isinstance(cmd,str)
++		kw['stdout']=kw['stderr']=Utils.subprocess.PIPE
++		kw['cwd']=bld.variant_dir
++		bld.out=bld.err=''
++		bld.to_log('command: %s\n'%cmd)
++		kw['output']=0
++		try:
++			(bld.out,bld.err)=bld.cmd_and_log(cmd,**kw)
++		except Exception ,e:
++			return-1
++		if bld.out:
++			bld.to_log("out: %s\n"%bld.out)
++		if bld.err:
++			bld.to_log("err: %s\n"%bld.err)
++class fcstlib(ccroot.stlink_task):
++	pass
++
++feature('fcprogram','fcshlib','fcstlib','fcprogram_test')(dummy)
++extension('.f','.f90','.F','.F90','.for','.FOR')(fc_hook)
++conf(modfile)
+\ No newline at end of file
+--- /dev/null
++++ ardour3/waflib/Tools/fc_scan.py
+@@ -0,0 +1,68 @@
++#! /usr/bin/env python
++# encoding: utf-8
++# WARNING! Do not edit! http://waf.googlecode.com/git/docs/wafbook/single.html#_obtaining_the_waf_file
++
++import re
++from waflib import Utils,Task,TaskGen,Logs
++from waflib.TaskGen import feature,before_method,after_method,extension
++from waflib.Configure import conf
++INC_REGEX="""(?:^|['">]\s*;)\s*INCLUDE\s+(?:\w+_)?[<"'](.+?)(?=["'>])"""
++USE_REGEX="""(?:^|;)\s*USE(?:\s+|(?:(?:\s*,\s*(?:NON_)?INTRINSIC)?\s*::))\s*(\w+)"""
++MOD_REGEX="""(?:^|;)\s*MODULE(?!\s*PROCEDURE)(?:\s+|(?:(?:\s*,\s*(?:NON_)?INTRINSIC)?\s*::))\s*(\w+)"""
++re_inc=re.compile(INC_REGEX,re.I)
++re_use=re.compile(USE_REGEX,re.I)
++re_mod=re.compile(MOD_REGEX,re.I)
++class fortran_parser(object):
++	def __init__(self,incpaths):
++		self.seen=[]
++		self.nodes=[]
++		self.names=[]
++		self.incpaths=incpaths
++	def find_deps(self,node):
++		txt=node.read()
++		incs=[]
++		uses=[]
++		mods=[]
++		for line in txt.splitlines():
++			m=re_inc.search(line)
++			if m:
++				incs.append(m.group(1))
++			m=re_use.search(line)
++			if m:
++				uses.append(m.group(1))
++			m=re_mod.search(line)
++			if m:
++				mods.append(m.group(1))
++		return(incs,uses,mods)
++	def start(self,node):
++		self.waiting=[node]
++		while self.waiting:
++			nd=self.waiting.pop(0)
++			self.iter(nd)
++	def iter(self,node):
++		path=node.abspath()
++		incs,uses,mods=self.find_deps(node)
++		for x in incs:
++			if x in self.seen:
++				continue
++			self.seen.append(x)
++			self.tryfind_header(x)
++		for x in uses:
++			name="USE@%s"%x
++			if not name in self.names:
++				self.names.append(name)
++		for x in mods:
++			name="MOD@%s"%x
++			if not name in self.names:
++				self.names.append(name)
++	def tryfind_header(self,filename):
++		found=None
++		for n in self.incpaths:
++			found=n.find_resource(filename)
++			if found:
++				self.nodes.append(found)
++				self.waiting.append(found)
++				break
++		if not found:
++			if not filename in self.names:
++				self.names.append(filename)
+--- /dev/null
++++ ardour3/waflib/Tools/flex.py
+@@ -0,0 +1,27 @@
++#! /usr/bin/env python
++# encoding: utf-8
++# WARNING! Do not edit! http://waf.googlecode.com/git/docs/wafbook/single.html#_obtaining_the_waf_file
++
++import waflib.TaskGen
++def decide_ext(self,node):
++	if'cxx'in self.features:
++		return['.lex.cc']
++	return['.lex.c']
++def flexfun(tsk):
++	env=tsk.env
++	bld=tsk.generator.bld
++	wd=bld.variant_dir
++	def to_list(xx):
++		if isinstance(xx,str):return[xx]
++		return xx
++	tsk.last_cmd=lst=[]
++	lst.extend(to_list(env['FLEX']))
++	lst.extend(to_list(env['FLEXFLAGS']))
++	lst.extend([a.path_from(bld.bldnode)for a in tsk.inputs])
++	lst=[x for x in lst if x]
++	txt=bld.cmd_and_log(lst,cwd=wd,env=env.env or None,quiet=0)
++	tsk.outputs[0].write(txt)
++waflib.TaskGen.declare_chain(name='flex',rule=flexfun,ext_in='.l',decider=decide_ext,)
++def configure(conf):
++	conf.find_program('flex',var='FLEX')
++	conf.env.FLEXFLAGS=['-t']
+--- /dev/null
++++ ardour3/waflib/Tools/g95.py
+@@ -0,0 +1,55 @@
++#! /usr/bin/env python
++# encoding: utf-8
++# WARNING! Do not edit! http://waf.googlecode.com/git/docs/wafbook/single.html#_obtaining_the_waf_file
++
++import re
++from waflib import Utils
++from waflib.Tools import fc,fc_config,fc_scan
++from waflib.Configure import conf
++def find_g95(conf):
++	fc=conf.find_program('g95',var='FC')
++	fc=conf.cmd_to_list(fc)
++	conf.get_g95_version(fc)
++	conf.env.FC_NAME='G95'
++def g95_flags(conf):
++	v=conf.env
++	v['FCFLAGS_fcshlib']=['-fPIC']
++	v['FORTRANMODFLAG']=['-fmod=','']
++	v['FCFLAGS_DEBUG']=['-Werror']
++def g95_modifier_win32(conf):
++	fc_config.fortran_modifier_win32(conf)
++def g95_modifier_cygwin(conf):
++	fc_config.fortran_modifier_cygwin(conf)
++def g95_modifier_darwin(conf):
++	fc_config.fortran_modifier_darwin(conf)
++def g95_modifier_platform(conf):
++	dest_os=conf.env['DEST_OS']or Utils.unversioned_sys_platform()
++	g95_modifier_func=getattr(conf,'g95_modifier_'+dest_os,None)
++	if g95_modifier_func:
++		g95_modifier_func()
++def get_g95_version(conf,fc):
++	version_re=re.compile(r"g95\s*(?P<major>\d*)\.(?P<minor>\d*)").search
++	cmd=fc+['--version']
++	out,err=fc_config.getoutput(conf,cmd,stdin=False)
++	if out:
++		match=version_re(out)
++	else:
++		match=version_re(err)
++	if not match:
++		conf.fatal('cannot determine g95 version')
++	k=match.groupdict()
++	conf.env['FC_VERSION']=(k['major'],k['minor'])
++def configure(conf):
++	conf.find_g95()
++	conf.find_ar()
++	conf.fc_flags()
++	conf.g95_flags()
++	conf.g95_modifier_platform()
++
++conf(find_g95)
++conf(g95_flags)
++conf(g95_modifier_win32)
++conf(g95_modifier_cygwin)
++conf(g95_modifier_darwin)
++conf(g95_modifier_platform)
++conf(get_g95_version)
+\ No newline at end of file
+--- /dev/null
++++ ardour3/waflib/Tools/gas.py
+@@ -0,0 +1,11 @@
++#! /usr/bin/env python
++# encoding: utf-8
++# WARNING! Do not edit! http://waf.googlecode.com/git/docs/wafbook/single.html#_obtaining_the_waf_file
++
++import waflib.Tools.asm
++from waflib.Tools import ar
++def configure(conf):
++	conf.find_program(['gas','as','gcc'],var='AS')
++	conf.env.AS_TGT_F=['-o']
++	conf.env.ASLNK_TGT_F=['-o']
++	conf.find_ar()
+--- /dev/null
++++ ardour3/waflib/Tools/gcc.py
+@@ -0,0 +1,98 @@
++#! /usr/bin/env python
++# encoding: utf-8
++# WARNING! Do not edit! http://waf.googlecode.com/git/docs/wafbook/single.html#_obtaining_the_waf_file
++
++import os,sys
++from waflib import Configure,Options,Utils
++from waflib.Tools import ccroot,ar
++from waflib.Configure import conf
++def find_gcc(conf):
++	cc=conf.find_program(['gcc','cc'],var='CC')
++	cc=conf.cmd_to_list(cc)
++	conf.get_cc_version(cc,gcc=True)
++	conf.env.CC_NAME='gcc'
++	conf.env.CC=cc
++def gcc_common_flags(conf):
++	v=conf.env
++	v['CC_SRC_F']=[]
++	v['CC_TGT_F']=['-c','-o']
++	if not v['LINK_CC']:v['LINK_CC']=v['CC']
++	v['CCLNK_SRC_F']=[]
++	v['CCLNK_TGT_F']=['-o']
++	v['CPPPATH_ST']='-I%s'
++	v['DEFINES_ST']='-D%s'
++	v['LIB_ST']='-l%s'
++	v['LIBPATH_ST']='-L%s'
++	v['STLIB_ST']='-l%s'
++	v['STLIBPATH_ST']='-L%s'
++	v['RPATH_ST']='-Wl,-rpath,%s'
++	v['SONAME_ST']='-Wl,-h,%s'
++	v['SHLIB_MARKER']='-Wl,-Bdynamic'
++	v['STLIB_MARKER']='-Wl,-Bstatic'
++	v['cprogram_PATTERN']='%s'
++	v['CFLAGS_cshlib']=['-fPIC']
++	v['LINKFLAGS_cshlib']=['-shared']
++	v['cshlib_PATTERN']='lib%s.so'
++	v['LINKFLAGS_cstlib']=['-Wl,-Bstatic']
++	v['cstlib_PATTERN']='lib%s.a'
++	v['LINKFLAGS_MACBUNDLE']=['-bundle','-undefined','dynamic_lookup']
++	v['CFLAGS_MACBUNDLE']=['-fPIC']
++	v['macbundle_PATTERN']='%s.bundle'
++def gcc_modifier_win32(conf):
++	v=conf.env
++	v['cprogram_PATTERN']='%s.exe'
++	v['cshlib_PATTERN']='%s.dll'
++	v['implib_PATTERN']='lib%s.dll.a'
++	v['IMPLIB_ST']='-Wl,--out-implib,%s'
++	v['CFLAGS_cshlib']=[]
++	v.append_value('CFLAGS_cshlib',['-DDLL_EXPORT'])
++	v.append_value('LINKFLAGS',['-Wl,--enable-auto-import'])
++def gcc_modifier_cygwin(conf):
++	gcc_modifier_win32(conf)
++	v=conf.env
++	v['cshlib_PATTERN']='cyg%s.dll'
++	v.append_value('LINKFLAGS_cshlib',['-Wl,--enable-auto-image-base'])
++	v['CFLAGS_cshlib']=[]
++def gcc_modifier_darwin(conf):
++	v=conf.env
++	v['CFLAGS_cshlib']=['-fPIC','-compatibility_version','1','-current_version','1']
++	v['LINKFLAGS_cshlib']=['-dynamiclib']
++	v['cshlib_PATTERN']='lib%s.dylib'
++	v['FRAMEWORKPATH_ST']='-F%s'
++	v['FRAMEWORK_ST']=['-framework']
++	v['ARCH_ST']=['-arch']
++	v['LINKFLAGS_cstlib']=[]
++	v['SHLIB_MARKER']=[]
++	v['STLIB_MARKER']=[]
++	v['SONAME_ST']=[]
++def gcc_modifier_aix(conf):
++	v=conf.env
++	v['LINKFLAGS_cprogram']=['-Wl,-brtl']
++	v['LINKFLAGS_cshlib']=['-shared','-Wl,-brtl,-bexpfull']
++	v['SHLIB_MARKER']=[]
++def gcc_modifier_hpux(conf):
++	v=conf.env
++	v['SHLIB_MARKER']=[]
++	v['CFLAGS_cshlib']=['-fPIC','-DPIC']
++	v['cshlib_PATTERN']='lib%s.sl'
++def gcc_modifier_platform(conf):
++	gcc_modifier_func=getattr(conf,'gcc_modifier_'+conf.env.DEST_OS,None)
++	if gcc_modifier_func:
++		gcc_modifier_func()
++def configure(conf):
++	conf.find_gcc()
++	conf.find_ar()
++	conf.gcc_common_flags()
++	conf.gcc_modifier_platform()
++	conf.cc_load_tools()
++	conf.cc_add_flags()
++	conf.link_add_flags()
++
++conf(find_gcc)
++conf(gcc_common_flags)
++conf(gcc_modifier_win32)
++conf(gcc_modifier_cygwin)
++conf(gcc_modifier_darwin)
++conf(gcc_modifier_aix)
++conf(gcc_modifier_hpux)
++conf(gcc_modifier_platform)
+\ No newline at end of file
+--- /dev/null
++++ ardour3/waflib/Tools/gdc.py
+@@ -0,0 +1,34 @@
++#! /usr/bin/env python
++# encoding: utf-8
++# WARNING! Do not edit! http://waf.googlecode.com/git/docs/wafbook/single.html#_obtaining_the_waf_file
++
++import sys
++from waflib.Tools import ar,d
++from waflib.Configure import conf
++def find_gdc(conf):
++	conf.find_program('gdc',var='D')
++def common_flags_gdc(conf):
++	v=conf.env
++	v['DFLAGS']=[]
++	v['D_SRC_F']=['-c']
++	v['D_TGT_F']='-o%s'
++	v['D_LINKER']=v['D']
++	v['DLNK_SRC_F']=''
++	v['DLNK_TGT_F']='-o%s'
++	v['DINC_ST']='-I%s'
++	v['DSHLIB_MARKER']=v['DSTLIB_MARKER']=''
++	v['DSTLIB_ST']=v['DSHLIB_ST']='-l%s'
++	v['DSTLIBPATH_ST']=v['DLIBPATH_ST']='-L%s'
++	v['LINKFLAGS_dshlib']=['-shared']
++	v['DHEADER_ext']='.di'
++	v.DFLAGS_d_with_header='-fintfc'
++	v['D_HDR_F']='-fintfc-file=%s'
++def configure(conf):
++	conf.find_gdc()
++	conf.load('ar')
++	conf.load('d')
++	conf.common_flags_gdc()
++	conf.d_platform_flags()
++
++conf(find_gdc)
++conf(common_flags_gdc)
+\ No newline at end of file
+--- /dev/null
++++ ardour3/waflib/Tools/gfortran.py
+@@ -0,0 +1,69 @@
++#! /usr/bin/env python
++# encoding: utf-8
++# WARNING! Do not edit! http://waf.googlecode.com/git/docs/wafbook/single.html#_obtaining_the_waf_file
++
++import re
++from waflib import Utils
++from waflib.Tools import fc,fc_config,fc_scan
++from waflib.Configure import conf
++def find_gfortran(conf):
++	fc=conf.find_program(['gfortran','g77'],var='FC')
++	fc=conf.cmd_to_list(fc)
++	conf.get_gfortran_version(fc)
++	conf.env.FC_NAME='GFORTRAN'
++def gfortran_flags(conf):
++	v=conf.env
++	v['FCFLAGS_fcshlib']=['-fPIC']
++	v['FORTRANMODFLAG']=['-J','']
++	v['FCFLAGS_DEBUG']=['-Werror']
++def gfortran_modifier_win32(conf):
++	fc_config.fortran_modifier_win32(conf)
++def gfortran_modifier_cygwin(conf):
++	fc_config.fortran_modifier_cygwin(conf)
++def gfortran_modifier_darwin(conf):
++	fc_config.fortran_modifier_darwin(conf)
++def gfortran_modifier_platform(conf):
++	dest_os=conf.env['DEST_OS']or Utils.unversioned_sys_platform()
++	gfortran_modifier_func=getattr(conf,'gfortran_modifier_'+dest_os,None)
++	if gfortran_modifier_func:
++		gfortran_modifier_func()
++def get_gfortran_version(conf,fc):
++	version_re=re.compile(r"GNU\s*Fortran",re.I).search
++	cmd=fc+['--version']
++	out,err=fc_config.getoutput(conf,cmd,stdin=False)
++	if out:match=version_re(out)
++	else:match=version_re(err)
++	if not match:
++		conf.fatal('Could not determine the compiler type')
++	cmd=fc+['-dM','-E','-']
++	out,err=fc_config.getoutput(conf,cmd,stdin=True)
++	if out.find('__GNUC__')<0:
++		conf.fatal('Could not determine the compiler type')
++	k={}
++	out=out.split('\n')
++	import shlex
++	for line in out:
++		lst=shlex.split(line)
++		if len(lst)>2:
++			key=lst[1]
++			val=lst[2]
++			k[key]=val
++	def isD(var):
++		return var in k
++	def isT(var):
++		return var in k and k[var]!='0'
++	conf.env['FC_VERSION']=(k['__GNUC__'],k['__GNUC_MINOR__'],k['__GNUC_PATCHLEVEL__'])
++def configure(conf):
++	conf.find_gfortran()
++	conf.find_ar()
++	conf.fc_flags()
++	conf.gfortran_flags()
++	conf.gfortran_modifier_platform()
++
++conf(find_gfortran)
++conf(gfortran_flags)
++conf(gfortran_modifier_win32)
++conf(gfortran_modifier_cygwin)
++conf(gfortran_modifier_darwin)
++conf(gfortran_modifier_platform)
++conf(get_gfortran_version)
+\ No newline at end of file
+--- /dev/null
++++ ardour3/waflib/Tools/glib2.py
+@@ -0,0 +1,174 @@
++#! /usr/bin/env python
++# encoding: utf-8
++# WARNING! Do not edit! http://waf.googlecode.com/git/docs/wafbook/single.html#_obtaining_the_waf_file
++
++import os
++from waflib import Task,Utils,Options,Errors,Logs
++from waflib.TaskGen import taskgen_method,before_method,after_method,feature
++def add_marshal_file(self,filename,prefix):
++	if not hasattr(self,'marshal_list'):
++		self.marshal_list=[]
++	self.meths.append('process_marshal')
++	self.marshal_list.append((filename,prefix))
++def process_marshal(self):
++	for f,prefix in getattr(self,'marshal_list',[]):
++		node=self.path.find_resource(f)
++		if not node:
++			raise Errors.WafError('file not found %r'%f)
++		h_node=node.change_ext('.h')
++		c_node=node.change_ext('.c')
++		task=self.create_task('glib_genmarshal',node,[h_node,c_node])
++		task.env.GLIB_GENMARSHAL_PREFIX=prefix
++	self.source=self.to_nodes(getattr(self,'source',[]))
++	self.source.append(c_node)
++class glib_genmarshal(Task.Task):
++	def run(self):
++		bld=self.inputs[0].__class__.ctx
++		get=self.env.get_flat
++		cmd1="%s %s --prefix=%s --header > %s"%(get('GLIB_GENMARSHAL'),self.inputs[0].srcpath(),get('GLIB_GENMARSHAL_PREFIX'),self.outputs[0].abspath())
++		ret=bld.exec_command(cmd1)
++		if ret:return ret
++		c='''#include "%s"\n'''%self.outputs[0].name
++		self.outputs[1].write(c)
++		cmd2="%s %s --prefix=%s --body >> %s"%(get('GLIB_GENMARSHAL'),self.inputs[0].srcpath(),get('GLIB_GENMARSHAL_PREFIX'),self.outputs[1].abspath())
++		return bld.exec_command(cmd2)
++	vars=['GLIB_GENMARSHAL_PREFIX','GLIB_GENMARSHAL']
++	color='BLUE'
++	ext_out=['.h']
++def add_enums_from_template(self,source='',target='',template='',comments=''):
++	if not hasattr(self,'enums_list'):
++		self.enums_list=[]
++	self.meths.append('process_enums')
++	self.enums_list.append({'source':source,'target':target,'template':template,'file-head':'','file-prod':'','file-tail':'','enum-prod':'','value-head':'','value-prod':'','value-tail':'','comments':comments})
++def add_enums(self,source='',target='',file_head='',file_prod='',file_tail='',enum_prod='',value_head='',value_prod='',value_tail='',comments=''):
++	if not hasattr(self,'enums_list'):
++		self.enums_list=[]
++	self.meths.append('process_enums')
++	self.enums_list.append({'source':source,'template':'','target':target,'file-head':file_head,'file-prod':file_prod,'file-tail':file_tail,'enum-prod':enum_prod,'value-head':value_head,'value-prod':value_prod,'value-tail':value_tail,'comments':comments})
++def process_enums(self):
++	for enum in getattr(self,'enums_list',[]):
++		task=self.create_task('glib_mkenums')
++		env=task.env
++		inputs=[]
++		source_list=self.to_list(enum['source'])
++		if not source_list:
++			raise Errors.WafError('missing source '+str(enum))
++		source_list=[self.path.find_resource(k)for k in source_list]
++		inputs+=source_list
++		env['GLIB_MKENUMS_SOURCE']=[k.abspath()for k in source_list]
++		if not enum['target']:
++			raise Errors.WafError('missing target '+str(enum))
++		tgt_node=self.path.find_or_declare(enum['target'])
++		if tgt_node.name.endswith('.c'):
++			self.source.append(tgt_node)
++		env['GLIB_MKENUMS_TARGET']=tgt_node.abspath()
++		options=[]
++		if enum['template']:
++			template_node=self.path.find_resource(enum['template'])
++			options.append('--template %s'%(template_node.abspath()))
++			inputs.append(template_node)
++		params={'file-head':'--fhead','file-prod':'--fprod','file-tail':'--ftail','enum-prod':'--eprod','value-head':'--vhead','value-prod':'--vprod','value-tail':'--vtail','comments':'--comments'}
++		for param,option in params.items():
++			if enum[param]:
++				options.append('%s %r'%(option,enum[param]))
++		env['GLIB_MKENUMS_OPTIONS']=' '.join(options)
++		task.set_inputs(inputs)
++		task.set_outputs(tgt_node)
++class glib_mkenums(Task.Task):
++	run_str='${GLIB_MKENUMS} ${GLIB_MKENUMS_OPTIONS} ${GLIB_MKENUMS_SOURCE} > ${GLIB_MKENUMS_TARGET}'
++	color='PINK'
++	ext_out=['.h']
++def add_settings_schemas(self,filename_list):
++	if not hasattr(self,'settings_schema_files'):
++		self.settings_schema_files=[]
++	if not isinstance(filename_list,list):
++		filename_list=[filename_list]
++	self.settings_schema_files.extend(filename_list)
++def add_settings_enums(self,namespace,filename_list):
++	if hasattr(self,'settings_enum_namespace'):
++		raise Errors.WafError("Tried to add gsettings enums to '%s' more than once"%self.name)
++	self.settings_enum_namespace=namespace
++	if type(filename_list)!='list':
++		filename_list=[filename_list]
++	self.settings_enum_files=filename_list
++def r_change_ext(self,ext):
++	name=self.name
++	k=name.rfind('.')
++	if k>=0:
++		name=name[:k]+ext
++	else:
++		name=name+ext
++	return self.parent.find_or_declare([name])
++def process_settings(self):
++	enums_tgt_node=[]
++	install_files=[]
++	settings_schema_files=getattr(self,'settings_schema_files',[])
++	if settings_schema_files and not self.env['GLIB_COMPILE_SCHEMAS']:
++		raise Errors.WafError("Unable to process GSettings schemas - glib-compile-schemas was not found during configure")
++	if hasattr(self,'settings_enum_files'):
++		enums_task=self.create_task('glib_mkenums')
++		source_list=self.settings_enum_files
++		source_list=[self.path.find_resource(k)for k in source_list]
++		enums_task.set_inputs(source_list)
++		enums_task.env['GLIB_MKENUMS_SOURCE']=[k.abspath()for k in source_list]
++		target=self.settings_enum_namespace+'.enums.xml'
++		tgt_node=self.path.find_or_declare(target)
++		enums_task.set_outputs(tgt_node)
++		enums_task.env['GLIB_MKENUMS_TARGET']=tgt_node.abspath()
++		enums_tgt_node=[tgt_node]
++		install_files.append(tgt_node)
++		options='--comments "<!-- @comment@ -->" --fhead "<schemalist>" --vhead "  <@type@ id=\\"%s. at EnumName@\\">" --vprod "    <value nick=\\"@valuenick@\\" value=\\"@valuenum@\\"/>" --vtail "  </@type@>" --ftail "</schemalist>" '%(self.settings_enum_namespace)
++		enums_task.env['GLIB_MKENUMS_OPTIONS']=options
++	for schema in settings_schema_files:
++		schema_task=self.create_task('glib_validate_schema')
++		schema_node=self.path.find_resource(schema)
++		if not schema_node:
++			raise Errors.WafError("Cannot find the schema file '%s'"%schema)
++		install_files.append(schema_node)
++		source_list=enums_tgt_node+[schema_node]
++		schema_task.set_inputs(source_list)
++		schema_task.env['GLIB_COMPILE_SCHEMAS_OPTIONS']=[("--schema-file="+k.abspath())for k in source_list]
++		target_node=r_change_ext(schema_node,'.xml.valid')
++		schema_task.set_outputs(target_node)
++		schema_task.env['GLIB_VALIDATE_SCHEMA_OUTPUT']=target_node.abspath()
++	def compile_schemas_callback(bld):
++		if not bld.is_install:return
++		Logs.pprint('YELLOW','Updating GSettings schema cache')
++		command=Utils.subst_vars("${GLIB_COMPILE_SCHEMAS} ${GSETTINGSSCHEMADIR}",bld.env)
++		ret=self.bld.exec_command(command)
++	if self.bld.is_install:
++		if not self.env['GSETTINGSSCHEMADIR']:
++			raise Errors.WafError('GSETTINGSSCHEMADIR not defined (should have been set up automatically during configure)')
++		if install_files:
++			self.bld.install_files(self.env['GSETTINGSSCHEMADIR'],install_files)
++			if not hasattr(self.bld,'_compile_schemas_registered'):
++				self.bld.add_post_fun(compile_schemas_callback)
++				self.bld._compile_schemas_registered=True
++class glib_validate_schema(Task.Task):
++	run_str='rm -f ${GLIB_VALIDATE_SCHEMA_OUTPUT} && ${GLIB_COMPILE_SCHEMAS} --dry-run ${GLIB_COMPILE_SCHEMAS_OPTIONS} && touch ${GLIB_VALIDATE_SCHEMA_OUTPUT}'
++	color='PINK'
++def configure(conf):
++	conf.find_program('glib-genmarshal',var='GLIB_GENMARSHAL')
++	conf.find_perl_program('glib-mkenums',var='GLIB_MKENUMS')
++	conf.find_program('glib-compile-schemas',var='GLIB_COMPILE_SCHEMAS',mandatory=False)
++	def getstr(varname):
++		return getattr(Options.options,varname,getattr(conf.env,varname,''))
++	gsettingsschemadir=getstr('GSETTINGSSCHEMADIR')
++	if not gsettingsschemadir:
++		datadir=getstr('DATADIR')
++		if not datadir:
++			prefix=conf.env['PREFIX']
++			datadir=os.path.join(prefix,'share')
++		gsettingsschemadir=os.path.join(datadir,'glib-2.0','schemas')
++	conf.env['GSETTINGSSCHEMADIR']=gsettingsschemadir
++def options(opt):
++	opt.add_option('--gsettingsschemadir',help='GSettings schema location [Default: ${datadir}/glib-2.0/schemas]',default='',dest='GSETTINGSSCHEMADIR')
++
++taskgen_method(add_marshal_file)
++before_method('process_source')(process_marshal)
++taskgen_method(add_enums_from_template)
++taskgen_method(add_enums)
++before_method('process_source')(process_enums)
++taskgen_method(add_settings_schemas)
++taskgen_method(add_settings_enums)
++feature('glib2')(process_settings)
+\ No newline at end of file
+--- /dev/null
++++ ardour3/waflib/Tools/gnu_dirs.py
+@@ -0,0 +1,65 @@
++#! /usr/bin/env python
++# encoding: utf-8
++# WARNING! Do not edit! http://waf.googlecode.com/git/docs/wafbook/single.html#_obtaining_the_waf_file
++
++import os
++from waflib import Utils,Options,Context
++_options=[x.split(', ')for x in'''
++bindir, user executables, ${EXEC_PREFIX}/bin
++sbindir, system admin executables, ${EXEC_PREFIX}/sbin
++libexecdir, program executables, ${EXEC_PREFIX}/libexec
++sysconfdir, read-only single-machine data, ${PREFIX}/etc
++sharedstatedir, modifiable architecture-independent data, ${PREFIX}/com
++localstatedir, modifiable single-machine data, ${PREFIX}/var
++libdir, object code libraries, ${EXEC_PREFIX}/lib
++includedir, C header files, ${PREFIX}/include
++oldincludedir, C header files for non-gcc, /usr/include
++datarootdir, read-only arch.-independent data root, ${PREFIX}/share
++datadir, read-only architecture-independent data, ${DATAROOTDIR}
++infodir, info documentation, ${DATAROOTDIR}/info
++localedir, locale-dependent data, ${DATAROOTDIR}/locale
++mandir, man documentation, ${DATAROOTDIR}/man
++docdir, documentation root, ${DATAROOTDIR}/doc/${PACKAGE}
++htmldir, html documentation, ${DOCDIR}
++dvidir, dvi documentation, ${DOCDIR}
++pdfdir, pdf documentation, ${DOCDIR}
++psdir, ps documentation, ${DOCDIR}
++'''.split('\n')if x]
++def configure(conf):
++	def get_param(varname,default):
++		return getattr(Options.options,varname,'')or default
++	env=conf.env
++	conf.env.LIBDIR=conf.env.BINDIR=[]
++	env['EXEC_PREFIX']=get_param('EXEC_PREFIX',env['PREFIX'])
++	env['PACKAGE']=getattr(Context.g_module,'APPNAME',None)or env['PACKAGE']
++	complete=False
++	iter=0
++	while not complete and iter<len(_options)+1:
++		iter+=1
++		complete=True
++		for name,help,default in _options:
++			name=name.upper()
++			if not env[name]:
++				try:
++					env[name]=Utils.subst_vars(get_param(name,default).replace('/',os.sep),env)
++				except TypeError:
++					complete=False
++	if not complete:
++		lst=[name for name,_,_ in _options if not env[name.upper()]]
++		raise conf.errors.WafError('Variable substitution failure %r'%lst)
++def options(opt):
++	inst_dir=opt.add_option_group('Installation directories','By default, "waf install" will put the files in\
++ "/usr/local/bin", "/usr/local/lib" etc. An installation prefix other\
++ than "/usr/local" can be given using "--prefix", for example "--prefix=$HOME"')
++	for k in('--prefix','--destdir'):
++		option=opt.parser.get_option(k)
++		if option:
++			opt.parser.remove_option(k)
++			inst_dir.add_option(option)
++	inst_dir.add_option('--exec-prefix',help='installation prefix [Default: ${PREFIX}]',default='',dest='EXEC_PREFIX')
++	dirs_options=opt.add_option_group('Pre-defined installation directories','')
++	for name,help,default in _options:
++		option_name='--'+name
++		str_default=default
++		str_help='%s [Default: %s]'%(help,str_default)
++		dirs_options.add_option(option_name,help=str_help,default='',dest=name.upper())
+--- /dev/null
++++ ardour3/waflib/Tools/gxx.py
+@@ -0,0 +1,98 @@
++#! /usr/bin/env python
++# encoding: utf-8
++# WARNING! Do not edit! http://waf.googlecode.com/git/docs/wafbook/single.html#_obtaining_the_waf_file
++
++import os,sys
++from waflib import Configure,Options,Utils
++from waflib.Tools import ccroot,ar
++from waflib.Configure import conf
++def find_gxx(conf):
++	cxx=conf.find_program(['g++','c++'],var='CXX')
++	cxx=conf.cmd_to_list(cxx)
++	conf.get_cc_version(cxx,gcc=True)
++	conf.env.CXX_NAME='gcc'
++	conf.env.CXX=cxx
++def gxx_common_flags(conf):
++	v=conf.env
++	v['CXX_SRC_F']=[]
++	v['CXX_TGT_F']=['-c','-o']
++	if not v['LINK_CXX']:v['LINK_CXX']=v['CXX']
++	v['CXXLNK_SRC_F']=[]
++	v['CXXLNK_TGT_F']=['-o']
++	v['CPPPATH_ST']='-I%s'
++	v['DEFINES_ST']='-D%s'
++	v['LIB_ST']='-l%s'
++	v['LIBPATH_ST']='-L%s'
++	v['STLIB_ST']='-l%s'
++	v['STLIBPATH_ST']='-L%s'
++	v['RPATH_ST']='-Wl,-rpath,%s'
++	v['SONAME_ST']='-Wl,-h,%s'
++	v['SHLIB_MARKER']='-Wl,-Bdynamic'
++	v['STLIB_MARKER']='-Wl,-Bstatic'
++	v['cxxprogram_PATTERN']='%s'
++	v['CXXFLAGS_cxxshlib']=['-fPIC']
++	v['LINKFLAGS_cxxshlib']=['-shared']
++	v['cxxshlib_PATTERN']='lib%s.so'
++	v['LINKFLAGS_cxxstlib']=['-Wl,-Bstatic']
++	v['cxxstlib_PATTERN']='lib%s.a'
++	v['LINKFLAGS_MACBUNDLE']=['-bundle','-undefined','dynamic_lookup']
++	v['CXXFLAGS_MACBUNDLE']=['-fPIC']
++	v['macbundle_PATTERN']='%s.bundle'
++def gxx_modifier_win32(conf):
++	v=conf.env
++	v['cxxprogram_PATTERN']='%s.exe'
++	v['cxxshlib_PATTERN']='%s.dll'
++	v['implib_PATTERN']='lib%s.dll.a'
++	v['IMPLIB_ST']='-Wl,--out-implib,%s'
++	v['CXXFLAGS_cxxshlib']=[]
++	v.append_value('CXXFLAGS_cxxshlib',['-DDLL_EXPORT'])
++	v.append_value('LINKFLAGS',['-Wl,--enable-auto-import'])
++def gxx_modifier_cygwin(conf):
++	gxx_modifier_win32(conf)
++	v=conf.env
++	v['cxxshlib_PATTERN']='cyg%s.dll'
++	v.append_value('LINKFLAGS_cxxshlib',['-Wl,--enable-auto-image-base'])
++	v['CXXFLAGS_cxxshlib']=[]
++def gxx_modifier_darwin(conf):
++	v=conf.env
++	v['CXXFLAGS_cxxshlib']=['-fPIC','-compatibility_version','1','-current_version','1']
++	v['LINKFLAGS_cxxshlib']=['-dynamiclib']
++	v['cxxshlib_PATTERN']='lib%s.dylib'
++	v['FRAMEWORKPATH_ST']='-F%s'
++	v['FRAMEWORK_ST']=['-framework']
++	v['ARCH_ST']=['-arch']
++	v['LINKFLAGS_cxxstlib']=[]
++	v['SHLIB_MARKER']=[]
++	v['STLIB_MARKER']=[]
++	v['SONAME_ST']=[]
++def gxx_modifier_aix(conf):
++	v=conf.env
++	v['LINKFLAGS_cxxprogram']=['-Wl,-brtl']
++	v['LINKFLAGS_cxxshlib']=['-shared','-Wl,-brtl,-bexpfull']
++	v['SHLIB_MARKER']=[]
++def gxx_modifier_hpux(conf):
++	v=conf.env
++	v['SHLIB_MARKER']=[]
++	v['CFLAGS_cxxshlib']=['-fPIC','-DPIC']
++	v['cxxshlib_PATTERN']='lib%s.sl'
++def gxx_modifier_platform(conf):
++	gxx_modifier_func=getattr(conf,'gxx_modifier_'+conf.env.DEST_OS,None)
++	if gxx_modifier_func:
++		gxx_modifier_func()
++def configure(conf):
++	conf.find_gxx()
++	conf.find_ar()
++	conf.gxx_common_flags()
++	conf.gxx_modifier_platform()
++	conf.cxx_load_tools()
++	conf.cxx_add_flags()
++	conf.link_add_flags()
++
++conf(find_gxx)
++conf(gxx_common_flags)
++conf(gxx_modifier_win32)
++conf(gxx_modifier_cygwin)
++conf(gxx_modifier_darwin)
++conf(gxx_modifier_aix)
++conf(gxx_modifier_hpux)
++conf(gxx_modifier_platform)
+\ No newline at end of file
+--- /dev/null
++++ ardour3/waflib/Tools/icc.py
+@@ -0,0 +1,31 @@
++#! /usr/bin/env python
++# encoding: utf-8
++# WARNING! Do not edit! http://waf.googlecode.com/git/docs/wafbook/single.html#_obtaining_the_waf_file
++
++import os,sys
++from waflib.Tools import ccroot,ar,gcc
++from waflib.Configure import conf
++def find_icc(conf):
++	if sys.platform=='cygwin':
++		conf.fatal('The Intel compiler does not work on Cygwin')
++	v=conf.env
++	cc=None
++	if v['CC']:cc=v['CC']
++	elif'CC'in conf.environ:cc=conf.environ['CC']
++	if not cc:cc=conf.find_program('icc',var='CC')
++	if not cc:cc=conf.find_program('ICL',var='CC')
++	if not cc:conf.fatal('Intel C Compiler (icc) was not found')
++	cc=conf.cmd_to_list(cc)
++	conf.get_cc_version(cc,icc=True)
++	v['CC']=cc
++	v['CC_NAME']='icc'
++def configure(conf):
++	conf.find_icc()
++	conf.find_ar()
++	conf.gcc_common_flags()
++	conf.gcc_modifier_platform()
++	conf.cc_load_tools()
++	conf.cc_add_flags()
++	conf.link_add_flags()
++
++conf(find_icc)
+\ No newline at end of file
+--- /dev/null
++++ ardour3/waflib/Tools/icpc.py
+@@ -0,0 +1,30 @@
++#! /usr/bin/env python
++# encoding: utf-8
++# WARNING! Do not edit! http://waf.googlecode.com/git/docs/wafbook/single.html#_obtaining_the_waf_file
++
++import os,sys
++from waflib.Tools import ccroot,ar,gxx
++from waflib.Configure import conf
++def find_icpc(conf):
++	if sys.platform=='cygwin':
++		conf.fatal('The Intel compiler does not work on Cygwin')
++	v=conf.env
++	cxx=None
++	if v['CXX']:cxx=v['CXX']
++	elif'CXX'in conf.environ:cxx=conf.environ['CXX']
++	if not cxx:cxx=conf.find_program('icpc',var='CXX')
++	if not cxx:conf.fatal('Intel C++ Compiler (icpc) was not found')
++	cxx=conf.cmd_to_list(cxx)
++	conf.get_cc_version(cxx,icc=True)
++	v['CXX']=cxx
++	v['CXX_NAME']='icc'
++def configure(conf):
++	conf.find_icpc()
++	conf.find_ar()
++	conf.gxx_common_flags()
++	conf.gxx_modifier_platform()
++	conf.cxx_load_tools()
++	conf.cxx_add_flags()
++	conf.link_add_flags()
++
++conf(find_icpc)
+\ No newline at end of file
+--- /dev/null
++++ ardour3/waflib/Tools/ifort.py
+@@ -0,0 +1,49 @@
++#! /usr/bin/env python
++# encoding: utf-8
++# WARNING! Do not edit! http://waf.googlecode.com/git/docs/wafbook/single.html#_obtaining_the_waf_file
++
++import re
++from waflib import Utils
++from waflib.Tools import fc,fc_config,fc_scan
++from waflib.Configure import conf
++def find_ifort(conf):
++	fc=conf.find_program('ifort',var='FC')
++	fc=conf.cmd_to_list(fc)
++	conf.get_ifort_version(fc)
++	conf.env.FC_NAME='IFORT'
++def ifort_modifier_cygwin(conf):
++	raise NotImplementedError("Ifort on cygwin not yet implemented")
++def ifort_modifier_win32(conf):
++	fc_config.fortran_modifier_win32(conf)
++def ifort_modifier_darwin(conf):
++	fc_config.fortran_modifier_darwin(conf)
++def ifort_modifier_platform(conf):
++	dest_os=conf.env['DEST_OS']or Utils.unversioned_sys_platform()
++	ifort_modifier_func=getattr(conf,'ifort_modifier_'+dest_os,None)
++	if ifort_modifier_func:
++		ifort_modifier_func()
++def get_ifort_version(conf,fc):
++	version_re=re.compile(r"ifort\s*\(IFORT\)\s*(?P<major>\d*)\.(?P<minor>\d*)",re.I).search
++	cmd=fc+['--version']
++	out,err=fc_config.getoutput(conf,cmd,stdin=False)
++	if out:
++		match=version_re(out)
++	else:
++		match=version_re(err)
++	if not match:
++		conf.fatal('cannot determine ifort version.')
++	k=match.groupdict()
++	conf.env['FC_VERSION']=(k['major'],k['minor'])
++def configure(conf):
++	conf.find_ifort()
++	conf.find_program('xiar',var='AR')
++	conf.env.ARFLAGS='rcs'
++	conf.fc_flags()
++	conf.ifort_modifier_platform()
++
++conf(find_ifort)
++conf(ifort_modifier_cygwin)
++conf(ifort_modifier_win32)
++conf(ifort_modifier_darwin)
++conf(ifort_modifier_platform)
++conf(get_ifort_version)
+\ No newline at end of file
+--- /dev/null
++++ ardour3/waflib/Tools/__init__.py
+@@ -0,0 +1,4 @@
++#! /usr/bin/env python
++# encoding: utf-8
++# WARNING! Do not edit! http://waf.googlecode.com/git/docs/wafbook/single.html#_obtaining_the_waf_file
++
+--- /dev/null
++++ ardour3/waflib/Tools/intltool.py
+@@ -0,0 +1,78 @@
++#! /usr/bin/env python
++# encoding: utf-8
++# WARNING! Do not edit! http://waf.googlecode.com/git/docs/wafbook/single.html#_obtaining_the_waf_file
++
++import os,re
++from waflib import Configure,TaskGen,Task,Utils,Runner,Options,Build,Logs
++import waflib.Tools.ccroot
++from waflib.TaskGen import feature,before_method
++from waflib.Logs import error
++def apply_intltool_in_f(self):
++	try:self.meths.remove('process_source')
++	except ValueError:pass
++	if not self.env.LOCALEDIR:
++		self.env.LOCALEDIR=self.env.PREFIX+'/share/locale'
++	for i in self.to_list(self.source):
++		node=self.path.find_resource(i)
++		podir=getattr(self,'podir','po')
++		podirnode=self.path.find_dir(podir)
++		if not podirnode:
++			error("could not find the podir %r"%podir)
++			continue
++		cache=getattr(self,'intlcache','.intlcache')
++		self.env['INTLCACHE']=os.path.join(self.path.bldpath(),podir,cache)
++		self.env['INTLPODIR']=podirnode.bldpath()
++		self.env['INTLFLAGS']=getattr(self,'flags',['-q','-u','-c'])
++		task=self.create_task('intltool',node,node.change_ext(''))
++		inst=getattr(self,'install_path','${LOCALEDIR}')
++		if inst:
++			self.bld.install_files(inst,task.outputs)
++def apply_intltool_po(self):
++	try:self.meths.remove('process_source')
++	except ValueError:pass
++	if not self.env.LOCALEDIR:
++		self.env.LOCALEDIR=self.env.PREFIX+'/share/locale'
++	appname=getattr(self,'appname','set_your_app_name')
++	podir=getattr(self,'podir','')
++	inst=getattr(self,'install_path','${LOCALEDIR}')
++	linguas=self.path.find_node(os.path.join(podir,'LINGUAS'))
++	if linguas:
++		file=open(linguas.abspath())
++		langs=[]
++		for line in file.readlines():
++			if not line.startswith('#'):
++				langs+=line.split()
++		file.close()
++		re_linguas=re.compile('[-a-zA-Z_ at .]+')
++		for lang in langs:
++			if re_linguas.match(lang):
++				node=self.path.find_resource(os.path.join(podir,re_linguas.match(lang).group()+'.po'))
++				task=self.create_task('po',node,node.change_ext('.mo'))
++				if inst:
++					filename=task.outputs[0].name
++					(langname,ext)=os.path.splitext(filename)
++					inst_file=inst+os.sep+langname+os.sep+'LC_MESSAGES'+os.sep+appname+'.mo'
++					self.bld.install_as(inst_file,task.outputs[0],chmod=getattr(self,'chmod',Utils.O644),env=task.env)
++	else:
++		Logs.pprint('RED',"Error no LINGUAS file found in po directory")
++class po(Task.Task):
++	run_str='${MSGFMT} -o ${TGT} ${SRC}'
++	color='BLUE'
++class intltool(Task.Task):
++	run_str='${INTLTOOL} ${INTLFLAGS} ${INTLCACHE} ${INTLPODIR} ${SRC} ${TGT}'
++	color='BLUE'
++def configure(conf):
++	conf.find_program('msgfmt',var='MSGFMT')
++	conf.find_perl_program('intltool-merge',var='INTLTOOL')
++	prefix=conf.env.PREFIX
++	datadir=conf.env.DATADIR
++	if not datadir:
++		datadir=os.path.join(prefix,'share')
++	conf.define('LOCALEDIR',os.path.join(datadir,'locale').replace('\\','\\\\'))
++	conf.define('DATADIR',datadir.replace('\\','\\\\'))
++	if conf.env.CC or conf.env.CXX:
++		conf.check(header_name='locale.h')
++
++before_method('process_source')(apply_intltool_in_f)
++feature('intltool_in')(apply_intltool_in_f)
++feature('intltool_po')(apply_intltool_po)
+\ No newline at end of file
+--- /dev/null
++++ ardour3/waflib/Tools/irixcc.py
+@@ -0,0 +1,49 @@
++#! /usr/bin/env python
++# encoding: utf-8
++# WARNING! Do not edit! http://waf.googlecode.com/git/docs/wafbook/single.html#_obtaining_the_waf_file
++
++import os
++from waflib import Utils
++from waflib.Tools import ccroot,ar
++from waflib.Configure import conf
++def find_irixcc(conf):
++	v=conf.env
++	cc=None
++	if v['CC']:cc=v['CC']
++	elif'CC'in conf.environ:cc=conf.environ['CC']
++	if not cc:cc=conf.find_program('cc',var='CC')
++	if not cc:conf.fatal('irixcc was not found')
++	cc=conf.cmd_to_list(cc)
++	try:
++		conf.cmd_and_log(cc+['-version'])
++	except:
++		conf.fatal('%r -version could not be executed'%cc)
++	v['CC']=cc
++	v['CC_NAME']='irix'
++def irixcc_common_flags(conf):
++	v=conf.env
++	v['CC_SRC_F']=''
++	v['CC_TGT_F']=['-c','-o']
++	v['CPPPATH_ST']='-I%s'
++	v['DEFINES_ST']='-D%s'
++	if not v['LINK_CC']:v['LINK_CC']=v['CC']
++	v['CCLNK_SRC_F']=''
++	v['CCLNK_TGT_F']=['-o']
++	v['LIB_ST']='-l%s'
++	v['LIBPATH_ST']='-L%s'
++	v['STLIB_ST']='-l%s'
++	v['STLIBPATH_ST']='-L%s'
++	v['cprogram_PATTERN']='%s'
++	v['cshlib_PATTERN']='lib%s.so'
++	v['cstlib_PATTERN']='lib%s.a'
++def configure(conf):
++	conf.find_irixcc()
++	conf.find_cpp()
++	conf.find_ar()
++	conf.irixcc_common_flags()
++	conf.cc_load_tools()
++	conf.cc_add_flags()
++	conf.link_add_flags()
++
++conf(find_irixcc)
++conf(irixcc_common_flags)
+\ No newline at end of file
+--- /dev/null
++++ ardour3/waflib/Tools/javaw.py
+@@ -0,0 +1,275 @@
++#! /usr/bin/env python
++# encoding: utf-8
++# WARNING! Do not edit! http://waf.googlecode.com/git/docs/wafbook/single.html#_obtaining_the_waf_file
++
++import sys
++if sys.hexversion < 0x020400f0: from sets import Set as set
++import os,re,tempfile,shutil
++from waflib.Configure import conf
++from waflib import TaskGen,Task,Utils,Options,Build,Errors,Node,Logs
++from waflib.TaskGen import feature,before_method,after_method
++from waflib.Tools import ccroot
++ccroot.USELIB_VARS['javac']=set(['CLASSPATH','JAVACFLAGS'])
++SOURCE_RE='**/*.java'
++JAR_RE='**/*'
++class_check_source='''
++public class Test {
++	public static void main(String[] argv) {
++		Class lib;
++		if (argv.length < 1) {
++			System.err.println("Missing argument");
++			System.exit(77);
++		}
++		try {
++			lib = Class.forName(argv[0]);
++		} catch (ClassNotFoundException e) {
++			System.err.println("ClassNotFoundException");
++			System.exit(1);
++		}
++		lib = null;
++		System.exit(0);
++	}
++}
++'''
++def apply_java(self):
++	Utils.def_attrs(self,jarname='',classpath='',sourcepath='.',srcdir='.',jar_mf_attributes={},jar_mf_classpath=[])
++	nodes_lst=[]
++	outdir=getattr(self,'outdir',None)
++	if outdir:
++		if not isinstance(outdir,Node.Node):
++			outdir=self.path.get_bld().make_node(self.outdir)
++	else:
++		outdir=self.path.get_bld()
++	outdir.mkdir()
++	self.outdir=outdir
++	self.env['OUTDIR']=outdir.abspath()
++	self.javac_task=tsk=self.create_task('javac')
++	tmp=[]
++	srcdir=getattr(self,'srcdir','')
++	if isinstance(srcdir,Node.Node):
++		srcdir=[srcdir]
++	for x in Utils.to_list(srcdir):
++		if isinstance(x,Node.Node):
++			y=x
++		else:
++			y=self.path.find_dir(x)
++			if not y:
++				self.bld.fatal('Could not find the folder %s from %s'%(x,self.path))
++		tmp.append(y)
++	tsk.srcdir=tmp
++	if getattr(self,'compat',None):
++		tsk.env.append_value('JAVACFLAGS',['-source',self.compat])
++	if hasattr(self,'sourcepath'):
++		fold=[isinstance(x,Node.Node)and x or self.path.find_dir(x)for x in self.to_list(self.sourcepath)]
++		names=os.pathsep.join([x.srcpath()for x in fold])
++	else:
++		names=[x.srcpath()for x in tsk.srcdir]
++	if names:
++		tsk.env.append_value('JAVACFLAGS',['-sourcepath',names])
++def use_javac_files(self):
++	lst=[]
++	self.uselib=self.to_list(getattr(self,'uselib',[]))
++	names=self.to_list(getattr(self,'use',[]))
++	get=self.bld.get_tgen_by_name
++	for x in names:
++		try:
++			y=get(x)
++		except:
++			self.uselib.append(x)
++		else:
++			y.post()
++			lst.append(y.jar_task.outputs[0].abspath())
++			self.javac_task.set_run_after(y.jar_task)
++	if lst:
++		self.env.append_value('CLASSPATH',lst)
++def set_classpath(self):
++	self.env.append_value('CLASSPATH',getattr(self,'classpath',[]))
++	for x in self.tasks:
++		x.env.CLASSPATH=os.pathsep.join(self.env.CLASSPATH)+os.pathsep
++def jar_files(self):
++	destfile=getattr(self,'destfile','test.jar')
++	jaropts=getattr(self,'jaropts',[])
++	manifest=getattr(self,'manifest',None)
++	basedir=getattr(self,'basedir',None)
++	if basedir:
++		if not isinstance(self.basedir,Node.Node):
++			basedir=self.path.get_bld().make_node(basedir)
++	else:
++		basedir=self.path.get_bld()
++	if not basedir:
++		self.bld.fatal('Could not find the basedir %r for %r'%(self.basedir,self))
++	self.jar_task=tsk=self.create_task('jar_create')
++	if manifest:
++		jarcreate=getattr(self,'jarcreate','cfm')
++		node=self.path.find_node(manifest)
++		tsk.dep_nodes.append(node)
++		jaropts.insert(0,node.abspath())
++	else:
++		jarcreate=getattr(self,'jarcreate','cf')
++	if not isinstance(destfile,Node.Node):
++		destfile=self.path.find_or_declare(destfile)
++	if not destfile:
++		self.bld.fatal('invalid destfile %r for %r'%(destfile,self))
++	tsk.set_outputs(destfile)
++	tsk.basedir=basedir
++	jaropts.append('-C')
++	jaropts.append(basedir.bldpath())
++	jaropts.append('.')
++	tsk.env['JAROPTS']=jaropts
++	tsk.env['JARCREATE']=jarcreate
++	if getattr(self,'javac_task',None):
++		tsk.set_run_after(self.javac_task)
++def use_jar_files(self):
++	lst=[]
++	self.uselib=self.to_list(getattr(self,'uselib',[]))
++	names=self.to_list(getattr(self,'use',[]))
++	get=self.bld.get_tgen_by_name
++	for x in names:
++		try:
++			y=get(x)
++		except:
++			self.uselib.append(x)
++		else:
++			y.post()
++			self.jar_task.run_after.update(y.tasks)
++class jar_create(Task.Task):
++	color='GREEN'
++	run_str='${JAR} ${JARCREATE} ${TGT} ${JAROPTS}'
++	def runnable_status(self):
++		for t in self.run_after:
++			if not t.hasrun:
++				return Task.ASK_LATER
++		if not self.inputs:
++			global JAR_RE
++			try:
++				self.inputs=[x for x in self.basedir.ant_glob(JAR_RE,remove=False)if id(x)!=id(self.outputs[0])]
++			except:
++				raise Errors.WafError('Could not find the basedir %r for %r'%(self.basedir,self))
++		return super(jar_create,self).runnable_status()
++class javac(Task.Task):
++	color='BLUE'
++	nocache=True
++	vars=['CLASSPATH','JAVACFLAGS','JAVAC','OUTDIR']
++	def runnable_status(self):
++		for t in self.run_after:
++			if not t.hasrun:
++				return Task.ASK_LATER
++		if not self.inputs:
++			global SOURCE_RE
++			self.inputs=[]
++			for x in self.srcdir:
++				self.inputs.extend(x.ant_glob(SOURCE_RE,remove=False))
++		return super(javac,self).runnable_status()
++	def run(self):
++		env=self.env
++		gen=self.generator
++		bld=gen.bld
++		wd=bld.bldnode.abspath()
++		def to_list(xx):
++			if isinstance(xx,str):return[xx]
++			return xx
++		cmd=[]
++		cmd.extend(to_list(env['JAVAC']))
++		cmd.extend(['-classpath'])
++		cmd.extend(to_list(env['CLASSPATH']))
++		cmd.extend(['-d'])
++		cmd.extend(to_list(env['OUTDIR']))
++		cmd.extend(to_list(env['JAVACFLAGS']))
++		files=[a.path_from(bld.bldnode)for a in self.inputs]
++		tmp=None
++		try:
++			if len(str(files))+len(str(cmd))>8192:
++				(fd,tmp)=tempfile.mkstemp(dir=bld.bldnode.abspath())
++				try:
++					os.write(fd,'\n'.join(files))
++				finally:
++					if tmp:
++						os.close(fd)
++				if Logs.verbose:
++					Logs.debug('runner: %r'%(cmd+files))
++				cmd.append('@'+tmp)
++			else:
++				cmd+=files
++			ret=self.exec_command(cmd,cwd=wd,env=env.env or None)
++		finally:
++			if tmp:
++				os.unlink(tmp)
++		return ret
++	def post_run(self):
++		for n in self.generator.outdir.ant_glob('**/*.class'):
++			n.sig=Utils.h_file(n.abspath())
++		self.generator.bld.task_sigs[self.uid()]=self.cache_sig
++def configure(self):
++	java_path=self.environ['PATH'].split(os.pathsep)
++	v=self.env
++	if'JAVA_HOME'in self.environ:
++		java_path=[os.path.join(self.environ['JAVA_HOME'],'bin')]+java_path
++		self.env['JAVA_HOME']=[self.environ['JAVA_HOME']]
++	for x in'javac java jar'.split():
++		self.find_program(x,var=x.upper(),path_list=java_path)
++		self.env[x.upper()]=self.cmd_to_list(self.env[x.upper()])
++	if'CLASSPATH'in self.environ:
++		v['CLASSPATH']=self.environ['CLASSPATH']
++	if not v['JAR']:self.fatal('jar is required for making java packages')
++	if not v['JAVAC']:self.fatal('javac is required for compiling java classes')
++	v['JARCREATE']='cf'
++	v['JAVACFLAGS']=[]
++def check_java_class(self,classname,with_classpath=None):
++	javatestdir='.waf-javatest'
++	classpath=javatestdir
++	if self.env['CLASSPATH']:
++		classpath+=os.pathsep+self.env['CLASSPATH']
++	if isinstance(with_classpath,str):
++		classpath+=os.pathsep+with_classpath
++	shutil.rmtree(javatestdir,True)
++	os.mkdir(javatestdir)
++	java_file=open(os.path.join(javatestdir,'Test.java'),'w')
++	java_file.write(class_check_source)
++	java_file.close()
++	self.exec_command(self.env['JAVAC']+[os.path.join(javatestdir,'Test.java')],shell=False)
++	cmd=self.env['JAVA']+['-cp',classpath,'Test',classname]
++	self.to_log("%s\n"%str(cmd))
++	found=self.exec_command(cmd,shell=False)
++	self.msg('Checking for java class %s'%classname,not found)
++	shutil.rmtree(javatestdir,True)
++	return found
++def check_jni_headers(conf):
++	if not conf.env.CC_NAME and not conf.env.CXX_NAME:
++		conf.fatal('load a compiler first (gcc, g++, ..)')
++	if not conf.env.JAVA_HOME:
++		conf.fatal('set JAVA_HOME in the system environment')
++	javaHome=conf.env['JAVA_HOME'][0]
++	dir=conf.root.find_dir(conf.env.JAVA_HOME[0]+'/include')
++	if dir is None:
++		conf.fatal('JAVA_HOME does not seem to be set properly')
++	f=dir.ant_glob('**/(jni|jni_md).h')
++	incDirs=[x.parent.abspath()for x in f]
++	dir=conf.root.find_dir(conf.env.JAVA_HOME[0])
++	f=dir.ant_glob('**/*jvm.(so|dll|dylib)')
++	libDirs=[x.parent.abspath()for x in f]or[javaHome]
++	f=dir.ant_glob('**/*jvm.(lib)')
++	if f:
++		libDirs=[[x,y.parent.abspath()]for x in libDirs for y in f]
++	for d in libDirs:
++		try:
++			conf.check(header_name='jni.h',define_name='HAVE_JNI_H',lib='jvm',libpath=d,includes=incDirs,uselib_store='JAVA',uselib='JAVA')
++		except:
++			pass
++		else:
++			break
++	else:
++		conf.fatal('could not find lib jvm in %r (see config.log)'%libDirs)
++
++feature('javac')(apply_java)
++before_method('process_source')(apply_java)
++feature('javac')(use_javac_files)
++after_method('apply_java')(use_javac_files)
++feature('javac')(set_classpath)
++after_method('apply_java','propagate_uselib_vars','use_javac_files')(set_classpath)
++feature('jar')(jar_files)
++after_method('apply_java','use_javac_files')(jar_files)
++before_method('process_source')(jar_files)
++feature('jar')(use_jar_files)
++after_method('jar_files')(use_jar_files)
++conf(check_java_class)
++conf(check_jni_headers)
+\ No newline at end of file
+--- /dev/null
++++ ardour3/waflib/Tools/kde4.py
+@@ -0,0 +1,49 @@
++#! /usr/bin/env python
++# encoding: utf-8
++# WARNING! Do not edit! http://waf.googlecode.com/git/docs/wafbook/single.html#_obtaining_the_waf_file
++
++import os,sys,re
++from waflib import Options,TaskGen,Task,Utils
++from waflib.TaskGen import feature,after_method
++def apply_msgfmt(self):
++	for lang in self.to_list(self.langs):
++		node=self.path.find_resource(lang+'.po')
++		task=self.create_task('msgfmt',node,node.change_ext('.mo'))
++		langname=lang.split('/')
++		langname=langname[-1]
++		inst=getattr(self,'install_path','${KDE4_LOCALE_INSTALL_DIR}')
++		self.bld.install_as(inst+os.sep+langname+os.sep+'LC_MESSAGES'+os.sep+getattr(self,'appname','set_your_appname')+'.mo',task.outputs[0],chmod=getattr(self,'chmod',Utils.O644))
++class msgfmt(Task.Task):
++	color='BLUE'
++	run_str='${MSGFMT} ${SRC} -o ${TGT}'
++def configure(self):
++	kdeconfig=self.find_program('kde4-config')
++	prefix=self.cmd_and_log('%s --prefix'%kdeconfig).strip()
++	fname='%s/share/apps/cmake/modules/KDELibsDependencies.cmake'%prefix
++	try:os.stat(fname)
++	except OSError:
++		fname='%s/share/kde4/apps/cmake/modules/KDELibsDependencies.cmake'%prefix
++		try:os.stat(fname)
++		except OSError:self.fatal('could not open %s'%fname)
++	try:
++		txt=Utils.readf(fname)
++	except(OSError,IOError):
++		self.fatal('could not read %s'%fname)
++	txt=txt.replace('\\\n','\n')
++	fu=re.compile('#(.*)\n')
++	txt=fu.sub('',txt)
++	setregexp=re.compile('([sS][eE][tT]\s*\()\s*([^\s]+)\s+\"([^"]+)\"\)')
++	found=setregexp.findall(txt)
++	for(_,key,val)in found:
++		self.env[key]=val
++	self.env['LIB_KDECORE']=['kdecore']
++	self.env['LIB_KDEUI']=['kdeui']
++	self.env['LIB_KIO']=['kio']
++	self.env['LIB_KHTML']=['khtml']
++	self.env['LIB_KPARTS']=['kparts']
++	self.env['LIBPATH_KDECORE']=[self.env['KDE4_LIB_INSTALL_DIR']]
++	self.env['INCLUDES_KDECORE']=[self.env['KDE4_INCLUDE_INSTALL_DIR']]
++	self.env.append_value('INCLUDES_KDECORE',[self.env['KDE4_INCLUDE_INSTALL_DIR']+os.sep+'KDE'])
++	self.find_program('msgfmt',var='MSGFMT')
++
++feature('msgfmt')(apply_msgfmt)
+\ No newline at end of file
+--- /dev/null
++++ ardour3/waflib/Tools/lua.py
+@@ -0,0 +1,19 @@
++#! /usr/bin/env python
++# encoding: utf-8
++# WARNING! Do not edit! http://waf.googlecode.com/git/docs/wafbook/single.html#_obtaining_the_waf_file
++
++from waflib.TaskGen import extension
++from waflib import Task,Utils
++def add_lua(self,node):
++	tsk=self.create_task('luac',node,node.change_ext('.luac'))
++	inst_to=getattr(self,'install_path',self.env.LUADIR and'${LUADIR}'or None)
++	if inst_to:
++		self.bld.install_files(inst_to,tsk.outputs)
++	return tsk
++class luac(Task.Task):
++	run_str='${LUAC} -s -o ${TGT} ${SRC}'
++	color='PINK'
++def configure(conf):
++	conf.find_program('luac',var='LUAC')
++
++extension('.lua')(add_lua)
+\ No newline at end of file
+--- /dev/null
++++ ardour3/waflib/Tools/msvc.py
+@@ -0,0 +1,654 @@
++#! /usr/bin/env python
++# encoding: utf-8
++# WARNING! Do not edit! http://waf.googlecode.com/git/docs/wafbook/single.html#_obtaining_the_waf_file
++
++import os,sys,re,tempfile
++try:
++	import _winreg
++except:
++	try:
++		import winreg as _winreg
++	except:
++		_winreg=None
++from waflib import Utils,TaskGen,Runner,Configure,Task,Options
++from waflib.Logs import debug,info,warn,error
++from waflib.TaskGen import after_method,before_method,feature
++from waflib.Configure import conf
++from waflib.Tools import ccroot,c,cxx,ar,winres
++g_msvc_systemlibs='''
++aclui activeds ad1 adptif adsiid advapi32 asycfilt authz bhsupp bits bufferoverflowu cabinet
++cap certadm certidl ciuuid clusapi comctl32 comdlg32 comsupp comsuppd comsuppw comsuppwd comsvcs
++credui crypt32 cryptnet cryptui d3d8thk daouuid dbgeng dbghelp dciman32 ddao35 ddao35d
++ddao35u ddao35ud delayimp dhcpcsvc dhcpsapi dlcapi dnsapi dsprop dsuiext dtchelp
++faultrep fcachdll fci fdi framedyd framedyn gdi32 gdiplus glauxglu32 gpedit gpmuuid
++gtrts32w gtrtst32hlink htmlhelp httpapi icm32 icmui imagehlp imm32 iphlpapi iprop
++kernel32 ksguid ksproxy ksuser libcmt libcmtd libcpmt libcpmtd loadperf lz32 mapi
++mapi32 mgmtapi minidump mmc mobsync mpr mprapi mqoa mqrt msacm32 mscms mscoree
++msdasc msimg32 msrating mstask msvcmrt msvcurt msvcurtd mswsock msxml2 mtx mtxdm
++netapi32 nmapinmsupp npptools ntdsapi ntdsbcli ntmsapi ntquery odbc32 odbcbcp
++odbccp32 oldnames ole32 oleacc oleaut32 oledb oledlgolepro32 opends60 opengl32
++osptk parser pdh penter pgobootrun pgort powrprof psapi ptrustm ptrustmd ptrustu
++ptrustud qosname rasapi32 rasdlg rassapi resutils riched20 rpcndr rpcns4 rpcrt4 rtm
++rtutils runtmchk scarddlg scrnsave scrnsavw secur32 sensapi setupapi sfc shell32
++shfolder shlwapi sisbkup snmpapi sporder srclient sti strsafe svcguid tapi32 thunk32
++traffic unicows url urlmon user32 userenv usp10 uuid uxtheme vcomp vcompd vdmdbg
++version vfw32 wbemuuid  webpost wiaguid wininet winmm winscard winspool winstrm
++wintrust wldap32 wmiutils wow32 ws2_32 wsnmp32 wsock32 wst wtsapi32 xaswitch xolehlp
++'''.split()
++all_msvc_platforms=[('x64','amd64'),('x86','x86'),('ia64','ia64'),('x86_amd64','amd64'),('x86_ia64','ia64')]
++all_wince_platforms=[('armv4','arm'),('armv4i','arm'),('mipsii','mips'),('mipsii_fp','mips'),('mipsiv','mips'),('mipsiv_fp','mips'),('sh4','sh'),('x86','cex86')]
++all_icl_platforms=[('intel64','amd64'),('em64t','amd64'),('ia32','x86'),('Itanium','ia64')]
++def options(opt):
++	opt.add_option('--msvc_version',type='string',help='msvc version, eg: "msvc 10.0,msvc 9.0"',default='')
++	opt.add_option('--msvc_targets',type='string',help='msvc targets, eg: "x64,arm"',default='')
++def setup_msvc(conf,versions):
++	platforms=getattr(Options.options,'msvc_targets','').split(',')
++	if platforms==['']:
++		platforms=Utils.to_list(conf.env['MSVC_TARGETS'])or[i for i,j in all_msvc_platforms+all_icl_platforms+all_wince_platforms]
++	desired_versions=getattr(Options.options,'msvc_version','').split(',')
++	if desired_versions==['']:
++		desired_versions=conf.env['MSVC_VERSIONS']or[v for v,_ in versions][::-1]
++	versiondict=dict(versions)
++	for version in desired_versions:
++		try:
++			targets=dict(versiondict[version])
++			for target in platforms:
++				try:
++					arch,(p1,p2,p3)=targets[target]
++					compiler,revision=version.rsplit(' ',1)
++					return compiler,revision,p1,p2,p3
++				except KeyError:continue
++		except KeyError:continue
++	conf.fatal('msvc: Impossible to find a valid architecture for building (in setup_msvc)')
++def get_msvc_version(conf,compiler,version,target,vcvars):
++	debug('msvc: get_msvc_version: %r %r %r',compiler,version,target)
++	batfile=conf.bldnode.make_node('waf-print-msvc.bat')
++	batfile.write("""@echo off
++set INCLUDE=
++set LIB=
++call "%s" %s
++echo PATH=%%PATH%%
++echo INCLUDE=%%INCLUDE%%
++echo LIB=%%LIB%%
++"""%(vcvars,target))
++	sout=conf.cmd_and_log(['cmd','/E:on','/V:on','/C',batfile.abspath()])
++	lines=sout.splitlines()
++	if not lines[0]:lines=lines[1:]
++	for x in('Setting environment','Setting SDK environment','Intel(R) C++ Compiler','Intel Parallel Studio'):
++		if lines[0].find(x)!=-1:
++			break
++	else:
++		debug('msvc: get_msvc_version: %r %r %r -> not found',compiler,version,target)
++		conf.fatal('msvc: Impossible to find a valid architecture for building (in get_msvc_version)')
++	for line in lines[1:]:
++		if line.startswith('PATH='):
++			path=line[5:]
++			MSVC_PATH=path.split(';')
++		elif line.startswith('INCLUDE='):
++			MSVC_INCDIR=[i for i in line[8:].split(';')if i]
++		elif line.startswith('LIB='):
++			MSVC_LIBDIR=[i for i in line[4:].split(';')if i]
++	env={}
++	env.update(os.environ)
++	env.update(PATH=path)
++	compiler_name,linker_name,lib_name=_get_prog_names(conf,compiler)
++	cxx=conf.find_program(compiler_name,path_list=MSVC_PATH)
++	cxx=conf.cmd_to_list(cxx)
++	if'CL'in env:
++		del(env['CL'])
++	try:
++		try:
++			conf.cmd_and_log(cxx+['/help'],env=env)
++		except Exception ,e:
++			debug('msvc: get_msvc_version: %r %r %r -> failure'%(compiler,version,target))
++			debug(str(e))
++			conf.fatal('msvc: cannot run the compiler (in get_msvc_version)')
++		else:
++			debug('msvc: get_msvc_version: %r %r %r -> OK',compiler,version,target)
++	finally:
++		conf.env[compiler_name]=''
++	return(MSVC_PATH,MSVC_INCDIR,MSVC_LIBDIR)
++def gather_wsdk_versions(conf,versions):
++	version_pattern=re.compile('^v..?.?\...?.?')
++	try:
++		all_versions=_winreg.OpenKey(_winreg.HKEY_LOCAL_MACHINE,'SOFTWARE\\Wow6432node\\Microsoft\\Microsoft SDKs\\Windows')
++	except WindowsError:
++		try:
++			all_versions=_winreg.OpenKey(_winreg.HKEY_LOCAL_MACHINE,'SOFTWARE\\Microsoft\\Microsoft SDKs\\Windows')
++		except WindowsError:
++			return
++	index=0
++	while 1:
++		try:
++			version=_winreg.EnumKey(all_versions,index)
++		except WindowsError:
++			break
++		index=index+1
++		if not version_pattern.match(version):
++			continue
++		try:
++			msvc_version=_winreg.OpenKey(all_versions,version)
++			path,type=_winreg.QueryValueEx(msvc_version,'InstallationFolder')
++		except WindowsError:
++			continue
++		if os.path.isfile(os.path.join(path,'bin','SetEnv.cmd')):
++			targets=[]
++			for target,arch in all_msvc_platforms:
++				try:
++					targets.append((target,(arch,conf.get_msvc_version('wsdk',version,'/'+target,os.path.join(path,'bin','SetEnv.cmd')))))
++				except conf.errors.ConfigurationError:
++					pass
++			versions.append(('wsdk '+version[1:],targets))
++def gather_wince_supported_platforms():
++	supported_wince_platforms=[]
++	try:
++		ce_sdk=_winreg.OpenKey(_winreg.HKEY_LOCAL_MACHINE,'SOFTWARE\\Wow6432node\\Microsoft\\Windows CE Tools\\SDKs')
++	except WindowsError:
++		try:
++			ce_sdk=_winreg.OpenKey(_winreg.HKEY_LOCAL_MACHINE,'SOFTWARE\\Microsoft\\Windows CE Tools\\SDKs')
++		except WindowsError:
++			ce_sdk=''
++	if not ce_sdk:
++		return supported_wince_platforms
++	ce_index=0
++	while 1:
++		try:
++			sdk_device=_winreg.EnumKey(ce_sdk,ce_index)
++		except WindowsError:
++			break
++		ce_index=ce_index+1
++		sdk=_winreg.OpenKey(ce_sdk,sdk_device)
++		try:
++			path,type=_winreg.QueryValueEx(sdk,'SDKRootDir')
++		except WindowsError:
++			try:
++				path,type=_winreg.QueryValueEx(sdk,'SDKInformation')
++				path,xml=os.path.split(path)
++			except WindowsError:
++				continue
++		path=str(path)
++		path,device=os.path.split(path)
++		if not device:
++			path,device=os.path.split(path)
++		for arch,compiler in all_wince_platforms:
++			platforms=[]
++			if os.path.isdir(os.path.join(path,device,'Lib',arch)):
++				platforms.append((arch,compiler,os.path.join(path,device,'Include',arch),os.path.join(path,device,'Lib',arch)))
++			if platforms:
++				supported_wince_platforms.append((device,platforms))
++	return supported_wince_platforms
++def gather_msvc_detected_versions():
++	version_pattern=re.compile('^(\d\d?\.\d\d?)(Exp)?$')
++	detected_versions=[]
++	for vcver,vcvar in[('VCExpress','Exp'),('VisualStudio','')]:
++		try:
++			prefix='SOFTWARE\\Wow6432node\\Microsoft\\'+vcver
++			all_versions=_winreg.OpenKey(_winreg.HKEY_LOCAL_MACHINE,prefix)
++		except WindowsError:
++			try:
++				prefix='SOFTWARE\\Microsoft\\'+vcver
++				all_versions=_winreg.OpenKey(_winreg.HKEY_LOCAL_MACHINE,prefix)
++			except WindowsError:
++				continue
++		index=0
++		while 1:
++			try:
++				version=_winreg.EnumKey(all_versions,index)
++			except WindowsError:
++				break
++			index=index+1
++			match=version_pattern.match(version)
++			if not match:
++				continue
++			else:
++				versionnumber=float(match.group(1))
++			detected_versions.append((versionnumber,version+vcvar,prefix+"\\"+version))
++	def fun(tup):
++		return tup[0]
++	try:
++		detected_versions.sort(key=fun)
++	except:
++		detected_versions.sort(lambda x,y:cmp(x[0],y[0]))
++	return detected_versions
++def gather_msvc_targets(conf,versions,version,vc_path):
++	targets=[]
++	if os.path.isfile(os.path.join(vc_path,'vcvarsall.bat')):
++		for target,realtarget in all_msvc_platforms[::-1]:
++			try:
++				targets.append((target,(realtarget,conf.get_msvc_version('msvc',version,target,os.path.join(vc_path,'vcvarsall.bat')))))
++			except conf.errors.ConfigurationError:
++				pass
++	elif os.path.isfile(os.path.join(vc_path,'Common7','Tools','vsvars32.bat')):
++		try:
++			targets.append(('x86',('x86',conf.get_msvc_version('msvc',version,'x86',os.path.join(vc_path,'Common7','Tools','vsvars32.bat')))))
++		except conf.errors.ConfigurationError:
++			pass
++	elif os.path.isfile(os.path.join(vc_path,'Bin','vcvars32.bat')):
++		try:
++			targets.append(('x86',('x86',conf.get_msvc_version('msvc',version,'',os.path.join(vc_path,'Bin','vcvars32.bat')))))
++		except conf.errors.ConfigurationError:
++			pass
++	versions.append(('msvc '+version,targets))
++def gather_wince_targets(conf,versions,version,vc_path,vsvars,supported_platforms):
++	for device,platforms in supported_platforms:
++		cetargets=[]
++		for platform,compiler,include,lib in platforms:
++			winCEpath=os.path.join(vc_path,'ce')
++			if not os.path.isdir(winCEpath):
++				continue
++			try:
++				common_bindirs,_1,_2=conf.get_msvc_version('msvc',version,'x86',vsvars)
++			except conf.errors.ConfigurationError:
++				continue
++			if os.path.isdir(os.path.join(winCEpath,'lib',platform)):
++				bindirs=[os.path.join(winCEpath,'bin',compiler),os.path.join(winCEpath,'bin','x86_'+compiler)]+common_bindirs
++				incdirs=[os.path.join(winCEpath,'include'),os.path.join(winCEpath,'atlmfc','include'),include]
++				libdirs=[os.path.join(winCEpath,'lib',platform),os.path.join(winCEpath,'atlmfc','lib',platform),lib]
++				cetargets.append((platform,(platform,(bindirs,incdirs,libdirs))))
++		if cetargets:
++			versions.append((device+' '+version,cetargets))
++def gather_msvc_versions(conf,versions):
++	vc_paths=[]
++	for(v,version,reg)in gather_msvc_detected_versions():
++		try:
++			try:
++				msvc_version=_winreg.OpenKey(_winreg.HKEY_LOCAL_MACHINE,reg+"\\Setup\\VC")
++			except WindowsError:
++				msvc_version=_winreg.OpenKey(_winreg.HKEY_LOCAL_MACHINE,reg+"\\Setup\\Microsoft Visual C++")
++			path,type=_winreg.QueryValueEx(msvc_version,'ProductDir')
++			vc_paths.append((version,os.path.abspath(str(path))))
++		except WindowsError:
++			continue
++	wince_supported_platforms=gather_wince_supported_platforms()
++	for version,vc_path in vc_paths:
++		vs_path=os.path.dirname(vc_path)
++		vsvars=os.path.join(vs_path,'Common7','Tools','vsvars32.bat')
++		if wince_supported_platforms and os.path.isfile(vsvars):
++			conf.gather_wince_targets(versions,version,vc_path,vsvars,wince_supported_platforms)
++	for version,vc_path in vc_paths:
++		vs_path=os.path.dirname(vc_path)
++		conf.gather_msvc_targets(versions,version,vc_path)
++def gather_icl_versions(conf,versions):
++	version_pattern=re.compile('^...?.?\....?.?')
++	try:
++		all_versions=_winreg.OpenKey(_winreg.HKEY_LOCAL_MACHINE,'SOFTWARE\\Wow6432node\\Intel\\Compilers\\C++')
++	except WindowsError:
++		try:
++			all_versions=_winreg.OpenKey(_winreg.HKEY_LOCAL_MACHINE,'SOFTWARE\\Intel\\Compilers\\C++')
++		except WindowsError:
++			return
++	index=0
++	while 1:
++		try:
++			version=_winreg.EnumKey(all_versions,index)
++		except WindowsError:
++			break
++		index=index+1
++		if not version_pattern.match(version):
++			continue
++		targets=[]
++		for target,arch in all_icl_platforms:
++			try:
++				if target=='intel64':targetDir='EM64T_NATIVE'
++				else:targetDir=target
++				_winreg.OpenKey(all_versions,version+'\\'+targetDir)
++				icl_version=_winreg.OpenKey(all_versions,version)
++				path,type=_winreg.QueryValueEx(icl_version,'ProductDir')
++				if os.path.isfile(os.path.join(path,'bin','iclvars.bat')):
++					try:
++						targets.append((target,(arch,conf.get_msvc_version('intel',version,target,os.path.join(path,'bin','iclvars.bat')))))
++					except conf.errors.ConfigurationError:
++						pass
++			except WindowsError:
++				pass
++		for target,arch in all_icl_platforms:
++			try:
++				icl_version=_winreg.OpenKey(all_versions,version+'\\'+target)
++				path,type=_winreg.QueryValueEx(icl_version,'ProductDir')
++				if os.path.isfile(os.path.join(path,'bin','iclvars.bat')):
++					try:
++						targets.append((target,(arch,conf.get_msvc_version('intel',version,target,os.path.join(path,'bin','iclvars.bat')))))
++					except conf.errors.ConfigurationError:
++						pass
++			except WindowsError:
++				continue
++		major=version[0:2]
++		versions.append(('intel '+major,targets))
++def get_msvc_versions(conf):
++	if not conf.env['MSVC_INSTALLED_VERSIONS']:
++		lst=[]
++		conf.gather_icl_versions(lst)
++		conf.gather_wsdk_versions(lst)
++		conf.gather_msvc_versions(lst)
++		conf.env['MSVC_INSTALLED_VERSIONS']=lst
++	return conf.env['MSVC_INSTALLED_VERSIONS']
++def print_all_msvc_detected(conf):
++	for version,targets in conf.env['MSVC_INSTALLED_VERSIONS']:
++		info(version)
++		for target,l in targets:
++			info("\t"+target)
++def detect_msvc(conf):
++	versions=get_msvc_versions(conf)
++	return setup_msvc(conf,versions)
++def find_lt_names_msvc(self,libname,is_static=False):
++	lt_names=['lib%s.la'%libname,'%s.la'%libname,]
++	for path in self.env['LIBPATH']:
++		for la in lt_names:
++			laf=os.path.join(path,la)
++			dll=None
++			if os.path.exists(laf):
++				ltdict=Utils.read_la_file(laf)
++				lt_libdir=None
++				if ltdict.get('libdir',''):
++					lt_libdir=ltdict['libdir']
++				if not is_static and ltdict.get('library_names',''):
++					dllnames=ltdict['library_names'].split()
++					dll=dllnames[0].lower()
++					dll=re.sub('\.dll$','',dll)
++					return(lt_libdir,dll,False)
++				elif ltdict.get('old_library',''):
++					olib=ltdict['old_library']
++					if os.path.exists(os.path.join(path,olib)):
++						return(path,olib,True)
++					elif lt_libdir!=''and os.path.exists(os.path.join(lt_libdir,olib)):
++						return(lt_libdir,olib,True)
++					else:
++						return(None,olib,True)
++				else:
++					raise self.errors.WafError('invalid libtool object file: %s'%laf)
++	return(None,None,None)
++def libname_msvc(self,libname,is_static=False):
++	lib=libname.lower()
++	lib=re.sub('\.lib$','',lib)
++	if lib in g_msvc_systemlibs:
++		return lib
++	lib=re.sub('^lib','',lib)
++	if lib=='m':
++		return None
++	(lt_path,lt_libname,lt_static)=self.find_lt_names_msvc(lib,is_static)
++	if lt_path!=None and lt_libname!=None:
++		if lt_static==True:
++			return os.path.join(lt_path,lt_libname)
++	if lt_path!=None:
++		_libpaths=[lt_path]+self.env['LIBPATH']
++	else:
++		_libpaths=self.env['LIBPATH']
++	static_libs=['lib%ss.lib'%lib,'lib%s.lib'%lib,'%ss.lib'%lib,'%s.lib'%lib,]
++	dynamic_libs=['lib%s.dll.lib'%lib,'lib%s.dll.a'%lib,'%s.dll.lib'%lib,'%s.dll.a'%lib,'lib%s_d.lib'%lib,'%s_d.lib'%lib,'%s.lib'%lib,]
++	libnames=static_libs
++	if not is_static:
++		libnames=dynamic_libs+static_libs
++	for path in _libpaths:
++		for libn in libnames:
++			if os.path.exists(os.path.join(path,libn)):
++				debug('msvc: lib found: %s'%os.path.join(path,libn))
++				return re.sub('\.lib$','',libn)
++	self.fatal("The library %r could not be found"%libname)
++	return re.sub('\.lib$','',libname)
++def check_lib_msvc(self,libname,is_static=False,uselib_store=None):
++	libn=self.libname_msvc(libname,is_static)
++	if not uselib_store:
++		uselib_store=libname.upper()
++	if False and is_static:
++		self.env['STLIB_'+uselib_store]=[libn]
++	else:
++		self.env['LIB_'+uselib_store]=[libn]
++def check_libs_msvc(self,libnames,is_static=False):
++	for libname in Utils.to_list(libnames):
++		self.check_lib_msvc(libname,is_static)
++def configure(conf):
++	conf.autodetect()
++	conf.find_msvc()
++	conf.msvc_common_flags()
++	conf.cc_load_tools()
++	conf.cxx_load_tools()
++	conf.cc_add_flags()
++	conf.cxx_add_flags()
++	conf.link_add_flags()
++	conf.visual_studio_add_flags()
++def no_autodetect(conf):
++	conf.env.NO_MSVC_DETECT=1
++	configure(conf)
++def autodetect(conf):
++	v=conf.env
++	if v.NO_MSVC_DETECT:
++		return
++	compiler,version,path,includes,libdirs=conf.detect_msvc()
++	v['PATH']=path
++	v['INCLUDES']=includes
++	v['LIBPATH']=libdirs
++	v['MSVC_COMPILER']=compiler
++	try:
++		v['MSVC_VERSION']=float(version)
++	except:
++		v['MSVC_VERSION']=float(version[:-3])
++def _get_prog_names(conf,compiler):
++	if compiler=='intel':
++		compiler_name='ICL'
++		linker_name='XILINK'
++		lib_name='XILIB'
++	else:
++		compiler_name='CL'
++		linker_name='LINK'
++		lib_name='LIB'
++	return compiler_name,linker_name,lib_name
++def find_msvc(conf):
++	if sys.platform=='cygwin':
++		conf.fatal('MSVC module does not work under cygwin Python!')
++	v=conf.env
++	path=v['PATH']
++	compiler=v['MSVC_COMPILER']
++	version=v['MSVC_VERSION']
++	compiler_name,linker_name,lib_name=_get_prog_names(conf,compiler)
++	v.MSVC_MANIFEST=(compiler=='msvc'and version>=8)or(compiler=='wsdk'and version>=6)or(compiler=='intel'and version>=11)
++	cxx=None
++	if v['CXX']:cxx=v['CXX']
++	elif'CXX'in conf.environ:cxx=conf.environ['CXX']
++	cxx=conf.find_program(compiler_name,var='CXX',path_list=path)
++	cxx=conf.cmd_to_list(cxx)
++	env=dict(conf.environ)
++	if path:env.update(PATH=';'.join(path))
++	if not conf.cmd_and_log(cxx+['/nologo','/help'],env=env):
++		conf.fatal('the msvc compiler could not be identified')
++	v['CC']=v['CXX']=cxx
++	v['CC_NAME']=v['CXX_NAME']='msvc'
++	if not v['LINK_CXX']:
++		link=conf.find_program(linker_name,path_list=path)
++		if link:v['LINK_CXX']=link
++		else:conf.fatal('%s was not found (linker)'%linker_name)
++		v['LINK']=link
++	if not v['LINK_CC']:
++		v['LINK_CC']=v['LINK_CXX']
++	if not v['AR']:
++		stliblink=conf.find_program(lib_name,path_list=path,var='AR')
++		if not stliblink:return
++		v['ARFLAGS']=['/NOLOGO']
++	if v.MSVC_MANIFEST:
++		mt=conf.find_program('MT',path_list=path,var='MT')
++		v['MTFLAGS']=['/NOLOGO']
++	conf.load('winres')
++	if not conf.env['WINRC']:
++		warn('Resource compiler not found. Compiling resource file is disabled')
++def visual_studio_add_flags(self):
++	v=self.env
++	try:v.prepend_value('INCLUDES',self.environ['INCLUDE'].split(';'))
++	except:pass
++	try:v.prepend_value('LIBPATH',self.environ['LIB'].split(';'))
++	except:pass
++def msvc_common_flags(conf):
++	v=conf.env
++	v['DEST_BINFMT']='pe'
++	v.append_value('CFLAGS',['/nologo'])
++	v.append_value('CXXFLAGS',['/nologo'])
++	v['DEFINES_ST']='/D%s'
++	v['CC_SRC_F']=''
++	v['CC_TGT_F']=['/c','/Fo']
++	if v['MSVC_VERSION']>=8:
++		v['CC_TGT_F']=['/FC']+v['CC_TGT_F']
++	v['CXX_SRC_F']=''
++	v['CXX_TGT_F']=['/c','/Fo']
++	if v['MSVC_VERSION']>=8:
++		v['CXX_TGT_F']=['/FC']+v['CXX_TGT_F']
++	v['CPPPATH_ST']='/I%s'
++	v['AR_TGT_F']=v['CCLNK_TGT_F']=v['CXXLNK_TGT_F']='/OUT:'
++	v['CFLAGS_CONSOLE']=v['CXXFLAGS_CONSOLE']=['/SUBSYSTEM:CONSOLE']
++	v['CFLAGS_NATIVE']=v['CXXFLAGS_NATIVE']=['/SUBSYSTEM:NATIVE']
++	v['CFLAGS_POSIX']=v['CXXFLAGS_POSIX']=['/SUBSYSTEM:POSIX']
++	v['CFLAGS_WINDOWS']=v['CXXFLAGS_WINDOWS']=['/SUBSYSTEM:WINDOWS']
++	v['CFLAGS_WINDOWSCE']=v['CXXFLAGS_WINDOWSCE']=['/SUBSYSTEM:WINDOWSCE']
++	v['CFLAGS_CRT_MULTITHREADED']=v['CXXFLAGS_CRT_MULTITHREADED']=['/MT']
++	v['CFLAGS_CRT_MULTITHREADED_DLL']=v['CXXFLAGS_CRT_MULTITHREADED_DLL']=['/MD']
++	v['CFLAGS_CRT_MULTITHREADED_DBG']=v['CXXFLAGS_CRT_MULTITHREADED_DBG']=['/MTd']
++	v['CFLAGS_CRT_MULTITHREADED_DLL_DBG']=v['CXXFLAGS_CRT_MULTITHREADED_DLL_DBG']=['/MDd']
++	v['LIB_ST']='%s.lib'
++	v['LIBPATH_ST']='/LIBPATH:%s'
++	v['STLIB_ST']='lib%s.lib'
++	v['STLIBPATH_ST']='/LIBPATH:%s'
++	v.append_value('LINKFLAGS',['/NOLOGO'])
++	if v['MSVC_MANIFEST']:
++		v.append_value('LINKFLAGS',['/MANIFEST'])
++	v['CFLAGS_cshlib']=[]
++	v['CXXFLAGS_cxxshlib']=[]
++	v['LINKFLAGS_cshlib']=v['LINKFLAGS_cxxshlib']=['/DLL']
++	v['cshlib_PATTERN']=v['cxxshlib_PATTERN']='%s.dll'
++	v['implib_PATTERN']='%s.lib'
++	v['IMPLIB_ST']='/IMPLIB:%s'
++	v['LINKFLAGS_cstlib']=[]
++	v['cstlib_PATTERN']=v['cxxstlib_PATTERN']='lib%s.lib'
++	v['cprogram_PATTERN']=v['cxxprogram_PATTERN']='%s.exe'
++def apply_flags_msvc(self):
++	if self.env.CC_NAME!='msvc'or not getattr(self,'link_task',None):
++		return
++	is_static=isinstance(self.link_task,ccroot.stlink_task)
++	subsystem=getattr(self,'subsystem','')
++	if subsystem:
++		subsystem='/subsystem:%s'%subsystem
++		flags=is_static and'ARFLAGS'or'LINKFLAGS'
++		self.env.append_value(flags,subsystem)
++	if not is_static:
++		for f in self.env.LINKFLAGS:
++			d=f.lower()
++			if d[1:]=='debug':
++				pdbnode=self.link_task.outputs[0].change_ext('.pdb')
++				self.link_task.outputs.append(pdbnode)
++				try:
++					self.install_task.source.append(pdbnode)
++				except AttributeError:
++					pass
++				break
++def apply_manifest(self):
++	if self.env.CC_NAME=='msvc'and self.env.MSVC_MANIFEST and getattr(self,'link_task',None):
++		out_node=self.link_task.outputs[0]
++		man_node=out_node.parent.find_or_declare(out_node.name+'.manifest')
++		self.link_task.outputs.append(man_node)
++		self.link_task.do_manifest=True
++def exec_mf(self):
++	env=self.env
++	mtool=env['MT']
++	if not mtool:
++		return 0
++	self.do_manifest=False
++	outfile=self.outputs[0].abspath()
++	manifest=None
++	for out_node in self.outputs:
++		if out_node.name.endswith('.manifest'):
++			manifest=out_node.abspath()
++			break
++	if manifest is None:
++		return 0
++	mode=''
++	if'cprogram'in self.generator.features or'cxxprogram'in self.generator.features:
++		mode='1'
++	elif'cshlib'in self.generator.features or'cxxshlib'in self.generator.features:
++		mode='2'
++	debug('msvc: embedding manifest in mode %r'%mode)
++	lst=[]
++	lst.append(env['MT'])
++	lst.extend(Utils.to_list(env['MTFLAGS']))
++	lst.extend(['-manifest',manifest])
++	lst.append('-outputresource:%s;%s'%(outfile,mode))
++	lst=[lst]
++	return self.exec_command(*lst)
++def quote_response_command(self,flag):
++	if flag.find(' ')>-1:
++		for x in('/LIBPATH:','/IMPLIB:','/OUT:','/I'):
++			if flag.startswith(x):
++				flag='%s"%s"'%(x,flag[len(x):])
++				break
++		else:
++			flag='"%s"'%flag
++	return flag
++def exec_response_command(self,cmd,**kw):
++	try:
++		tmp=None
++		if sys.platform.startswith('win')and isinstance(cmd,list)and len(' '.join(cmd))>=8192:
++			program=cmd[0]
++			cmd=[self.quote_response_command(x)for x in cmd]
++			(fd,tmp)=tempfile.mkstemp()
++			os.write(fd,'\r\n'.join(i.replace('\\','\\\\')for i in cmd[1:]))
++			os.close(fd)
++			cmd=[program,'@'+tmp]
++		ret=self.generator.bld.exec_command(cmd,**kw)
++	finally:
++		if tmp:
++			try:
++				os.remove(tmp)
++			except:
++				pass
++	return ret
++def exec_command_msvc(self,*k,**kw):
++	if self.env['CC_NAME']=='msvc':
++		if isinstance(k[0],list):
++			lst=[]
++			carry=''
++			for a in k[0]:
++				if a=='/Fo'or a=='/doc'or a[-1]==':':
++					carry=a
++				else:
++					lst.append(carry+a)
++					carry=''
++			k=[lst]
++		if self.env['PATH']:
++			env=dict(os.environ)
++			env.update(PATH=';'.join(self.env['PATH']))
++			kw['env']=env
++	bld=self.generator.bld
++	try:
++		if not kw.get('cwd',None):
++			kw['cwd']=bld.cwd
++	except AttributeError:
++		bld.cwd=kw['cwd']=bld.variant_dir
++	ret=self.exec_response_command(k[0],**kw)
++	if not ret and getattr(self,'do_manifest',None):
++		ret=self.exec_mf()
++	return ret
++for k in'c cxx cprogram cxxprogram cshlib cxxshlib cstlib cxxstlib'.split():
++	cls=Task.classes.get(k,None)
++	if cls:
++		cls.exec_command=exec_command_msvc
++		cls.exec_response_command=exec_response_command
++		cls.quote_response_command=quote_response_command
++		cls.exec_mf=exec_mf
++
++conf(get_msvc_version)
++conf(gather_wsdk_versions)
++conf(gather_msvc_targets)
++conf(gather_wince_targets)
++conf(gather_msvc_versions)
++conf(gather_icl_versions)
++conf(get_msvc_versions)
++conf(print_all_msvc_detected)
++conf(detect_msvc)
++conf(find_lt_names_msvc)
++conf(libname_msvc)
++conf(check_lib_msvc)
++conf(check_libs_msvc)
++conf(no_autodetect)
++conf(autodetect)
++conf(find_msvc)
++conf(visual_studio_add_flags)
++conf(msvc_common_flags)
++after_method('apply_link')(apply_flags_msvc)
++feature('c','cxx')(apply_flags_msvc)
++feature('cprogram','cshlib','cxxprogram','cxxshlib')(apply_manifest)
++after_method('apply_link')(apply_manifest)
+\ No newline at end of file
+--- /dev/null
++++ ardour3/waflib/Tools/nasm.py
+@@ -0,0 +1,14 @@
++#! /usr/bin/env python
++# encoding: utf-8
++# WARNING! Do not edit! http://waf.googlecode.com/git/docs/wafbook/single.html#_obtaining_the_waf_file
++
++import waflib.Tools.asm
++from waflib.TaskGen import feature
++def apply_nasm_vars(self):
++	self.env.append_value('ASFLAGS',self.to_list(getattr(self,'nasm_flags',[])))
++def configure(conf):
++	nasm=conf.find_program(['nasm','yasm'],var='AS')
++	conf.env.AS_TGT_F=['-o']
++	conf.env.ASLNK_TGT_F=['-o']
++
++feature('asm')(apply_nasm_vars)
+\ No newline at end of file
+--- /dev/null
++++ ardour3/waflib/Tools/perl.py
+@@ -0,0 +1,81 @@
++#! /usr/bin/env python
++# encoding: utf-8
++# WARNING! Do not edit! http://waf.googlecode.com/git/docs/wafbook/single.html#_obtaining_the_waf_file
++
++import os
++from waflib import Task,Options,Utils
++from waflib.Configure import conf
++from waflib.TaskGen import extension,feature,before_method
++def init_perlext(self):
++	self.uselib=self.to_list(getattr(self,'uselib',[]))
++	if not'PERLEXT'in self.uselib:self.uselib.append('PERLEXT')
++	self.env['cshlib_PATTERN']=self.env['cxxshlib_PATTERN']=self.env['perlext_PATTERN']
++def xsubpp_file(self,node):
++	outnode=node.change_ext('.c')
++	self.create_task('xsubpp',node,outnode)
++	self.source.append(outnode)
++class xsubpp(Task.Task):
++	run_str='${PERL} ${XSUBPP} -noprototypes -typemap ${EXTUTILS_TYPEMAP} ${SRC} > ${TGT}'
++	color='BLUE'
++	ext_out=['.h']
++def check_perl_version(self,minver=None):
++	res=True
++	if minver:
++		cver='.'.join(map(str,minver))
++	else:
++		cver=''
++	self.start_msg('Checking for minimum perl version %s'%cver)
++	perl=getattr(Options.options,'perlbinary',None)
++	if not perl:
++		perl=self.find_program('perl',var='PERL')
++	if not perl:
++		self.end_msg("Perl not found",color="YELLOW")
++		return False
++	self.env['PERL']=perl
++	version=self.cmd_and_log([perl,"-e",'printf \"%vd\", $^V'])
++	if not version:
++		res=False
++		version="Unknown"
++	elif not minver is None:
++		ver=tuple(map(int,version.split(".")))
++		if ver<minver:
++			res=False
++	self.end_msg(version,color=res and"GREEN"or"YELLOW")
++	return res
++def check_perl_module(self,module):
++	cmd=[self.env['PERL'],'-e','use %s'%module]
++	self.start_msg('perl module %s'%module)
++	try:
++		r=self.cmd_and_log(cmd)
++	except:
++		self.end_msg(False)
++		return None
++	self.end_msg(r or True)
++	return r
++def check_perl_ext_devel(self):
++	env=self.env
++	perl=env.PERL
++	if not perl:
++		self.fatal('find perl first')
++	def read_out(cmd):
++		return Utils.to_list(self.cmd_and_log(perl+cmd))
++	env['LINKFLAGS_PERLEXT']=read_out(" -MConfig -e'print $Config{lddlflags}'")
++	env['INCLUDES_PERLEXT']=read_out(" -MConfig -e'print \"$Config{archlib}/CORE\"'")
++	env['CFLAGS_PERLEXT']=read_out(" -MConfig -e'print \"$Config{ccflags} $Config{cccdlflags}\"'")
++	env['XSUBPP']=read_out(" -MConfig -e'print \"$Config{privlib}/ExtUtils/xsubpp$Config{exe_ext}\"'")
++	env['EXTUTILS_TYPEMAP']=read_out(" -MConfig -e'print \"$Config{privlib}/ExtUtils/typemap\"'")
++	if not getattr(Options.options,'perlarchdir',None):
++		env['ARCHDIR_PERL']=self.cmd_and_log(perl+" -MConfig -e'print $Config{sitearch}'")
++	else:
++		env['ARCHDIR_PERL']=getattr(Options.options,'perlarchdir')
++	env['perlext_PATTERN']='%s.'+self.cmd_and_log(perl+" -MConfig -e'print $Config{dlext}'")
++def options(opt):
++	opt.add_option('--with-perl-binary',type='string',dest='perlbinary',help='Specify alternate perl binary',default=None)
++	opt.add_option('--with-perl-archdir',type='string',dest='perlarchdir',help='Specify directory where to install arch specific files',default=None)
++
++before_method('apply_incpaths','apply_link','propagate_uselib_vars')(init_perlext)
++feature('perlext')(init_perlext)
++extension('.xs')(xsubpp_file)
++conf(check_perl_version)
++conf(check_perl_module)
++conf(check_perl_ext_devel)
+\ No newline at end of file
+--- /dev/null
++++ ardour3/waflib/Tools/python.py
+@@ -0,0 +1,336 @@
++#! /usr/bin/env python
++# encoding: utf-8
++# WARNING! Do not edit! http://waf.googlecode.com/git/docs/wafbook/single.html#_obtaining_the_waf_file
++
++import os,sys
++from waflib import Utils,Options,Errors
++from waflib.Logs import debug,warn,info,error
++from waflib.TaskGen import extension,before_method,after_method,feature
++from waflib.Configure import conf
++FRAG='''
++#include <Python.h>
++#ifdef __cplusplus
++extern "C" {
++#endif
++	void Py_Initialize(void);
++	void Py_Finalize(void);
++#ifdef __cplusplus
++}
++#endif
++int main()
++{
++   Py_Initialize();
++   Py_Finalize();
++   return 0;
++}
++'''
++INST='''
++import sys, py_compile
++py_compile.compile(sys.argv[1], sys.argv[2], sys.argv[3])
++'''
++DISTUTILS_IMP=['from distutils.sysconfig import get_config_var, get_python_lib']
++def process_py(self,node):
++	try:
++		if not self.bld.is_install:
++			return
++	except:
++		return
++	try:
++		if not self.install_path:
++			return
++	except AttributeError:
++		self.install_path='${PYTHONDIR}'
++	def inst_py(ctx):
++		install_from=getattr(self,'install_from',None)
++		if install_from:
++			install_from=self.path.find_dir(install_from)
++		install_pyfile(self,node,install_from)
++	self.bld.add_post_fun(inst_py)
++def install_pyfile(self,node,install_from=None):
++	from_node=install_from or node.parent
++	tsk=self.bld.install_as(self.install_path+'/'+node.path_from(from_node),node,postpone=False)
++	path=tsk.get_install_path()
++	if self.bld.is_install<0:
++		info("+ removing byte compiled python files")
++		for x in'co':
++			try:
++				os.remove(path+x)
++			except OSError:
++				pass
++	if self.bld.is_install>0:
++		try:
++			st1=os.stat(path)
++		except:
++			error('The python file is missing, this should not happen')
++		for x in['c','o']:
++			do_inst=self.env['PY'+x.upper()]
++			try:
++				st2=os.stat(path+x)
++			except OSError:
++				pass
++			else:
++				if st1.st_mtime<=st2.st_mtime:
++					do_inst=False
++			if do_inst:
++				lst=(x=='o')and[self.env['PYFLAGS_OPT']]or[]
++				(a,b,c)=(path,path+x,tsk.get_install_path(destdir=False)+x)
++				argv=self.env['PYTHON']+lst+['-c',INST,a,b,c]
++				info('+ byte compiling %r'%(path+x))
++				env=self.env.env or None
++				ret=Utils.subprocess.Popen(argv,env=env).wait()
++				if ret:
++					raise Errors.WafError('py%s compilation failed %r'%(x,path))
++def feature_py(self):
++	pass
++def init_pyext(self):
++	try:
++		if not self.install_path:
++			return
++	except AttributeError:
++		self.install_path='${PYTHONARCHDIR}'
++	self.uselib=self.to_list(getattr(self,'uselib',[]))
++	if not'PYEXT'in self.uselib:
++		self.uselib.append('PYEXT')
++	self.env['cshlib_PATTERN']=self.env['cxxshlib_PATTERN']=self.env['macbundle_PATTERN']=self.env['pyext_PATTERN']
++def set_bundle(self):
++	if Utils.unversioned_sys_platform()=='darwin':
++		self.mac_bundle=True
++def init_pyembed(self):
++	self.uselib=self.to_list(getattr(self,'uselib',[]))
++	if not'PYEMBED'in self.uselib:
++		self.uselib.append('PYEMBED')
++def get_python_variables(self,variables,imports=None):
++	if not imports:
++		try:
++			imports=self.python_imports
++		except AttributeError:
++			imports=DISTUTILS_IMP
++	program=list(imports)
++	program.append('')
++	for v in variables:
++		program.append("print(repr(%s))"%v)
++	os_env=dict(os.environ)
++	try:
++		del os_env['MACOSX_DEPLOYMENT_TARGET']
++	except KeyError:
++		pass
++	try:
++		out=self.cmd_and_log(self.env.PYTHON+['-c','\n'.join(program)],env=os_env)
++	except Errors.WafError:
++		self.fatal('The distutils module is unusable: install "python-devel"?')
++	return_values=[]
++	for s in out.split('\n'):
++		s=s.strip()
++		if not s:
++			continue
++		if s=='None':
++			return_values.append(None)
++		elif s[0]=="'"and s[-1]=="'":
++			return_values.append(s[1:-1])
++		elif s[0].isdigit():
++			return_values.append(int(s))
++		else:break
++	return return_values
++def check_python_headers(conf):
++	if not conf.env['CC_NAME']and not conf.env['CXX_NAME']:
++		conf.fatal('load a compiler first (gcc, g++, ..)')
++	if not conf.env['PYTHON_VERSION']:
++		conf.check_python_version()
++	env=conf.env
++	pybin=conf.env.PYTHON
++	if not pybin:
++		conf.fatal('could not find the python executable')
++	v='prefix SO LDFLAGS LIBDIR LIBPL INCLUDEPY Py_ENABLE_SHARED MACOSX_DEPLOYMENT_TARGET LDSHARED CFLAGS'.split()
++	try:
++		lst=conf.get_python_variables(["get_config_var('%s') or ''"%x for x in v])
++	except RuntimeError:
++		conf.fatal("Python development headers not found (-v for details).")
++	vals=['%s = %r'%(x,y)for(x,y)in zip(v,lst)]
++	conf.to_log("Configuration returned from %r:\n%r\n"%(pybin,'\n'.join(vals)))
++	dct=dict(zip(v,lst))
++	x='MACOSX_DEPLOYMENT_TARGET'
++	if dct[x]:
++		conf.env[x]=conf.environ[x]=dct[x]
++	env['pyext_PATTERN']='%s'+dct['SO']
++	all_flags=dct['LDFLAGS']+' '+dct['CFLAGS']
++	conf.parse_flags(all_flags,'PYEMBED')
++	all_flags=dct['LDFLAGS']+' '+dct['LDSHARED']+' '+dct['CFLAGS']
++	conf.parse_flags(all_flags,'PYEXT')
++	result=None
++	for name in('python'+env['PYTHON_VERSION'],'python'+env['PYTHON_VERSION'].replace('.','')):
++		if not result and env['LIBPATH_PYEMBED']:
++			path=env['LIBPATH_PYEMBED']
++			conf.to_log("\n\n# Trying default LIBPATH_PYEMBED: %r\n"%path)
++			result=conf.check(lib=name,uselib='PYEMBED',libpath=path,mandatory=False,msg='Checking for library %s in LIBPATH_PYEMBED'%name)
++		if not result and dct['LIBDIR']:
++			path=[dct['LIBDIR']]
++			conf.to_log("\n\n# try again with -L$python_LIBDIR: %r\n"%path)
++			result=conf.check(lib=name,uselib='PYEMBED',libpath=path,mandatory=False,msg='Checking for library %s in LIBDIR'%name)
++		if not result and dct['LIBPL']:
++			path=[dct['LIBPL']]
++			conf.to_log("\n\n# try again with -L$python_LIBPL (some systems don't install the python library in $prefix/lib)\n")
++			result=conf.check(lib=name,uselib='PYEMBED',libpath=path,mandatory=False,msg='Checking for library %s in python_LIBPL'%name)
++		if not result:
++			path=[os.path.join(dct['prefix'],"libs")]
++			conf.to_log("\n\n# try again with -L$prefix/libs, and pythonXY name rather than pythonX.Y (win32)\n")
++			result=conf.check(lib=name,uselib='PYEMBED',libpath=path,mandatory=False,msg='Checking for library %s in $prefix/libs'%name)
++		if result:
++			break
++	if result:
++		env['LIBPATH_PYEMBED']=path
++		env.append_value('LIB_PYEMBED',[name])
++	else:
++		conf.to_log("\n\n### LIB NOT FOUND\n")
++	if(Utils.is_win32 or sys.platform.startswith('os2')or dct['Py_ENABLE_SHARED']):
++		env['LIBPATH_PYEXT']=env['LIBPATH_PYEMBED']
++		env['LIB_PYEXT']=env['LIB_PYEMBED']
++	num='.'.join(env['PYTHON_VERSION'].split('.')[:2])
++	conf.find_program(['python%s-config'%num,'python-config-%s'%num,'python%sm-config'%num],var='PYTHON_CONFIG',mandatory=False)
++	includes=[]
++	if conf.env.PYTHON_CONFIG:
++		for incstr in conf.cmd_and_log([conf.env.PYTHON_CONFIG,'--includes']).strip().split():
++			if(incstr.startswith('-I')or incstr.startswith('/I')):
++				incstr=incstr[2:]
++			if incstr not in includes:
++				includes.append(incstr)
++		conf.to_log("Include path for Python extensions (found via python-config --includes): %r\n"%(includes,))
++		env['INCLUDES_PYEXT']=includes
++		env['INCLUDES_PYEMBED']=includes
++	else:
++		conf.to_log("Include path for Python extensions ""(found via distutils module): %r\n"%(dct['INCLUDEPY'],))
++		env['INCLUDES_PYEXT']=[dct['INCLUDEPY']]
++		env['INCLUDES_PYEMBED']=[dct['INCLUDEPY']]
++	if env['CC_NAME']=='gcc':
++		env.append_value('CFLAGS_PYEMBED',['-fno-strict-aliasing'])
++		env.append_value('CFLAGS_PYEXT',['-fno-strict-aliasing'])
++	if env['CXX_NAME']=='gcc':
++		env.append_value('CXXFLAGS_PYEMBED',['-fno-strict-aliasing'])
++		env.append_value('CXXFLAGS_PYEXT',['-fno-strict-aliasing'])
++	if env.CC_NAME=="msvc":
++		from distutils.msvccompiler import MSVCCompiler
++		dist_compiler=MSVCCompiler()
++		dist_compiler.initialize()
++		env.append_value('CFLAGS_PYEXT',dist_compiler.compile_options)
++		env.append_value('CXXFLAGS_PYEXT',dist_compiler.compile_options)
++		env.append_value('LINKFLAGS_PYEXT',dist_compiler.ldflags_shared)
++	try:
++		conf.check(header_name='Python.h',define_name='HAVE_PYTHON_H',uselib='PYEMBED',fragment=FRAG,errmsg='Could not find the python development headers')
++	except conf.errors.ConfigurationError:
++		conf.check_cfg(path=conf.env.PYTHON_CONFIG,package='',uselib_store='PYEMBED',args=['--cflags','--libs'])
++		conf.check(header_name='Python.h',define_name='HAVE_PYTHON_H',msg='Getting the python flags from python-config',uselib='PYEMBED',fragment=FRAG,errmsg='Could not find the python development headers elsewhere')
++def check_python_version(conf,minver=None):
++	assert minver is None or isinstance(minver,tuple)
++	pybin=conf.env['PYTHON']
++	if not pybin:
++		conf.fatal('could not find the python executable')
++	cmd=pybin+['-c','import sys\nfor x in sys.version_info: print(str(x))']
++	debug('python: Running python command %r'%cmd)
++	lines=conf.cmd_and_log(cmd).split()
++	assert len(lines)==5,"found %i lines, expected 5: %r"%(len(lines),lines)
++	pyver_tuple=(int(lines[0]),int(lines[1]),int(lines[2]),lines[3],int(lines[4]))
++	result=(minver is None)or(pyver_tuple>=minver)
++	if result:
++		pyver='.'.join([str(x)for x in pyver_tuple[:2]])
++		conf.env['PYTHON_VERSION']=pyver
++		if'PYTHONDIR'in conf.environ:
++			pydir=conf.environ['PYTHONDIR']
++		else:
++			if Utils.is_win32:
++				(python_LIBDEST,pydir)=conf.get_python_variables(["get_config_var('LIBDEST') or ''","get_python_lib(standard_lib=0, prefix=%r) or ''"%conf.env['PREFIX']])
++			else:
++				python_LIBDEST=None
++				(pydir,)=conf.get_python_variables(["get_python_lib(standard_lib=0, prefix=%r) or ''"%conf.env['PREFIX']])
++			if python_LIBDEST is None:
++				if conf.env['LIBDIR']:
++					python_LIBDEST=os.path.join(conf.env['LIBDIR'],"python"+pyver)
++				else:
++					python_LIBDEST=os.path.join(conf.env['PREFIX'],"lib","python"+pyver)
++		if'PYTHONARCHDIR'in conf.environ:
++			pyarchdir=conf.environ['PYTHONARCHDIR']
++		else:
++			(pyarchdir,)=conf.get_python_variables(["get_python_lib(plat_specific=1, standard_lib=0, prefix=%r) or ''"%conf.env['PREFIX']])
++			if not pyarchdir:
++				pyarchdir=pydir
++		if hasattr(conf,'define'):
++			conf.define('PYTHONDIR',pydir)
++			conf.define('PYTHONARCHDIR',pyarchdir)
++		conf.env['PYTHONDIR']=pydir
++		conf.env['PYTHONARCHDIR']=pyarchdir
++	pyver_full='.'.join(map(str,pyver_tuple[:3]))
++	if minver is None:
++		conf.msg('Checking for python version',pyver_full)
++	else:
++		minver_str='.'.join(map(str,minver))
++		conf.msg('Checking for python version',pyver_tuple,">= %s"%(minver_str,)and'GREEN'or'YELLOW')
++	if not result:
++		conf.fatal('The python version is too old, expecting %r'%(minver,))
++PYTHON_MODULE_TEMPLATE='''
++import %s as current_module
++version = getattr(current_module, '__version__', None)
++if version is not None:
++    print(str(version))
++else:
++    print('unknown version')
++'''
++def check_python_module(conf,module_name,condition=''):
++	msg='Python module %s'%module_name
++	if condition:
++		msg='%s (%s)'%(msg,condition)
++	conf.start_msg(msg)
++	try:
++		ret=conf.cmd_and_log(conf.env['PYTHON']+['-c',PYTHON_MODULE_TEMPLATE%module_name])
++	except Exception:
++		conf.end_msg(False)
++		conf.fatal('Could not find the python module %r'%module_name)
++	ret=ret.strip()
++	if condition:
++		conf.end_msg(ret)
++		if ret=='unknown version':
++			conf.fatal('Could not check the %s version'%module_name)
++		from distutils.version import LooseVersion
++		def num(*k):
++			if isinstance(k[0],int):
++				return LooseVersion('.'.join([str(x)for x in k]))
++			else:
++				return LooseVersion(k[0])
++		d={'num':num,'ver':LooseVersion(ret)}
++		ev=eval(condition,{},d)
++		if not ev:
++			conf.fatal('The %s version does not satisfy the requirements'%module_name)
++	else:
++		if ret=='unknown version':
++			conf.end_msg(True)
++		else:
++			conf.end_msg(ret)
++def configure(conf):
++	try:
++		conf.find_program('python',var='PYTHON')
++	except conf.errors.ConfigurationError:
++		warn("could not find a python executable, setting to sys.executable '%s'"%sys.executable)
++		conf.env.PYTHON=sys.executable
++	if conf.env.PYTHON!=sys.executable:
++		warn("python executable '%s' different from sys.executable '%s'"%(conf.env.PYTHON,sys.executable))
++	conf.env.PYTHON=conf.cmd_to_list(conf.env.PYTHON)
++	v=conf.env
++	v['PYCMD']='"import sys, py_compile;py_compile.compile(sys.argv[1], sys.argv[2])"'
++	v['PYFLAGS']=''
++	v['PYFLAGS_OPT']='-O'
++	v['PYC']=getattr(Options.options,'pyc',1)
++	v['PYO']=getattr(Options.options,'pyo',1)
++def options(opt):
++	opt.add_option('--nopyc',action='store_false',default=1,help='Do not install bytecode compiled .pyc files (configuration) [Default:install]',dest='pyc')
++	opt.add_option('--nopyo',action='store_false',default=1,help='Do not install optimised compiled .pyo files (configuration) [Default:install]',dest='pyo')
++
++extension('.py')(process_py)
++feature('py')(feature_py)
++feature('pyext')(init_pyext)
++before_method('propagate_uselib_vars','apply_link')(init_pyext)
++after_method('apply_bundle')(init_pyext)
++feature('pyext')(set_bundle)
++before_method('apply_link','apply_bundle')(set_bundle)
++before_method('propagate_uselib_vars')(init_pyembed)
++feature('pyembed')(init_pyembed)
++conf(get_python_variables)
++conf(check_python_headers)
++conf(check_python_version)
++conf(check_python_module)
+\ No newline at end of file
+--- /dev/null
++++ ardour3/waflib/Tools/qt4.py
+@@ -0,0 +1,434 @@
++#! /usr/bin/env python
++# encoding: utf-8
++# WARNING! Do not edit! http://waf.googlecode.com/git/docs/wafbook/single.html#_obtaining_the_waf_file
++
++import sys
++if sys.hexversion < 0x020400f0: from sets import Set as set
++try:
++	from xml.sax import make_parser
++	from xml.sax.handler import ContentHandler
++except ImportError:
++	has_xml=False
++	ContentHandler=object
++else:
++	has_xml=True
++import os,sys
++from waflib.Tools import c_preproc,cxx
++from waflib import Task,Utils,Options,Errors
++from waflib.TaskGen import feature,after_method,extension
++from waflib.Configure import conf
++from waflib import Logs
++MOC_H=['.h','.hpp','.hxx','.hh']
++EXT_RCC=['.qrc']
++EXT_UI=['.ui']
++EXT_QT4=['.cpp','.cc','.cxx','.C']
++QT4_LIBS="QtCore QtGui QtUiTools QtNetwork QtOpenGL QtSql QtSvg QtTest QtXml QtXmlPatterns QtWebKit Qt3Support QtHelp QtScript QtDeclarative"
++class qxx(cxx.cxx):
++	def __init__(self,*k,**kw):
++		Task.Task.__init__(self,*k,**kw)
++		self.moc_done=0
++	def scan(self):
++		(nodes,names)=c_preproc.scan(self)
++		for x in nodes:
++			if x.name.endswith('.moc'):
++				nodes.remove(x)
++				names.append(x.path_from(self.inputs[0].parent.get_bld()))
++		return(nodes,names)
++	def runnable_status(self):
++		if self.moc_done:
++			return Task.Task.runnable_status(self)
++		else:
++			for t in self.run_after:
++				if not t.hasrun:
++					return Task.ASK_LATER
++			self.add_moc_tasks()
++			return Task.Task.runnable_status(self)
++	def add_moc_tasks(self):
++		node=self.inputs[0]
++		bld=self.generator.bld
++		try:
++			self.signature()
++		except KeyError:
++			pass
++		else:
++			delattr(self,'cache_sig')
++		moctasks=[]
++		mocfiles=[]
++		try:
++			tmp_lst=bld.raw_deps[self.uid()]
++			bld.raw_deps[self.uid()]=[]
++		except KeyError:
++			tmp_lst=[]
++		for d in tmp_lst:
++			if not d.endswith('.moc'):
++				continue
++			if d in mocfiles:
++				Logs.error("paranoia owns")
++				continue
++			mocfiles.append(d)
++			h_node=None
++			try:ext=Options.options.qt_header_ext.split()
++			except AttributeError:pass
++			if not ext:ext=MOC_H
++			base2=d[:-4]
++			for x in[node.parent]+self.generator.includes_nodes:
++				for e in ext:
++					h_node=x.find_node(base2+e)
++					if h_node:
++						break
++				if h_node:
++					m_node=h_node.change_ext('.moc')
++					break
++			else:
++				for k in EXT_QT4:
++					if base2.endswith(k):
++						for x in[node.parent]+self.generator.includes_nodes:
++							h_node=x.find_node(base2)
++							if h_node:
++								break
++					if h_node:
++						m_node=h_node.change_ext(k+'.moc')
++						break
++			if not h_node:
++				raise Errors.WafError('no header found for %r which is a moc file'%d)
++			bld.node_deps[(self.inputs[0].parent.abspath(),m_node.name)]=h_node
++			task=Task.classes['moc'](env=self.env,generator=self.generator)
++			task.set_inputs(h_node)
++			task.set_outputs(m_node)
++			gen=bld.producer
++			gen.outstanding.insert(0,task)
++			gen.total+=1
++			moctasks.append(task)
++		tmp_lst=bld.raw_deps[self.uid()]=mocfiles
++		lst=bld.node_deps.get(self.uid(),())
++		for d in lst:
++			name=d.name
++			if name.endswith('.moc'):
++				task=Task.classes['moc'](env=self.env,generator=self.generator)
++				task.set_inputs(bld.node_deps[(self.inputs[0].parent.abspath(),name)])
++				task.set_outputs(d)
++				gen=bld.producer
++				gen.outstanding.insert(0,task)
++				gen.total+=1
++				moctasks.append(task)
++		self.run_after.update(set(moctasks))
++		self.moc_done=1
++	run=Task.classes['cxx'].__dict__['run']
++class trans_update(Task.Task):
++	run_str='${QT_LUPDATE} ${SRC} -ts ${TGT}'
++	color='BLUE'
++Task.update_outputs(trans_update)
++class XMLHandler(ContentHandler):
++	def __init__(self):
++		self.buf=[]
++		self.files=[]
++	def startElement(self,name,attrs):
++		if name=='file':
++			self.buf=[]
++	def endElement(self,name):
++		if name=='file':
++			self.files.append(str(''.join(self.buf)))
++	def characters(self,cars):
++		self.buf.append(cars)
++def create_rcc_task(self,node):
++	rcnode=node.change_ext('_rc.cpp')
++	rcctask=self.create_task('rcc',node,rcnode)
++	cpptask=self.create_task('cxx',rcnode,rcnode.change_ext('.o'))
++	try:
++		self.compiled_tasks.append(cpptask)
++	except AttributeError:
++		self.compiled_tasks=[cpptask]
++	return cpptask
++def create_uic_task(self,node):
++	uictask=self.create_task('ui4',node)
++	uictask.outputs=[self.path.find_or_declare(self.env['ui_PATTERN']%node.name[:-3])]
++def add_lang(self,node):
++	self.lang=self.to_list(getattr(self,'lang',[]))+[node]
++def apply_qt4(self):
++	if getattr(self,'lang',None):
++		qmtasks=[]
++		for x in self.to_list(self.lang):
++			if isinstance(x,str):
++				x=self.path.find_resource(x+'.ts')
++			qmtasks.append(self.create_task('ts2qm',x,x.change_ext('.qm')))
++		if getattr(self,'update',None)and Options.options.trans_qt4:
++			cxxnodes=[a.inputs[0]for a in self.compiled_tasks]+[a.inputs[0]for a in self.tasks if getattr(a,'inputs',None)and a.inputs[0].name.endswith('.ui')]
++			for x in qmtasks:
++				self.create_task('trans_update',cxxnodes,x.inputs)
++		if getattr(self,'langname',None):
++			qmnodes=[x.outputs[0]for x in qmtasks]
++			rcnode=self.langname
++			if isinstance(rcnode,str):
++				rcnode=self.path.find_or_declare(rcnode+'.qrc')
++			t=self.create_task('qm2rcc',qmnodes,rcnode)
++			k=create_rcc_task(self,t.outputs[0])
++			self.link_task.inputs.append(k.outputs[0])
++	lst=[]
++	for flag in self.to_list(self.env['CXXFLAGS']):
++		if len(flag)<2:continue
++		f=flag[0:2]
++		if f in['-D','-I','/D','/I']:
++			if(f[0]=='/'):
++				lst.append('-'+flag[1:])
++			else:
++				lst.append(flag)
++	self.env['MOC_FLAGS']=lst
++def cxx_hook(self,node):
++	return self.create_compiled_task('qxx',node)
++class rcc(Task.Task):
++	color='BLUE'
++	run_str='${QT_RCC} -name ${SRC[0].name} ${SRC[0].abspath()} ${RCC_ST} -o ${TGT}'
++	ext_out=['.h']
++	def scan(self):
++		node=self.inputs[0]
++		if not has_xml:
++			Logs.error('no xml support was found, the rcc dependencies will be incomplete!')
++			return([],[])
++		parser=make_parser()
++		curHandler=XMLHandler()
++		parser.setContentHandler(curHandler)
++		fi=open(self.inputs[0].abspath())
++		parser.parse(fi)
++		fi.close()
++		nodes=[]
++		names=[]
++		root=self.inputs[0].parent
++		for x in curHandler.files:
++			nd=root.find_resource(x)
++			if nd:nodes.append(nd)
++			else:names.append(x)
++		return(nodes,names)
++class moc(Task.Task):
++	color='BLUE'
++	run_str='${QT_MOC} ${MOC_FLAGS} ${MOCCPPPATH_ST:INCPATHS} ${MOCDEFINES_ST:DEFINES} ${SRC} ${MOC_ST} ${TGT}'
++class ui4(Task.Task):
++	color='BLUE'
++	run_str='${QT_UIC} ${SRC} -o ${TGT}'
++	ext_out=['.h']
++class ts2qm(Task.Task):
++	color='BLUE'
++	run_str='${QT_LRELEASE} ${QT_LRELEASE_FLAGS} ${SRC} -qm ${TGT}'
++class qm2rcc(Task.Task):
++	color='BLUE'
++	after='ts2qm'
++	def run(self):
++		txt='\n'.join(['<file>%s</file>'%k.path_from(self.outputs[0].parent)for k in self.inputs])
++		code='<!DOCTYPE RCC><RCC version="1.0">\n<qresource>\n%s\n</qresource>\n</RCC>'%txt
++		self.outputs[0].write(code)
++def configure(self):
++	self.find_qt4_binaries()
++	self.set_qt4_libs_to_check()
++	self.find_qt4_libraries()
++	self.add_qt4_rpath()
++	self.simplify_qt4_libs()
++def find_qt4_binaries(self):
++	env=self.env
++	opt=Options.options
++	qtdir=getattr(opt,'qtdir','')
++	qtbin=getattr(opt,'qtbin','')
++	paths=[]
++	if qtdir:
++		qtbin=os.path.join(qtdir,'bin')
++	if not qtdir:
++		qtdir=self.environ.get('QT4_ROOT','')
++		qtbin=os.path.join(qtdir,'bin')
++	if qtbin:
++		paths=[qtbin]
++	if not qtdir:
++		paths=os.environ.get('PATH','').split(os.pathsep)
++		paths.append('/usr/share/qt4/bin/')
++		try:
++			lst=Utils.listdir('/usr/local/Trolltech/')
++		except OSError:
++			pass
++		else:
++			if lst:
++				lst.sort()
++				lst.reverse()
++				qtdir='/usr/local/Trolltech/%s/'%lst[0]
++				qtbin=os.path.join(qtdir,'bin')
++				paths.append(qtbin)
++	cand=None
++	prev_ver=['4','0','0']
++	for qmk in['qmake-qt4','qmake4','qmake']:
++		try:
++			qmake=self.find_program(qmk,path_list=paths)
++		except self.errors.ConfigurationError:
++			pass
++		else:
++			try:
++				version=self.cmd_and_log([qmake,'-query','QT_VERSION']).strip()
++			except self.errors.ConfigurationError:
++				pass
++			else:
++				if version:
++					new_ver=version.split('.')
++					if new_ver>prev_ver:
++						cand=qmake
++						prev_ver=new_ver
++	if cand:
++		self.env.QMAKE=cand
++	else:
++		self.fatal('Could not find qmake for qt4')
++	qtbin=self.cmd_and_log([self.env.QMAKE,'-query','QT_INSTALL_BINS']).strip()+os.sep
++	def find_bin(lst,var):
++		for f in lst:
++			try:
++				ret=self.find_program(f,path_list=paths)
++			except self.errors.ConfigurationError:
++				pass
++			else:
++				env[var]=ret
++				break
++	find_bin(['uic-qt3','uic3'],'QT_UIC3')
++	find_bin(['uic-qt4','uic'],'QT_UIC')
++	if not env['QT_UIC']:
++		self.fatal('cannot find the uic compiler for qt4')
++	try:
++		uicver=self.cmd_and_log(env['QT_UIC']+" -version 2>&1").strip()
++	except self.errors.ConfigurationError:
++		self.fatal('this uic compiler is for qt3, add uic for qt4 to your path')
++	uicver=uicver.replace('Qt User Interface Compiler ','').replace('User Interface Compiler for Qt','')
++	self.msg('Checking for uic version','%s'%uicver)
++	if uicver.find(' 3.')!=-1:
++		self.fatal('this uic compiler is for qt3, add uic for qt4 to your path')
++	find_bin(['moc-qt4','moc'],'QT_MOC')
++	find_bin(['rcc'],'QT_RCC')
++	find_bin(['lrelease-qt4','lrelease'],'QT_LRELEASE')
++	find_bin(['lupdate-qt4','lupdate'],'QT_LUPDATE')
++	env['UIC3_ST']='%s -o %s'
++	env['UIC_ST']='%s -o %s'
++	env['MOC_ST']='-o'
++	env['ui_PATTERN']='ui_%s.h'
++	env['QT_LRELEASE_FLAGS']=['-silent']
++	env.MOCCPPPATH_ST='-I%s'
++	env.MOCDEFINES_ST='-D%s'
++def find_qt4_libraries(self):
++	qtlibs=getattr(Options.options,'qtlibs','')
++	if not qtlibs:
++		try:
++			qtlibs=self.cmd_and_log([self.env.QMAKE,'-query','QT_INSTALL_LIBS']).strip()
++		except Errors.WafError:
++			qtdir=self.cmd_and_log([self.env.QMAKE,'-query','QT_INSTALL_PREFIX']).strip()+os.sep
++			qtlibs=os.path.join(qtdir,'lib')
++	self.msg('Found the Qt4 libraries in',qtlibs)
++	qtincludes=self.cmd_and_log([self.env.QMAKE,'-query','QT_INSTALL_HEADERS']).strip()
++	env=self.env
++	if not'PKG_CONFIG_PATH'in os.environ:
++		os.environ['PKG_CONFIG_PATH']='%s:%s/pkgconfig:/usr/lib/qt4/lib/pkgconfig:/opt/qt4/lib/pkgconfig:/usr/lib/qt4/lib:/opt/qt4/lib'%(qtlibs,qtlibs)
++	try:
++		self.check_cfg(atleast_pkgconfig_version='0.1')
++	except self.errors.ConfigurationError:
++		for i in self.qt4_vars:
++			uselib=i.upper()
++			if Utils.unversioned_sys_platform()=="darwin":
++				frameworkName=i+".framework"
++				qtDynamicLib=os.path.join(qtlibs,frameworkName,i)
++				if os.path.exists(qtDynamicLib):
++					env.append_unique('FRAMEWORK_'+uselib,i)
++					self.msg('Checking for %s'%i,qtDynamicLib,'GREEN')
++				else:
++					self.msg('Checking for %s'%i,False,'YELLOW')
++				env.append_unique('INCLUDES_'+uselib,os.path.join(qtlibs,frameworkName,'Headers'))
++			elif sys.platform!="win32":
++				qtDynamicLib=os.path.join(qtlibs,"lib"+i+".so")
++				qtStaticLib=os.path.join(qtlibs,"lib"+i+".a")
++				if os.path.exists(qtDynamicLib):
++					env.append_unique('LIB_'+uselib,i)
++					self.msg('Checking for %s'%i,qtDynamicLib,'GREEN')
++				elif os.path.exists(qtStaticLib):
++					env.append_unique('LIB_'+uselib,i)
++					self.msg('Checking for %s'%i,qtStaticLib,'GREEN')
++				else:
++					self.msg('Checking for %s'%i,False,'YELLOW')
++				env.append_unique('LIBPATH_'+uselib,qtlibs)
++				env.append_unique('INCLUDES_'+uselib,qtincludes)
++				env.append_unique('INCLUDES_'+uselib,os.path.join(qtincludes,i))
++			else:
++				for k in("lib%s.a","lib%s4.a","%s.lib","%s4.lib"):
++					lib=os.path.join(qtlibs,k%i)
++					if os.path.exists(lib):
++						env.append_unique('LIB_'+uselib,i+k[k.find("%s")+2:k.find('.')])
++						self.msg('Checking for %s'%i,lib,'GREEN')
++						break
++				else:
++					self.msg('Checking for %s'%i,False,'YELLOW')
++				env.append_unique('LIBPATH_'+uselib,qtlibs)
++				env.append_unique('INCLUDES_'+uselib,qtincludes)
++				env.append_unique('INCLUDES_'+uselib,os.path.join(qtincludes,i))
++				uselib=i.upper()+"_debug"
++				for k in("lib%sd.a","lib%sd4.a","%sd.lib","%sd4.lib"):
++					lib=os.path.join(qtlibs,k%i)
++					if os.path.exists(lib):
++						env.append_unique('LIB_'+uselib,i+k[k.find("%s")+2:k.find('.')])
++						self.msg('Checking for %s'%i,lib,'GREEN')
++						break
++				else:
++					self.msg('Checking for %s'%i,False,'YELLOW')
++				env.append_unique('LIBPATH_'+uselib,qtlibs)
++				env.append_unique('INCLUDES_'+uselib,qtincludes)
++				env.append_unique('INCLUDES_'+uselib,os.path.join(qtincludes,i))
++	else:
++		for i in self.qt4_vars_debug+self.qt4_vars:
++			self.check_cfg(package=i,args='--cflags --libs',mandatory=False)
++def simplify_qt4_libs(self):
++	env=self.env
++	def process_lib(vars_,coreval):
++		for d in vars_:
++			var=d.upper()
++			if var=='QTCORE':
++				continue
++			value=env['LIBPATH_'+var]
++			if value:
++				core=env[coreval]
++				accu=[]
++				for lib in value:
++					if lib in core:
++						continue
++					accu.append(lib)
++				env['LIBPATH_'+var]=accu
++	process_lib(self.qt4_vars,'LIBPATH_QTCORE')
++	process_lib(self.qt4_vars_debug,'LIBPATH_QTCORE_DEBUG')
++def add_qt4_rpath(self):
++	env=self.env
++	if Options.options.want_rpath:
++		def process_rpath(vars_,coreval):
++			for d in vars_:
++				var=d.upper()
++				value=env['LIBPATH_'+var]
++				if value:
++					core=env[coreval]
++					accu=[]
++					for lib in value:
++						if var!='QTCORE':
++							if lib in core:
++								continue
++						accu.append('-Wl,--rpath='+lib)
++					env['RPATH_'+var]=accu
++		process_rpath(self.qt4_vars,'LIBPATH_QTCORE')
++		process_rpath(self.qt4_vars_debug,'LIBPATH_QTCORE_DEBUG')
++def set_qt4_libs_to_check(self):
++	if not hasattr(self,'qt4_vars'):
++		self.qt4_vars=QT4_LIBS
++	self.qt4_vars=Utils.to_list(self.qt4_vars)
++	if not hasattr(self,'qt4_vars_debug'):
++		self.qt4_vars_debug=[a+'_debug'for a in self.qt4_vars]
++	self.qt4_vars_debug=Utils.to_list(self.qt4_vars_debug)
++def options(opt):
++	opt.add_option('--want-rpath',action='store_true',default=False,dest='want_rpath',help='enable the rpath for qt libraries')
++	opt.add_option('--header-ext',type='string',default='',help='header extension for moc files',dest='qt_header_ext')
++	for i in'qtdir qtbin qtlibs'.split():
++		opt.add_option('--'+i,type='string',default='',dest=i)
++	opt.add_option('--translate',action="store_true",help="collect translation strings",dest="trans_qt4",default=False)
++
++extension(*EXT_RCC)(create_rcc_task)
++extension(*EXT_UI)(create_uic_task)
++extension('.ts')(add_lang)
++feature('qt4')(apply_qt4)
++after_method('apply_link')(apply_qt4)
++extension(*EXT_QT4)(cxx_hook)
++conf(find_qt4_binaries)
++conf(find_qt4_libraries)
++conf(simplify_qt4_libs)
++conf(add_qt4_rpath)
++conf(set_qt4_libs_to_check)
+\ No newline at end of file
+--- /dev/null
++++ ardour3/waflib/Tools/ruby.py
+@@ -0,0 +1,104 @@
++#! /usr/bin/env python
++# encoding: utf-8
++# WARNING! Do not edit! http://waf.googlecode.com/git/docs/wafbook/single.html#_obtaining_the_waf_file
++
++import os
++from waflib import Task,Options,Utils
++from waflib.TaskGen import before_method,feature,after_method,Task,extension
++from waflib.Configure import conf
++def init_rubyext(self):
++	self.install_path='${ARCHDIR_RUBY}'
++	self.uselib=self.to_list(getattr(self,'uselib',''))
++	if not'RUBY'in self.uselib:
++		self.uselib.append('RUBY')
++	if not'RUBYEXT'in self.uselib:
++		self.uselib.append('RUBYEXT')
++def apply_ruby_so_name(self):
++	self.env['cshlib_PATTERN']=self.env['cxxshlib_PATTERN']=self.env['rubyext_PATTERN']
++def check_ruby_version(self,minver=()):
++	if Options.options.rubybinary:
++		self.env.RUBY=Options.options.rubybinary
++	else:
++		self.find_program('ruby',var='RUBY')
++	ruby=self.env.RUBY
++	try:
++		version=self.cmd_and_log([ruby,'-e','puts defined?(VERSION) ? VERSION : RUBY_VERSION']).strip()
++	except:
++		self.fatal('could not determine ruby version')
++	self.env.RUBY_VERSION=version
++	try:
++		ver=tuple(map(int,version.split(".")))
++	except:
++		self.fatal('unsupported ruby version %r'%version)
++	cver=''
++	if minver:
++		if ver<minver:
++			self.fatal('ruby is too old %r'%ver)
++		cver='.'.join([str(x)for x in minver])
++	else:
++		cver=ver
++	self.msg('Checking for ruby version %s'%str(minver or''),cver)
++def check_ruby_ext_devel(self):
++	if not self.env.RUBY:
++		self.fatal('ruby detection is required first')
++	if not self.env.CC_NAME and not self.env.CXX_NAME:
++		self.fatal('load a c/c++ compiler first')
++	version=tuple(map(int,self.env.RUBY_VERSION.split(".")))
++	def read_out(cmd):
++		return Utils.to_list(self.cmd_and_log([self.env.RUBY,'-rrbconfig','-e',cmd]))
++	def read_config(key):
++		return read_out('puts Config::CONFIG[%r]'%key)
++	ruby=self.env['RUBY']
++	archdir=read_config('archdir')
++	cpppath=archdir
++	if version>=(1,9,0):
++		ruby_hdrdir=read_config('rubyhdrdir')
++		cpppath+=ruby_hdrdir
++		cpppath+=[os.path.join(ruby_hdrdir[0],read_config('arch')[0])]
++	self.check(header_name='ruby.h',includes=cpppath,errmsg='could not find ruby header file')
++	self.env.LIBPATH_RUBYEXT=read_config('libdir')
++	self.env.LIBPATH_RUBYEXT+=archdir
++	self.env.INCLUDES_RUBYEXT=cpppath
++	self.env.CFLAGS_RUBYEXT=read_config('CCDLFLAGS')
++	self.env.rubyext_PATTERN='%s.'+read_config('DLEXT')[0]
++	flags=read_config('LDSHARED')
++	while flags and flags[0][0]!='-':
++		flags=flags[1:]
++	if len(flags)>1 and flags[1]=="ppc":
++		flags=flags[2:]
++	self.env.LINKFLAGS_RUBYEXT=flags
++	self.env.LINKFLAGS_RUBYEXT+=read_config('LIBS')
++	self.env.LINKFLAGS_RUBYEXT+=read_config('LIBRUBYARG_SHARED')
++	if Options.options.rubyarchdir:
++		self.env.ARCHDIR_RUBY=Options.options.rubyarchdir
++	else:
++		self.env.ARCHDIR_RUBY=read_config('sitearchdir')[0]
++	if Options.options.rubylibdir:
++		self.env.LIBDIR_RUBY=Options.options.rubylibdir
++	else:
++		self.env.LIBDIR_RUBY=read_config('sitelibdir')[0]
++def check_ruby_module(self,module_name):
++	self.start_msg('Ruby module %s'%module_name)
++	try:
++		self.cmd_and_log([self.env['RUBY'],'-e','require \'%s\';puts 1'%module_name])
++	except:
++		self.end_msg(False)
++		self.fatal('Could not find the ruby module %r'%module_name)
++	self.end_msg(True)
++def process(self,node):
++	tsk=self.create_task('run_ruby',node)
++class run_ruby(Task.Task):
++	run_str='${RUBY} ${RBFLAGS} -I ${SRC[0].parent.abspath()} ${SRC}'
++def options(opt):
++	opt.add_option('--with-ruby-archdir',type='string',dest='rubyarchdir',help='Specify directory where to install arch specific files')
++	opt.add_option('--with-ruby-libdir',type='string',dest='rubylibdir',help='Specify alternate ruby library path')
++	opt.add_option('--with-ruby-binary',type='string',dest='rubybinary',help='Specify alternate ruby binary')
++
++feature('rubyext')(init_rubyext)
++before_method('apply_incpaths','apply_lib_vars','apply_bundle','apply_link')(init_rubyext)
++feature('rubyext')(apply_ruby_so_name)
++before_method('apply_link','propagate_uselib')(apply_ruby_so_name)
++conf(check_ruby_version)
++conf(check_ruby_ext_devel)
++conf(check_ruby_module)
++extension('.rb')(process)
+\ No newline at end of file
+--- /dev/null
++++ ardour3/waflib/Tools/suncc.py
+@@ -0,0 +1,54 @@
++#! /usr/bin/env python
++# encoding: utf-8
++# WARNING! Do not edit! http://waf.googlecode.com/git/docs/wafbook/single.html#_obtaining_the_waf_file
++
++import os
++from waflib import Utils
++from waflib.Tools import ccroot,ar
++from waflib.Configure import conf
++def find_scc(conf):
++	v=conf.env
++	cc=None
++	if v['CC']:cc=v['CC']
++	elif'CC'in conf.environ:cc=conf.environ['CC']
++	if not cc:cc=conf.find_program('cc',var='CC')
++	if not cc:conf.fatal('Could not find a Sun C compiler')
++	cc=conf.cmd_to_list(cc)
++	try:
++		conf.cmd_and_log(cc+['-flags'])
++	except:
++		conf.fatal('%r is not a Sun compiler'%cc)
++	v['CC']=cc
++	v['CC_NAME']='sun'
++def scc_common_flags(conf):
++	v=conf.env
++	v['CC_SRC_F']=[]
++	v['CC_TGT_F']=['-c','-o']
++	if not v['LINK_CC']:v['LINK_CC']=v['CC']
++	v['CCLNK_SRC_F']=''
++	v['CCLNK_TGT_F']=['-o']
++	v['CPPPATH_ST']='-I%s'
++	v['DEFINES_ST']='-D%s'
++	v['LIB_ST']='-l%s'
++	v['LIBPATH_ST']='-L%s'
++	v['STLIB_ST']='-l%s'
++	v['STLIBPATH_ST']='-L%s'
++	v['SONAME_ST']='-Wl,-h,%s'
++	v['SHLIB_MARKER']='-Bdynamic'
++	v['STLIB_MARKER']='-Bstatic'
++	v['cprogram_PATTERN']='%s'
++	v['CFLAGS_cshlib']=['-Kpic','-DPIC']
++	v['LINKFLAGS_cshlib']=['-G']
++	v['cshlib_PATTERN']='lib%s.so'
++	v['LINKFLAGS_cstlib']=['-Bstatic']
++	v['cstlib_PATTERN']='lib%s.a'
++def configure(conf):
++	conf.find_scc()
++	conf.find_ar()
++	conf.scc_common_flags()
++	conf.cc_load_tools()
++	conf.cc_add_flags()
++	conf.link_add_flags()
++
++conf(find_scc)
++conf(scc_common_flags)
+\ No newline at end of file
+--- /dev/null
++++ ardour3/waflib/Tools/suncxx.py
+@@ -0,0 +1,55 @@
++#! /usr/bin/env python
++# encoding: utf-8
++# WARNING! Do not edit! http://waf.googlecode.com/git/docs/wafbook/single.html#_obtaining_the_waf_file
++
++import os
++from waflib import Utils
++from waflib.Tools import ccroot,ar
++from waflib.Configure import conf
++def find_sxx(conf):
++	v=conf.env
++	cc=None
++	if v['CXX']:cc=v['CXX']
++	elif'CXX'in conf.environ:cc=conf.environ['CXX']
++	if not cc:cc=conf.find_program('CC',var='CXX')
++	if not cc:cc=conf.find_program('c++',var='CXX')
++	if not cc:conf.fatal('Could not find a Sun C++ compiler')
++	cc=conf.cmd_to_list(cc)
++	try:
++		conf.cmd_and_log(cc+['-flags'])
++	except:
++		conf.fatal('%r is not a Sun compiler'%cc)
++	v['CXX']=cc
++	v['CXX_NAME']='sun'
++def sxx_common_flags(conf):
++	v=conf.env
++	v['CXX_SRC_F']=[]
++	v['CXX_TGT_F']=['-c','-o']
++	if not v['LINK_CXX']:v['LINK_CXX']=v['CXX']
++	v['CXXLNK_SRC_F']=[]
++	v['CXXLNK_TGT_F']=['-o']
++	v['CPPPATH_ST']='-I%s'
++	v['DEFINES_ST']='-D%s'
++	v['LIB_ST']='-l%s'
++	v['LIBPATH_ST']='-L%s'
++	v['STLIB_ST']='-l%s'
++	v['STLIBPATH_ST']='-L%s'
++	v['SONAME_ST']='-Wl,-h,%s'
++	v['SHLIB_MARKER']='-Bdynamic'
++	v['STLIB_MARKER']='-Bstatic'
++	v['cxxprogram_PATTERN']='%s'
++	v['CXXFLAGS_cxxshlib']=['-Kpic','-DPIC']
++	v['LINKFLAGS_cxxshlib']=['-G']
++	v['cxxshlib_PATTERN']='lib%s.so'
++	v['LINKFLAGS_cxxstlib']=['-Bstatic']
++	v['cxxstlib_PATTERN']='lib%s.a'
++def configure(conf):
++	conf.find_sxx()
++	conf.find_ar()
++	conf.sxx_common_flags()
++	conf.cxx_load_tools()
++	conf.cxx_add_flags()
++	conf.link_add_flags()
++
++conf(find_sxx)
++conf(sxx_common_flags)
+\ No newline at end of file
+--- /dev/null
++++ ardour3/waflib/Tools/tex.py
+@@ -0,0 +1,242 @@
++#! /usr/bin/env python
++# encoding: utf-8
++# WARNING! Do not edit! http://waf.googlecode.com/git/docs/wafbook/single.html#_obtaining_the_waf_file
++
++import os,re
++from waflib import Utils,Task,Errors
++from waflib.TaskGen import feature,before_method
++from waflib.Logs import error,warn,debug
++re_bibunit=re.compile(r'\\(?P<type>putbib)\[(?P<file>[^\[\]]*)\]',re.M)
++def bibunitscan(self):
++	node=self.inputs[0]
++	nodes=[]
++	if not node:return nodes
++	code=Utils.readf(node.abspath())
++	for match in re_bibunit.finditer(code):
++		path=match.group('file')
++		if path:
++			for k in['','.bib']:
++				debug('tex: trying %s%s'%(path,k))
++				fi=node.parent.find_resource(path+k)
++				if fi:
++					nodes.append(fi)
++			else:
++				debug('tex: could not find %s'%path)
++	debug("tex: found the following bibunit files: %s"%nodes)
++	return nodes
++exts_deps_tex=['','.ltx','.tex','.bib','.pdf','.png','.eps','.ps']
++exts_tex=['.ltx','.tex']
++re_tex=re.compile(r'\\(?P<type>include|bibliography|putbib|includegraphics|input|import|bringin|lstinputlisting)(\[[^\[\]]*\])?{(?P<file>[^{}]*)}',re.M)
++g_bibtex_re=re.compile('bibdata',re.M)
++class tex(Task.Task):
++	bibtex_fun,_=Task.compile_fun('${BIBTEX} ${BIBTEXFLAGS} ${SRCFILE}',shell=False)
++	bibtex_fun.__doc__="""
++	Execute the program **bibtex**
++	"""
++	makeindex_fun,_=Task.compile_fun('${MAKEINDEX} ${MAKEINDEXFLAGS} ${SRCFILE}',shell=False)
++	makeindex_fun.__doc__="""
++	Execute the program **makeindex**
++	"""
++	def scan_aux(self,node):
++		nodes=[node]
++		re_aux=re.compile(r'\\@input{(?P<file>[^{}]*)}',re.M)
++		def parse_node(node):
++			code=node.read()
++			for match in re_aux.finditer(code):
++				path=match.group('file')
++				found=node.parent.find_or_declare(path)
++				if found and found not in nodes:
++					debug('tex: found aux node '+found.abspath())
++					nodes.append(found)
++					parse_node(found)
++		parse_node(node)
++		return nodes
++	def scan(self):
++		node=self.inputs[0]
++		nodes=[]
++		names=[]
++		seen=[]
++		if not node:return(nodes,names)
++		def parse_node(node):
++			if node in seen:
++				return
++			seen.append(node)
++			code=node.read()
++			global re_tex
++			for match in re_tex.finditer(code):
++				for path in match.group('file').split(','):
++					if path:
++						add_name=True
++						found=None
++						for k in exts_deps_tex:
++							debug('tex: trying %s%s'%(path,k))
++							found=node.parent.find_resource(path+k)
++							if found and not found in self.outputs:
++								nodes.append(found)
++								add_name=False
++								for ext in exts_tex:
++									if found.name.endswith(ext):
++										parse_node(found)
++										break
++						if add_name:
++							names.append(path)
++		parse_node(node)
++		for x in nodes:
++			x.parent.get_bld().mkdir()
++		debug("tex: found the following : %s and names %s"%(nodes,names))
++		return(nodes,names)
++	def check_status(self,msg,retcode):
++		if retcode!=0:
++			raise Errors.WafError("%r command exit status %r"%(msg,retcode))
++	def bibfile(self):
++		need_bibtex=False
++		try:
++			for aux_node in self.aux_nodes:
++				ct=aux_node.read()
++				if g_bibtex_re.findall(ct):
++					need_bibtex=True
++					break
++		except(OSError,IOError):
++			error('error bibtex scan')
++		else:
++			if need_bibtex:
++				warn('calling bibtex')
++				self.env.env={}
++				self.env.env.update(os.environ)
++				self.env.env.update({'BIBINPUTS':self.TEXINPUTS,'BSTINPUTS':self.TEXINPUTS})
++				self.env.SRCFILE=self.aux_nodes[0].name[:-4]
++				self.check_status('error when calling bibtex',self.bibtex_fun())
++	def bibunits(self):
++		try:
++			bibunits=bibunitscan(self)
++		except FSError:
++			error('error bibunitscan')
++		else:
++			if bibunits:
++				fn=['bu'+str(i)for i in xrange(1,len(bibunits)+1)]
++				if fn:
++					warn('calling bibtex on bibunits')
++				for f in fn:
++					self.env.env={'BIBINPUTS':self.TEXINPUTS,'BSTINPUTS':self.TEXINPUTS}
++					self.env.SRCFILE=f
++					self.check_status('error when calling bibtex',self.bibtex_fun())
++	def makeindex(self):
++		try:
++			idx_path=self.idx_node.abspath()
++			os.stat(idx_path)
++		except OSError:
++			warn('index file %s absent, not calling makeindex'%idx_path)
++		else:
++			warn('calling makeindex')
++			self.env.SRCFILE=self.idx_node.name
++			self.env.env={}
++			self.check_status('error when calling makeindex %s'%idx_path,self.makeindex_fun())
++	def run(self):
++		env=self.env
++		if not env['PROMPT_LATEX']:
++			env.append_value('LATEXFLAGS','-interaction=batchmode')
++			env.append_value('PDFLATEXFLAGS','-interaction=batchmode')
++			env.append_value('XELATEXFLAGS','-interaction=batchmode')
++		fun=self.texfun
++		node=self.inputs[0]
++		srcfile=node.abspath()
++		texinputs=self.env.TEXINPUTS or''
++		self.TEXINPUTS=node.parent.get_bld().abspath()+os.pathsep+node.parent.get_src().abspath()+os.pathsep+texinputs+os.pathsep
++		self.aux_node=node.change_ext('.aux')
++		self.cwd=self.inputs[0].parent.get_bld().abspath()
++		warn('first pass on %s'%self.__class__.__name__)
++		self.env.env={}
++		self.env.env.update(os.environ)
++		self.env.env.update({'TEXINPUTS':self.TEXINPUTS})
++		self.env.SRCFILE=srcfile
++		self.check_status('error when calling latex',fun())
++		self.aux_nodes=self.scan_aux(node.change_ext('.aux'))
++		self.idx_node=node.change_ext('.idx')
++		self.bibfile()
++		self.bibunits()
++		self.makeindex()
++		hash=''
++		for i in range(10):
++			prev_hash=hash
++			try:
++				hashes=[Utils.h_file(x.abspath())for x in self.aux_nodes]
++				hash=Utils.h_list(hashes)
++			except(OSError,IOError):
++				error('could not read aux.h')
++				pass
++			if hash and hash==prev_hash:
++				break
++			warn('calling %s'%self.__class__.__name__)
++			self.env.env={}
++			self.env.env.update(os.environ)
++			self.env.env.update({'TEXINPUTS':self.TEXINPUTS})
++			self.env.SRCFILE=srcfile
++			self.check_status('error when calling %s'%self.__class__.__name__,fun())
++class latex(tex):
++	texfun,vars=Task.compile_fun('${LATEX} ${LATEXFLAGS} ${SRCFILE}',shell=False)
++class pdflatex(tex):
++	texfun,vars=Task.compile_fun('${PDFLATEX} ${PDFLATEXFLAGS} ${SRCFILE}',shell=False)
++class xelatex(tex):
++	texfun,vars=Task.compile_fun('${XELATEX} ${XELATEXFLAGS} ${SRCFILE}',shell=False)
++class dvips(Task.Task):
++	run_str='${DVIPS} ${DVIPSFLAGS} ${SRC} -o ${TGT}'
++	color='BLUE'
++	after=['latex','pdflatex','xelatex']
++class dvipdf(Task.Task):
++	run_str='${DVIPDF} ${DVIPDFFLAGS} ${SRC} ${TGT}'
++	color='BLUE'
++	after=['latex','pdflatex','xelatex']
++class pdf2ps(Task.Task):
++	run_str='${PDF2PS} ${PDF2PSFLAGS} ${SRC} ${TGT}'
++	color='BLUE'
++	after=['latex','pdflatex','xelatex']
++def apply_tex(self):
++	if not getattr(self,'type',None)in['latex','pdflatex','xelatex']:
++		self.type='pdflatex'
++	tree=self.bld
++	outs=Utils.to_list(getattr(self,'outs',[]))
++	self.env['PROMPT_LATEX']=getattr(self,'prompt',1)
++	deps_lst=[]
++	if getattr(self,'deps',None):
++		deps=self.to_list(self.deps)
++		for filename in deps:
++			n=self.path.find_resource(filename)
++			if not n in deps_lst:deps_lst.append(n)
++	for node in self.to_nodes(self.source):
++		if self.type=='latex':
++			task=self.create_task('latex',node,node.change_ext('.dvi'))
++		elif self.type=='pdflatex':
++			task=self.create_task('pdflatex',node,node.change_ext('.pdf'))
++		elif self.type=='xelatex':
++			task=self.create_task('xelatex',node,node.change_ext('.pdf'))
++		task.env=self.env
++		if deps_lst:
++			try:
++				lst=tree.node_deps[task.uid()]
++				for n in deps_lst:
++					if not n in lst:
++						lst.append(n)
++			except KeyError:
++				tree.node_deps[task.uid()]=deps_lst
++		if self.type=='latex':
++			if'ps'in outs:
++				tsk=self.create_task('dvips',task.outputs,node.change_ext('.ps'))
++				tsk.env.env={'TEXINPUTS':node.parent.abspath()+os.pathsep+self.path.abspath()+os.pathsep+self.path.get_bld().abspath()}
++			if'pdf'in outs:
++				tsk=self.create_task('dvipdf',task.outputs,node.change_ext('.pdf'))
++				tsk.env.env={'TEXINPUTS':node.parent.abspath()+os.pathsep+self.path.abspath()+os.pathsep+self.path.get_bld().abspath()}
++		elif self.type=='pdflatex':
++			if'ps'in outs:
++				self.create_task('pdf2ps',task.outputs,node.change_ext('.ps'))
++	self.source=[]
++def configure(self):
++	v=self.env
++	for p in'tex latex pdflatex xelatex bibtex dvips dvipdf ps2pdf makeindex pdf2ps'.split():
++		try:
++			self.find_program(p,var=p.upper())
++		except self.errors.ConfigurationError:
++			pass
++	v['DVIPSFLAGS']='-Ppdf'
++
++feature('tex')(apply_tex)
++before_method('process_source')(apply_tex)
+\ No newline at end of file
+--- /dev/null
++++ ardour3/waflib/Tools/vala.py
+@@ -0,0 +1,216 @@
++#! /usr/bin/env python
++# encoding: utf-8
++# WARNING! Do not edit! http://waf.googlecode.com/git/docs/wafbook/single.html#_obtaining_the_waf_file
++
++import os.path,shutil,re
++from waflib import Context,Task,Utils,Logs,Options,Errors
++from waflib.TaskGen import extension
++from waflib.Configure import conf
++class valac(Task.Task):
++	vars=["VALAC","VALAC_VERSION","VALAFLAGS"]
++	ext_out=['.h']
++	def run(self):
++		env=self.env
++		cmd=[env['VALAC'],'-C','--quiet']
++		cmd.extend(Utils.to_list(env['VALAFLAGS']))
++		if self.threading:
++			cmd.append('--thread')
++		if self.profile:
++			cmd.append('--profile=%s'%self.profile)
++		if self.target_glib:
++			cmd.append('--target-glib=%s'%self.target_glib)
++		if self.is_lib:
++			cmd.append('--library='+self.target)
++			for x in self.outputs:
++				if x.name.endswith('.h'):
++					cmd.append('--header='+x.name)
++			if self.gir:
++				cmd.append('--gir=%s.gir'%self.gir)
++		for vapi_dir in self.vapi_dirs:
++			cmd.append('--vapidir=%s'%vapi_dir)
++		for package in self.packages:
++			cmd.append('--pkg=%s'%package)
++		for package in self.packages_private:
++			cmd.append('--pkg=%s'%package)
++		for define in self.vala_defines:
++			cmd.append('--define=%s'%define)
++		cmd.extend([a.abspath()for a in self.inputs])
++		ret=self.exec_command(cmd,cwd=self.outputs[0].parent.abspath())
++		if ret:
++			return ret
++		for x in self.outputs:
++			if id(x.parent)!=id(self.outputs[0].parent):
++				shutil.move(self.outputs[0].parent.abspath()+os.sep+x.name,x.abspath())
++		if self.packages and getattr(self,'deps_node',None):
++			self.deps_node.write('\n'.join(self.packages))
++		return ret
++def vala_file(self,node):
++	valatask=getattr(self,"valatask",None)
++	if not valatask:
++		def _get_api_version():
++			api_version='1.0'
++			if hasattr(Context.g_module,'API_VERSION'):
++				version=Context.g_module.API_VERSION.split(".")
++				if version[0]=="0":
++					api_version="0."+version[1]
++				else:
++					api_version=version[0]+".0"
++			return api_version
++		valatask=self.create_task('valac')
++		self.valatask=valatask
++		self.includes=Utils.to_list(getattr(self,'includes',[]))
++		self.uselib=self.to_list(getattr(self,'uselib',[]))
++		valatask.packages=[]
++		valatask.packages_private=Utils.to_list(getattr(self,'packages_private',[]))
++		valatask.vapi_dirs=[]
++		valatask.target=self.target
++		valatask.threading=False
++		valatask.install_path=getattr(self,'install_path','')
++		valatask.profile=getattr(self,'profile','gobject')
++		valatask.vala_defines=getattr(self,'vala_defines',[])
++		valatask.target_glib=None
++		valatask.gir=getattr(self,'gir',None)
++		valatask.gir_path=getattr(self,'gir_path','${DATAROOTDIR}/gir-1.0')
++		valatask.vapi_path=getattr(self,'vapi_path','${DATAROOTDIR}/vala/vapi')
++		valatask.pkg_name=getattr(self,'pkg_name',self.env['PACKAGE'])
++		valatask.header_path=getattr(self,'header_path','${INCLUDEDIR}/%s-%s'%(valatask.pkg_name,_get_api_version()))
++		valatask.install_binding=getattr(self,'install_binding',True)
++		valatask.is_lib=False
++		if not'cprogram'in self.features:
++			valatask.is_lib=True
++		packages=Utils.to_list(getattr(self,'packages',[]))
++		vapi_dirs=Utils.to_list(getattr(self,'vapi_dirs',[]))
++		includes=[]
++		if hasattr(self,'use'):
++			local_packages=Utils.to_list(self.use)[:]
++			seen=[]
++			while len(local_packages)>0:
++				package=local_packages.pop()
++				if package in seen:
++					continue
++				seen.append(package)
++				try:
++					package_obj=self.bld.get_tgen_by_name(package)
++				except Errors.WafError:
++					continue
++				package_name=package_obj.target
++				package_node=package_obj.path
++				package_dir=package_node.path_from(self.path)
++				for task in package_obj.tasks:
++					for output in task.outputs:
++						if output.name==package_name+".vapi":
++							valatask.set_run_after(task)
++							if package_name not in packages:
++								packages.append(package_name)
++							if package_dir not in vapi_dirs:
++								vapi_dirs.append(package_dir)
++							if package_dir not in includes:
++								includes.append(package_dir)
++				if hasattr(package_obj,'use'):
++					lst=self.to_list(package_obj.use)
++					lst.reverse()
++					local_packages=[pkg for pkg in lst if pkg not in seen]+local_packages
++		valatask.packages=packages
++		for vapi_dir in vapi_dirs:
++			try:
++				valatask.vapi_dirs.append(self.path.find_dir(vapi_dir).abspath())
++				valatask.vapi_dirs.append(self.path.find_dir(vapi_dir).get_bld().abspath())
++			except AttributeError:
++				Logs.warn("Unable to locate Vala API directory: '%s'"%vapi_dir)
++		self.includes.append(self.bld.srcnode.abspath())
++		self.includes.append(self.bld.bldnode.abspath())
++		for include in includes:
++			try:
++				self.includes.append(self.path.find_dir(include).abspath())
++				self.includes.append(self.path.find_dir(include).get_bld().abspath())
++			except AttributeError:
++				Logs.warn("Unable to locate include directory: '%s'"%include)
++		if valatask.profile=='gobject':
++			if hasattr(self,'target_glib'):
++				Logs.warn('target_glib on vala tasks is not supported --vala-target-glib=MAJOR.MINOR from the vala tool options')
++			if getattr(Options.options,'vala_target_glib',None):
++				valatask.target_glib=Options.options.vala_target_glib
++			if not'GOBJECT'in self.uselib:
++				self.uselib.append('GOBJECT')
++		if hasattr(self,'threading'):
++			if valatask.profile=='gobject':
++				valatask.threading=self.threading
++				if not'GTHREAD'in self.uselib:
++					self.uselib.append('GTHREAD')
++			else:
++				Logs.warn("Profile %s does not have threading support"%valatask.profile)
++		if valatask.is_lib:
++			valatask.outputs.append(self.path.find_or_declare('%s.h'%self.target))
++			valatask.outputs.append(self.path.find_or_declare('%s.vapi'%self.target))
++			if valatask.gir:
++				valatask.outputs.append(self.path.find_or_declare('%s.gir'%self.gir))
++			if valatask.packages:
++				d=self.path.find_or_declare('%s.deps'%self.target)
++				valatask.outputs.append(d)
++				valatask.deps_node=d
++	valatask.inputs.append(node)
++	c_node=node.change_ext('.c')
++	valatask.outputs.append(c_node)
++	self.source.append(c_node)
++	if valatask.is_lib and valatask.install_binding:
++		headers_list=[o for o in valatask.outputs if o.suffix()==".h"]
++		try:
++			self.install_vheader.source=headers_list
++		except AttributeError:
++			self.install_vheader=self.bld.install_files(valatask.header_path,headers_list,self.env)
++		vapi_list=[o for o in valatask.outputs if(o.suffix()in(".vapi",".deps"))]
++		try:
++			self.install_vapi.source=vapi_list
++		except AttributeError:
++			self.install_vapi=self.bld.install_files(valatask.vapi_path,vapi_list,self.env)
++		gir_list=[o for o in valatask.outputs if o.suffix()==".gir"]
++		try:
++			self.install_gir.source=gir_list
++		except AttributeError:
++			self.install_gir=self.bld.install_files(valatask.gir_path,gir_list,self.env)
++valac=Task.update_outputs(valac)
++def find_valac(self,valac_name,min_version):
++	valac=self.find_program(valac_name,var='VALAC')
++	try:
++		output=self.cmd_and_log(valac+' --version')
++	except Exception:
++		valac_version=None
++	else:
++		ver=re.search(r'\d+.\d+.\d+',output).group(0).split('.')
++		valac_version=tuple([int(x)for x in ver])
++	self.msg('Checking for %s version >= %r'%(valac_name,min_version),valac_version,valac_version and valac_version>=min_version)
++	if valac and valac_version<min_version:
++		self.fatal("%s version %r is too old, need >= %r"%(valac_name,valac_version,min_version))
++	self.env['VALAC_VERSION']=valac_version
++	return valac
++def check_vala(self,min_version=(0,8,0),branch=None):
++	if not branch:
++		branch=min_version[:2]
++	try:
++		find_valac(self,'valac-%d.%d'%(branch[0],branch[1]),min_version)
++	except self.errors.ConfigurationError:
++		find_valac(self,'valac',min_version)
++def check_vala_deps(self):
++	if not self.env['HAVE_GOBJECT']:
++		pkg_args={'package':'gobject-2.0','uselib_store':'GOBJECT','args':'--cflags --libs'}
++		if getattr(Options.options,'vala_target_glib',None):
++			pkg_args['atleast_version']=Options.options.vala_target_glib
++		self.check_cfg(**pkg_args)
++	if not self.env['HAVE_GTHREAD']:
++		pkg_args={'package':'gthread-2.0','uselib_store':'GTHREAD','args':'--cflags --libs'}
++		if getattr(Options.options,'vala_target_glib',None):
++			pkg_args['atleast_version']=Options.options.vala_target_glib
++		self.check_cfg(**pkg_args)
++def configure(self):
++	self.load('gnu_dirs')
++	self.check_vala_deps()
++	self.check_vala()
++def options(opt):
++	opt.load('gnu_dirs')
++	valaopts=opt.add_option_group('Vala Compiler Options')
++	valaopts.add_option('--vala-target-glib',default=None,dest='vala_target_glib',metavar='MAJOR.MINOR',help='Target version of glib for Vala GObject code generation')
++
++extension('.vala','.gs')(vala_file)
++conf(find_valac)
++conf(check_vala)
++conf(check_vala_deps)
+\ No newline at end of file
+--- /dev/null
++++ ardour3/waflib/Tools/waf_unit_test.py
+@@ -0,0 +1,79 @@
++#! /usr/bin/env python
++# encoding: utf-8
++# WARNING! Do not edit! http://waf.googlecode.com/git/docs/wafbook/single.html#_obtaining_the_waf_file
++
++import os,sys
++from waflib.TaskGen import feature,after_method
++from waflib import Utils,Task,Logs,Options
++testlock=Utils.threading.Lock()
++def make_test(self):
++	if getattr(self,'link_task',None):
++		self.create_task('utest',self.link_task.outputs)
++class utest(Task.Task):
++	color='PINK'
++	after=['vnum','inst']
++	vars=[]
++	def runnable_status(self):
++		ret=super(utest,self).runnable_status()
++		if ret==Task.SKIP_ME:
++			if getattr(Options.options,'all_tests',False):
++				return Task.RUN_ME
++		return ret
++	def run(self):
++		filename=self.inputs[0].abspath()
++		self.ut_exec=getattr(self,'ut_exec',[filename])
++		if getattr(self.generator,'ut_fun',None):
++			self.generator.ut_fun(self)
++		try:
++			fu=getattr(self.generator.bld,'all_test_paths')
++		except AttributeError:
++			fu=os.environ.copy()
++			self.generator.bld.all_test_paths=fu
++			lst=[]
++			for g in self.generator.bld.groups:
++				for tg in g:
++					if getattr(tg,'link_task',None):
++						lst.append(tg.link_task.outputs[0].parent.abspath())
++			def add_path(dct,path,var):
++				dct[var]=os.pathsep.join(Utils.to_list(path)+[os.environ.get(var,'')])
++			if Utils.is_win32:
++				add_path(fu,lst,'PATH')
++			elif Utils.unversioned_sys_platform()=='darwin':
++				add_path(fu,lst,'DYLD_LIBRARY_PATH')
++				add_path(fu,lst,'LD_LIBRARY_PATH')
++			else:
++				add_path(fu,lst,'LD_LIBRARY_PATH')
++		cwd=getattr(self.generator,'ut_cwd','')or self.inputs[0].parent.abspath()
++		proc=Utils.subprocess.Popen(self.ut_exec,cwd=cwd,env=fu,stderr=Utils.subprocess.PIPE,stdout=Utils.subprocess.PIPE)
++		(stdout,stderr)=proc.communicate()
++		tup=(filename,proc.returncode,stdout,stderr)
++		self.generator.utest_result=tup
++		testlock.acquire()
++		try:
++			bld=self.generator.bld
++			Logs.debug("ut: %r",tup)
++			try:
++				bld.utest_results.append(tup)
++			except AttributeError:
++				bld.utest_results=[tup]
++		finally:
++			testlock.release()
++def summary(bld):
++	lst=getattr(bld,'utest_results',[])
++	if lst:
++		Logs.pprint('CYAN','execution summary')
++		total=len(lst)
++		tfail=len([x for x in lst if x[1]])
++		Logs.pprint('CYAN','  tests that pass %d/%d'%(total-tfail,total))
++		for(f,code,out,err)in lst:
++			if not code:
++				Logs.pprint('CYAN','    %s'%f)
++		Logs.pprint('CYAN','  tests that fail %d/%d'%(tfail,total))
++		for(f,code,out,err)in lst:
++			if code:
++				Logs.pprint('CYAN','    %s'%f)
++def options(opt):
++	opt.add_option('--alltests',action='store_true',default=False,help='Exec all unit tests',dest='all_tests')
++
++feature('test')(make_test)
++after_method('apply_link')(make_test)
+\ No newline at end of file
+--- /dev/null
++++ ardour3/waflib/Tools/winres.py
+@@ -0,0 +1,34 @@
++#! /usr/bin/env python
++# encoding: utf-8
++# WARNING! Do not edit! http://waf.googlecode.com/git/docs/wafbook/single.html#_obtaining_the_waf_file
++
++from waflib import Task
++from waflib.TaskGen import extension
++def rc_file(self,node):
++	obj_ext='.rc.o'
++	if self.env['WINRC_TGT_F']=='/fo':
++		obj_ext='.res'
++	rctask=self.create_task('winrc',node,node.change_ext(obj_ext))
++	try:
++		self.compiled_tasks.append(rctask)
++	except AttributeError:
++		self.compiled_tasks=[rctask]
++class winrc(Task.Task):
++	run_str='${WINRC} ${WINRCFLAGS} ${CPPPATH_ST:INCPATHS} ${DEFINES_ST:DEFINES} ${WINRC_TGT_F} ${TGT} ${WINRC_SRC_F} ${SRC}'
++	color='BLUE'
++def configure(conf):
++	v=conf.env
++	v['WINRC_TGT_F']='-o'
++	v['WINRC_SRC_F']='-i'
++	if not conf.env.WINRC:
++		if v.CC_NAME=='msvc':
++			conf.find_program('RC',var='WINRC',path_list=v['PATH'])
++			v['WINRC_TGT_F']='/fo'
++			v['WINRC_SRC_F']=''
++		else:
++			conf.find_program('windres',var='WINRC',path_list=v['PATH'])
++	if not conf.env.WINRC:
++		conf.fatal('winrc was not found!')
++	v['WINRCFLAGS']=[]
++
++extension('.rc')(rc_file)
+\ No newline at end of file
+--- /dev/null
++++ ardour3/waflib/Tools/xlc.py
+@@ -0,0 +1,46 @@
++#! /usr/bin/env python
++# encoding: utf-8
++# WARNING! Do not edit! http://waf.googlecode.com/git/docs/wafbook/single.html#_obtaining_the_waf_file
++
++from waflib.Tools import ccroot,ar
++from waflib.Configure import conf
++def find_xlc(conf):
++	cc=conf.find_program(['xlc_r','xlc'],var='CC')
++	cc=conf.cmd_to_list(cc)
++	conf.get_xlc_version(cc)
++	conf.env.CC_NAME='xlc'
++	conf.env.CC=cc
++def xlc_common_flags(conf):
++	v=conf.env
++	v['CC_SRC_F']=[]
++	v['CC_TGT_F']=['-c','-o']
++	if not v['LINK_CC']:v['LINK_CC']=v['CC']
++	v['CCLNK_SRC_F']=[]
++	v['CCLNK_TGT_F']=['-o']
++	v['CPPPATH_ST']='-I%s'
++	v['DEFINES_ST']='-D%s'
++	v['LIB_ST']='-l%s'
++	v['LIBPATH_ST']='-L%s'
++	v['STLIB_ST']='-l%s'
++	v['STLIBPATH_ST']='-L%s'
++	v['RPATH_ST']='-Wl,-rpath,%s'
++	v['SONAME_ST']=[]
++	v['SHLIB_MARKER']=[]
++	v['STLIB_MARKER']=[]
++	v['LINKFLAGS_cprogram']=['-Wl,-brtl']
++	v['cprogram_PATTERN']='%s'
++	v['CFLAGS_cshlib']=['-fPIC']
++	v['LINKFLAGS_cshlib']=['-G','-Wl,-brtl,-bexpfull']
++	v['cshlib_PATTERN']='lib%s.so'
++	v['LINKFLAGS_cstlib']=[]
++	v['cstlib_PATTERN']='lib%s.a'
++def configure(conf):
++	conf.find_xlc()
++	conf.find_ar()
++	conf.xlc_common_flags()
++	conf.cc_load_tools()
++	conf.cc_add_flags()
++	conf.link_add_flags()
++
++conf(find_xlc)
++conf(xlc_common_flags)
+\ No newline at end of file
+--- /dev/null
++++ ardour3/waflib/Tools/xlcxx.py
+@@ -0,0 +1,46 @@
++#! /usr/bin/env python
++# encoding: utf-8
++# WARNING! Do not edit! http://waf.googlecode.com/git/docs/wafbook/single.html#_obtaining_the_waf_file
++
++from waflib.Tools import ccroot,ar
++from waflib.Configure import conf
++def find_xlcxx(conf):
++	cxx=conf.find_program(['xlc++_r','xlc++'],var='CXX')
++	cxx=conf.cmd_to_list(cxx)
++	conf.get_xlc_version(cxx)
++	conf.env.CXX_NAME='xlc++'
++	conf.env.CXX=cxx
++def xlcxx_common_flags(conf):
++	v=conf.env
++	v['CXX_SRC_F']=[]
++	v['CXX_TGT_F']=['-c','-o']
++	if not v['LINK_CXX']:v['LINK_CXX']=v['CXX']
++	v['CXXLNK_SRC_F']=[]
++	v['CXXLNK_TGT_F']=['-o']
++	v['CPPPATH_ST']='-I%s'
++	v['DEFINES_ST']='-D%s'
++	v['LIB_ST']='-l%s'
++	v['LIBPATH_ST']='-L%s'
++	v['STLIB_ST']='-l%s'
++	v['STLIBPATH_ST']='-L%s'
++	v['RPATH_ST']='-Wl,-rpath,%s'
++	v['SONAME_ST']=[]
++	v['SHLIB_MARKER']=[]
++	v['STLIB_MARKER']=[]
++	v['LINKFLAGS_cxxprogram']=['-Wl,-brtl']
++	v['cxxprogram_PATTERN']='%s'
++	v['CXXFLAGS_cxxshlib']=['-fPIC']
++	v['LINKFLAGS_cxxshlib']=['-G','-Wl,-brtl,-bexpfull']
++	v['cxxshlib_PATTERN']='lib%s.so'
++	v['LINKFLAGS_cxxstlib']=[]
++	v['cxxstlib_PATTERN']='lib%s.a'
++def configure(conf):
++	conf.find_xlcxx()
++	conf.find_ar()
++	conf.xlcxx_common_flags()
++	conf.cxx_load_tools()
++	conf.cxx_add_flags()
++	conf.link_add_flags()
++
++conf(find_xlcxx)
++conf(xlcxx_common_flags)
+\ No newline at end of file
+--- /dev/null
++++ ardour3/waflib/Utils.py
+@@ -0,0 +1,336 @@
++#! /usr/bin/env python
++# encoding: utf-8
++# WARNING! Do not edit! http://waf.googlecode.com/git/docs/wafbook/single.html#_obtaining_the_waf_file
++
++import os,sys,errno,traceback,inspect,re,shutil,datetime,gc
++try:
++	import subprocess
++except:
++	try:
++		import waflib.extras.subprocess as subprocess
++	except:
++		print("The subprocess module is missing (python2.3?):\n try calling 'waf update --files=subprocess'\n or add a copy of subprocess.py to the python libraries")
++try:
++	from collections import deque
++except ImportError:
++	class deque(list):
++		def popleft(self):
++			return self.pop(0)
++try:
++	import _winreg as winreg
++except:
++	try:
++		import winreg
++	except:
++		winreg=None
++from waflib import Errors
++try:
++	from collections import UserDict
++except:
++	from UserDict import UserDict
++try:
++	from hashlib import md5
++except:
++	try:
++		from md5 import md5
++	except:
++		pass
++try:
++	import threading
++except:
++	class threading(object):
++		pass
++	class Lock(object):
++		def acquire(self):
++			pass
++		def release(self):
++			pass
++	threading.Lock=threading.Thread=Lock
++else:
++	run_old=threading.Thread.run
++	def run(*args,**kwargs):
++		try:
++			run_old(*args,**kwargs)
++		except(KeyboardInterrupt,SystemExit):
++			raise
++		except:
++			sys.excepthook(*sys.exc_info())
++	threading.Thread.run=run
++SIG_NIL='iluvcuteoverload'
++O644=420
++O755=493
++rot_chr=['\\','|','/','-']
++rot_idx=0
++try:
++	from collections import defaultdict
++except ImportError:
++	class defaultdict(dict):
++		def __init__(self,default_factory):
++			super(defaultdict,self).__init__()
++			self.default_factory=default_factory
++		def __getitem__(self,key):
++			try:
++				return super(defaultdict,self).__getitem__(key)
++			except KeyError:
++				value=self.default_factory()
++				self[key]=value
++				return value
++is_win32=sys.platform in('win32','cli')
++indicator='\x1b[K%s%s%s\r'
++if is_win32 and'NOCOLOR'in os.environ:
++	indicator='%s%s%s\r'
++def readf(fname,m='r'):
++	f=open(fname,m)
++	try:
++		txt=f.read()
++	finally:
++		f.close()
++	return txt
++def h_file(filename):
++	f=open(filename,'rb')
++	m=md5()
++	try:
++		while filename:
++			filename=f.read(100000)
++			m.update(filename)
++	finally:
++		f.close()
++	return m.digest()
++try:
++	x=''.encode('hex')
++except:
++	import binascii
++	def to_hex(s):
++		ret=binascii.hexlify(s)
++		if not isinstance(ret,str):
++			ret=ret.decode('utf-8')
++		return ret
++else:
++	def to_hex(s):
++		return s.encode('hex')
++to_hex.__doc__="""
++Return the hexadecimal representation of a string
++
++:param s: string to convert
++:type s: string
++"""
++listdir=os.listdir
++if is_win32:
++	def listdir_win32(s):
++		if not s:
++			try:
++				import ctypes
++			except:
++				return[x+':\\'for x in list('ABCDEFGHIJKLMNOPQRSTUVWXYZ')]
++			else:
++				dlen=4
++				maxdrives=26
++				buf=ctypes.create_string_buffer(maxdrives*dlen)
++				ndrives=ctypes.windll.kernel32.GetLogicalDriveStringsA(maxdrives,ctypes.byref(buf))
++				return[buf.raw[4*i:4*i+3].decode('ascii')for i in range(int(ndrives/dlen))]
++		if len(s)==2 and s[1]==":":
++			s+=os.sep
++		if not os.path.isdir(s):
++			e=OSError()
++			e.errno=errno.ENOENT
++			raise e
++		return os.listdir(s)
++	listdir=listdir_win32
++def num2ver(ver):
++	if isinstance(ver,str):
++		ver=tuple(ver.split('.'))
++	if isinstance(ver,tuple):
++		ret=0
++		for i in range(4):
++			if i<len(ver):
++				ret+=256**(3-i)*int(ver[i])
++		return ret
++	return ver
++def ex_stack():
++	exc_type,exc_value,tb=sys.exc_info()
++	exc_lines=traceback.format_exception(exc_type,exc_value,tb)
++	return''.join(exc_lines)
++def to_list(sth):
++	if isinstance(sth,str):
++		return sth.split()
++	else:
++		return sth
++re_nl=re.compile('\r*\n',re.M)
++def str_to_dict(txt):
++	tbl={}
++	lines=re_nl.split(txt)
++	for x in lines:
++		x=x.strip()
++		if not x or x.startswith('#')or x.find('=')<0:
++			continue
++		tmp=x.split('=')
++		tbl[tmp[0].strip()]='='.join(tmp[1:]).strip()
++	return tbl
++def split_path(path):
++	return path.split('/')
++def split_path_cygwin(path):
++	if path.startswith('//'):
++		ret=path.split('/')[2:]
++		ret[0]='/'+ret[0]
++		return ret
++	return path.split('/')
++re_sp=re.compile('[/\\\\]')
++def split_path_win32(path):
++	if path.startswith('\\\\'):
++		ret=re.split(re_sp,path)[2:]
++		ret[0]='\\'+ret[0]
++		return ret
++	return re.split(re_sp,path)
++if sys.platform=='cygwin':
++	split_path=split_path_cygwin
++elif is_win32:
++	split_path=split_path_win32
++split_path.__doc__="""
++Split a path by / or \\. This function is not like os.path.split
++
++:type  path: string
++:param path: path to split
++:return:     list of strings
++"""
++def check_dir(path):
++	if not os.path.isdir(path):
++		try:
++			os.makedirs(path)
++		except OSError ,e:
++			if not os.path.isdir(path):
++				raise Errors.WafError('Cannot create the folder %r'%path,ex=e)
++def def_attrs(cls,**kw):
++	for k,v in kw.items():
++		if not hasattr(cls,k):
++			setattr(cls,k,v)
++def quote_define_name(s):
++	fu=re.compile("[^a-zA-Z0-9]").sub("_",s)
++	fu=fu.upper()
++	return fu
++def h_list(lst):
++	m=md5()
++	m.update(str(lst))
++	return m.digest()
++def h_fun(fun):
++	try:
++		return fun.code
++	except AttributeError:
++		try:
++			h=inspect.getsource(fun)
++		except IOError:
++			h="nocode"
++		try:
++			fun.code=h
++		except AttributeError:
++			pass
++		return h
++reg_subst=re.compile(r"(\\\\)|(\$\$)|\$\{([^}]+)\}")
++def subst_vars(expr,params):
++	def repl_var(m):
++		if m.group(1):
++			return'\\'
++		if m.group(2):
++			return'$'
++		try:
++			return params.get_flat(m.group(3))
++		except AttributeError:
++			return params[m.group(3)]
++	return reg_subst.sub(repl_var,expr)
++def destos_to_binfmt(key):
++	if key=='darwin':
++		return'mac-o'
++	elif key in('win32','cygwin','uwin','msys'):
++		return'pe'
++	return'elf'
++def unversioned_sys_platform():
++	s=sys.platform
++	if s=='java':
++		from java.lang import System
++		s=System.getProperty('os.name')
++		if s=='Mac OS X':
++			return'darwin'
++		elif s.startswith('Windows '):
++			return'win32'
++		elif s=='OS/2':
++			return'os2'
++		elif s=='HP-UX':
++			return'hpux'
++		elif s in('SunOS','Solaris'):
++			return'sunos'
++		else:s=s.lower()
++	if s=='powerpc':
++		return'darwin'
++	if s=='win32'or s.endswith('os2')and s!='sunos2':return s
++	return re.split('\d+$',s)[0]
++def nada(*k,**kw):
++	pass
++class Timer(object):
++	def __init__(self):
++		self.start_time=datetime.datetime.utcnow()
++	def __str__(self):
++		delta=datetime.datetime.utcnow()-self.start_time
++		days=int(delta.days)
++		hours=delta.seconds//3600
++		minutes=(delta.seconds-hours*3600)//60
++		seconds=delta.seconds-hours*3600-minutes*60+float(delta.microseconds)/1000/1000
++		result=''
++		if days:
++			result+='%dd'%days
++		if days or hours:
++			result+='%dh'%hours
++		if days or hours or minutes:
++			result+='%dm'%minutes
++		return'%s%.3fs'%(result,seconds)
++if is_win32:
++	old=shutil.copy2
++	def copy2(src,dst):
++		old(src,dst)
++		shutil.copystat(src,dst)
++	setattr(shutil,'copy2',copy2)
++if os.name=='java':
++	try:
++		gc.disable()
++		gc.enable()
++	except NotImplementedError:
++		gc.disable=gc.enable
++def read_la_file(path):
++	sp=re.compile(r'^([^=]+)=\'(.*)\'$')
++	dc={}
++	for line in readf(path).splitlines():
++		try:
++			_,left,right,_=sp.split(line.strip())
++			dc[left]=right
++		except ValueError:
++			pass
++	return dc
++def nogc(fun):
++	def f(*k,**kw):
++		try:
++			gc.disable()
++			ret=fun(*k,**kw)
++		finally:
++			gc.enable()
++		return ret
++	f.__doc__=fun.__doc__
++	return f
++def run_once(fun):
++	cache={}
++	def wrap(k):
++		try:
++			return cache[k]
++		except KeyError:
++			ret=fun(k)
++			cache[k]=ret
++			return ret
++	wrap.__cache__=cache
++	return wrap
++def get_registry_app_path(key,filename):
++	if not winreg:
++		return None
++	try:
++		result=winreg.QueryValue(key,"Software\\Microsoft\\Windows\\CurrentVersion\\App Paths\\%s.exe"%filename[0])
++	except WindowsError:
++		pass
++	else:
++		if os.path.isfile(result):
++			return result
diff --git a/debian/patches/wscript.patch b/debian/patches/wscript.patch
new file mode 100644
index 0000000..716529c
--- /dev/null
+++ b/debian/patches/wscript.patch
@@ -0,0 +1,41 @@
+From: Adrian Knoth <adi at drcomp.erfurt.thur.de>
+Forwarded: Not-Needed
+Last-Update: 2015-04-19
+Description: Always rely on the version from debian changelog
+ We generate libs/ardour/revision.cc from debian/rules.
+ Adapt the wscript to parse our custom format. To avoid
+ confusion with any existing .git directory on a packager's
+ machine, disable the git version check and pretend it's a
+ tarball.
+
+--- ardour3.orig/wscript
++++ ardour3/wscript
+@@ -142,24 +142,17 @@ def fetch_tarball_revision ():
+         remove_punctuation_map = dict((ord(char), None) for char in '";')
+         return content[1].decode('utf-8').strip().split(' ')[7].translate (remove_punctuation_map)
+ 
+-if os.path.isdir (os.path.join(os.getcwd(), '.git')):
+-    rev = fetch_git_revision ()
+-else:
+-    rev = fetch_tarball_revision ()
++rev = fetch_tarball_revision ()
+ 
+ #
+ # rev is now of the form MAJOR.MINOR[-rcX]-rev-commit
+ # or, if right at the same rev as a release, MAJOR.MINOR[-rcX]
+ #
+ 
+-parts = rev.split ('.', 1)
++parts = rev.split ('.')
+ MAJOR = parts[0]
+-other = parts[1].split('-', 1)
+-MINOR = other[0]
+-if len(other) > 1:
+-    MICRO = other[1].rsplit('-',1)[0].replace('-','.')
+-else:
+-    MICRO = '0'
++MINOR = parts[1].split('~', 1)[0]
++MICRO = '0'
+ 
+ V = MAJOR + '.' + MINOR + '.' + MICRO
+ VERSION = V
diff --git a/debian/rules b/debian/rules
index 31a548f..1826796 100755
--- a/debian/rules
+++ b/debian/rules
@@ -5,181 +5,163 @@
 # This software may be used and distributed according to the terms
 # of the GNU General Public License, incorporated herein by reference.
 
-include /usr/share/cdbs/1/class/langcore.mk
-include /usr/share/cdbs/1/rules/buildcore.mk
-include /usr/share/cdbs/1/rules/utils.mk
+# Path to the debian directory
+DEBIAN_DIR := $(shell echo ${MAKEFILE_LIST} | awk '{print $$1}' | xargs dirname )
+UPSTREAM_VERSION ?=$(shell uscan --dehs | sed -n 's/.*<upstream-version>\(.*\)<\/upstream-version>.*/\1/p')
+DFSG = dfsg1
+PKG = $(shell dpkg-parsechangelog | sed -ne 's/^Source: //p')
+
+-include /usr/share/cdbs/1/rules/upstream-tarball.mk
+-include /usr/share/cdbs/1/rules/utils.mk
 include /usr/share/cdbs/1/rules/debhelper.mk
+include /usr/share/cdbs/1/class/makefile.mk
+
 DEB_CLEAN_EXCLUDE=debian/tmp
-# temporary hack for unclean upstream codebase
-DEB_CLEAN_EXCLUDE += libs/clearlooks-newer/support.h.orig
 DEB_DESTDIR = $(CURDIR)/debian/tmp/
 
-NJOBS = $(DEB_PARALLEL_JOBS:%=-j%)
-
-ALTIVEC_OPT_FLAGS := -O3 -mcpu=7400 -maltivec -mabi=altivec -mhard-float -mpowerpc-gfxopt
-I686_OPT_FLAGS := -O3 -march=i686 -mmmx
-
-DEB_SCONS_EXTRA_FLAGS := \
-	PREFIX=/usr \
-	NLS=yes \
-	FREEDESKTOP=no \
-	$(NJOBS) \
-	SYSLIBS=yes \
-	FREESOUND=yes \
-	VST=0
-
-DEB_SCONS_NOOPT_FLAGS := DEBUG=no FPU_OPTIMIZATION=no
-ifneq (,$(findstring amd64,$(DEB_BUILD_ARCH)))
-DEB_SCONS_NOOPT_FLAGS := DEBUG=no FPU_OPTIMIZATION=yes
-endif
-
-ifneq (,$(findstring i386,$(DEB_BUILD_ARCH)))
-DEB_SCONS_NOOPT_FLAGS += DIST_TARGET=i386
-endif
-
-
-DEB_DH_STRIP_ARGS := --dbg-package=ardour-dbg
-
-
-DEB_SCONS_ENVVARS :=
-DEB_SCONS_INVOKE = $(DEB_SCONS_ENVVARS) scons 
-
-# For an out-of-tree build, we just cp -al all the needed files.
-# It seems these are enough
-BUILD_FILES = ardour.rc.in gtk2_ardour icons libs SConstruct templates tools vst
+# ignore oddly packaged bzip2 archive to not upset dpkg
+DEB_COPYRIGHT_CHECK_IGNORE_REGEX = ^\./waf|\./debian/(changelog|copyright(|_hints|_newhints))$
 
-common-build-arch:: debian/stamp-scons-build
-debian/stamp-scons-build: libs/ardour/svn_revision.cc
-	mkdir -p $(DEB_DESTDIR)/generic
-	mkdir -p build-generic
-	cp -alf $(BUILD_FILES) build-generic
-	cd build-generic && $(DEB_SCONS_INVOKE) 'ARCH=$(CFLAGS)' \
-		DESTDIR=$(DEB_DESTDIR)/generic \
-		$(DEB_SCONS_EXTRA_FLAGS) $(DEB_SCONS_NOOPT_FLAGS)
+DIST_TARGET = none
 ifneq (,$(findstring i386,$(DEB_BUILD_ARCH)))
-	mkdir -p $(DEB_DESTDIR)/i686
-	mkdir -p build-i686
-	cp -alf $(BUILD_FILES) build-i686
-	cd build-i686 && $(DEB_SCONS_INVOKE) 'ARCH=$(I686_OPT_FLAGS)' \
-		DESTDIR=$(DEB_DESTDIR)/i686 \
-		DEBUG=no $(DEB_SCONS_EXTRA_FLAGS) \
-		FPU_OPTIMIZATION=yes DIST_TARGET=i686
+	DIST_TARGET = i686
 endif
-ifneq (,$(findstring powerpc,$(DEB_BUILD_ARCH)))
-	mkdir -p $(DEB_DESTDIR)/altivec
-	mkdir -p build-altivec
-	cp -alf $(BUILD_FILES) build-altivec
-	cd build-altivec && $(DEB_SCONS_INVOKE) 'ARCH=$(ALTIVEC_OPT_FLAGS)' \
-		DESTDIR=$(DEB_DESTDIR)/altivec \
-		DEBUG=no $(DEB_SCONS_EXTRA_FLAGS)
-endif
-	touch $@
-
-libs/ardour/svn_revision.cc:
-	echo '#include "ardour/svn_revision.h"' > $@
-	echo "namespace ARDOUR { extern const char* svn_revision = \"${DEB_VERSION}\"; }" >> $@
-
-
-install/ardour::
-	cd build-generic && $(DEB_SCONS_INVOKE) 'ARCH=$(CFLAGS)' \
-		DESTDIR=$(DEB_DESTDIR)/generic \
-		$(DEB_SCONS_EXTRA_FLAGS) $(DEB_SCONS_NOOPT_FLAGS) \
-		install
-
-ifneq (,$(findstring i386,$(DEB_BUILD_ARCH)))
-install/ardour-i686::
-	cd build-i686 && $(DEB_SCONS_INVOKE) 'ARCH=$(I686_OPT_FLAGS)' \
-		DESTDIR=$(DEB_DESTDIR)/i686 \
-		DEBUG=no $(DEB_SCONS_EXTRA_FLAGS) \
-		FPU_OPTIMIZATION=yes DIST_TARGET=i686 \
-		install
-endif
-
-ifneq (,$(findstring powerpc,$(DEB_BUILD_ARCH)))
-install/ardour-altivec::
-	cd build-altivec && $(DEB_SCONS_INVOKE) 'ARCH=$(ALTIVEC_OPT_FLAGS)' \
-		DESTDIR=$(DEB_DESTDIR)/altivec \
-		DEBUG=no $(DEB_SCONS_EXTRA_FLAGS) install
+ifneq (,$(findstring amd64,$(DEB_BUILD_ARCH)))
+	DIST_TARGET = x86_64
 endif
 
-# this is bad but the only easy way to have ardour.rc generated from
-# ardour.rc.in
-common-install-indep:: debian/stamp-scons-build
-common-install-arch:: debian/stamp-scons-build
-
-clean:: scons-clean
-scons-clean::
-	$(MAKE) -f debian/rules reverse-config
-	rm -rf build-generic build-i686 build-altivec
-	rm -rf $(DEB_DESTDIR) debian/stamp-scons-build
-
-	rm -rf debian/ardour-dbg
-	rm -f gtk2_ardour/*.mo
-	rm -f libs/ardour/svn_revision.cc
+LD_LIBRARY_PATH += :$(DEB_DESTDIR)/usr/lib/ardour4/
+
+waf-configure-options = --lv2 \
+	--lxvst \
+	--freedesktop \
+	--configdir=/etc/ \
+	--noconfirm \
+	--prefix=/usr/ \
+	--with-backends=jack,alsa \
+	--no-phone-home \
+	--use-external-libs \
+	--dist-target=$(DIST_TARGET) \
+	--optimize \
+
+
+DEB_MAKE_PARALLEL = -j$(if $(DEB_PARALLEL_JOBS),$(DEB_PARALLEL_JOBS),1)
+DEB_MAKE_EXTRA_ARGS = -v --destdir=$(CURDIR)/debian/tmp $(DEB_MAKE_PARALLEL)
+DEB_MAKE_BUILD_TARGET = build i18n_mo
+DEB_MAKE_ENVVARS = CFLAGS="$(or $(CFLAGS_$(cdbs_curpkg)),$(CFLAGS))" CXXFLAGS="$(or $(CXXFLAGS_$(cdbs_curpkg)),$(CXXFLAGS))" CPPFLAGS="$(or $(CPPFLAGS_$(cdbs_curpkg)),$(CPPFLAGS))" LDFLAGS="$(or $(LDFLAGS_$(cdbs_curpkg)),$(LDFLAGS))"
+DEB_MAKE_INVOKE = $(DEB_MAKE_ENVVARS) $(CURDIR)/waf-light $(DEB_MAKE_EXTRA_ARGS)
+DEB_MAKE_INSTALL_TARGET = install
+
+
+clean::
+	rm -f autowaf.pyc
+	rm -f .lock-wscript .lock-waf_linux2_build
+	find waflib -name "*.pyc" -delete || true
+	find . -name "*.mo" -delete || true
+	rm -rf build
+	rm -f \
+		gtk2_ardour/version.cc\
+		gtk2_ardour/version.h\
+		libs/ardour/ardour/version.h\
+		libs/ardour/config_text.cc\
+		libs/ardour/svn_revision.cc\
+		libs/ardour/version.cc\
+		libs/gtkmm2ext/gtkmm2ext/version.h\
+		libs/gtkmm2ext/version.cc\
+		libs/midi++2/midi++/version.h\
+		libs/midi++2/version.cc\
+		libs/pbd/pbd/version.h\
+		libs/pbd/version.cc
+	rm -rf debian/tmp
+
+
+common-configure-arch common-configure-indep:: common-configure-impl
+common-configure-impl:: libs/ardour/revision.cc debian/stamp-waf-configure
+
+
+libs/ardour/revision.cc:
+	echo '#include "ardour/revision.h"' > $@
+	echo "namespace ARDOUR { const char* revision = \"${DEB_VERSION}\"; }" >> $@
+
+debian/stamp-waf-configure:
+	chmod +x ./waf-light
+	$(DEB_MAKE_INVOKE) configure $(waf-configure-options)
+	touch $@
+clean::
+	rm -f debian/stamp-waf-configure
+	rm -f libs/ardour/revision.cc
 
 # Needed at build time
 # (separated in build tools, core, Glib/GTK and audio dependencies)
 CDBS_BUILD_DEPENDS += , gettext,\
-					  intltool,\
-					  scons
-CDBS_BUILD_DEPENDS += , libboost-dev,\
-					  libcurl4-gnutls-dev,\
-					  libfftw3-dev,\
-					  libraptor1-dev (>= 1.4.21-5),\
-					  liblrdf0-dev (>= 0.4.0-4),\
-					  libsigc++-2.0-dev,\
+					  intltool
+CDBS_BUILD_DEPENDS += , libboost-dev (>= 1.49.0),\
+					  libcurl4-gnutls-dev (>= 7.25.0),\
+					  libfftw3-dev (>= 3.3.1),\
+					  libraptor2-dev (>= 2.0.9),\
+					  librdf0-dev (>= 1.0.15),\
+					  liblrdf0-dev (>= 0.4.0), \
+					  libserd-dev (>= 0.18.2~),\
+					  libsord-dev (>= 0.12.0~),\
+					  libsuil-dev (>= 0.6.10~),\
+					  liblilv-dev,\
+					  libsratom-dev (>= 0.4.2~),\
+					  libsigc++-2.0-dev (>= 2.2.10),\
 					  libusb-dev,\
+					  uuid-dev,\
 					  libxml2-dev (>= 2.5.7),\
-					  librasqal3-dev | librasqal2-dev (>= 0.9.14)
-CDBS_BUILD_DEPENDS += , libcairomm-1.0-dev (>= 1.2.4),\
-					  libglade2-dev,\
-					  libglademm-2.4-dev,\
-					  libglib2.0-dev,\
-					  libgnomecanvas2-dev,\
-					  libgnomecanvasmm-2.6-dev,\
-					  libgtkmm-2.4-dev,\
-					  libpango1.0-dev
+					  librasqal3-dev (>= 0.9.28),\
+					  libcwiid-dev
+CDBS_BUILD_DEPENDS += , libcairomm-1.0-dev (>= 1.10.0),\
+					  libgnomecanvas2-dev (>= 2.30.3),\
+					  libgnomecanvasmm-2.6-dev (>= 2.26.0),\
+					  libgtkmm-2.4-dev (>= 2.24.2),\
+					  libpangomm-1.4-dev (>= 2.28.4)
 CDBS_BUILD_DEPENDS += , ladspa-sdk (>= 1.1-2),\
-					  libasound2-dev (>= 0.9.4) [linux-any],\
-					  liboss-salsa-dev [!linux-any],\
-					  libaubio-dev (>= 0.4.0),\
+					  libasound2-dev (>= 0.9.4),\
+					  libaubio-dev (>= 0.3.2),\
 					  libjack-dev,\
-					  liblo-dev,\
-					  libsuil-dev,\
-					  libsamplerate0-dev,\
-					  libsndfile1-dev,\
+					  liblo-dev (>= 0.26~),\
+					  libltc-dev, \
+					  librubberband-dev, \
+					  libsamplerate0-dev (>= 0.1.8),\
+					  libsndfile1-dev (>= 1.0.25),\
 					  libsoundtouch-dev (>= 1.5.0),\
-					  lv2-dev,\
-					  liblilv-dev,\
+					  libtagc0-dev, \
+					  lv2-dev (>= 1.2.0),\
 					  vamp-plugin-sdk (>=2.1)
+CDBS_BUILD_DEPENDS += , python-setuptools,\
+					  python-isodate,\
+					  libpcre3-dev,\
+					  python-rdflib
+
 
 # Needed always/often/sometimes at runtime
-CDBS_DEPENDS_ALL = python, python-twisted, python-gtk2, jackd
+CDBS_DEPENDS_ALL = jackd
 CDBS_RECOMMENDS_ALL = iceweasel | www-browser
 CDBS_SUGGESTS_ALL = jamin, qjackctl
 
 # Ensure only one variant is installed at a time
 CDBS_PROVIDES_ardour-altivec = ardour
 CDBS_PROVIDES_ardour-i686 = ardour
-CDBS_CONFLICTS_ALL = ardour
-CDBS_REPLACES_ALL = ardour
-
-# Transitional quirk: ardour-gtk renamed to ardour
-# TODO: drop after Squeeze (was introduced before Lenny)
-CDBS_CONFLICTS_ardour += , ardour-gtk
-CDBS_REPLACES_ardour += , ardour-gtk
-
-# Transitional quirk: ardour-gtk-altivec renamed to ardour-altivec
-# TODO: drop after Squeeze (was introduced before Lenny)
-CDBS_CONFLICTS_ardour-altivec += , ardour-gtk-altivec
-CDBS_REPLACES_ardour-altivec += , ardour-gtk-altivec
-
-# Transitional quirk: ardour-gtk-i686 renamed to ardour-i686
-# TODO: drop after Squeeze (was introduced before Lenny)
-CDBS_CONFLICTS_ardour-i686 += , ardour-gtk-i686
-CDBS_REPLACES_ardour-i686 += , ardour-gtk-i686
 
 # Quirk for derivatives using different packaging name
 ifeq (Ubuntu,$(shell dpkg-vendor --query Vendor))
 CDBS_RECOMMENDS_ALL = firefox | www-browser
 endif
+
+get-orig-source:
+	uscan --noconf --force-download --rename --download-current-version --destdir=.
+	tar -xf $(PKG)_$(UPSTREAM_VERSION).orig.tar.gz
+	mv ardour-$(UPSTREAM_VERSION) $(PKG)-$(UPSTREAM_VERSION)~$(DFSG)
+	rm -rf $(PKG)-$(UPSTREAM_VERSION)~$(DFSG)/waf
+	rm -rf $(PKG)-$(UPSTREAM_VERSION)~$(DFSG)/.git
+	rm -rf $(PKG)-$(UPSTREAM_VERSION)~$(DFSG)/.gitignore
+	rm -rf $(PKG)-$(UPSTREAM_VERSION)~$(DFSG)/MSVCvst_scan
+	rm -rf $(PKG)-$(UPSTREAM_VERSION)~$(DFSG)/MSVCardour3
+	rm -rf $(PKG)-$(UPSTREAM_VERSION)~$(DFSG)/icons/win32/resource
+	rm -rf $(PKG)-$(UPSTREAM_VERSION)~$(DFSG)/icons/win32/msvc_resources.rc.in
+	XZ_OPT=-9 tar cJf ../$(PKG)_$(UPSTREAM_VERSION)~$(DFSG).orig.tar.xz $(PKG)-$(UPSTREAM_VERSION)~$(DFSG)
+	rm -rf $(PKG)-$(UPSTREAM_VERSION)~$(DFSG)
+	rm -rf $(PKG)_$(UPSTREAM_VERSION).orig.tar.gz
+
diff --git a/debian/watch b/debian/watch
index 7c384b9..6f5c2bb 100644
--- a/debian/watch
+++ b/debian/watch
@@ -1,4 +1,3 @@
 version=3
-# Homepage  Pattern  [Version  [Action]]
-http://www.ardour.org/source_downloads http://releases.ardour.org/ardour-(2.[\d\.]+)\.tar\.bz2
-
+opts=filenamemangle=s/.+\/v?(\d\S*)\.tar\.gz/ardour-$1\.tar\.gz/,uversionmangle=s/-rc/~rc/,dversionmangle=s/~dfsg.*// \
+  https://github.com/Ardour/ardour/tags .*/v?(\d\S*)\.tar\.gz

-- 
ardour Debian packaging



More information about the pkg-multimedia-commits mailing list