[liborcus] 03/04: New upstream version 0.13.1
Rene Engelhard
rene at moszumanska.debian.org
Sat Nov 18 19:52:15 UTC 2017
This is an automated email from the git hooks/post-receive script.
rene pushed a commit to branch experimental
in repository liborcus.
commit 8899494ac2bbdb7ea888a8533e7680147c18c76e
Author: Rene Engelhard <rene at rene-engelhard.de>
Date: Sat Nov 18 20:48:27 2017 +0100
New upstream version 0.13.1
---
AUTHORS | 1 +
CHANGELOG | 33 +-
ChangeLog | 154 ++++
Makefile.am | 4 +-
Makefile.in | 6 +-
README.md | 2 +
config.guess | 233 ++----
config.sub | 60 +-
configure | 25 +-
configure.ac | 3 +-
doc/cli/orcus_csv.rst | 22 +-
doc/cli/orcus_gnumeric.rst | 15 +-
doc/cli/orcus_ods.rst | 15 +-
doc/cli/orcus_xls_xml.rst | 15 +-
doc/cli/orcus_xlsx.rst | 15 +-
doc/conf.py | 6 +-
doc/cpp/model/json.rst | 27 +-
doc/overview/json.rst | 367 ++++++++-
doc_example/Makefile.am | 37 +
doc_example/Makefile.in | 1114 +++++++++++++++++++++++++++
doc_example/json_doc_1.cpp | 56 ++
doc_example/json_doc_2.cpp | 195 +++++
doc_example/json_parser_1.cpp | 88 +++
include/orcus/css_parser.hpp | 10 +-
include/orcus/json_document_tree.hpp | 26 +-
include/orcus/orcus_xml.hpp | 3 +
install-sh | 373 +++++----
py-compile | 170 ----
slickedit/orcus.vpj | 5 +
src/liborcus/json_document_tree.cpp | 2 +-
src/liborcus/orcus_xml.cpp | 17 +
src/spreadsheet/sheet.cpp | 41 +-
test-driver | 15 +-
test/css/basic8.css | 2 +-
test/xml-mapped/content-basic/flat/data.txt | 22 -
test/xml-mapped/fuel-economy/flat/data.txt | 6 -
36 files changed, 2486 insertions(+), 699 deletions(-)
diff --git a/AUTHORS b/AUTHORS
index 2ce6d6c..6a5ee05 100644
--- a/AUTHORS
+++ b/AUTHORS
@@ -6,6 +6,7 @@ Tomas Chvatal <tchvatal at suse.cz>
Fridrich Štrba <fridrich.strba at bluewin.ch>
Caolán McNamara <caolanm at redhat.com>
Stephan Bergmann <sbergman at redhat.com>
+Miklos Vajna <vmiklos at collabora.co.uk>
Eike Rathke <erack at redhat.com>
Maks Naumov <maksqwe1 at ukr.net>
Dmitry Roshchin <dmitry at roshchin.org>
diff --git a/CHANGELOG b/CHANGELOG
index 079bacf..b6060ac 100644
--- a/CHANGELOG
+++ b/CHANGELOG
@@ -1,4 +1,35 @@
-orcus 0.13.0 (not released yet)
+orcus 0.13.1
+
+ * use a more efficient way to set format ranges in spreadsheet model.
+
+ * support single quoted strings in the css parser.
+
+orcus 0.13.0
+
+ * fix incorrect parsing of XML 1.0 documents that don't include
+ header declarations.
+
+ * fix incorrect parsing of XML elements and attributes whose names
+ start with an underscore.
+
+ * orcus-csv: add an option to split content into multiple sheets in
+ case it doesn't fit in one sheet.
+
+ * add csv dump mode for all spreadsheet document based filter
+ commands.
+
+ * orcus-ods: suppress debug outputs unless the debug flag is set.
+
+ * orcus-xlsx: correctly import boolean cell values.
+
+ * add experimental cmake-based build support, primarily for Windows.
+
+ * add initial support for importing select sheet view settings in
+ xlsx and xls-xml.
+
+ * add API for directly constructing json document trees.
+
+ * support import of cell formats for xls-xml.
* support single-quoted attribute values in the sax xml parser.
diff --git a/ChangeLog b/ChangeLog
index 7bdeca0..1f7504c 100644
--- a/ChangeLog
+++ b/ChangeLog
@@ -1,3 +1,157 @@
+2017-11-15 Kohei Yoshida <kohei.yoshida at gmail.com> [8d9fa37666f313871b90382b4a6c495f199d22d7]
+
+ Update the changelog.
+
+
+2017-11-15 Kohei Yoshida <kohei.yoshida at gmail.com> [2ab7da8f77bfeab29bcadadd0177790aa26b091d]
+
+ Fix 'make distcheck'
+
+
+2017-11-15 Kohei Yoshida <kohei.yoshida at gmail.com> [8d3f4917148f388b028e19dd48db406797d06a07]
+
+ Up the version to 0.13.1.
+
+
+2017-09-29 Markus Mohrhard <markus.mohrhard at googlemail.com> [fac81d2299c16c288e9ae19ec08d8d076c5f1681]
+
+ Fixes #46. Use efficient way to set format ranges in spreadsheet model
+
+ (cherry picked from commit bd9d921c671777cb0942aba210d4d03940747961)
+
+2017-09-07 Kohei Yoshida <kohei.yoshida at gmail.com> [eafd86f475a7de9d9d43df472f453e068cb06141]
+
+ Reflect the latest patch from Miklos.
+
+
+2017-09-07 Miklos Vajna <vmiklos at collabora.co.uk> [cc8356089eb7ee226521162a23c32d48407833bb]
+
+ css parser: handle single quotes for property values
+
+ And modify one of the testcases where there are already two
+ double-quoted strings, so that one of them is single-quoted. This way
+ make check fails without the code change.
+
+2017-08-30 Kohei Yoshida <kohei.yoshida at gmail.com> [4d24143521d469b90541521b37d95862619ddcc2]
+
+ Add class descriptions.
+
+
+2017-08-30 Kohei Yoshida <kohei.yoshida at gmail.com> [715c91c6fc1ab38c5e9bd3fe354a85bbc4856c38]
+
+ doc: add example for json array's push_back() method.
+
+
+2017-08-29 Kohei Yoshida <kohei.yoshida at gmail.com> [891b21809ef02a91decec00e6b83b1de6f1f6ce6]
+
+ Add some class descriptions.
+
+
+2017-08-29 Kohei Yoshida <kohei.yoshida at gmail.com> [d048e01866588056fb3882116feec219982f1846]
+
+ doc: add the rest of the code examples and annotate them.
+
+
+2017-08-28 Kohei Yoshida <kohei.yoshida at gmail.com> [25196a7de1046ad68332c120d32d2ed3edce1835]
+
+ doc: this flows better.
+
+
+2017-08-28 Kohei Yoshida <kohei.yoshida at gmail.com> [4951f1b0a9bf526cb5b55a51412d5e6716a4c2f1]
+
+ doc: add more examples.
+
+
+2017-08-28 Kohei Yoshida <kohei.yoshida at gmail.com> [35801917a0985161aa3fac0a2366b2527df11511]
+
+ doc: add more code snippets and narratives.
+
+
+2017-08-26 Kohei Yoshida <kohei.yoshida at gmail.com> [60b505a3440950a80c0020f6572d9ca791d0ff95]
+
+ doc: edit and add some content.
+
+
+2017-08-26 Kohei Yoshida <kohei.yoshida at gmail.com> [d1e03f528ead6e57370cf4c222e044d548350bdd]
+
+ doc: add more example code blocks.
+
+
+2017-08-24 Kohei Yoshida <kohei.yoshida at gmail.com> [e1c57ee747a090c67d26eb21041aa928fd35a2b2]
+
+ Add these files to slickedit project.
+
+
+2017-08-24 Kohei Yoshida <kohei.yoshida at gmail.com> [206fd9ba51ad0815ea54c28ba53a610575a814fe]
+
+ doc: add another example code for direct json doc tree initialization.
+
+ Not finished yet.
+
+2017-08-23 Kohei Yoshida <kohei.yoshida at gmail.com> [3dcd866b0a20d7294772359df061b1035d63c5bf]
+
+ orcus-xml: rename read_content to read_stream.
+
+ To be consistent with the equivalent methods in the other orcus-*
+ classes.
+
+2017-08-23 Kohei Yoshida <kohei.yoshida at gmail.com> [543cf40fe554fa2b883df8cea2194b93a916c003]
+
+ doc: update the CLI options.
+
+
+2017-08-23 Kohei Yoshida <kohei.yoshida at gmail.com> [20bfdc6870d68095d7e3779d3e8acbb8871cabce]
+
+ doc: update the class names here as well.
+
+
+2017-08-23 Kohei Yoshida <kohei.yoshida at gmail.com> [eedf17c9f20aef4cd602202d95ee36d23bc52a3d]
+
+ doc: update the json example codes.
+
+
+2017-08-23 Kohei Yoshida <kohei.yoshida at gmail.com> [e829380c28ee48eb9935b9b8e2c0e1a056e38eaf]
+
+ Make the doc example code a part of the automatic test suite.
+
+ These codes now get run as part of 'make check'.
+
+2017-08-23 Kohei Yoshida <kohei.yoshida at gmail.com> [5f33e788ebbe5bcc40ffdb420a4aec68727c2b4d]
+
+ Add more classes to the doc.
+
+
+2017-08-23 Kohei Yoshida <kohei.yoshida at gmail.com> [337473cb0724a3a80590809261d87fc4c5bc81b5]
+
+ Avoid decltype and use real type of nullptr.
+
+ Doxygen also trips on decltype.
+
+2017-08-23 Kohei Yoshida <kohei.yoshida at gmail.com> [1e2e43a90a47d7cc0306ee8809f56d8e6bde01c7]
+
+ Fix the doc a bit.
+
+
+2017-08-22 Markus Mohrhard <markus.mohrhard at googlemail.com> [c2c8948110f630c48ed22b0660623472048811ed]
+
+ orcus_xml: add a read_content method
+
+
+2017-08-15 Kohei Yoshida <kohei.yoshida at gmail.com> [91b1b9cf81b01e5d1fba02b2c2031b5d86c51ae6]
+
+ Update the changelog post the 0.13.0 release.
+
+
+2017-08-15 Kohei Yoshida <kohei.yoshida at gmail.com> [941a31ca3644c8519146d652f1e24b9c68d60e06]
+
+ Fix the release date.
+
+
+2017-08-15 Kohei Yoshida <kohei.yoshida at gmail.com> [cd34695b55d598c6573923b638534944e037fc1b]
+
+ Add URL's to the 0.13.0 source packages.
+
+
2017-08-15 Kohei Yoshida <kohei.yoshida at gmail.com> [c3b3b36e4a1a3dc97a12f062a483f96bbeaf0259]
Update the authors list.
diff --git a/Makefile.am b/Makefile.am
index a4f1bee..edcef91 100644
--- a/Makefile.am
+++ b/Makefile.am
@@ -1,4 +1,4 @@
-SUBDIRS = src include parser_handlers benchmark
+SUBDIRS = src include parser_handlers benchmark doc_example
ACLOCAL_AMFLAGS = -I m4
pcfiles = liborcus- at ORCUS_API_VERSION@.pc
@@ -221,7 +221,6 @@ test_data = \
test/xml-mapped/attribute-single-element/input.xml \
test/xml-mapped/attribute-single-element/map.xml \
test/xml-mapped/content-basic/check.txt \
- test/xml-mapped/content-basic/flat/data.txt \
test/xml-mapped/content-basic/input.xml \
test/xml-mapped/content-basic/map.xml \
test/xml-mapped/content-namespace-2/check.txt \
@@ -231,7 +230,6 @@ test_data = \
test/xml-mapped/content-namespace/input.xml \
test/xml-mapped/content-namespace/map.xml \
test/xml-mapped/fuel-economy/check.txt \
- test/xml-mapped/fuel-economy/flat/data.txt \
test/xml-mapped/fuel-economy/input.xml \
test/xml-mapped/fuel-economy/map.xml \
test/xml-structure/attribute-1/check.txt \
diff --git a/Makefile.in b/Makefile.in
index 46d27be..f6baf74 100644
--- a/Makefile.in
+++ b/Makefile.in
@@ -196,7 +196,7 @@ am__DIST_COMMON = $(srcdir)/Makefile.in $(srcdir)/VERSION.in \
$(srcdir)/config.h.in \
$(srcdir)/liborcus-spreadsheet-model.pc.in \
$(srcdir)/liborcus.pc.in AUTHORS compile config.guess \
- config.sub depcomp install-sh ltmain.sh missing py-compile
+ config.sub depcomp install-sh ltmain.sh missing
DISTFILES = $(DIST_COMMON) $(DIST_SOURCES) $(TEXINFOS) $(EXTRA_DIST)
distdir = $(PACKAGE)-$(VERSION)
top_distdir = $(distdir)
@@ -407,7 +407,7 @@ target_alias = @target_alias@
top_build_prefix = @top_build_prefix@
top_builddir = @top_builddir@
top_srcdir = @top_srcdir@
-SUBDIRS = src include parser_handlers benchmark
+SUBDIRS = src include parser_handlers benchmark doc_example
ACLOCAL_AMFLAGS = -I m4
pcfiles = liborcus- at ORCUS_API_VERSION@.pc $(am__append_1)
pkgconfig_DATA = $(pcfiles)
@@ -624,7 +624,6 @@ test_data = \
test/xml-mapped/attribute-single-element/input.xml \
test/xml-mapped/attribute-single-element/map.xml \
test/xml-mapped/content-basic/check.txt \
- test/xml-mapped/content-basic/flat/data.txt \
test/xml-mapped/content-basic/input.xml \
test/xml-mapped/content-basic/map.xml \
test/xml-mapped/content-namespace-2/check.txt \
@@ -634,7 +633,6 @@ test_data = \
test/xml-mapped/content-namespace/input.xml \
test/xml-mapped/content-namespace/map.xml \
test/xml-mapped/fuel-economy/check.txt \
- test/xml-mapped/fuel-economy/flat/data.txt \
test/xml-mapped/fuel-economy/input.xml \
test/xml-mapped/fuel-economy/map.xml \
test/xml-structure/attribute-1/check.txt \
diff --git a/README.md b/README.md
index 08b6324..5f17403 100644
--- a/README.md
+++ b/README.md
@@ -36,6 +36,8 @@ callbacks from the parser as the file is being parsed.
| Version | API Version | Release Date | Download | Checksum | File Size (bytes) |
|---------|-------------|--------------|----------|----------|-------------------|
+| 0.13.0 | 0.13 | 2017-08-15 | [liborcus-0.13.0.tar.xz](http://kohei.us/files/orcus/src/liborcus-0.13.0.tar.xz) | sha256sum: 08c779722471d49f38de30dad538dbf3ae1c26eb9aeb7f5eb5ca64516513e6d7 | 1812468 |
+| | | | [liborcus-0.13.0.tar.gz](http://kohei.us/files/orcus/src/liborcus-0.13.0.tar.gz) | sha256sum: 1b03e1970aca31ecceae2d6412c4ead23d727c7c655efc26cf49d4ed83ba36e2 | 2309677 |
| 0.12.1 | 0.12 | 2016-09-18 | [liborcus-0.12.1.tar.xz](http://kohei.us/files/orcus/src/liborcus-0.12.1.tar.xz) | sha256sum: d1b936c66944d23e1b2582d0e7129e44670052510d03f19fef644e9814ae2b9c | 1673880 |
| | | | [liborcus-0.12.1.tar.gz](http://kohei.us/files/orcus/src/liborcus-0.12.1.tar.gz) | sha256sum: 676b1fedd721f64489650f5e76d7f98b750439914d87cae505b8163d08447908 | 2117890 |
| 0.12.0 | 0.12 | 2016-08-21 | [liborcus-0.12.0.tar.xz](http://kohei.us/files/orcus/src/liborcus-0.12.0.tar.xz) | sha256sum: a0b904c4c501a4428cacf1178b2a0c4c8dc89fcade8d0310f4826a32495750df | 1672940 |
diff --git a/config.guess b/config.guess
index b79252d..1659250 100755
--- a/config.guess
+++ b/config.guess
@@ -1,8 +1,8 @@
#! /bin/sh
# Attempt to guess a canonical system name.
-# Copyright 1992-2013 Free Software Foundation, Inc.
+# Copyright 1992-2015 Free Software Foundation, Inc.
-timestamp='2013-06-10'
+timestamp='2015-08-20'
# This file is free software; you can redistribute it and/or modify it
# under the terms of the GNU General Public License as published by
@@ -24,12 +24,12 @@ timestamp='2013-06-10'
# program. This Exception is an additional permission under section 7
# of the GNU General Public License, version 3 ("GPLv3").
#
-# Originally written by Per Bothner.
+# Originally written by Per Bothner; maintained since 2000 by Ben Elliston.
#
# You can get the latest version of this script from:
# http://git.savannah.gnu.org/gitweb/?p=config.git;a=blob_plain;f=config.guess;hb=HEAD
#
-# Please send patches with a ChangeLog entry to config-patches at gnu.org.
+# Please send patches to <config-patches at gnu.org>.
me=`echo "$0" | sed -e 's,.*/,,'`
@@ -50,7 +50,7 @@ version="\
GNU config.guess ($timestamp)
Originally written by Per Bothner.
-Copyright 1992-2013 Free Software Foundation, Inc.
+Copyright 1992-2015 Free Software Foundation, Inc.
This is free software; see the source for copying conditions. There is NO
warranty; not even for MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE."
@@ -149,7 +149,7 @@ Linux|GNU|GNU/*)
LIBC=gnu
#endif
EOF
- eval `$CC_FOR_BUILD -E $dummy.c 2>/dev/null | grep '^LIBC'`
+ eval `$CC_FOR_BUILD -E $dummy.c 2>/dev/null | grep '^LIBC' | sed 's, ,,g'`
;;
esac
@@ -168,20 +168,27 @@ case "${UNAME_MACHINE}:${UNAME_SYSTEM}:${UNAME_RELEASE}:${UNAME_VERSION}" in
# Note: NetBSD doesn't particularly care about the vendor
# portion of the name. We always set it to "unknown".
sysctl="sysctl -n hw.machine_arch"
- UNAME_MACHINE_ARCH=`(/sbin/$sysctl 2>/dev/null || \
- /usr/sbin/$sysctl 2>/dev/null || echo unknown)`
+ UNAME_MACHINE_ARCH=`(uname -p 2>/dev/null || \
+ /sbin/$sysctl 2>/dev/null || \
+ /usr/sbin/$sysctl 2>/dev/null || \
+ echo unknown)`
case "${UNAME_MACHINE_ARCH}" in
armeb) machine=armeb-unknown ;;
arm*) machine=arm-unknown ;;
sh3el) machine=shl-unknown ;;
sh3eb) machine=sh-unknown ;;
sh5el) machine=sh5le-unknown ;;
+ earmv*)
+ arch=`echo ${UNAME_MACHINE_ARCH} | sed -e 's,^e\(armv[0-9]\).*$,\1,'`
+ endian=`echo ${UNAME_MACHINE_ARCH} | sed -ne 's,^.*\(eb\)$,\1,p'`
+ machine=${arch}${endian}-unknown
+ ;;
*) machine=${UNAME_MACHINE_ARCH}-unknown ;;
esac
# The Operating System including object format, if it has switched
# to ELF recently, or will in the future.
case "${UNAME_MACHINE_ARCH}" in
- arm*|i386|m68k|ns32k|sh3*|sparc|vax)
+ arm*|earm*|i386|m68k|ns32k|sh3*|sparc|vax)
eval $set_cc_for_build
if echo __ELF__ | $CC_FOR_BUILD -E - 2>/dev/null \
| grep -q __ELF__
@@ -197,6 +204,13 @@ case "${UNAME_MACHINE}:${UNAME_SYSTEM}:${UNAME_RELEASE}:${UNAME_VERSION}" in
os=netbsd
;;
esac
+ # Determine ABI tags.
+ case "${UNAME_MACHINE_ARCH}" in
+ earm*)
+ expr='s/^earmv[0-9]/-eabi/;s/eb$//'
+ abi=`echo ${UNAME_MACHINE_ARCH} | sed -e "$expr"`
+ ;;
+ esac
# The OS release
# Debian GNU/NetBSD machines have a different userland, and
# thus, need a distinct triplet. However, they do not need
@@ -207,13 +221,13 @@ case "${UNAME_MACHINE}:${UNAME_SYSTEM}:${UNAME_RELEASE}:${UNAME_VERSION}" in
release='-gnu'
;;
*)
- release=`echo ${UNAME_RELEASE}|sed -e 's/[-_].*/\./'`
+ release=`echo ${UNAME_RELEASE} | sed -e 's/[-_].*//' | cut -d. -f1,2`
;;
esac
# Since CPU_TYPE-MANUFACTURER-KERNEL-OPERATING_SYSTEM:
# contains redundant information, the shorter form:
# CPU_TYPE-MANUFACTURER-OPERATING_SYSTEM is used.
- echo "${machine}-${os}${release}"
+ echo "${machine}-${os}${release}${abi}"
exit ;;
*:Bitrig:*:*)
UNAME_MACHINE_ARCH=`arch | sed 's/Bitrig.//'`
@@ -235,6 +249,9 @@ case "${UNAME_MACHINE}:${UNAME_SYSTEM}:${UNAME_RELEASE}:${UNAME_VERSION}" in
*:MirBSD:*:*)
echo ${UNAME_MACHINE}-unknown-mirbsd${UNAME_RELEASE}
exit ;;
+ *:Sortix:*:*)
+ echo ${UNAME_MACHINE}-unknown-sortix
+ exit ;;
alpha:OSF1:*:*)
case $UNAME_RELEASE in
*4.0)
@@ -579,8 +596,9 @@ EOF
else
IBM_ARCH=powerpc
fi
- if [ -x /usr/bin/oslevel ] ; then
- IBM_REV=`/usr/bin/oslevel`
+ if [ -x /usr/bin/lslpp ] ; then
+ IBM_REV=`/usr/bin/lslpp -Lqc bos.rte.libc |
+ awk -F: '{ print $3 }' | sed s/[0-9]*$/0/`
else
IBM_REV=${UNAME_VERSION}.${UNAME_RELEASE}
fi
@@ -826,7 +844,7 @@ EOF
*:MINGW*:*)
echo ${UNAME_MACHINE}-pc-mingw32
exit ;;
- i*:MSYS*:*)
+ *:MSYS*:*)
echo ${UNAME_MACHINE}-pc-msys
exit ;;
i*:windows32*:*)
@@ -932,6 +950,9 @@ EOF
crisv32:Linux:*:*)
echo ${UNAME_MACHINE}-axis-linux-${LIBC}
exit ;;
+ e2k:Linux:*:*)
+ echo ${UNAME_MACHINE}-unknown-linux-${LIBC}
+ exit ;;
frv:Linux:*:*)
echo ${UNAME_MACHINE}-unknown-linux-${LIBC}
exit ;;
@@ -969,10 +990,10 @@ EOF
eval `$CC_FOR_BUILD -E $dummy.c 2>/dev/null | grep '^CPU'`
test x"${CPU}" != x && { echo "${CPU}-unknown-linux-${LIBC}"; exit; }
;;
- or1k:Linux:*:*)
- echo ${UNAME_MACHINE}-unknown-linux-${LIBC}
+ openrisc*:Linux:*:*)
+ echo or1k-unknown-linux-${LIBC}
exit ;;
- or32:Linux:*:*)
+ or32:Linux:*:* | or1k*:Linux:*:*)
echo ${UNAME_MACHINE}-unknown-linux-${LIBC}
exit ;;
padre:Linux:*:*)
@@ -1020,7 +1041,7 @@ EOF
echo ${UNAME_MACHINE}-dec-linux-${LIBC}
exit ;;
x86_64:Linux:*:*)
- echo ${UNAME_MACHINE}-unknown-linux-${LIBC}
+ echo ${UNAME_MACHINE}-pc-linux-${LIBC}
exit ;;
xtensa*:Linux:*:*)
echo ${UNAME_MACHINE}-unknown-linux-${LIBC}
@@ -1260,16 +1281,26 @@ EOF
if test "$UNAME_PROCESSOR" = unknown ; then
UNAME_PROCESSOR=powerpc
fi
- if [ "$CC_FOR_BUILD" != 'no_compiler_found' ]; then
- if (echo '#ifdef __LP64__'; echo IS_64BIT_ARCH; echo '#endif') | \
- (CCOPTS= $CC_FOR_BUILD -E - 2>/dev/null) | \
- grep IS_64BIT_ARCH >/dev/null
- then
- case $UNAME_PROCESSOR in
- i386) UNAME_PROCESSOR=x86_64 ;;
- powerpc) UNAME_PROCESSOR=powerpc64 ;;
- esac
+ if test `echo "$UNAME_RELEASE" | sed -e 's/\..*//'` -le 10 ; then
+ if [ "$CC_FOR_BUILD" != 'no_compiler_found' ]; then
+ if (echo '#ifdef __LP64__'; echo IS_64BIT_ARCH; echo '#endif') | \
+ (CCOPTS= $CC_FOR_BUILD -E - 2>/dev/null) | \
+ grep IS_64BIT_ARCH >/dev/null
+ then
+ case $UNAME_PROCESSOR in
+ i386) UNAME_PROCESSOR=x86_64 ;;
+ powerpc) UNAME_PROCESSOR=powerpc64 ;;
+ esac
+ fi
fi
+ elif test "$UNAME_PROCESSOR" = i386 ; then
+ # Avoid executing cc on OS X 10.9, as it ships with a stub
+ # that puts up a graphical alert prompting to install
+ # developer tools. Any system running Mac OS X 10.7 or
+ # later (Darwin 11 and later) is required to have a 64-bit
+ # processor. This is not true of the ARM version of Darwin
+ # that Apple uses in portable devices.
+ UNAME_PROCESSOR=x86_64
fi
echo ${UNAME_PROCESSOR}-apple-darwin${UNAME_RELEASE}
exit ;;
@@ -1361,154 +1392,6 @@ EOF
exit ;;
esac
-eval $set_cc_for_build
-cat >$dummy.c <<EOF
-#ifdef _SEQUENT_
-# include <sys/types.h>
-# include <sys/utsname.h>
-#endif
-main ()
-{
-#if defined (sony)
-#if defined (MIPSEB)
- /* BFD wants "bsd" instead of "newsos". Perhaps BFD should be changed,
- I don't know.... */
- printf ("mips-sony-bsd\n"); exit (0);
-#else
-#include <sys/param.h>
- printf ("m68k-sony-newsos%s\n",
-#ifdef NEWSOS4
- "4"
-#else
- ""
-#endif
- ); exit (0);
-#endif
-#endif
-
-#if defined (__arm) && defined (__acorn) && defined (__unix)
- printf ("arm-acorn-riscix\n"); exit (0);
-#endif
-
-#if defined (hp300) && !defined (hpux)
- printf ("m68k-hp-bsd\n"); exit (0);
-#endif
-
-#if defined (NeXT)
-#if !defined (__ARCHITECTURE__)
-#define __ARCHITECTURE__ "m68k"
-#endif
- int version;
- version=`(hostinfo | sed -n 's/.*NeXT Mach \([0-9]*\).*/\1/p') 2>/dev/null`;
- if (version < 4)
- printf ("%s-next-nextstep%d\n", __ARCHITECTURE__, version);
- else
- printf ("%s-next-openstep%d\n", __ARCHITECTURE__, version);
- exit (0);
-#endif
-
-#if defined (MULTIMAX) || defined (n16)
-#if defined (UMAXV)
- printf ("ns32k-encore-sysv\n"); exit (0);
-#else
-#if defined (CMU)
- printf ("ns32k-encore-mach\n"); exit (0);
-#else
- printf ("ns32k-encore-bsd\n"); exit (0);
-#endif
-#endif
-#endif
-
-#if defined (__386BSD__)
- printf ("i386-pc-bsd\n"); exit (0);
-#endif
-
-#if defined (sequent)
-#if defined (i386)
- printf ("i386-sequent-dynix\n"); exit (0);
-#endif
-#if defined (ns32000)
- printf ("ns32k-sequent-dynix\n"); exit (0);
-#endif
-#endif
-
-#if defined (_SEQUENT_)
- struct utsname un;
-
- uname(&un);
-
- if (strncmp(un.version, "V2", 2) == 0) {
- printf ("i386-sequent-ptx2\n"); exit (0);
- }
- if (strncmp(un.version, "V1", 2) == 0) { /* XXX is V1 correct? */
- printf ("i386-sequent-ptx1\n"); exit (0);
- }
- printf ("i386-sequent-ptx\n"); exit (0);
-
-#endif
-
-#if defined (vax)
-# if !defined (ultrix)
-# include <sys/param.h>
-# if defined (BSD)
-# if BSD == 43
- printf ("vax-dec-bsd4.3\n"); exit (0);
-# else
-# if BSD == 199006
- printf ("vax-dec-bsd4.3reno\n"); exit (0);
-# else
- printf ("vax-dec-bsd\n"); exit (0);
-# endif
-# endif
-# else
- printf ("vax-dec-bsd\n"); exit (0);
-# endif
-# else
- printf ("vax-dec-ultrix\n"); exit (0);
-# endif
-#endif
-
-#if defined (alliant) && defined (i860)
- printf ("i860-alliant-bsd\n"); exit (0);
-#endif
-
- exit (1);
-}
-EOF
-
-$CC_FOR_BUILD -o $dummy $dummy.c 2>/dev/null && SYSTEM_NAME=`$dummy` &&
- { echo "$SYSTEM_NAME"; exit; }
-
-# Apollos put the system type in the environment.
-
-test -d /usr/apollo && { echo ${ISP}-apollo-${SYSTYPE}; exit; }
-
-# Convex versions that predate uname can use getsysinfo(1)
-
-if [ -x /usr/convex/getsysinfo ]
-then
- case `getsysinfo -f cpu_type` in
- c1*)
- echo c1-convex-bsd
- exit ;;
- c2*)
- if getsysinfo -f scalar_acc
- then echo c32-convex-bsd
- else echo c2-convex-bsd
- fi
- exit ;;
- c34*)
- echo c34-convex-bsd
- exit ;;
- c38*)
- echo c38-convex-bsd
- exit ;;
- c4*)
- echo c4-convex-bsd
- exit ;;
- esac
-fi
-
cat >&2 <<EOF
$0: unable to guess system type
diff --git a/config.sub b/config.sub
index 9633db7..1acc966 100755
--- a/config.sub
+++ b/config.sub
@@ -1,8 +1,8 @@
#! /bin/sh
# Configuration validation subroutine script.
-# Copyright 1992-2013 Free Software Foundation, Inc.
+# Copyright 1992-2015 Free Software Foundation, Inc.
-timestamp='2013-08-10'
+timestamp='2015-08-20'
# This file is free software; you can redistribute it and/or modify it
# under the terms of the GNU General Public License as published by
@@ -25,7 +25,7 @@ timestamp='2013-08-10'
# of the GNU General Public License, version 3 ("GPLv3").
-# Please send patches with a ChangeLog entry to config-patches at gnu.org.
+# Please send patches to <config-patches at gnu.org>.
#
# Configuration subroutine to validate and canonicalize a configuration type.
# Supply the specified configuration type as an argument.
@@ -68,7 +68,7 @@ Report bugs and patches to <config-patches at gnu.org>."
version="\
GNU config.sub ($timestamp)
-Copyright 1992-2013 Free Software Foundation, Inc.
+Copyright 1992-2015 Free Software Foundation, Inc.
This is free software; see the source for copying conditions. There is NO
warranty; not even for MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE."
@@ -117,7 +117,7 @@ maybe_os=`echo $1 | sed 's/^\(.*\)-\([^-]*-[^-]*\)$/\2/'`
case $maybe_os in
nto-qnx* | linux-gnu* | linux-android* | linux-dietlibc | linux-newlib* | \
linux-musl* | linux-uclibc* | uclinux-uclibc* | uclinux-gnu* | kfreebsd*-gnu* | \
- knetbsd*-gnu* | netbsd*-gnu* | \
+ knetbsd*-gnu* | netbsd*-gnu* | netbsd*-eabi* | \
kopensolaris*-gnu* | \
storm-chaos* | os2-emx* | rtmk-nova*)
os=-$maybe_os
@@ -255,16 +255,18 @@ case $basic_machine in
| arc | arceb \
| arm | arm[bl]e | arme[lb] | armv[2-8] | armv[3-8][lb] | armv7[arm] \
| avr | avr32 \
+ | ba \
| be32 | be64 \
| bfin \
| c4x | c8051 | clipper \
| d10v | d30v | dlx | dsp16xx \
- | epiphany \
- | fido | fr30 | frv \
+ | e2k | epiphany \
+ | fido | fr30 | frv | ft32 \
| h8300 | h8500 | hppa | hppa1.[01] | hppa2.0 | hppa2.0[nw] | hppa64 \
| hexagon \
| i370 | i860 | i960 | ia64 \
| ip2k | iq2000 \
+ | k1om \
| le32 | le64 \
| lm32 \
| m32c | m32r | m32rle | m68000 | m68k | m88k \
@@ -282,8 +284,10 @@ case $basic_machine in
| mips64vr5900 | mips64vr5900el \
| mipsisa32 | mipsisa32el \
| mipsisa32r2 | mipsisa32r2el \
+ | mipsisa32r6 | mipsisa32r6el \
| mipsisa64 | mipsisa64el \
| mipsisa64r2 | mipsisa64r2el \
+ | mipsisa64r6 | mipsisa64r6el \
| mipsisa64sb1 | mipsisa64sb1el \
| mipsisa64sr71k | mipsisa64sr71kel \
| mipsr5900 | mipsr5900el \
@@ -295,14 +299,14 @@ case $basic_machine in
| nds32 | nds32le | nds32be \
| nios | nios2 | nios2eb | nios2el \
| ns16k | ns32k \
- | open8 \
- | or1k | or32 \
+ | open8 | or1k | or1knd | or32 \
| pdp10 | pdp11 | pj | pjl \
| powerpc | powerpc64 | powerpc64le | powerpcle \
| pyramid \
+ | riscv32 | riscv64 \
| rl78 | rx \
| score \
- | sh | sh[1234] | sh[24]a | sh[24]aeb | sh[23]e | sh[34]eb | sheb | shbe | shle | sh[1234]le | sh3ele \
+ | sh | sh[1234] | sh[24]a | sh[24]aeb | sh[23]e | sh[234]eb | sheb | shbe | shle | sh[1234]le | sh3ele \
| sh64 | sh64le \
| sparc | sparc64 | sparc64b | sparc64v | sparc86x | sparclet | sparclite \
| sparcv8 | sparcv9 | sparcv9b | sparcv9v \
@@ -310,6 +314,7 @@ case $basic_machine in
| tahoe | tic4x | tic54x | tic55x | tic6x | tic80 | tron \
| ubicom32 \
| v850 | v850e | v850e1 | v850e2 | v850es | v850e2v3 \
+ | visium \
| we32k \
| x86 | xc16x | xstormy16 | xtensa \
| z8k | z80)
@@ -324,7 +329,10 @@ case $basic_machine in
c6x)
basic_machine=tic6x-unknown
;;
- m6811 | m68hc11 | m6812 | m68hc12 | m68hcs12x | picochip)
+ leon|leon[3-9])
+ basic_machine=sparc-$basic_machine
+ ;;
+ m6811 | m68hc11 | m6812 | m68hc12 | m68hcs12x | nvptx | picochip)
basic_machine=$basic_machine-unknown
os=-none
;;
@@ -369,18 +377,20 @@ case $basic_machine in
| alphapca5[67]-* | alpha64pca5[67]-* | arc-* | arceb-* \
| arm-* | armbe-* | armle-* | armeb-* | armv*-* \
| avr-* | avr32-* \
+ | ba-* \
| be32-* | be64-* \
| bfin-* | bs2000-* \
| c[123]* | c30-* | [cjt]90-* | c4x-* \
| c8051-* | clipper-* | craynv-* | cydra-* \
| d10v-* | d30v-* | dlx-* \
- | elxsi-* \
+ | e2k-* | elxsi-* \
| f30[01]-* | f700-* | fido-* | fr30-* | frv-* | fx80-* \
| h8300-* | h8500-* \
| hppa-* | hppa1.[01]-* | hppa2.0-* | hppa2.0[nw]-* | hppa64-* \
| hexagon-* \
| i*86-* | i860-* | i960-* | ia64-* \
| ip2k-* | iq2000-* \
+ | k1om-* \
| le32-* | le64-* \
| lm32-* \
| m32c-* | m32r-* | m32rle-* \
@@ -400,8 +410,10 @@ case $basic_machine in
| mips64vr5900-* | mips64vr5900el-* \
| mipsisa32-* | mipsisa32el-* \
| mipsisa32r2-* | mipsisa32r2el-* \
+ | mipsisa32r6-* | mipsisa32r6el-* \
| mipsisa64-* | mipsisa64el-* \
| mipsisa64r2-* | mipsisa64r2el-* \
+ | mipsisa64r6-* | mipsisa64r6el-* \
| mipsisa64sb1-* | mipsisa64sb1el-* \
| mipsisa64sr71k-* | mipsisa64sr71kel-* \
| mipsr5900-* | mipsr5900el-* \
@@ -413,16 +425,18 @@ case $basic_machine in
| nios-* | nios2-* | nios2eb-* | nios2el-* \
| none-* | np1-* | ns16k-* | ns32k-* \
| open8-* \
+ | or1k*-* \
| orion-* \
| pdp10-* | pdp11-* | pj-* | pjl-* | pn-* | power-* \
| powerpc-* | powerpc64-* | powerpc64le-* | powerpcle-* \
| pyramid-* \
+ | riscv32-* | riscv64-* \
| rl78-* | romp-* | rs6000-* | rx-* \
| sh-* | sh[1234]-* | sh[24]a-* | sh[24]aeb-* | sh[23]e-* | sh[34]eb-* | sheb-* | shbe-* \
| shle-* | sh[1234]le-* | sh3ele-* | sh64-* | sh64le-* \
| sparc-* | sparc64-* | sparc64b-* | sparc64v-* | sparc86x-* | sparclet-* \
| sparclite-* \
- | sparcv8-* | sparcv9-* | sparcv9b-* | sparcv9v-* | sv1-* | sx?-* \
+ | sparcv8-* | sparcv9-* | sparcv9b-* | sparcv9v-* | sv1-* | sx*-* \
| tahoe-* \
| tic30-* | tic4x-* | tic54x-* | tic55x-* | tic6x-* | tic80-* \
| tile*-* \
@@ -430,6 +444,7 @@ case $basic_machine in
| ubicom32-* \
| v850-* | v850e-* | v850e1-* | v850es-* | v850e2-* | v850e2v3-* \
| vax-* \
+ | visium-* \
| we32k-* \
| x86-* | x86_64-* | xc16x-* | xps100-* \
| xstormy16-* | xtensa*-* \
@@ -506,6 +521,9 @@ case $basic_machine in
basic_machine=i386-pc
os=-aros
;;
+ asmjs)
+ basic_machine=asmjs-unknown
+ ;;
aux)
basic_machine=m68k-apple
os=-aux
@@ -767,6 +785,9 @@ case $basic_machine in
basic_machine=m68k-isi
os=-sysv
;;
+ leon-*|leon[3-9]-*)
+ basic_machine=sparc-`echo $basic_machine | sed 's/-.*//'`
+ ;;
m68knommu)
basic_machine=m68k-unknown
os=-linux
@@ -822,6 +843,10 @@ case $basic_machine in
basic_machine=powerpc-unknown
os=-morphos
;;
+ moxiebox)
+ basic_machine=moxie-unknown
+ os=-moxiebox
+ ;;
msdos)
basic_machine=i386-pc
os=-msdos
@@ -1354,7 +1379,7 @@ case $os in
| -hpux* | -unos* | -osf* | -luna* | -dgux* | -auroraux* | -solaris* \
| -sym* | -kopensolaris* | -plan9* \
| -amigaos* | -amigados* | -msdos* | -newsos* | -unicos* | -aof* \
- | -aos* | -aros* \
+ | -aos* | -aros* | -cloudabi* | -sortix* \
| -nindy* | -vxsim* | -vxworks* | -ebmon* | -hms* | -mvs* \
| -clix* | -riscos* | -uniplus* | -iris* | -rtu* | -xenix* \
| -hiux* | -386bsd* | -knetbsd* | -mirbsd* | -netbsd* \
@@ -1367,14 +1392,14 @@ case $os in
| -cygwin* | -msys* | -pe* | -psos* | -moss* | -proelf* | -rtems* \
| -mingw32* | -mingw64* | -linux-gnu* | -linux-android* \
| -linux-newlib* | -linux-musl* | -linux-uclibc* \
- | -uxpv* | -beos* | -mpeix* | -udk* \
+ | -uxpv* | -beos* | -mpeix* | -udk* | -moxiebox* \
| -interix* | -uwin* | -mks* | -rhapsody* | -darwin* | -opened* \
| -openstep* | -oskit* | -conix* | -pw32* | -nonstopux* \
| -storm-chaos* | -tops10* | -tenex* | -tops20* | -its* \
| -os2* | -vos* | -palmos* | -uclinux* | -nucleus* \
| -morphos* | -superux* | -rtmk* | -rtmk-nova* | -windiss* \
| -powermax* | -dnix* | -nx6 | -nx7 | -sei* | -dragonfly* \
- | -skyos* | -haiku* | -rdos* | -toppers* | -drops* | -es*)
+ | -skyos* | -haiku* | -rdos* | -toppers* | -drops* | -es* | -tirtos*)
# Remember, each alternative MUST END IN *, to match a version number.
;;
-qnx*)
@@ -1592,9 +1617,6 @@ case $basic_machine in
mips*-*)
os=-elf
;;
- or1k-*)
- os=-elf
- ;;
or32-*)
os=-coff
;;
diff --git a/configure b/configure
index 6c47f67..953c606 100755
--- a/configure
+++ b/configure
@@ -1,6 +1,6 @@
#! /bin/sh
# Guess values for system-dependent variables and create Makefiles.
-# Generated by GNU Autoconf 2.69 for liborcus 0.13.0.
+# Generated by GNU Autoconf 2.69 for liborcus 0.13.1.
#
#
# Copyright (C) 1992-1996, 1998-2012 Free Software Foundation, Inc.
@@ -587,8 +587,8 @@ MAKEFLAGS=
# Identity of this package.
PACKAGE_NAME='liborcus'
PACKAGE_TARNAME='liborcus'
-PACKAGE_VERSION='0.13.0'
-PACKAGE_STRING='liborcus 0.13.0'
+PACKAGE_VERSION='0.13.1'
+PACKAGE_STRING='liborcus 0.13.1'
PACKAGE_BUGREPORT=''
PACKAGE_URL=''
@@ -1420,7 +1420,7 @@ if test "$ac_init_help" = "long"; then
# Omit some internal or obsolete options to make the list less imposing.
# This message is too long to be a string in the A/UX 3.1 sh.
cat <<_ACEOF
-\`configure' configures liborcus 0.13.0 to adapt to many kinds of systems.
+\`configure' configures liborcus 0.13.1 to adapt to many kinds of systems.
Usage: $0 [OPTION]... [VAR=VALUE]...
@@ -1491,7 +1491,7 @@ fi
if test -n "$ac_init_help"; then
case $ac_init_help in
- short | recursive ) echo "Configuration of liborcus 0.13.0:";;
+ short | recursive ) echo "Configuration of liborcus 0.13.1:";;
esac
cat <<\_ACEOF
@@ -1642,7 +1642,7 @@ fi
test -n "$ac_init_help" && exit $ac_status
if $ac_init_version; then
cat <<\_ACEOF
-liborcus configure 0.13.0
+liborcus configure 0.13.1
generated by GNU Autoconf 2.69
Copyright (C) 2012 Free Software Foundation, Inc.
@@ -2291,7 +2291,7 @@ cat >config.log <<_ACEOF
This file contains any messages produced by compilers while
running configure, to aid debugging if configure makes a mistake.
-It was created by liborcus $as_me 0.13.0, which was
+It was created by liborcus $as_me 0.13.1, which was
generated by GNU Autoconf 2.69. Invocation command line was
$ $0 $@
@@ -3158,7 +3158,7 @@ fi
# Define the identity of the package.
PACKAGE='liborcus'
- VERSION='0.13.0'
+ VERSION='0.13.1'
cat >>confdefs.h <<_ACEOF
@@ -17532,7 +17532,7 @@ IXION_REQUIRED_API_VERSION=0.13
ORCUS_API_VERSION=0.13
ORCUS_MAJOR_VERSION=0
ORCUS_MINOR_VERSION=13
-ORCUS_MICRO_VERSION=0
+ORCUS_MICRO_VERSION=1
@@ -20847,7 +20847,7 @@ else
fi
-ac_config_files="$ac_config_files Makefile liborcus-$ORCUS_API_VERSION.pc:liborcus.pc.in liborcus-spreadsheet-model-$ORCUS_API_VERSION.pc:liborcus-spreadsheet-model.pc.in VERSION include/Makefile include/orcus/Makefile include/orcus/detail/Makefile include/orcus/mso/Makefile include/orcus/spreadsheet/Makefile src/Makefile src/liborcus/Makefile src/liborcus/constants.inl src/mso/Makefile src/parser/Makefile src/python/Makefile src/spreadsheet/Makefile parser_handlers/Makefile benchmark/Makefile"
+ac_config_files="$ac_config_files Makefile liborcus-$ORCUS_API_VERSION.pc:liborcus.pc.in liborcus-spreadsheet-model-$ORCUS_API_VERSION.pc:liborcus-spreadsheet-model.pc.in VERSION include/Makefile include/orcus/Makefile include/orcus/detail/Makefile include/orcus/mso/Makefile include/orcus/spreadsheet/Makefile src/Makefile src/liborcus/Makefile src/liborcus/constants.inl src/mso/Makefile src/parser/Makefile src/python/Makefile src/spreadsheet/Makefile parser_handlers/Makefile benchmark/Ma [...]
cat >confcache <<\_ACEOF
# This file is a shell script that caches the results of configure
@@ -21423,7 +21423,7 @@ cat >>$CONFIG_STATUS <<\_ACEOF || ac_write_fail=1
# report actual input values of CONFIG_FILES etc. instead of their
# values after options handling.
ac_log="
-This file was extended by liborcus $as_me 0.13.0, which was
+This file was extended by liborcus $as_me 0.13.1, which was
generated by GNU Autoconf 2.69. Invocation command line was
CONFIG_FILES = $CONFIG_FILES
@@ -21489,7 +21489,7 @@ _ACEOF
cat >>$CONFIG_STATUS <<_ACEOF || ac_write_fail=1
ac_cs_config="`$as_echo "$ac_configure_args" | sed 's/^ //; s/[\\""\`\$]/\\\\&/g'`"
ac_cs_version="\\
-liborcus config.status 0.13.0
+liborcus config.status 0.13.1
configured by $0, generated by GNU Autoconf 2.69,
with options \\"\$ac_cs_config\\"
@@ -22022,6 +22022,7 @@ do
"src/spreadsheet/Makefile") CONFIG_FILES="$CONFIG_FILES src/spreadsheet/Makefile" ;;
"parser_handlers/Makefile") CONFIG_FILES="$CONFIG_FILES parser_handlers/Makefile" ;;
"benchmark/Makefile") CONFIG_FILES="$CONFIG_FILES benchmark/Makefile" ;;
+ "doc_example/Makefile") CONFIG_FILES="$CONFIG_FILES doc_example/Makefile" ;;
*) as_fn_error $? "invalid argument: \`$ac_config_target'" "$LINENO" 5;;
esac
diff --git a/configure.ac b/configure.ac
index 1aa9cdc..62aeb61 100644
--- a/configure.ac
+++ b/configure.ac
@@ -8,7 +8,7 @@ AC_PREREQ([2.65])
# ===================
m4_define([orcus_major_version], [0])
m4_define([orcus_minor_version], [13])
-m4_define([orcus_micro_version], [0])
+m4_define([orcus_micro_version], [1])
m4_define([orcus_version], [orcus_major_version.orcus_minor_version.orcus_micro_version])
# ===============
@@ -270,6 +270,7 @@ AC_CONFIG_FILES([Makefile
src/spreadsheet/Makefile
parser_handlers/Makefile
benchmark/Makefile
+ doc_example/Makefile
])
AC_OUTPUT
diff --git a/doc/cli/orcus_csv.rst b/doc/cli/orcus_csv.rst
index 236a816..542bc0a 100644
--- a/doc/cli/orcus_csv.rst
+++ b/doc/cli/orcus_csv.rst
@@ -19,8 +19,8 @@ Allowed options
Turn on a debug mode to generate run-time debug output.
**--dump-check**
- Dump the the content to stdout in a special format used for content
- verification in unit tests.
+ Dump the content to stdout in a special format used for content verification
+ in automated tests.
**-o [ --output ] arg**
Output directory path, or output file when --dump-check option is used.
@@ -28,7 +28,19 @@ Allowed options
**-f [ --output-format ] arg**
Specify the format of output file. Supported format types are:
- - flat text format (flat)
- - HTML format (html)
- - no output (none)
+ - csv - CSV format
+ - flat - flat text format
+ - html - HTML format
+ - json - JSON format
+ - none - no output
+
+**--row-size arg**
+ Specify the number of maximum rows in each sheet.
+
+**--row-header arg**
+ Specify the number of header rows to repeat if the source content gets split
+ into multiple sheets.
+**--split**
+ Specify whether or not to split the data into multiple sheets in case it
+ won't fit in a single sheet.
diff --git a/doc/cli/orcus_gnumeric.rst b/doc/cli/orcus_gnumeric.rst
index 1e6df37..3ba34c8 100644
--- a/doc/cli/orcus_gnumeric.rst
+++ b/doc/cli/orcus_gnumeric.rst
@@ -19,8 +19,8 @@ Allowed options
Turn on a debug mode to generate run-time debug output.
**--dump-check**
- Dump the the content to stdout in a special format used for content
- verification in unit tests.
+ Dump the content to stdout in a special format used for content verification
+ in automated tests.
**-o [ --output ] arg**
Output directory path, or output file when --dump-check option is used.
@@ -28,7 +28,12 @@ Allowed options
**-f [ --output-format ] arg**
Specify the format of output file. Supported format types are:
- - flat text format (flat)
- - HTML format (html)
- - no output (none)
+ - csv - CSV format
+ - flat - flat text format
+ - html - HTML format
+ - json - JSON format
+ - none - no output
+
+**--row-size arg**
+ Specify the number of maximum rows in each sheet.
diff --git a/doc/cli/orcus_ods.rst b/doc/cli/orcus_ods.rst
index cce628c..a0bc776 100644
--- a/doc/cli/orcus_ods.rst
+++ b/doc/cli/orcus_ods.rst
@@ -19,8 +19,8 @@ Allowed options
Turn on a debug mode to generate run-time debug output.
**--dump-check**
- Dump the the content to stdout in a special format used for content
- verification in unit tests.
+ Dump the content to stdout in a special format used for content verification
+ in automated tests.
**-o [ --output ] arg**
Output directory path, or output file when --dump-check option is used.
@@ -28,7 +28,12 @@ Allowed options
**-f [ --output-format ] arg**
Specify the format of output file. Supported format types are:
- - flat text format (flat)
- - HTML format (html)
- - no output (none)
+ - csv - CSV format
+ - flat - flat text format
+ - html - HTML format
+ - json - JSON format
+ - none - no output
+
+**--row-size arg**
+ Specify the number of maximum rows in each sheet.
diff --git a/doc/cli/orcus_xls_xml.rst b/doc/cli/orcus_xls_xml.rst
index d1c5051..2f37078 100644
--- a/doc/cli/orcus_xls_xml.rst
+++ b/doc/cli/orcus_xls_xml.rst
@@ -19,8 +19,8 @@ Allowed options
Turn on a debug mode to generate run-time debug output.
**--dump-check**
- Dump the the content to stdout in a special format used for content
- verification in unit tests.
+ Dump the content to stdout in a special format used for content verification
+ in automated tests.
**-o [ --output ] arg**
Output directory path, or output file when --dump-check option is used.
@@ -28,7 +28,12 @@ Allowed options
**-f [ --output-format ] arg**
Specify the format of output file. Supported format types are:
- - flat text format (flat)
- - HTML format (html)
- - no output (none)
+ - csv - CSV format
+ - flat - flat text format
+ - html - HTML format
+ - json - JSON format
+ - none - no output
+
+**--row-size arg**
+ Specify the number of maximum rows in each sheet.
diff --git a/doc/cli/orcus_xlsx.rst b/doc/cli/orcus_xlsx.rst
index dcdbc28..513b66a 100644
--- a/doc/cli/orcus_xlsx.rst
+++ b/doc/cli/orcus_xlsx.rst
@@ -19,8 +19,8 @@ Allowed options
Turn on a debug mode to generate run-time debug output.
**--dump-check**
- Dump the the content to stdout in a special format used for content
- verification in unit tests.
+ Dump the content to stdout in a special format used for content verification
+ in automated tests.
**-o [ --output ] arg**
Output directory path, or output file when --dump-check option is used.
@@ -28,7 +28,12 @@ Allowed options
**-f [ --output-format ] arg**
Specify the format of output file. Supported format types are:
- - flat text format (flat)
- - HTML format (html)
- - no output (none)
+ - csv - CSV format
+ - flat - flat text format
+ - html - HTML format
+ - json - JSON format
+ - none - no output
+
+**--row-size arg**
+ Specify the number of maximum rows in each sheet.
diff --git a/doc/conf.py b/doc/conf.py
index 4843360..c567d8d 100644
--- a/doc/conf.py
+++ b/doc/conf.py
@@ -46,16 +46,16 @@ master_doc = 'index'
# General information about the project.
project = 'Orcus'
-copyright = '2016, Kohei Yoshida'
+copyright = '2017, Kohei Yoshida'
# The version info for the project you're documenting, acts as replacement for
# |version| and |release|, also used in various other places throughout the
# built documents.
#
# The short X.Y version.
-version = '0.12'
+version = '0.13'
# The full version, including alpha/beta/rc tags.
-release = '0.12.0'
+release = '0.13.0'
# The language for content autogenerated by Sphinx. Refer to documentation
# for a list of supported languages.
diff --git a/doc/cpp/model/json.rst b/doc/cpp/model/json.rst
index 203de1f..d2385b6 100644
--- a/doc/cpp/model/json.rst
+++ b/doc/cpp/model/json.rst
@@ -4,15 +4,34 @@ JSON Document Tree
Document tree
`````````````
-
-.. doxygenclass:: orcus::json_document_tree
+.. doxygenclass:: orcus::json::document_tree
:members:
.. doxygenstruct:: orcus::json_config
:members:
-.. doxygenclass:: orcus::json::detail::node
+.. doxygenclass:: orcus::json::const_node
+ :members:
+
+.. doxygenclass:: orcus::json::node
+ :members:
+
+.. doxygenclass:: orcus::json::array
+ :members:
+
+.. doxygenclass:: orcus::json::object
:members:
-.. doxygenenum:: orcus::json::detail::node_t
+.. doxygenclass:: orcus::json::detail::init::node
+ :members:
+
+.. doxygenenum:: orcus::json::node_t
:project: orcus
+
+Exceptions
+``````````
+.. doxygenclass:: orcus::json::document_error
+ :members:
+
+.. doxygenclass:: orcus::json::key_value_error
+ :members:
diff --git a/doc/overview/json.rst b/doc/overview/json.rst
index beac0aa..dde0ab7 100644
--- a/doc/overview/json.rst
+++ b/doc/overview/json.rst
@@ -5,31 +5,31 @@ JSON
====
The JSON part of orcus consists of a low-level parser class that handles
-parsing of JSON strings and a document class that stores parsed JSON
-structures.
-
-There are two approaches to process JSON strings using the orcus library. One
-approach is to utilize the :cpp:class:`~orcus::json_document_tree` class to
-load and populate the JSON structure tree via its
-:cpp:func:`~orcus::json_document_tree::load()` method and traverse the tree
-through its :cpp:func:`~orcus::json_document_tree::get_document_root()` method.
+parsing of JSON strings, and a high-level document class that stores parsed
+JSON structures as a node tree.
+
+There are two approaches to processing JSON strings using the orcus library.
+One approach is to utilize the :cpp:class:`~orcus::json::document_tree` class
+to load and populate the JSON structure tree via its
+:cpp:func:`~orcus::json::document_tree::load()` method and traverse the tree
+through its :cpp:func:`~orcus::json::document_tree::get_document_root()` method.
This approach is ideal if you want a quick way to parse and access the content
of a JSON document with minimal effort.
-The other approach is to use the low-level :cpp:class:`~orcus::json_parser`
+Another approach is to use the low-level :cpp:class:`~orcus::json_parser`
class directly by providing your own handler class to receive callbacks from
the parser. This method requires a bit more effort on your part to provide
and populate your own data structure, but if you already have a data structure
to store the content of JSON, then this approach is ideal. The
-:cpp:class:`~orcus::json_document_tree` class internally uses
+:cpp:class:`~orcus::json::document_tree` class internally uses
:cpp:class:`~orcus::json_parser` to parse JSON contents.
-Populating a document tree
---------------------------
+Populating a document tree from JSON string
+-------------------------------------------
The following code snippet shows an example of how to populate an instance of
-:cpp:class:`~orcus::json_document_tree` from a JSON string, and navigate its
+:cpp:class:`~orcus::json::document_tree` from a JSON string, and navigate its
content tree afterward.
::
@@ -51,31 +51,26 @@ content tree afterward.
int main()
{
- using node = orcus::json_document_tree::node;
+ using node = orcus::json::node;
orcus::json_config config; // Use default configuration.
- orcus::json_document_tree doc;
-
- // Load JSON string into a document tree.
+ orcus::json::document_tree doc;
doc.load(json_string, config);
// Root is an object containing three key-value pairs.
node root = doc.get_document_root();
- vector<orcus::pstring> keys = root.keys();
-
- for (auto it = keys.begin(), ite = keys.end(); it != ite; ++it)
+ for (const orcus::pstring& key : root.keys())
{
- orcus::pstring key = *it;
node value = root.child(key);
switch (value.type())
{
- case orcus::json_node_t::string:
+ case orcus::json::node_t::string:
// string value
cout << key << ": " << value.string_value() << endl;
break;
- case orcus::json_node_t::array:
+ case orcus::json::node_t::array:
{
// array value
cout << key << ":" << endl;
@@ -113,6 +108,13 @@ Using the low-level parser
The following code snippet shows how to use the low-level :cpp:class:`~orcus::json_parser`
class by providing an own handler class and passing it as a template argument::
+ #include <orcus/json_parser.hpp>
+ #include <orcus/pstring.hpp>
+ #include <cstring>
+ #include <iostream>
+
+ using namespace std;
+
class json_parser_handler
{
public:
@@ -213,3 +215,322 @@ Executing this code will generate the following output:
number: 12.3
end object
end parse
+
+
+Building a document tree directly
+---------------------------------
+
+You can also create and populate a JSON document tree directly without needing
+to parse a JSON string. This approach is ideal if you want to create a JSON
+tree from scratch and export it as a string. The following series of code
+snippets demonstrate how to exactly build JSON document trees directly and
+export their contents as JSON strings.
+
+The first example shows how to initialize the tree with a simple array::
+
+ orcus::json::document_tree doc = {
+ 1.0, 2.0, "string value", false, nullptr
+ };
+
+ std::cout << doc.dump() << std::endl;
+
+You can simply specify the content of the array via initialization list and
+assign it to the document. The :cpp:func:`~orcus::json::document_tree::dump()`
+method then turns the content into a single string instance, which looks like
+the following:
+
+.. code-block:: text
+
+ [
+ 1,
+ 2,
+ "string value",
+ false,
+ null
+ ]
+
+If you need to build a array of arrays, do like the following::
+
+ orcus::json::document_tree doc = {
+ { true, false, nullptr },
+ { 1.1, 2.2, "text" }
+ };
+
+ std::cout << doc.dump() << std::endl;
+
+This will create an array of two nested child arrays with three values each.
+Dumping the content of the tree as a JSON string will produce something like
+the following:
+
+.. code-block:: text
+
+ [
+ [
+ true,
+ false,
+ null
+ ],
+ [
+ 1.1,
+ 2.2,
+ "text"
+ ]
+ ]
+
+Creating an object can be done by nesting one of more key-value pairs, each of
+which is surrounded by a pair of curly braces, inside another pair of curly
+braces. For example, the following code::
+
+ orcus::json::document_tree doc = {
+ { "key1", 1.2 },
+ { "key2", "some text" },
+ };
+
+ std::cout << doc.dump() << std::endl;
+
+produces the following output:
+
+.. code-block:: text
+
+ {
+ "key1": 1.2,
+ "key2": "some text"
+ }
+
+indicating that the tree consists of a single object having two key-value
+pairs.
+
+You may notice that this syntax is identical to the syntax for
+creating an array of arrays as shown above. In fact, in order for this to be
+an object, each of the inner sequences must have exactly two values, and its
+first value must be a string value. Failing that, it will be interpreted as
+an array of arrays.
+
+As with arrays, nesting of objects is also supported. The following code::
+
+ orcus::json::document_tree doc = {
+ { "parent1", {
+ { "child1", true },
+ { "child2", false },
+ { "child3", 123.4 },
+ }
+ },
+ { "parent2", "not-nested" },
+ };
+
+ std::cout << doc.dump() << std::endl;
+
+creates a root object having two key-value pairs one of which contains
+another object having three key-value pairs, as evident in the following output
+generated by this code:
+
+.. code-block:: text
+
+ {
+ "parent1": {
+ "child1": true,
+ "child2": false,
+ "child3": 123.4
+ },
+ "parent2": "not-nested"
+ }
+
+There is one caveat that you need to be aware of because of this special
+object creation syntax. When you have a nested array that exactly contains
+two values and the first value is a string value, you must explicitly declare
+that as an array by using an :cpp:class:`~orcus::json::array` class instance.
+For instance, this code::
+
+ orcus::json::document_tree doc = {
+ { "array", { "one", 987.0 } }
+ };
+
+is intended to be an object containing an array. However, because the supposed
+inner array contains exactly two values and the first value is a string
+value, which could be interpreted as a key-value pair for the outer object, it
+ends up being too ambiguous and a :cpp:class:`~orcus::json::key_value_error`
+exception gets thrown as a result.
+
+To work around this ambiguity, you need to declare the inner array to be
+explicit by using an :cpp:class:`~orcus::json::array` instance::
+
+ using namespace orcus;
+
+ json::document_tree doc = {
+ { "array", json::array({ "one", 987.0 }) }
+ };
+
+This code now correctly generates a root object containing one key-value pair
+whose value is an array:
+
+.. code-block:: text
+
+ {
+ "array": [
+ "one",
+ 987
+ ]
+ }
+
+Similar ambiguity issue arises when you want to construct a tree consisting
+only of an empty root object. You may be tempted to write something like
+this::
+
+ using namespace orcus;
+
+ json::document_tree doc = {};
+
+However, this will result in leaving the tree entirely unpopulated i.e. the
+tree will not even have a root node! If you continue on and try to get a root
+node from this tree, you'll get a :cpp:class:`~orcus::json::document_error`
+thrown as a result. If you inspect the error message stored in the exception::
+
+ try
+ {
+ auto root = doc.get_document_root();
+ }
+ catch (const json::document_error& e)
+ {
+ std::cout << e.what() << std::endl;
+ }
+
+you will get
+
+.. code-block:: text
+
+ json::document_error: document tree is empty
+
+giving you further proof that the tree is indeed empty! The solution here is
+to directly assign an instance of :cpp:class:`~orcus::json::object` to the
+document tree, which will initialize the tree with an empty root object. The
+following code::
+
+ using namespace orcus;
+
+ json::document_tree doc = json::object();
+
+ std::cout << doc.dump() << std::endl;
+
+will therefore generate
+
+.. code-block:: text
+
+ {
+ }
+
+You can also use the :cpp:class:`~orcus::json::object` class instances to
+indicate empty objects anythere in the tree. For instance, this code::
+
+ using namespace orcus;
+
+ json::document_tree doc = {
+ json::object(),
+ json::object(),
+ json::object()
+ };
+
+is intended to create an array containing three empty objects as its elements,
+and that's exactly what it does:
+
+.. code-block:: text
+
+ [
+ {
+ },
+ {
+ },
+ {
+ }
+ ]
+
+So far all the examples have shown how to initialize the document tree as the
+tree itself is being constructed. But our next example shows how to create
+new key-value pairs to existing objects after the document tree instance has
+been initialized.
+
+::
+
+ using namespace orcus;
+
+ // Initialize the tree with an empty object.
+ json::document_tree doc = json::object();
+
+ // Get the root object, and assign three key-value pairs.
+ json::node root = doc.get_document_root();
+ root["child1"] = 1.0;
+ root["child2"] = "string";
+ root["child3"] = { true, false }; // implicit array
+
+ // You can also create a key-value pair whose value is another object.
+ root["child object"] = {
+ { "key1", 100.0 },
+ { "key2", 200.0 }
+ };
+
+ root["child array"] = json::array({ 1.1, 1.2, true }); // explicit array
+
+This code first initializes the tree with an empty object, then retrieves the
+root empty object and assigns several key-value pairs to it. When converting
+the tree content to a string and inspecting it you'll see something like the
+following:
+
+.. code-block:: text
+
+ {
+ "child array": [
+ 1.1,
+ 1.2,
+ true
+ ],
+ "child1": 1,
+ "child3": [
+ true,
+ false
+ ],
+ "child2": "string",
+ "child object": {
+ "key1": 100,
+ "key2": 200
+ }
+ }
+
+The next example shows how to append values to an existing array after the
+tree has been constructed. Let's take a look at the code::
+
+ using namespace orcus;
+
+ // Initialize the tree with an empty array root.
+ json::document_tree doc = json::array();
+
+ // Get the root array.
+ json::node root = doc.get_document_root();
+
+ // Append values to the array.
+ root.push_back(-1.2);
+ root.push_back("string");
+ root.push_back(true);
+ root.push_back(nullptr);
+
+ // You can append an object to the array via push_back() as well.
+ root.push_back({{"key1", 1.1}, {"key2", 1.2}});
+
+Like the previous example, this code first initializes the tree but this time
+with an empty array as its root, retrieves the root array, then appends
+several values to it via its :cpp:func:`~orcus::json::node::push_back` method.
+
+When you dump the content of this tree as a JSON string you'll get something
+like this:
+
+.. code-block:: text
+
+ [
+ -1.2,
+ "string",
+ true,
+ null,
+ {
+ "key1": 1.1,
+ "key2": 1.2
+ }
+ ]
+
diff --git a/doc_example/Makefile.am b/doc_example/Makefile.am
new file mode 100644
index 0000000..3e3ed7f
--- /dev/null
+++ b/doc_example/Makefile.am
@@ -0,0 +1,37 @@
+
+AM_CPPFLAGS = -I$(top_srcdir)/include
+
+bin_PROGRAMS =
+
+EXTRA_PROGRAMS = \
+ json-doc-1 \
+ json-doc-2 \
+ json-parser-1
+
+json_doc_1_SOURCES = \
+ json_doc_1.cpp
+
+json_doc_1_LDADD = \
+ ../src/liborcus/liborcus- at ORCUS_API_VERSION@.la \
+ ../src/parser/liborcus-parser- at ORCUS_API_VERSION@.la
+
+json_doc_2_SOURCES = \
+ json_doc_2.cpp
+
+json_doc_2_LDADD = \
+ ../src/liborcus/liborcus- at ORCUS_API_VERSION@.la \
+ ../src/parser/liborcus-parser- at ORCUS_API_VERSION@.la
+
+json_parser_1_SOURCES = \
+ json_parser_1.cpp
+
+json_parser_1_LDADD = \
+ ../src/parser/liborcus-parser- at ORCUS_API_VERSION@.la
+
+TESTS = \
+ json-doc-1 \
+ json-doc-2 \
+ json-parser-1
+
+distclean-local:
+ rm -rf $(TESTS)
diff --git a/doc_example/Makefile.in b/doc_example/Makefile.in
new file mode 100644
index 0000000..1a4b151
--- /dev/null
+++ b/doc_example/Makefile.in
@@ -0,0 +1,1114 @@
+# Makefile.in generated by automake 1.15 from Makefile.am.
+# @configure_input@
+
+# Copyright (C) 1994-2014 Free Software Foundation, Inc.
+
+# This Makefile.in is free software; the Free Software Foundation
+# gives unlimited permission to copy and/or distribute it,
+# with or without modifications, as long as this notice is preserved.
+
+# This program is distributed in the hope that it will be useful,
+# but WITHOUT ANY WARRANTY, to the extent permitted by law; without
+# even the implied warranty of MERCHANTABILITY or FITNESS FOR A
+# PARTICULAR PURPOSE.
+
+ at SET_MAKE@
+
+VPATH = @srcdir@
+am__is_gnu_make = { \
+ if test -z '$(MAKELEVEL)'; then \
+ false; \
+ elif test -n '$(MAKE_HOST)'; then \
+ true; \
+ elif test -n '$(MAKE_VERSION)' && test -n '$(CURDIR)'; then \
+ true; \
+ else \
+ false; \
+ fi; \
+}
+am__make_running_with_option = \
+ case $${target_option-} in \
+ ?) ;; \
+ *) echo "am__make_running_with_option: internal error: invalid" \
+ "target option '$${target_option-}' specified" >&2; \
+ exit 1;; \
+ esac; \
+ has_opt=no; \
+ sane_makeflags=$$MAKEFLAGS; \
+ if $(am__is_gnu_make); then \
+ sane_makeflags=$$MFLAGS; \
+ else \
+ case $$MAKEFLAGS in \
+ *\\[\ \ ]*) \
+ bs=\\; \
+ sane_makeflags=`printf '%s\n' "$$MAKEFLAGS" \
+ | sed "s/$$bs$$bs[$$bs $$bs ]*//g"`;; \
+ esac; \
+ fi; \
+ skip_next=no; \
+ strip_trailopt () \
+ { \
+ flg=`printf '%s\n' "$$flg" | sed "s/$$1.*$$//"`; \
+ }; \
+ for flg in $$sane_makeflags; do \
+ test $$skip_next = yes && { skip_next=no; continue; }; \
+ case $$flg in \
+ *=*|--*) continue;; \
+ -*I) strip_trailopt 'I'; skip_next=yes;; \
+ -*I?*) strip_trailopt 'I';; \
+ -*O) strip_trailopt 'O'; skip_next=yes;; \
+ -*O?*) strip_trailopt 'O';; \
+ -*l) strip_trailopt 'l'; skip_next=yes;; \
+ -*l?*) strip_trailopt 'l';; \
+ -[dEDm]) skip_next=yes;; \
+ -[JT]) skip_next=yes;; \
+ esac; \
+ case $$flg in \
+ *$$target_option*) has_opt=yes; break;; \
+ esac; \
+ done; \
+ test $$has_opt = yes
+am__make_dryrun = (target_option=n; $(am__make_running_with_option))
+am__make_keepgoing = (target_option=k; $(am__make_running_with_option))
+pkgdatadir = $(datadir)/@PACKAGE@
+pkgincludedir = $(includedir)/@PACKAGE@
+pkglibdir = $(libdir)/@PACKAGE@
+pkglibexecdir = $(libexecdir)/@PACKAGE@
+am__cd = CDPATH="$${ZSH_VERSION+.}$(PATH_SEPARATOR)" && cd
+install_sh_DATA = $(install_sh) -c -m 644
+install_sh_PROGRAM = $(install_sh) -c
+install_sh_SCRIPT = $(install_sh) -c
+INSTALL_HEADER = $(INSTALL_DATA)
+transform = $(program_transform_name)
+NORMAL_INSTALL = :
+PRE_INSTALL = :
+POST_INSTALL = :
+NORMAL_UNINSTALL = :
+PRE_UNINSTALL = :
+POST_UNINSTALL = :
+build_triplet = @build@
+host_triplet = @host@
+bin_PROGRAMS =
+EXTRA_PROGRAMS = json-doc-1$(EXEEXT) json-doc-2$(EXEEXT) \
+ json-parser-1$(EXEEXT)
+TESTS = json-doc-1$(EXEEXT) json-doc-2$(EXEEXT) json-parser-1$(EXEEXT)
+subdir = doc_example
+ACLOCAL_M4 = $(top_srcdir)/aclocal.m4
+am__aclocal_m4_deps = $(top_srcdir)/m4/ax_cxx_compile_stdcxx_11.m4 \
+ $(top_srcdir)/m4/boost.m4 $(top_srcdir)/m4/libtool.m4 \
+ $(top_srcdir)/m4/ltoptions.m4 $(top_srcdir)/m4/ltsugar.m4 \
+ $(top_srcdir)/m4/ltversion.m4 $(top_srcdir)/m4/lt~obsolete.m4 \
+ $(top_srcdir)/configure.ac
+am__configure_deps = $(am__aclocal_m4_deps) $(CONFIGURE_DEPENDENCIES) \
+ $(ACLOCAL_M4)
+DIST_COMMON = $(srcdir)/Makefile.am $(am__DIST_COMMON)
+mkinstalldirs = $(install_sh) -d
+CONFIG_HEADER = $(top_builddir)/config.h
+CONFIG_CLEAN_FILES =
+CONFIG_CLEAN_VPATH_FILES =
+am__installdirs = "$(DESTDIR)$(bindir)"
+PROGRAMS = $(bin_PROGRAMS)
+am_json_doc_1_OBJECTS = json_doc_1.$(OBJEXT)
+json_doc_1_OBJECTS = $(am_json_doc_1_OBJECTS)
+json_doc_1_DEPENDENCIES = \
+ ../src/liborcus/liborcus- at ORCUS_API_VERSION@.la \
+ ../src/parser/liborcus-parser- at ORCUS_API_VERSION@.la
+AM_V_lt = $(am__v_lt_ at AM_V@)
+am__v_lt_ = $(am__v_lt_ at AM_DEFAULT_V@)
+am__v_lt_0 = --silent
+am__v_lt_1 =
+am_json_doc_2_OBJECTS = json_doc_2.$(OBJEXT)
+json_doc_2_OBJECTS = $(am_json_doc_2_OBJECTS)
+json_doc_2_DEPENDENCIES = \
+ ../src/liborcus/liborcus- at ORCUS_API_VERSION@.la \
+ ../src/parser/liborcus-parser- at ORCUS_API_VERSION@.la
+am_json_parser_1_OBJECTS = json_parser_1.$(OBJEXT)
+json_parser_1_OBJECTS = $(am_json_parser_1_OBJECTS)
+json_parser_1_DEPENDENCIES = \
+ ../src/parser/liborcus-parser- at ORCUS_API_VERSION@.la
+AM_V_P = $(am__v_P_ at AM_V@)
+am__v_P_ = $(am__v_P_ at AM_DEFAULT_V@)
+am__v_P_0 = false
+am__v_P_1 = :
+AM_V_GEN = $(am__v_GEN_ at AM_V@)
+am__v_GEN_ = $(am__v_GEN_ at AM_DEFAULT_V@)
+am__v_GEN_0 = @echo " GEN " $@;
+am__v_GEN_1 =
+AM_V_at = $(am__v_at_ at AM_V@)
+am__v_at_ = $(am__v_at_ at AM_DEFAULT_V@)
+am__v_at_0 = @
+am__v_at_1 =
+DEFAULT_INCLUDES = -I. at am__isrc@ -I$(top_builddir)
+depcomp = $(SHELL) $(top_srcdir)/depcomp
+am__depfiles_maybe = depfiles
+am__mv = mv -f
+CXXCOMPILE = $(CXX) $(DEFS) $(DEFAULT_INCLUDES) $(INCLUDES) \
+ $(AM_CPPFLAGS) $(CPPFLAGS) $(AM_CXXFLAGS) $(CXXFLAGS)
+LTCXXCOMPILE = $(LIBTOOL) $(AM_V_lt) --tag=CXX $(AM_LIBTOOLFLAGS) \
+ $(LIBTOOLFLAGS) --mode=compile $(CXX) $(DEFS) \
+ $(DEFAULT_INCLUDES) $(INCLUDES) $(AM_CPPFLAGS) $(CPPFLAGS) \
+ $(AM_CXXFLAGS) $(CXXFLAGS)
+AM_V_CXX = $(am__v_CXX_ at AM_V@)
+am__v_CXX_ = $(am__v_CXX_ at AM_DEFAULT_V@)
+am__v_CXX_0 = @echo " CXX " $@;
+am__v_CXX_1 =
+CXXLD = $(CXX)
+CXXLINK = $(LIBTOOL) $(AM_V_lt) --tag=CXX $(AM_LIBTOOLFLAGS) \
+ $(LIBTOOLFLAGS) --mode=link $(CXXLD) $(AM_CXXFLAGS) \
+ $(CXXFLAGS) $(AM_LDFLAGS) $(LDFLAGS) -o $@
+AM_V_CXXLD = $(am__v_CXXLD_ at AM_V@)
+am__v_CXXLD_ = $(am__v_CXXLD_ at AM_DEFAULT_V@)
+am__v_CXXLD_0 = @echo " CXXLD " $@;
+am__v_CXXLD_1 =
+SOURCES = $(json_doc_1_SOURCES) $(json_doc_2_SOURCES) \
+ $(json_parser_1_SOURCES)
+DIST_SOURCES = $(json_doc_1_SOURCES) $(json_doc_2_SOURCES) \
+ $(json_parser_1_SOURCES)
+am__can_run_installinfo = \
+ case $$AM_UPDATE_INFO_DIR in \
+ n|no|NO) false;; \
+ *) (install-info --version) >/dev/null 2>&1;; \
+ esac
+am__tagged_files = $(HEADERS) $(SOURCES) $(TAGS_FILES) $(LISP)
+# Read a list of newline-separated strings from the standard input,
+# and print each of them once, without duplicates. Input order is
+# *not* preserved.
+am__uniquify_input = $(AWK) '\
+ BEGIN { nonempty = 0; } \
+ { items[$$0] = 1; nonempty = 1; } \
+ END { if (nonempty) { for (i in items) print i; }; } \
+'
+# Make sure the list of sources is unique. This is necessary because,
+# e.g., the same source file might be shared among _SOURCES variables
+# for different programs/libraries.
+am__define_uniq_tagged_files = \
+ list='$(am__tagged_files)'; \
+ unique=`for i in $$list; do \
+ if test -f "$$i"; then echo $$i; else echo $(srcdir)/$$i; fi; \
+ done | $(am__uniquify_input)`
+ETAGS = etags
+CTAGS = ctags
+am__tty_colors_dummy = \
+ mgn= red= grn= lgn= blu= brg= std=; \
+ am__color_tests=no
+am__tty_colors = { \
+ $(am__tty_colors_dummy); \
+ if test "X$(AM_COLOR_TESTS)" = Xno; then \
+ am__color_tests=no; \
+ elif test "X$(AM_COLOR_TESTS)" = Xalways; then \
+ am__color_tests=yes; \
+ elif test "X$$TERM" != Xdumb && { test -t 1; } 2>/dev/null; then \
+ am__color_tests=yes; \
+ fi; \
+ if test $$am__color_tests = yes; then \
+ red='[0;31m'; \
+ grn='[0;32m'; \
+ lgn='[1;32m'; \
+ blu='[1;34m'; \
+ mgn='[0;35m'; \
+ brg='[1m'; \
+ std='[m'; \
+ fi; \
+}
+am__vpath_adj_setup = srcdirstrip=`echo "$(srcdir)" | sed 's|.|.|g'`;
+am__vpath_adj = case $$p in \
+ $(srcdir)/*) f=`echo "$$p" | sed "s|^$$srcdirstrip/||"`;; \
+ *) f=$$p;; \
+ esac;
+am__strip_dir = f=`echo $$p | sed -e 's|^.*/||'`;
+am__install_max = 40
+am__nobase_strip_setup = \
+ srcdirstrip=`echo "$(srcdir)" | sed 's/[].[^$$\\*|]/\\\\&/g'`
+am__nobase_strip = \
+ for p in $$list; do echo "$$p"; done | sed -e "s|$$srcdirstrip/||"
+am__nobase_list = $(am__nobase_strip_setup); \
+ for p in $$list; do echo "$$p $$p"; done | \
+ sed "s| $$srcdirstrip/| |;"' / .*\//!s/ .*/ ./; s,\( .*\)/[^/]*$$,\1,' | \
+ $(AWK) 'BEGIN { files["."] = "" } { files[$$2] = files[$$2] " " $$1; \
+ if (++n[$$2] == $(am__install_max)) \
+ { print $$2, files[$$2]; n[$$2] = 0; files[$$2] = "" } } \
+ END { for (dir in files) print dir, files[dir] }'
+am__base_list = \
+ sed '$$!N;$$!N;$$!N;$$!N;$$!N;$$!N;$$!N;s/\n/ /g' | \
+ sed '$$!N;$$!N;$$!N;$$!N;s/\n/ /g'
+am__uninstall_files_from_dir = { \
+ test -z "$$files" \
+ || { test ! -d "$$dir" && test ! -f "$$dir" && test ! -r "$$dir"; } \
+ || { echo " ( cd '$$dir' && rm -f" $$files ")"; \
+ $(am__cd) "$$dir" && rm -f $$files; }; \
+ }
+am__recheck_rx = ^[ ]*:recheck:[ ]*
+am__global_test_result_rx = ^[ ]*:global-test-result:[ ]*
+am__copy_in_global_log_rx = ^[ ]*:copy-in-global-log:[ ]*
+# A command that, given a newline-separated list of test names on the
+# standard input, print the name of the tests that are to be re-run
+# upon "make recheck".
+am__list_recheck_tests = $(AWK) '{ \
+ recheck = 1; \
+ while ((rc = (getline line < ($$0 ".trs"))) != 0) \
+ { \
+ if (rc < 0) \
+ { \
+ if ((getline line2 < ($$0 ".log")) < 0) \
+ recheck = 0; \
+ break; \
+ } \
+ else if (line ~ /$(am__recheck_rx)[nN][Oo]/) \
+ { \
+ recheck = 0; \
+ break; \
+ } \
+ else if (line ~ /$(am__recheck_rx)[yY][eE][sS]/) \
+ { \
+ break; \
+ } \
+ }; \
+ if (recheck) \
+ print $$0; \
+ close ($$0 ".trs"); \
+ close ($$0 ".log"); \
+}'
+# A command that, given a newline-separated list of test names on the
+# standard input, create the global log from their .trs and .log files.
+am__create_global_log = $(AWK) ' \
+function fatal(msg) \
+{ \
+ print "fatal: making $@: " msg | "cat >&2"; \
+ exit 1; \
+} \
+function rst_section(header) \
+{ \
+ print header; \
+ len = length(header); \
+ for (i = 1; i <= len; i = i + 1) \
+ printf "="; \
+ printf "\n\n"; \
+} \
+{ \
+ copy_in_global_log = 1; \
+ global_test_result = "RUN"; \
+ while ((rc = (getline line < ($$0 ".trs"))) != 0) \
+ { \
+ if (rc < 0) \
+ fatal("failed to read from " $$0 ".trs"); \
+ if (line ~ /$(am__global_test_result_rx)/) \
+ { \
+ sub("$(am__global_test_result_rx)", "", line); \
+ sub("[ ]*$$", "", line); \
+ global_test_result = line; \
+ } \
+ else if (line ~ /$(am__copy_in_global_log_rx)[nN][oO]/) \
+ copy_in_global_log = 0; \
+ }; \
+ if (copy_in_global_log) \
+ { \
+ rst_section(global_test_result ": " $$0); \
+ while ((rc = (getline line < ($$0 ".log"))) != 0) \
+ { \
+ if (rc < 0) \
+ fatal("failed to read from " $$0 ".log"); \
+ print line; \
+ }; \
+ printf "\n"; \
+ }; \
+ close ($$0 ".trs"); \
+ close ($$0 ".log"); \
+}'
+# Restructured Text title.
+am__rst_title = { sed 's/.*/ & /;h;s/./=/g;p;x;s/ *$$//;p;g' && echo; }
+# Solaris 10 'make', and several other traditional 'make' implementations,
+# pass "-e" to $(SHELL), and POSIX 2008 even requires this. Work around it
+# by disabling -e (using the XSI extension "set +e") if it's set.
+am__sh_e_setup = case $$- in *e*) set +e;; esac
+# Default flags passed to test drivers.
+am__common_driver_flags = \
+ --color-tests "$$am__color_tests" \
+ --enable-hard-errors "$$am__enable_hard_errors" \
+ --expect-failure "$$am__expect_failure"
+# To be inserted before the command running the test. Creates the
+# directory for the log if needed. Stores in $dir the directory
+# containing $f, in $tst the test, in $log the log. Executes the
+# developer- defined test setup AM_TESTS_ENVIRONMENT (if any), and
+# passes TESTS_ENVIRONMENT. Set up options for the wrapper that
+# will run the test scripts (or their associated LOG_COMPILER, if
+# thy have one).
+am__check_pre = \
+$(am__sh_e_setup); \
+$(am__vpath_adj_setup) $(am__vpath_adj) \
+$(am__tty_colors); \
+srcdir=$(srcdir); export srcdir; \
+case "$@" in \
+ */*) am__odir=`echo "./$@" | sed 's|/[^/]*$$||'`;; \
+ *) am__odir=.;; \
+esac; \
+test "x$$am__odir" = x"." || test -d "$$am__odir" \
+ || $(MKDIR_P) "$$am__odir" || exit $$?; \
+if test -f "./$$f"; then dir=./; \
+elif test -f "$$f"; then dir=; \
+else dir="$(srcdir)/"; fi; \
+tst=$$dir$$f; log='$@'; \
+if test -n '$(DISABLE_HARD_ERRORS)'; then \
+ am__enable_hard_errors=no; \
+else \
+ am__enable_hard_errors=yes; \
+fi; \
+case " $(XFAIL_TESTS) " in \
+ *[\ \ ]$$f[\ \ ]* | *[\ \ ]$$dir$$f[\ \ ]*) \
+ am__expect_failure=yes;; \
+ *) \
+ am__expect_failure=no;; \
+esac; \
+$(AM_TESTS_ENVIRONMENT) $(TESTS_ENVIRONMENT)
+# A shell command to get the names of the tests scripts with any registered
+# extension removed (i.e., equivalently, the names of the test logs, with
+# the '.log' extension removed). The result is saved in the shell variable
+# '$bases'. This honors runtime overriding of TESTS and TEST_LOGS. Sadly,
+# we cannot use something simpler, involving e.g., "$(TEST_LOGS:.log=)",
+# since that might cause problem with VPATH rewrites for suffix-less tests.
+# See also 'test-harness-vpath-rewrite.sh' and 'test-trs-basic.sh'.
+am__set_TESTS_bases = \
+ bases='$(TEST_LOGS)'; \
+ bases=`for i in $$bases; do echo $$i; done | sed 's/\.log$$//'`; \
+ bases=`echo $$bases`
+RECHECK_LOGS = $(TEST_LOGS)
+AM_RECURSIVE_TARGETS = check recheck
+TEST_SUITE_LOG = test-suite.log
+TEST_EXTENSIONS = @EXEEXT@ .test
+LOG_DRIVER = $(SHELL) $(top_srcdir)/test-driver
+LOG_COMPILE = $(LOG_COMPILER) $(AM_LOG_FLAGS) $(LOG_FLAGS)
+am__set_b = \
+ case '$@' in \
+ */*) \
+ case '$*' in \
+ */*) b='$*';; \
+ *) b=`echo '$@' | sed 's/\.log$$//'`; \
+ esac;; \
+ *) \
+ b='$*';; \
+ esac
+am__test_logs1 = $(TESTS:=.log)
+am__test_logs2 = $(am__test_logs1:@EXEEXT at .log=.log)
+TEST_LOGS = $(am__test_logs2:.test.log=.log)
+TEST_LOG_DRIVER = $(SHELL) $(top_srcdir)/test-driver
+TEST_LOG_COMPILE = $(TEST_LOG_COMPILER) $(AM_TEST_LOG_FLAGS) \
+ $(TEST_LOG_FLAGS)
+am__DIST_COMMON = $(srcdir)/Makefile.in $(top_srcdir)/depcomp \
+ $(top_srcdir)/test-driver
+DISTFILES = $(DIST_COMMON) $(DIST_SOURCES) $(TEXINFOS) $(EXTRA_DIST)
+ACLOCAL = @ACLOCAL@
+AMTAR = @AMTAR@
+AM_DEFAULT_VERBOSITY = @AM_DEFAULT_VERBOSITY@
+AR = @AR@
+AS = @AS@
+AUTOCONF = @AUTOCONF@
+AUTOHEADER = @AUTOHEADER@
+AUTOMAKE = @AUTOMAKE@
+AWK = @AWK@
+BOOST_CPPFLAGS = @BOOST_CPPFLAGS@
+BOOST_DATE_TIME_LDFLAGS = @BOOST_DATE_TIME_LDFLAGS@
+BOOST_DATE_TIME_LDPATH = @BOOST_DATE_TIME_LDPATH@
+BOOST_DATE_TIME_LIBS = @BOOST_DATE_TIME_LIBS@
+BOOST_FILESYSTEM_LDFLAGS = @BOOST_FILESYSTEM_LDFLAGS@
+BOOST_FILESYSTEM_LDPATH = @BOOST_FILESYSTEM_LDPATH@
+BOOST_FILESYSTEM_LIBS = @BOOST_FILESYSTEM_LIBS@
+BOOST_IOSTREAMS_LDFLAGS = @BOOST_IOSTREAMS_LDFLAGS@
+BOOST_IOSTREAMS_LDPATH = @BOOST_IOSTREAMS_LDPATH@
+BOOST_IOSTREAMS_LIBS = @BOOST_IOSTREAMS_LIBS@
+BOOST_LDPATH = @BOOST_LDPATH@
+BOOST_PROGRAM_OPTIONS_LDFLAGS = @BOOST_PROGRAM_OPTIONS_LDFLAGS@
+BOOST_PROGRAM_OPTIONS_LDPATH = @BOOST_PROGRAM_OPTIONS_LDPATH@
+BOOST_PROGRAM_OPTIONS_LIBS = @BOOST_PROGRAM_OPTIONS_LIBS@
+BOOST_ROOT = @BOOST_ROOT@
+BOOST_SYSTEM_LDFLAGS = @BOOST_SYSTEM_LDFLAGS@
+BOOST_SYSTEM_LDPATH = @BOOST_SYSTEM_LDPATH@
+BOOST_SYSTEM_LIBS = @BOOST_SYSTEM_LIBS@
+CC = @CC@
+CCDEPMODE = @CCDEPMODE@
+CFLAGS = @CFLAGS@
+CPP = @CPP@
+CPPFLAGS = @CPPFLAGS@
+CXX = @CXX@
+CXXCPP = @CXXCPP@
+CXXDEPMODE = @CXXDEPMODE@
+CXXFLAGS = @CXXFLAGS@
+CYGPATH_W = @CYGPATH_W@
+DEFS = @DEFS@
+DEPDIR = @DEPDIR@
+DISTCHECK_CONFIGURE_FLAGS = @DISTCHECK_CONFIGURE_FLAGS@
+DLLTOOL = @DLLTOOL@
+DSYMUTIL = @DSYMUTIL@
+DUMPBIN = @DUMPBIN@
+ECHO_C = @ECHO_C@
+ECHO_N = @ECHO_N@
+ECHO_T = @ECHO_T@
+EGREP = @EGREP@
+EXEEXT = @EXEEXT@
+FGREP = @FGREP@
+GREP = @GREP@
+HAVE_CXX11 = @HAVE_CXX11@
+INSTALL = @INSTALL@
+INSTALL_DATA = @INSTALL_DATA@
+INSTALL_PROGRAM = @INSTALL_PROGRAM@
+INSTALL_SCRIPT = @INSTALL_SCRIPT@
+INSTALL_STRIP_PROGRAM = @INSTALL_STRIP_PROGRAM@
+IXION_REQUIRED_API_VERSION = @IXION_REQUIRED_API_VERSION@
+LD = @LD@
+LDFLAGS = @LDFLAGS@
+LIBIXION_CFLAGS = @LIBIXION_CFLAGS@
+LIBIXION_LIBS = @LIBIXION_LIBS@
+LIBOBJS = @LIBOBJS@
+LIBS = @LIBS@
+LIBTOOL = @LIBTOOL@
+LIPO = @LIPO@
+LN_S = @LN_S@
+LTLIBOBJS = @LTLIBOBJS@
+LT_SYS_LIBRARY_PATH = @LT_SYS_LIBRARY_PATH@
+MAKEINFO = @MAKEINFO@
+MANIFEST_TOOL = @MANIFEST_TOOL@
+MDDS_CFLAGS = @MDDS_CFLAGS@
+MDDS_LIBS = @MDDS_LIBS@
+MKDIR_P = @MKDIR_P@
+NM = @NM@
+NMEDIT = @NMEDIT@
+OBJDUMP = @OBJDUMP@
+OBJEXT = @OBJEXT@
+ORCUS_API_VERSION = @ORCUS_API_VERSION@
+ORCUS_MAJOR_VERSION = @ORCUS_MAJOR_VERSION@
+ORCUS_MICRO_VERSION = @ORCUS_MICRO_VERSION@
+ORCUS_MINOR_VERSION = @ORCUS_MINOR_VERSION@
+OTOOL = @OTOOL@
+OTOOL64 = @OTOOL64@
+PACKAGE = @PACKAGE@
+PACKAGE_BUGREPORT = @PACKAGE_BUGREPORT@
+PACKAGE_NAME = @PACKAGE_NAME@
+PACKAGE_STRING = @PACKAGE_STRING@
+PACKAGE_TARNAME = @PACKAGE_TARNAME@
+PACKAGE_URL = @PACKAGE_URL@
+PACKAGE_VERSION = @PACKAGE_VERSION@
+PATH_SEPARATOR = @PATH_SEPARATOR@
+PKG_CONFIG = @PKG_CONFIG@
+PKG_CONFIG_LIBDIR = @PKG_CONFIG_LIBDIR@
+PKG_CONFIG_PATH = @PKG_CONFIG_PATH@
+POW_LIB = @POW_LIB@
+PYTHON = @PYTHON@
+PYTHON_CFLAGS = @PYTHON_CFLAGS@
+PYTHON_EXEC_PREFIX = @PYTHON_EXEC_PREFIX@
+PYTHON_LIBS = @PYTHON_LIBS@
+PYTHON_PLATFORM = @PYTHON_PLATFORM@
+PYTHON_PREFIX = @PYTHON_PREFIX@
+PYTHON_VERSION = @PYTHON_VERSION@
+RANLIB = @RANLIB@
+SED = @SED@
+SET_MAKE = @SET_MAKE@
+SHELL = @SHELL@
+STRIP = @STRIP@
+VERSION = @VERSION@
+ZLIB_CFLAGS = @ZLIB_CFLAGS@
+ZLIB_LIBS = @ZLIB_LIBS@
+abs_builddir = @abs_builddir@
+abs_srcdir = @abs_srcdir@
+abs_top_builddir = @abs_top_builddir@
+abs_top_srcdir = @abs_top_srcdir@
+ac_ct_AR = @ac_ct_AR@
+ac_ct_CC = @ac_ct_CC@
+ac_ct_CXX = @ac_ct_CXX@
+ac_ct_DUMPBIN = @ac_ct_DUMPBIN@
+am__include = @am__include@
+am__leading_dot = @am__leading_dot@
+am__quote = @am__quote@
+am__tar = @am__tar@
+am__untar = @am__untar@
+bindir = @bindir@
+build = @build@
+build_alias = @build_alias@
+build_cpu = @build_cpu@
+build_os = @build_os@
+build_vendor = @build_vendor@
+builddir = @builddir@
+datadir = @datadir@
+datarootdir = @datarootdir@
+docdir = @docdir@
+dvidir = @dvidir@
+exec_prefix = @exec_prefix@
+host = @host@
+host_alias = @host_alias@
+host_cpu = @host_cpu@
+host_os = @host_os@
+host_vendor = @host_vendor@
+htmldir = @htmldir@
+includedir = @includedir@
+infodir = @infodir@
+install_sh = @install_sh@
+libdir = @libdir@
+libexecdir = @libexecdir@
+localedir = @localedir@
+localstatedir = @localstatedir@
+mandir = @mandir@
+mkdir_p = @mkdir_p@
+oldincludedir = @oldincludedir@
+pdfdir = @pdfdir@
+pkgpyexecdir = @pkgpyexecdir@
+pkgpythondir = @pkgpythondir@
+prefix = @prefix@
+program_transform_name = @program_transform_name@
+psdir = @psdir@
+pyexecdir = @pyexecdir@
+pythondir = @pythondir@
+runstatedir = @runstatedir@
+sbindir = @sbindir@
+sharedstatedir = @sharedstatedir@
+srcdir = @srcdir@
+sysconfdir = @sysconfdir@
+target_alias = @target_alias@
+top_build_prefix = @top_build_prefix@
+top_builddir = @top_builddir@
+top_srcdir = @top_srcdir@
+AM_CPPFLAGS = -I$(top_srcdir)/include
+json_doc_1_SOURCES = \
+ json_doc_1.cpp
+
+json_doc_1_LDADD = \
+ ../src/liborcus/liborcus- at ORCUS_API_VERSION@.la \
+ ../src/parser/liborcus-parser- at ORCUS_API_VERSION@.la
+
+json_doc_2_SOURCES = \
+ json_doc_2.cpp
+
+json_doc_2_LDADD = \
+ ../src/liborcus/liborcus- at ORCUS_API_VERSION@.la \
+ ../src/parser/liborcus-parser- at ORCUS_API_VERSION@.la
+
+json_parser_1_SOURCES = \
+ json_parser_1.cpp
+
+json_parser_1_LDADD = \
+ ../src/parser/liborcus-parser- at ORCUS_API_VERSION@.la
+
+all: all-am
+
+.SUFFIXES:
+.SUFFIXES: .cpp .lo .log .o .obj .test .test$(EXEEXT) .trs
+$(srcdir)/Makefile.in: $(srcdir)/Makefile.am $(am__configure_deps)
+ @for dep in $?; do \
+ case '$(am__configure_deps)' in \
+ *$$dep*) \
+ ( cd $(top_builddir) && $(MAKE) $(AM_MAKEFLAGS) am--refresh ) \
+ && { if test -f $@; then exit 0; else break; fi; }; \
+ exit 1;; \
+ esac; \
+ done; \
+ echo ' cd $(top_srcdir) && $(AUTOMAKE) --foreign doc_example/Makefile'; \
+ $(am__cd) $(top_srcdir) && \
+ $(AUTOMAKE) --foreign doc_example/Makefile
+Makefile: $(srcdir)/Makefile.in $(top_builddir)/config.status
+ @case '$?' in \
+ *config.status*) \
+ cd $(top_builddir) && $(MAKE) $(AM_MAKEFLAGS) am--refresh;; \
+ *) \
+ echo ' cd $(top_builddir) && $(SHELL) ./config.status $(subdir)/$@ $(am__depfiles_maybe)'; \
+ cd $(top_builddir) && $(SHELL) ./config.status $(subdir)/$@ $(am__depfiles_maybe);; \
+ esac;
+
+$(top_builddir)/config.status: $(top_srcdir)/configure $(CONFIG_STATUS_DEPENDENCIES)
+ cd $(top_builddir) && $(MAKE) $(AM_MAKEFLAGS) am--refresh
+
+$(top_srcdir)/configure: $(am__configure_deps)
+ cd $(top_builddir) && $(MAKE) $(AM_MAKEFLAGS) am--refresh
+$(ACLOCAL_M4): $(am__aclocal_m4_deps)
+ cd $(top_builddir) && $(MAKE) $(AM_MAKEFLAGS) am--refresh
+$(am__aclocal_m4_deps):
+install-binPROGRAMS: $(bin_PROGRAMS)
+ @$(NORMAL_INSTALL)
+ @list='$(bin_PROGRAMS)'; test -n "$(bindir)" || list=; \
+ if test -n "$$list"; then \
+ echo " $(MKDIR_P) '$(DESTDIR)$(bindir)'"; \
+ $(MKDIR_P) "$(DESTDIR)$(bindir)" || exit 1; \
+ fi; \
+ for p in $$list; do echo "$$p $$p"; done | \
+ sed 's/$(EXEEXT)$$//' | \
+ while read p p1; do if test -f $$p \
+ || test -f $$p1 \
+ ; then echo "$$p"; echo "$$p"; else :; fi; \
+ done | \
+ sed -e 'p;s,.*/,,;n;h' \
+ -e 's|.*|.|' \
+ -e 'p;x;s,.*/,,;s/$(EXEEXT)$$//;$(transform);s/$$/$(EXEEXT)/' | \
+ sed 'N;N;N;s,\n, ,g' | \
+ $(AWK) 'BEGIN { files["."] = ""; dirs["."] = 1 } \
+ { d=$$3; if (dirs[d] != 1) { print "d", d; dirs[d] = 1 } \
+ if ($$2 == $$4) files[d] = files[d] " " $$1; \
+ else { print "f", $$3 "/" $$4, $$1; } } \
+ END { for (d in files) print "f", d, files[d] }' | \
+ while read type dir files; do \
+ if test "$$dir" = .; then dir=; else dir=/$$dir; fi; \
+ test -z "$$files" || { \
+ echo " $(INSTALL_PROGRAM_ENV) $(LIBTOOL) $(AM_LIBTOOLFLAGS) $(LIBTOOLFLAGS) --mode=install $(INSTALL_PROGRAM) $$files '$(DESTDIR)$(bindir)$$dir'"; \
+ $(INSTALL_PROGRAM_ENV) $(LIBTOOL) $(AM_LIBTOOLFLAGS) $(LIBTOOLFLAGS) --mode=install $(INSTALL_PROGRAM) $$files "$(DESTDIR)$(bindir)$$dir" || exit $$?; \
+ } \
+ ; done
+
+uninstall-binPROGRAMS:
+ @$(NORMAL_UNINSTALL)
+ @list='$(bin_PROGRAMS)'; test -n "$(bindir)" || list=; \
+ files=`for p in $$list; do echo "$$p"; done | \
+ sed -e 'h;s,^.*/,,;s/$(EXEEXT)$$//;$(transform)' \
+ -e 's/$$/$(EXEEXT)/' \
+ `; \
+ test -n "$$list" || exit 0; \
+ echo " ( cd '$(DESTDIR)$(bindir)' && rm -f" $$files ")"; \
+ cd "$(DESTDIR)$(bindir)" && rm -f $$files
+
+clean-binPROGRAMS:
+ @list='$(bin_PROGRAMS)'; test -n "$$list" || exit 0; \
+ echo " rm -f" $$list; \
+ rm -f $$list || exit $$?; \
+ test -n "$(EXEEXT)" || exit 0; \
+ list=`for p in $$list; do echo "$$p"; done | sed 's/$(EXEEXT)$$//'`; \
+ echo " rm -f" $$list; \
+ rm -f $$list
+
+json-doc-1$(EXEEXT): $(json_doc_1_OBJECTS) $(json_doc_1_DEPENDENCIES) $(EXTRA_json_doc_1_DEPENDENCIES)
+ @rm -f json-doc-1$(EXEEXT)
+ $(AM_V_CXXLD)$(CXXLINK) $(json_doc_1_OBJECTS) $(json_doc_1_LDADD) $(LIBS)
+
+json-doc-2$(EXEEXT): $(json_doc_2_OBJECTS) $(json_doc_2_DEPENDENCIES) $(EXTRA_json_doc_2_DEPENDENCIES)
+ @rm -f json-doc-2$(EXEEXT)
+ $(AM_V_CXXLD)$(CXXLINK) $(json_doc_2_OBJECTS) $(json_doc_2_LDADD) $(LIBS)
+
+json-parser-1$(EXEEXT): $(json_parser_1_OBJECTS) $(json_parser_1_DEPENDENCIES) $(EXTRA_json_parser_1_DEPENDENCIES)
+ @rm -f json-parser-1$(EXEEXT)
+ $(AM_V_CXXLD)$(CXXLINK) $(json_parser_1_OBJECTS) $(json_parser_1_LDADD) $(LIBS)
+
+mostlyclean-compile:
+ -rm -f *.$(OBJEXT)
+
+distclean-compile:
+ -rm -f *.tab.c
+
+ at AMDEP_TRUE@@am__include@ @am__quote at ./$(DEPDIR)/json_doc_1.Po at am__quote@
+ at AMDEP_TRUE@@am__include@ @am__quote at ./$(DEPDIR)/json_doc_2.Po at am__quote@
+ at AMDEP_TRUE@@am__include@ @am__quote at ./$(DEPDIR)/json_parser_1.Po at am__quote@
+
+.cpp.o:
+ at am__fastdepCXX_TRUE@ $(AM_V_CXX)$(CXXCOMPILE) -MT $@ -MD -MP -MF $(DEPDIR)/$*.Tpo -c -o $@ $<
+ at am__fastdepCXX_TRUE@ $(AM_V_at)$(am__mv) $(DEPDIR)/$*.Tpo $(DEPDIR)/$*.Po
+ at AMDEP_TRUE@@am__fastdepCXX_FALSE@ $(AM_V_CXX)source='$<' object='$@' libtool=no @AMDEPBACKSLASH@
+ at AMDEP_TRUE@@am__fastdepCXX_FALSE@ DEPDIR=$(DEPDIR) $(CXXDEPMODE) $(depcomp) @AMDEPBACKSLASH@
+ at am__fastdepCXX_FALSE@ $(AM_V_CXX at am__nodep@)$(CXXCOMPILE) -c -o $@ $<
+
+.cpp.obj:
+ at am__fastdepCXX_TRUE@ $(AM_V_CXX)$(CXXCOMPILE) -MT $@ -MD -MP -MF $(DEPDIR)/$*.Tpo -c -o $@ `$(CYGPATH_W) '$<'`
+ at am__fastdepCXX_TRUE@ $(AM_V_at)$(am__mv) $(DEPDIR)/$*.Tpo $(DEPDIR)/$*.Po
+ at AMDEP_TRUE@@am__fastdepCXX_FALSE@ $(AM_V_CXX)source='$<' object='$@' libtool=no @AMDEPBACKSLASH@
+ at AMDEP_TRUE@@am__fastdepCXX_FALSE@ DEPDIR=$(DEPDIR) $(CXXDEPMODE) $(depcomp) @AMDEPBACKSLASH@
+ at am__fastdepCXX_FALSE@ $(AM_V_CXX at am__nodep@)$(CXXCOMPILE) -c -o $@ `$(CYGPATH_W) '$<'`
+
+.cpp.lo:
+ at am__fastdepCXX_TRUE@ $(AM_V_CXX)$(LTCXXCOMPILE) -MT $@ -MD -MP -MF $(DEPDIR)/$*.Tpo -c -o $@ $<
+ at am__fastdepCXX_TRUE@ $(AM_V_at)$(am__mv) $(DEPDIR)/$*.Tpo $(DEPDIR)/$*.Plo
+ at AMDEP_TRUE@@am__fastdepCXX_FALSE@ $(AM_V_CXX)source='$<' object='$@' libtool=yes @AMDEPBACKSLASH@
+ at AMDEP_TRUE@@am__fastdepCXX_FALSE@ DEPDIR=$(DEPDIR) $(CXXDEPMODE) $(depcomp) @AMDEPBACKSLASH@
+ at am__fastdepCXX_FALSE@ $(AM_V_CXX at am__nodep@)$(LTCXXCOMPILE) -c -o $@ $<
+
+mostlyclean-libtool:
+ -rm -f *.lo
+
+clean-libtool:
+ -rm -rf .libs _libs
+
+ID: $(am__tagged_files)
+ $(am__define_uniq_tagged_files); mkid -fID $$unique
+tags: tags-am
+TAGS: tags
+
+tags-am: $(TAGS_DEPENDENCIES) $(am__tagged_files)
+ set x; \
+ here=`pwd`; \
+ $(am__define_uniq_tagged_files); \
+ shift; \
+ if test -z "$(ETAGS_ARGS)$$*$$unique"; then :; else \
+ test -n "$$unique" || unique=$$empty_fix; \
+ if test $$# -gt 0; then \
+ $(ETAGS) $(ETAGSFLAGS) $(AM_ETAGSFLAGS) $(ETAGS_ARGS) \
+ "$$@" $$unique; \
+ else \
+ $(ETAGS) $(ETAGSFLAGS) $(AM_ETAGSFLAGS) $(ETAGS_ARGS) \
+ $$unique; \
+ fi; \
+ fi
+ctags: ctags-am
+
+CTAGS: ctags
+ctags-am: $(TAGS_DEPENDENCIES) $(am__tagged_files)
+ $(am__define_uniq_tagged_files); \
+ test -z "$(CTAGS_ARGS)$$unique" \
+ || $(CTAGS) $(CTAGSFLAGS) $(AM_CTAGSFLAGS) $(CTAGS_ARGS) \
+ $$unique
+
+GTAGS:
+ here=`$(am__cd) $(top_builddir) && pwd` \
+ && $(am__cd) $(top_srcdir) \
+ && gtags -i $(GTAGS_ARGS) "$$here"
+cscopelist: cscopelist-am
+
+cscopelist-am: $(am__tagged_files)
+ list='$(am__tagged_files)'; \
+ case "$(srcdir)" in \
+ [\\/]* | ?:[\\/]*) sdir="$(srcdir)" ;; \
+ *) sdir=$(subdir)/$(srcdir) ;; \
+ esac; \
+ for i in $$list; do \
+ if test -f "$$i"; then \
+ echo "$(subdir)/$$i"; \
+ else \
+ echo "$$sdir/$$i"; \
+ fi; \
+ done >> $(top_builddir)/cscope.files
+
+distclean-tags:
+ -rm -f TAGS ID GTAGS GRTAGS GSYMS GPATH tags
+
+# Recover from deleted '.trs' file; this should ensure that
+# "rm -f foo.log; make foo.trs" re-run 'foo.test', and re-create
+# both 'foo.log' and 'foo.trs'. Break the recipe in two subshells
+# to avoid problems with "make -n".
+.log.trs:
+ rm -f $< $@
+ $(MAKE) $(AM_MAKEFLAGS) $<
+
+# Leading 'am--fnord' is there to ensure the list of targets does not
+# expand to empty, as could happen e.g. with make check TESTS=''.
+am--fnord $(TEST_LOGS) $(TEST_LOGS:.log=.trs): $(am__force_recheck)
+am--force-recheck:
+ @:
+
+$(TEST_SUITE_LOG): $(TEST_LOGS)
+ @$(am__set_TESTS_bases); \
+ am__f_ok () { test -f "$$1" && test -r "$$1"; }; \
+ redo_bases=`for i in $$bases; do \
+ am__f_ok $$i.trs && am__f_ok $$i.log || echo $$i; \
+ done`; \
+ if test -n "$$redo_bases"; then \
+ redo_logs=`for i in $$redo_bases; do echo $$i.log; done`; \
+ redo_results=`for i in $$redo_bases; do echo $$i.trs; done`; \
+ if $(am__make_dryrun); then :; else \
+ rm -f $$redo_logs && rm -f $$redo_results || exit 1; \
+ fi; \
+ fi; \
+ if test -n "$$am__remaking_logs"; then \
+ echo "fatal: making $(TEST_SUITE_LOG): possible infinite" \
+ "recursion detected" >&2; \
+ elif test -n "$$redo_logs"; then \
+ am__remaking_logs=yes $(MAKE) $(AM_MAKEFLAGS) $$redo_logs; \
+ fi; \
+ if $(am__make_dryrun); then :; else \
+ st=0; \
+ errmsg="fatal: making $(TEST_SUITE_LOG): failed to create"; \
+ for i in $$redo_bases; do \
+ test -f $$i.trs && test -r $$i.trs \
+ || { echo "$$errmsg $$i.trs" >&2; st=1; }; \
+ test -f $$i.log && test -r $$i.log \
+ || { echo "$$errmsg $$i.log" >&2; st=1; }; \
+ done; \
+ test $$st -eq 0 || exit 1; \
+ fi
+ @$(am__sh_e_setup); $(am__tty_colors); $(am__set_TESTS_bases); \
+ ws='[ ]'; \
+ results=`for b in $$bases; do echo $$b.trs; done`; \
+ test -n "$$results" || results=/dev/null; \
+ all=` grep "^$$ws*:test-result:" $$results | wc -l`; \
+ pass=` grep "^$$ws*:test-result:$$ws*PASS" $$results | wc -l`; \
+ fail=` grep "^$$ws*:test-result:$$ws*FAIL" $$results | wc -l`; \
+ skip=` grep "^$$ws*:test-result:$$ws*SKIP" $$results | wc -l`; \
+ xfail=`grep "^$$ws*:test-result:$$ws*XFAIL" $$results | wc -l`; \
+ xpass=`grep "^$$ws*:test-result:$$ws*XPASS" $$results | wc -l`; \
+ error=`grep "^$$ws*:test-result:$$ws*ERROR" $$results | wc -l`; \
+ if test `expr $$fail + $$xpass + $$error` -eq 0; then \
+ success=true; \
+ else \
+ success=false; \
+ fi; \
+ br='==================='; br=$$br$$br$$br$$br; \
+ result_count () \
+ { \
+ if test x"$$1" = x"--maybe-color"; then \
+ maybe_colorize=yes; \
+ elif test x"$$1" = x"--no-color"; then \
+ maybe_colorize=no; \
+ else \
+ echo "$@: invalid 'result_count' usage" >&2; exit 4; \
+ fi; \
+ shift; \
+ desc=$$1 count=$$2; \
+ if test $$maybe_colorize = yes && test $$count -gt 0; then \
+ color_start=$$3 color_end=$$std; \
+ else \
+ color_start= color_end=; \
+ fi; \
+ echo "$${color_start}# $$desc $$count$${color_end}"; \
+ }; \
+ create_testsuite_report () \
+ { \
+ result_count $$1 "TOTAL:" $$all "$$brg"; \
+ result_count $$1 "PASS: " $$pass "$$grn"; \
+ result_count $$1 "SKIP: " $$skip "$$blu"; \
+ result_count $$1 "XFAIL:" $$xfail "$$lgn"; \
+ result_count $$1 "FAIL: " $$fail "$$red"; \
+ result_count $$1 "XPASS:" $$xpass "$$red"; \
+ result_count $$1 "ERROR:" $$error "$$mgn"; \
+ }; \
+ { \
+ echo "$(PACKAGE_STRING): $(subdir)/$(TEST_SUITE_LOG)" | \
+ $(am__rst_title); \
+ create_testsuite_report --no-color; \
+ echo; \
+ echo ".. contents:: :depth: 2"; \
+ echo; \
+ for b in $$bases; do echo $$b; done \
+ | $(am__create_global_log); \
+ } >$(TEST_SUITE_LOG).tmp || exit 1; \
+ mv $(TEST_SUITE_LOG).tmp $(TEST_SUITE_LOG); \
+ if $$success; then \
+ col="$$grn"; \
+ else \
+ col="$$red"; \
+ test x"$$VERBOSE" = x || cat $(TEST_SUITE_LOG); \
+ fi; \
+ echo "$${col}$$br$${std}"; \
+ echo "$${col}Testsuite summary for $(PACKAGE_STRING)$${std}"; \
+ echo "$${col}$$br$${std}"; \
+ create_testsuite_report --maybe-color; \
+ echo "$$col$$br$$std"; \
+ if $$success; then :; else \
+ echo "$${col}See $(subdir)/$(TEST_SUITE_LOG)$${std}"; \
+ if test -n "$(PACKAGE_BUGREPORT)"; then \
+ echo "$${col}Please report to $(PACKAGE_BUGREPORT)$${std}"; \
+ fi; \
+ echo "$$col$$br$$std"; \
+ fi; \
+ $$success || exit 1
+
+check-TESTS:
+ @list='$(RECHECK_LOGS)'; test -z "$$list" || rm -f $$list
+ @list='$(RECHECK_LOGS:.log=.trs)'; test -z "$$list" || rm -f $$list
+ @test -z "$(TEST_SUITE_LOG)" || rm -f $(TEST_SUITE_LOG)
+ @set +e; $(am__set_TESTS_bases); \
+ log_list=`for i in $$bases; do echo $$i.log; done`; \
+ trs_list=`for i in $$bases; do echo $$i.trs; done`; \
+ log_list=`echo $$log_list`; trs_list=`echo $$trs_list`; \
+ $(MAKE) $(AM_MAKEFLAGS) $(TEST_SUITE_LOG) TEST_LOGS="$$log_list"; \
+ exit $$?;
+recheck: all
+ @test -z "$(TEST_SUITE_LOG)" || rm -f $(TEST_SUITE_LOG)
+ @set +e; $(am__set_TESTS_bases); \
+ bases=`for i in $$bases; do echo $$i; done \
+ | $(am__list_recheck_tests)` || exit 1; \
+ log_list=`for i in $$bases; do echo $$i.log; done`; \
+ log_list=`echo $$log_list`; \
+ $(MAKE) $(AM_MAKEFLAGS) $(TEST_SUITE_LOG) \
+ am__force_recheck=am--force-recheck \
+ TEST_LOGS="$$log_list"; \
+ exit $$?
+json-doc-1.log: json-doc-1$(EXEEXT)
+ @p='json-doc-1$(EXEEXT)'; \
+ b='json-doc-1'; \
+ $(am__check_pre) $(LOG_DRIVER) --test-name "$$f" \
+ --log-file $$b.log --trs-file $$b.trs \
+ $(am__common_driver_flags) $(AM_LOG_DRIVER_FLAGS) $(LOG_DRIVER_FLAGS) -- $(LOG_COMPILE) \
+ "$$tst" $(AM_TESTS_FD_REDIRECT)
+json-doc-2.log: json-doc-2$(EXEEXT)
+ @p='json-doc-2$(EXEEXT)'; \
+ b='json-doc-2'; \
+ $(am__check_pre) $(LOG_DRIVER) --test-name "$$f" \
+ --log-file $$b.log --trs-file $$b.trs \
+ $(am__common_driver_flags) $(AM_LOG_DRIVER_FLAGS) $(LOG_DRIVER_FLAGS) -- $(LOG_COMPILE) \
+ "$$tst" $(AM_TESTS_FD_REDIRECT)
+json-parser-1.log: json-parser-1$(EXEEXT)
+ @p='json-parser-1$(EXEEXT)'; \
+ b='json-parser-1'; \
+ $(am__check_pre) $(LOG_DRIVER) --test-name "$$f" \
+ --log-file $$b.log --trs-file $$b.trs \
+ $(am__common_driver_flags) $(AM_LOG_DRIVER_FLAGS) $(LOG_DRIVER_FLAGS) -- $(LOG_COMPILE) \
+ "$$tst" $(AM_TESTS_FD_REDIRECT)
+.test.log:
+ @p='$<'; \
+ $(am__set_b); \
+ $(am__check_pre) $(TEST_LOG_DRIVER) --test-name "$$f" \
+ --log-file $$b.log --trs-file $$b.trs \
+ $(am__common_driver_flags) $(AM_TEST_LOG_DRIVER_FLAGS) $(TEST_LOG_DRIVER_FLAGS) -- $(TEST_LOG_COMPILE) \
+ "$$tst" $(AM_TESTS_FD_REDIRECT)
+ at am__EXEEXT_TRUE@.test$(EXEEXT).log:
+ at am__EXEEXT_TRUE@ @p='$<'; \
+ at am__EXEEXT_TRUE@ $(am__set_b); \
+ at am__EXEEXT_TRUE@ $(am__check_pre) $(TEST_LOG_DRIVER) --test-name "$$f" \
+ at am__EXEEXT_TRUE@ --log-file $$b.log --trs-file $$b.trs \
+ at am__EXEEXT_TRUE@ $(am__common_driver_flags) $(AM_TEST_LOG_DRIVER_FLAGS) $(TEST_LOG_DRIVER_FLAGS) -- $(TEST_LOG_COMPILE) \
+ at am__EXEEXT_TRUE@ "$$tst" $(AM_TESTS_FD_REDIRECT)
+
+distdir: $(DISTFILES)
+ @srcdirstrip=`echo "$(srcdir)" | sed 's/[].[^$$\\*]/\\\\&/g'`; \
+ topsrcdirstrip=`echo "$(top_srcdir)" | sed 's/[].[^$$\\*]/\\\\&/g'`; \
+ list='$(DISTFILES)'; \
+ dist_files=`for file in $$list; do echo $$file; done | \
+ sed -e "s|^$$srcdirstrip/||;t" \
+ -e "s|^$$topsrcdirstrip/|$(top_builddir)/|;t"`; \
+ case $$dist_files in \
+ */*) $(MKDIR_P) `echo "$$dist_files" | \
+ sed '/\//!d;s|^|$(distdir)/|;s,/[^/]*$$,,' | \
+ sort -u` ;; \
+ esac; \
+ for file in $$dist_files; do \
+ if test -f $$file || test -d $$file; then d=.; else d=$(srcdir); fi; \
+ if test -d $$d/$$file; then \
+ dir=`echo "/$$file" | sed -e 's,/[^/]*$$,,'`; \
+ if test -d "$(distdir)/$$file"; then \
+ find "$(distdir)/$$file" -type d ! -perm -700 -exec chmod u+rwx {} \;; \
+ fi; \
+ if test -d $(srcdir)/$$file && test $$d != $(srcdir); then \
+ cp -fpR $(srcdir)/$$file "$(distdir)$$dir" || exit 1; \
+ find "$(distdir)/$$file" -type d ! -perm -700 -exec chmod u+rwx {} \;; \
+ fi; \
+ cp -fpR $$d/$$file "$(distdir)$$dir" || exit 1; \
+ else \
+ test -f "$(distdir)/$$file" \
+ || cp -p $$d/$$file "$(distdir)/$$file" \
+ || exit 1; \
+ fi; \
+ done
+check-am: all-am
+ $(MAKE) $(AM_MAKEFLAGS) check-TESTS
+check: check-am
+all-am: Makefile $(PROGRAMS)
+installdirs:
+ for dir in "$(DESTDIR)$(bindir)"; do \
+ test -z "$$dir" || $(MKDIR_P) "$$dir"; \
+ done
+install: install-am
+install-exec: install-exec-am
+install-data: install-data-am
+uninstall: uninstall-am
+
+install-am: all-am
+ @$(MAKE) $(AM_MAKEFLAGS) install-exec-am install-data-am
+
+installcheck: installcheck-am
+install-strip:
+ if test -z '$(STRIP)'; then \
+ $(MAKE) $(AM_MAKEFLAGS) INSTALL_PROGRAM="$(INSTALL_STRIP_PROGRAM)" \
+ install_sh_PROGRAM="$(INSTALL_STRIP_PROGRAM)" INSTALL_STRIP_FLAG=-s \
+ install; \
+ else \
+ $(MAKE) $(AM_MAKEFLAGS) INSTALL_PROGRAM="$(INSTALL_STRIP_PROGRAM)" \
+ install_sh_PROGRAM="$(INSTALL_STRIP_PROGRAM)" INSTALL_STRIP_FLAG=-s \
+ "INSTALL_PROGRAM_ENV=STRIPPROG='$(STRIP)'" install; \
+ fi
+mostlyclean-generic:
+ -test -z "$(TEST_LOGS)" || rm -f $(TEST_LOGS)
+ -test -z "$(TEST_LOGS:.log=.trs)" || rm -f $(TEST_LOGS:.log=.trs)
+ -test -z "$(TEST_SUITE_LOG)" || rm -f $(TEST_SUITE_LOG)
+
+clean-generic:
+
+distclean-generic:
+ -test -z "$(CONFIG_CLEAN_FILES)" || rm -f $(CONFIG_CLEAN_FILES)
+ -test . = "$(srcdir)" || test -z "$(CONFIG_CLEAN_VPATH_FILES)" || rm -f $(CONFIG_CLEAN_VPATH_FILES)
+
+maintainer-clean-generic:
+ @echo "This command is intended for maintainers to use"
+ @echo "it deletes files that may require special tools to rebuild."
+clean: clean-am
+
+clean-am: clean-binPROGRAMS clean-generic clean-libtool mostlyclean-am
+
+distclean: distclean-am
+ -rm -rf ./$(DEPDIR)
+ -rm -f Makefile
+distclean-am: clean-am distclean-compile distclean-generic \
+ distclean-local distclean-tags
+
+dvi: dvi-am
+
+dvi-am:
+
+html: html-am
+
+html-am:
+
+info: info-am
+
+info-am:
+
+install-data-am:
+
+install-dvi: install-dvi-am
+
+install-dvi-am:
+
+install-exec-am: install-binPROGRAMS
+
+install-html: install-html-am
+
+install-html-am:
+
+install-info: install-info-am
+
+install-info-am:
+
+install-man:
+
+install-pdf: install-pdf-am
+
+install-pdf-am:
+
+install-ps: install-ps-am
+
+install-ps-am:
+
+installcheck-am:
+
+maintainer-clean: maintainer-clean-am
+ -rm -rf ./$(DEPDIR)
+ -rm -f Makefile
+maintainer-clean-am: distclean-am maintainer-clean-generic
+
+mostlyclean: mostlyclean-am
+
+mostlyclean-am: mostlyclean-compile mostlyclean-generic \
+ mostlyclean-libtool
+
+pdf: pdf-am
+
+pdf-am:
+
+ps: ps-am
+
+ps-am:
+
+uninstall-am: uninstall-binPROGRAMS
+
+.MAKE: check-am install-am install-strip
+
+.PHONY: CTAGS GTAGS TAGS all all-am check check-TESTS check-am clean \
+ clean-binPROGRAMS clean-generic clean-libtool cscopelist-am \
+ ctags ctags-am distclean distclean-compile distclean-generic \
+ distclean-libtool distclean-local distclean-tags distdir dvi \
+ dvi-am html html-am info info-am install install-am \
+ install-binPROGRAMS install-data install-data-am install-dvi \
+ install-dvi-am install-exec install-exec-am install-html \
+ install-html-am install-info install-info-am install-man \
+ install-pdf install-pdf-am install-ps install-ps-am \
+ install-strip installcheck installcheck-am installdirs \
+ maintainer-clean maintainer-clean-generic mostlyclean \
+ mostlyclean-compile mostlyclean-generic mostlyclean-libtool \
+ pdf pdf-am ps ps-am recheck tags tags-am uninstall \
+ uninstall-am uninstall-binPROGRAMS
+
+.PRECIOUS: Makefile
+
+
+distclean-local:
+ rm -rf $(TESTS)
+
+# Tell versions [3.59,3.63) of GNU make to not export all variables.
+# Otherwise a system limit (for SysV at least) may be exceeded.
+.NOEXPORT:
diff --git a/doc_example/json_doc_1.cpp b/doc_example/json_doc_1.cpp
new file mode 100644
index 0000000..7824ffb
--- /dev/null
+++ b/doc_example/json_doc_1.cpp
@@ -0,0 +1,56 @@
+
+#include <orcus/json_document_tree.hpp>
+#include <orcus/config.hpp>
+#include <orcus/pstring.hpp>
+
+#include <cstdlib>
+#include <iostream>
+
+using namespace std;
+
+const char* json_string = "{"
+" \"name\": \"John Doe\","
+" \"occupation\": \"Software Engineer\","
+" \"score\": [89, 67, 90]"
+"}";
+
+int main()
+{
+ using node = orcus::json::node;
+
+ orcus::json_config config; // Use default configuration.
+
+ orcus::json::document_tree doc;
+ doc.load(json_string, config);
+
+ // Root is an object containing three key-value pairs.
+ node root = doc.get_document_root();
+
+ for (const orcus::pstring& key : root.keys())
+ {
+ node value = root.child(key);
+ switch (value.type())
+ {
+ case orcus::json::node_t::string:
+ // string value
+ cout << key << ": " << value.string_value() << endl;
+ break;
+ case orcus::json::node_t::array:
+ {
+ // array value
+ cout << key << ":" << endl;
+
+ for (size_t i = 0; i < value.child_count(); ++i)
+ {
+ node array_element = value.child(i);
+ cout << " - " << array_element.numeric_value() << endl;
+ }
+ }
+ break;
+ default:
+ ;
+ }
+ }
+
+ return EXIT_SUCCESS;
+}
diff --git a/doc_example/json_doc_2.cpp b/doc_example/json_doc_2.cpp
new file mode 100644
index 0000000..e3dcb7d
--- /dev/null
+++ b/doc_example/json_doc_2.cpp
@@ -0,0 +1,195 @@
+
+#include <orcus/json_document_tree.hpp>
+#include <orcus/config.hpp>
+
+#include <iostream>
+#include <functional>
+#include <vector>
+
+void example_root_list()
+{
+ orcus::json::document_tree doc = {
+ 1.0, 2.0, "string value", false, nullptr
+ };
+
+ std::cout << doc.dump() << std::endl;
+}
+
+void example_list_nested()
+{
+ orcus::json::document_tree doc = {
+ { true, false, nullptr },
+ { 1.1, 2.2, "text" }
+ };
+
+ std::cout << doc.dump() << std::endl;
+}
+
+void example_list_object()
+{
+ orcus::json::document_tree doc = {
+ { "key1", 1.2 },
+ { "key2", "some text" },
+ };
+
+ std::cout << doc.dump() << std::endl;
+}
+
+void example_list_object_2()
+{
+ orcus::json::document_tree doc = {
+ { "parent1", {
+ { "child1", true },
+ { "child2", false },
+ { "child3", 123.4 },
+ }
+ },
+ { "parent2", "not-nested" },
+ };
+
+ std::cout << doc.dump() << std::endl;
+}
+
+void example_array_ambiguous()
+{
+ orcus::json::document_tree doc = {
+ { "array", { "one", 987.0 } }
+ };
+}
+
+void example_array_explicit()
+{
+ using namespace orcus;
+
+ json::document_tree doc = {
+ { "array", json::array({ "one", 987.0 }) }
+ };
+
+ std::cout << doc.dump() << std::endl;
+}
+
+void example_object_ambiguous()
+{
+ using namespace orcus;
+
+ json::document_tree doc = {};
+
+ try
+ {
+ auto root = doc.get_document_root();
+ }
+ catch (const json::document_error& e)
+ {
+ std::cout << e.what() << std::endl;
+ }
+}
+
+void example_object_explicit_1()
+{
+ using namespace orcus;
+
+ json::document_tree doc = json::object();
+
+ std::cout << doc.dump() << std::endl;
+}
+
+void example_object_explicit_2()
+{
+ using namespace orcus;
+
+ json::document_tree doc = {
+ json::object(),
+ json::object(),
+ json::object()
+ };
+
+ std::cout << doc.dump() << std::endl;
+}
+
+void example_root_object_add_child()
+{
+ using namespace orcus;
+
+ // Initialize the tree with an empty object.
+ json::document_tree doc = json::object();
+
+ // Get the root object, and assign three key-value pairs.
+ json::node root = doc.get_document_root();
+ root["child1"] = 1.0;
+ root["child2"] = "string";
+ root["child3"] = { true, false }; // implicit array
+
+ // You can also create a key-value pair whose value is another object.
+ root["child object"] = {
+ { "key1", 100.0 },
+ { "key2", 200.0 }
+ };
+
+ root["child array"] = json::array({ 1.1, 1.2, true }); // explicit array
+
+ std::cout << doc.dump() << std::endl;
+}
+
+void example_root_array_add_child()
+{
+ using namespace orcus;
+
+ // Initialize the tree with an empty array root.
+ json::document_tree doc = json::array();
+
+ // Get the root array.
+ json::node root = doc.get_document_root();
+
+ // Append values to the array.
+ root.push_back(-1.2);
+ root.push_back("string");
+ root.push_back(true);
+ root.push_back(nullptr);
+
+ // You can append an object to the array via push_back() as well.
+ root.push_back({{"key1", 1.1}, {"key2", 1.2}});
+
+ std::cout << doc.dump() << std::endl;
+}
+
+int main()
+{
+ using func_type = std::function<void()>;
+
+ std::vector<func_type> funcs = {
+ example_root_list,
+ example_list_nested,
+ example_list_object,
+ example_list_object_2,
+ example_array_explicit,
+ example_object_ambiguous,
+ example_object_explicit_1,
+ example_object_explicit_2,
+ example_root_object_add_child,
+ example_root_array_add_child,
+ };
+
+ for (func_type f : funcs)
+ {
+ std::cout << "--" << std::endl;
+ f();
+ }
+
+ std::vector<func_type> funcs_exc = {
+ example_array_ambiguous,
+ };
+
+ for (func_type f : funcs_exc)
+ {
+ try
+ {
+ f();
+ }
+ catch (orcus::json::key_value_error&)
+ {
+ // expected
+ }
+ }
+
+ return EXIT_SUCCESS;
+}
diff --git a/doc_example/json_parser_1.cpp b/doc_example/json_parser_1.cpp
new file mode 100644
index 0000000..cf851d9
--- /dev/null
+++ b/doc_example/json_parser_1.cpp
@@ -0,0 +1,88 @@
+
+#include <orcus/json_parser.hpp>
+#include <orcus/pstring.hpp>
+#include <cstring>
+#include <iostream>
+
+using namespace std;
+
+class json_parser_handler
+{
+public:
+ void begin_parse()
+ {
+ cout << "begin parse" << endl;
+ }
+
+ void end_parse()
+ {
+ cout << "end parse" << endl;
+ }
+
+ void begin_array()
+ {
+ cout << "begin array" << endl;
+ }
+
+ void end_array()
+ {
+ cout << "end array" << endl;
+ }
+
+ void begin_object()
+ {
+ cout << "begin object" << endl;
+ }
+
+ void object_key(const char* p, size_t len, bool transient)
+ {
+ cout << "object key: " << orcus::pstring(p, len) << endl;
+ }
+
+ void end_object()
+ {
+ cout << "end object" << endl;
+ }
+
+ void boolean_true()
+ {
+ cout << "true" << endl;
+ }
+
+ void boolean_false()
+ {
+ cout << "false" << endl;
+ }
+
+ void null()
+ {
+ cout << "null" << endl;
+ }
+
+ void string(const char* p, size_t len, bool transient)
+ {
+ cout << "string: " << orcus::pstring(p, len) << endl;
+ }
+
+ void number(double val)
+ {
+ cout << "number: " << val << endl;
+ }
+};
+
+int main()
+{
+ const char* test_code = "{\"key1\": [1,2,3,4,5], \"key2\": 12.3}";
+ size_t n = strlen(test_code);
+
+ cout << "JSON string: " << test_code << endl;
+
+ // Instantiate the parser with an own handler.
+ json_parser_handler hdl;
+ orcus::json_parser<json_parser_handler> parser(test_code, n, hdl);
+
+ // Parse the string.
+ parser.parse();
+
+ return EXIT_SUCCESS;
+}
diff --git a/include/orcus/css_parser.hpp b/include/orcus/css_parser.hpp
index 2a31ae0..c2ffc9d 100644
--- a/include/orcus/css_parser.hpp
+++ b/include/orcus/css_parser.hpp
@@ -42,7 +42,7 @@ private:
void simple_selector_name();
void property_name();
void property();
- void quoted_value();
+ void quoted_value(char c);
void value();
void function_value(const char* p, size_t len);
void function_rgb(bool alpha);
@@ -314,12 +314,12 @@ void css_parser<_Handler>::property()
}
template<typename _Handler>
-void css_parser<_Handler>::quoted_value()
+void css_parser<_Handler>::quoted_value(char c)
{
// Parse until the the end quote is reached.
const char* p = nullptr;
size_t len = 0;
- literal(p, len, '"');
+ literal(p, len, c);
next();
skip_blanks();
@@ -335,9 +335,9 @@ void css_parser<_Handler>::value()
{
assert(has_char());
char c = cur_char();
- if (c == '"')
+ if (c == '"' || c == '\'')
{
- quoted_value();
+ quoted_value(c);
return;
}
diff --git a/include/orcus/json_document_tree.hpp b/include/orcus/json_document_tree.hpp
index 77695fc..a972b8c 100644
--- a/include/orcus/json_document_tree.hpp
+++ b/include/orcus/json_document_tree.hpp
@@ -27,6 +27,9 @@ struct json_value;
struct json_value_store;
class document_tree;
+/**
+ * Exception related to JSON document tree construction.
+ */
class ORCUS_DLLPUBLIC document_error : public general_error
{
public:
@@ -34,6 +37,11 @@ public:
virtual ~document_error() throw();
};
+/**
+ * Exception that gets thrown due to ambiguity when you specify a braced
+ * list that can be interpreted either as a key-value pair inside an object
+ * or as values of an array.
+ */
class ORCUS_DLLPUBLIC key_value_error : public document_error
{
public:
@@ -75,8 +83,8 @@ enum class node_t : int
namespace detail { namespace init { class node; }}
/**
- * Each node instance represents a JSON value object stored in the document
- * tree.
+ * Each node instance represents a JSON value stored in the document tree.
+ * It's immutable.
*/
class ORCUS_DLLPUBLIC const_node
{
@@ -203,6 +211,10 @@ public:
uintptr_t identity() const;
};
+/**
+ * Each node instance represents a JSON value stored in the document tree.
+ * This class allows mutable operations.
+ */
class ORCUS_DLLPUBLIC node : public const_node
{
friend class document_tree;
@@ -268,6 +280,10 @@ public:
void push_back(const detail::init::node& v);
};
+/**
+ * This class represents a JSON array, to be used to explicitly create an
+ * array instance during initialization.
+ */
class ORCUS_DLLPUBLIC array
{
friend class detail::init::node;
@@ -282,6 +298,10 @@ public:
~array();
};
+/**
+ * This class represents a JSON object, primarily to be used to create an
+ * empty object instance.
+ */
class ORCUS_DLLPUBLIC object
{
public:
@@ -310,7 +330,7 @@ public:
node(double v);
node(int v);
node(bool b);
- node(decltype(nullptr));
+ node(std::nullptr_t);
node(const char* p);
node(std::initializer_list<detail::init::node> vs);
node(json::array array);
diff --git a/include/orcus/orcus_xml.hpp b/include/orcus/orcus_xml.hpp
index 7cf2e52..ffdec34 100644
--- a/include/orcus/orcus_xml.hpp
+++ b/include/orcus/orcus_xml.hpp
@@ -27,6 +27,8 @@ class ORCUS_DLLPUBLIC orcus_xml
orcus_xml(const orcus_xml&); // disabled
orcus_xml& operator= (const orcus_xml&); // disabled
+ void read_impl();
+
public:
orcus_xml(xmlns_repository& ns_repo, spreadsheet::iface::import_factory* im_fact, spreadsheet::iface::export_factory* ex_fact);
~orcus_xml();
@@ -42,6 +44,7 @@ public:
void append_sheet(const pstring& name);
void read_file(const char* filepath);
+ void read_stream(const char* p, size_t n);
void write_file(const char* filepath);
private:
diff --git a/install-sh b/install-sh
index 377bb86..59990a1 100755
--- a/install-sh
+++ b/install-sh
@@ -1,7 +1,7 @@
#!/bin/sh
# install - install a program, script, or datafile
-scriptversion=2011-11-20.07; # UTC
+scriptversion=2014-09-12.12; # UTC
# This originates from X11R5 (mit/util/scripts/install.sh), which was
# later released in X11R6 (xc/config/util/install.sh) with the
@@ -41,19 +41,15 @@ scriptversion=2011-11-20.07; # UTC
# This script is compatible with the BSD install script, but was written
# from scratch.
+tab=' '
nl='
'
-IFS=" "" $nl"
+IFS=" $tab$nl"
-# set DOITPROG to echo to test this script
+# Set DOITPROG to "echo" to test this script.
-# Don't use :- since 4.3BSD and earlier shells don't like it.
doit=${DOITPROG-}
-if test -z "$doit"; then
- doit_exec=exec
-else
- doit_exec=$doit
-fi
+doit_exec=${doit:-exec}
# Put in absolute file names if you don't have them in your path;
# or use environment vars.
@@ -68,17 +64,6 @@ mvprog=${MVPROG-mv}
rmprog=${RMPROG-rm}
stripprog=${STRIPPROG-strip}
-posix_glob='?'
-initialize_posix_glob='
- test "$posix_glob" != "?" || {
- if (set -f) 2>/dev/null; then
- posix_glob=
- else
- posix_glob=:
- fi
- }
-'
-
posix_mkdir=
# Desired mode of installed file.
@@ -97,7 +82,7 @@ dir_arg=
dst_arg=
copy_on_change=false
-no_target_directory=
+is_target_a_directory=possibly
usage="\
Usage: $0 [OPTION]... [-T] SRCFILE DSTFILE
@@ -137,46 +122,57 @@ while test $# -ne 0; do
-d) dir_arg=true;;
-g) chgrpcmd="$chgrpprog $2"
- shift;;
+ shift;;
--help) echo "$usage"; exit $?;;
-m) mode=$2
- case $mode in
- *' '* | *' '* | *'
-'* | *'*'* | *'?'* | *'['*)
- echo "$0: invalid mode: $mode" >&2
- exit 1;;
- esac
- shift;;
+ case $mode in
+ *' '* | *"$tab"* | *"$nl"* | *'*'* | *'?'* | *'['*)
+ echo "$0: invalid mode: $mode" >&2
+ exit 1;;
+ esac
+ shift;;
-o) chowncmd="$chownprog $2"
- shift;;
+ shift;;
-s) stripcmd=$stripprog;;
- -t) dst_arg=$2
- # Protect names problematic for 'test' and other utilities.
- case $dst_arg in
- -* | [=\(\)!]) dst_arg=./$dst_arg;;
- esac
- shift;;
+ -t)
+ is_target_a_directory=always
+ dst_arg=$2
+ # Protect names problematic for 'test' and other utilities.
+ case $dst_arg in
+ -* | [=\(\)!]) dst_arg=./$dst_arg;;
+ esac
+ shift;;
- -T) no_target_directory=true;;
+ -T) is_target_a_directory=never;;
--version) echo "$0 $scriptversion"; exit $?;;
- --) shift
- break;;
+ --) shift
+ break;;
- -*) echo "$0: invalid option: $1" >&2
- exit 1;;
+ -*) echo "$0: invalid option: $1" >&2
+ exit 1;;
*) break;;
esac
shift
done
+# We allow the use of options -d and -T together, by making -d
+# take the precedence; this is for compatibility with GNU install.
+
+if test -n "$dir_arg"; then
+ if test -n "$dst_arg"; then
+ echo "$0: target directory not allowed when installing a directory." >&2
+ exit 1
+ fi
+fi
+
if test $# -ne 0 && test -z "$dir_arg$dst_arg"; then
# When -d is used, all remaining arguments are directories to create.
# When -t is used, the destination is already specified.
@@ -208,6 +204,15 @@ if test $# -eq 0; then
fi
if test -z "$dir_arg"; then
+ if test $# -gt 1 || test "$is_target_a_directory" = always; then
+ if test ! -d "$dst_arg"; then
+ echo "$0: $dst_arg: Is not a directory." >&2
+ exit 1
+ fi
+ fi
+fi
+
+if test -z "$dir_arg"; then
do_exit='(exit $ret); exit $ret'
trap "ret=129; $do_exit" 1
trap "ret=130; $do_exit" 2
@@ -223,16 +228,16 @@ if test -z "$dir_arg"; then
*[0-7])
if test -z "$stripcmd"; then
- u_plus_rw=
+ u_plus_rw=
else
- u_plus_rw='% 200'
+ u_plus_rw='% 200'
fi
cp_umask=`expr '(' 777 - $mode % 1000 ')' $u_plus_rw`;;
*)
if test -z "$stripcmd"; then
- u_plus_rw=
+ u_plus_rw=
else
- u_plus_rw=,u+rw
+ u_plus_rw=,u+rw
fi
cp_umask=$mode$u_plus_rw;;
esac
@@ -269,41 +274,15 @@ do
# If destination is a directory, append the input filename; won't work
# if double slashes aren't ignored.
if test -d "$dst"; then
- if test -n "$no_target_directory"; then
- echo "$0: $dst_arg: Is a directory" >&2
- exit 1
+ if test "$is_target_a_directory" = never; then
+ echo "$0: $dst_arg: Is a directory" >&2
+ exit 1
fi
dstdir=$dst
dst=$dstdir/`basename "$src"`
dstdir_status=0
else
- # Prefer dirname, but fall back on a substitute if dirname fails.
- dstdir=`
- (dirname "$dst") 2>/dev/null ||
- expr X"$dst" : 'X\(.*[^/]\)//*[^/][^/]*/*$' \| \
- X"$dst" : 'X\(//\)[^/]' \| \
- X"$dst" : 'X\(//\)$' \| \
- X"$dst" : 'X\(/\)' \| . 2>/dev/null ||
- echo X"$dst" |
- sed '/^X\(.*[^/]\)\/\/*[^/][^/]*\/*$/{
- s//\1/
- q
- }
- /^X\(\/\/\)[^/].*/{
- s//\1/
- q
- }
- /^X\(\/\/\)$/{
- s//\1/
- q
- }
- /^X\(\/\).*/{
- s//\1/
- q
- }
- s/.*/./; q'
- `
-
+ dstdir=`dirname "$dst"`
test -d "$dstdir"
dstdir_status=$?
fi
@@ -314,74 +293,81 @@ do
if test $dstdir_status != 0; then
case $posix_mkdir in
'')
- # Create intermediate dirs using mode 755 as modified by the umask.
- # This is like FreeBSD 'install' as of 1997-10-28.
- umask=`umask`
- case $stripcmd.$umask in
- # Optimize common cases.
- *[2367][2367]) mkdir_umask=$umask;;
- .*0[02][02] | .[02][02] | .[02]) mkdir_umask=22;;
-
- *[0-7])
- mkdir_umask=`expr $umask + 22 \
- - $umask % 100 % 40 + $umask % 20 \
- - $umask % 10 % 4 + $umask % 2
- `;;
- *) mkdir_umask=$umask,go-w;;
- esac
-
- # With -d, create the new directory with the user-specified mode.
- # Otherwise, rely on $mkdir_umask.
- if test -n "$dir_arg"; then
- mkdir_mode=-m$mode
- else
- mkdir_mode=
- fi
-
- posix_mkdir=false
- case $umask in
- *[123567][0-7][0-7])
- # POSIX mkdir -p sets u+wx bits regardless of umask, which
- # is incompatible with FreeBSD 'install' when (umask & 300) != 0.
- ;;
- *)
- tmpdir=${TMPDIR-/tmp}/ins$RANDOM-$$
- trap 'ret=$?; rmdir "$tmpdir/d" "$tmpdir" 2>/dev/null; exit $ret' 0
-
- if (umask $mkdir_umask &&
- exec $mkdirprog $mkdir_mode -p -- "$tmpdir/d") >/dev/null 2>&1
- then
- if test -z "$dir_arg" || {
- # Check for POSIX incompatibilities with -m.
- # HP-UX 11.23 and IRIX 6.5 mkdir -m -p sets group- or
- # other-writable bit of parent directory when it shouldn't.
- # FreeBSD 6.1 mkdir -m -p sets mode of existing directory.
- ls_ld_tmpdir=`ls -ld "$tmpdir"`
- case $ls_ld_tmpdir in
- d????-?r-*) different_mode=700;;
- d????-?--*) different_mode=755;;
- *) false;;
- esac &&
- $mkdirprog -m$different_mode -p -- "$tmpdir" && {
- ls_ld_tmpdir_1=`ls -ld "$tmpdir"`
- test "$ls_ld_tmpdir" = "$ls_ld_tmpdir_1"
- }
- }
- then posix_mkdir=:
- fi
- rmdir "$tmpdir/d" "$tmpdir"
- else
- # Remove any dirs left behind by ancient mkdir implementations.
- rmdir ./$mkdir_mode ./-p ./-- 2>/dev/null
- fi
- trap '' 0;;
- esac;;
+ # Create intermediate dirs using mode 755 as modified by the umask.
+ # This is like FreeBSD 'install' as of 1997-10-28.
+ umask=`umask`
+ case $stripcmd.$umask in
+ # Optimize common cases.
+ *[2367][2367]) mkdir_umask=$umask;;
+ .*0[02][02] | .[02][02] | .[02]) mkdir_umask=22;;
+
+ *[0-7])
+ mkdir_umask=`expr $umask + 22 \
+ - $umask % 100 % 40 + $umask % 20 \
+ - $umask % 10 % 4 + $umask % 2
+ `;;
+ *) mkdir_umask=$umask,go-w;;
+ esac
+
+ # With -d, create the new directory with the user-specified mode.
+ # Otherwise, rely on $mkdir_umask.
+ if test -n "$dir_arg"; then
+ mkdir_mode=-m$mode
+ else
+ mkdir_mode=
+ fi
+
+ posix_mkdir=false
+ case $umask in
+ *[123567][0-7][0-7])
+ # POSIX mkdir -p sets u+wx bits regardless of umask, which
+ # is incompatible with FreeBSD 'install' when (umask & 300) != 0.
+ ;;
+ *)
+ # $RANDOM is not portable (e.g. dash); use it when possible to
+ # lower collision chance
+ tmpdir=${TMPDIR-/tmp}/ins$RANDOM-$$
+ trap 'ret=$?; rmdir "$tmpdir/a/b" "$tmpdir/a" "$tmpdir" 2>/dev/null; exit $ret' 0
+
+ # As "mkdir -p" follows symlinks and we work in /tmp possibly; so
+ # create the $tmpdir first (and fail if unsuccessful) to make sure
+ # that nobody tries to guess the $tmpdir name.
+ if (umask $mkdir_umask &&
+ $mkdirprog $mkdir_mode "$tmpdir" &&
+ exec $mkdirprog $mkdir_mode -p -- "$tmpdir/a/b") >/dev/null 2>&1
+ then
+ if test -z "$dir_arg" || {
+ # Check for POSIX incompatibilities with -m.
+ # HP-UX 11.23 and IRIX 6.5 mkdir -m -p sets group- or
+ # other-writable bit of parent directory when it shouldn't.
+ # FreeBSD 6.1 mkdir -m -p sets mode of existing directory.
+ test_tmpdir="$tmpdir/a"
+ ls_ld_tmpdir=`ls -ld "$test_tmpdir"`
+ case $ls_ld_tmpdir in
+ d????-?r-*) different_mode=700;;
+ d????-?--*) different_mode=755;;
+ *) false;;
+ esac &&
+ $mkdirprog -m$different_mode -p -- "$test_tmpdir" && {
+ ls_ld_tmpdir_1=`ls -ld "$test_tmpdir"`
+ test "$ls_ld_tmpdir" = "$ls_ld_tmpdir_1"
+ }
+ }
+ then posix_mkdir=:
+ fi
+ rmdir "$tmpdir/a/b" "$tmpdir/a" "$tmpdir"
+ else
+ # Remove any dirs left behind by ancient mkdir implementations.
+ rmdir ./$mkdir_mode ./-p ./-- "$tmpdir" 2>/dev/null
+ fi
+ trap '' 0;;
+ esac;;
esac
if
$posix_mkdir && (
- umask $mkdir_umask &&
- $doit_exec $mkdirprog $mkdir_mode -p -- "$dstdir"
+ umask $mkdir_umask &&
+ $doit_exec $mkdirprog $mkdir_mode -p -- "$dstdir"
)
then :
else
@@ -391,53 +377,51 @@ do
# directory the slow way, step by step, checking for races as we go.
case $dstdir in
- /*) prefix='/';;
- [-=\(\)!]*) prefix='./';;
- *) prefix='';;
+ /*) prefix='/';;
+ [-=\(\)!]*) prefix='./';;
+ *) prefix='';;
esac
- eval "$initialize_posix_glob"
-
oIFS=$IFS
IFS=/
- $posix_glob set -f
+ set -f
set fnord $dstdir
shift
- $posix_glob set +f
+ set +f
IFS=$oIFS
prefixes=
for d
do
- test X"$d" = X && continue
-
- prefix=$prefix$d
- if test -d "$prefix"; then
- prefixes=
- else
- if $posix_mkdir; then
- (umask=$mkdir_umask &&
- $doit_exec $mkdirprog $mkdir_mode -p -- "$dstdir") && break
- # Don't fail if two instances are running concurrently.
- test -d "$prefix" || exit 1
- else
- case $prefix in
- *\'*) qprefix=`echo "$prefix" | sed "s/'/'\\\\\\\\''/g"`;;
- *) qprefix=$prefix;;
- esac
- prefixes="$prefixes '$qprefix'"
- fi
- fi
- prefix=$prefix/
+ test X"$d" = X && continue
+
+ prefix=$prefix$d
+ if test -d "$prefix"; then
+ prefixes=
+ else
+ if $posix_mkdir; then
+ (umask=$mkdir_umask &&
+ $doit_exec $mkdirprog $mkdir_mode -p -- "$dstdir") && break
+ # Don't fail if two instances are running concurrently.
+ test -d "$prefix" || exit 1
+ else
+ case $prefix in
+ *\'*) qprefix=`echo "$prefix" | sed "s/'/'\\\\\\\\''/g"`;;
+ *) qprefix=$prefix;;
+ esac
+ prefixes="$prefixes '$qprefix'"
+ fi
+ fi
+ prefix=$prefix/
done
if test -n "$prefixes"; then
- # Don't fail if two instances are running concurrently.
- (umask $mkdir_umask &&
- eval "\$doit_exec \$mkdirprog $prefixes") ||
- test -d "$dstdir" || exit 1
- obsolete_mkdir_used=true
+ # Don't fail if two instances are running concurrently.
+ (umask $mkdir_umask &&
+ eval "\$doit_exec \$mkdirprog $prefixes") ||
+ test -d "$dstdir" || exit 1
+ obsolete_mkdir_used=true
fi
fi
fi
@@ -472,15 +456,12 @@ do
# If -C, don't bother to copy if it wouldn't change the file.
if $copy_on_change &&
- old=`LC_ALL=C ls -dlL "$dst" 2>/dev/null` &&
- new=`LC_ALL=C ls -dlL "$dsttmp" 2>/dev/null` &&
-
- eval "$initialize_posix_glob" &&
- $posix_glob set -f &&
+ old=`LC_ALL=C ls -dlL "$dst" 2>/dev/null` &&
+ new=`LC_ALL=C ls -dlL "$dsttmp" 2>/dev/null` &&
+ set -f &&
set X $old && old=:$2:$4:$5:$6 &&
set X $new && new=:$2:$4:$5:$6 &&
- $posix_glob set +f &&
-
+ set +f &&
test "$old" = "$new" &&
$cmpprog "$dst" "$dsttmp" >/dev/null 2>&1
then
@@ -493,24 +474,24 @@ do
# to itself, or perhaps because mv is so ancient that it does not
# support -f.
{
- # Now remove or move aside any old file at destination location.
- # We try this two ways since rm can't unlink itself on some
- # systems and the destination file might be busy for other
- # reasons. In this case, the final cleanup might fail but the new
- # file should still install successfully.
- {
- test ! -f "$dst" ||
- $doit $rmcmd -f "$dst" 2>/dev/null ||
- { $doit $mvcmd -f "$dst" "$rmtmp" 2>/dev/null &&
- { $doit $rmcmd -f "$rmtmp" 2>/dev/null; :; }
- } ||
- { echo "$0: cannot unlink or rename $dst" >&2
- (exit 1); exit 1
- }
- } &&
-
- # Now rename the file to the real destination.
- $doit $mvcmd "$dsttmp" "$dst"
+ # Now remove or move aside any old file at destination location.
+ # We try this two ways since rm can't unlink itself on some
+ # systems and the destination file might be busy for other
+ # reasons. In this case, the final cleanup might fail but the new
+ # file should still install successfully.
+ {
+ test ! -f "$dst" ||
+ $doit $rmcmd -f "$dst" 2>/dev/null ||
+ { $doit $mvcmd -f "$dst" "$rmtmp" 2>/dev/null &&
+ { $doit $rmcmd -f "$rmtmp" 2>/dev/null; :; }
+ } ||
+ { echo "$0: cannot unlink or rename $dst" >&2
+ (exit 1); exit 1
+ }
+ } &&
+
+ # Now rename the file to the real destination.
+ $doit $mvcmd "$dsttmp" "$dst"
}
fi || exit 1
diff --git a/py-compile b/py-compile
deleted file mode 100755
index 46ea866..0000000
--- a/py-compile
+++ /dev/null
@@ -1,170 +0,0 @@
-#!/bin/sh
-# py-compile - Compile a Python program
-
-scriptversion=2011-06-08.12; # UTC
-
-# Copyright (C) 2000-2013 Free Software Foundation, Inc.
-
-# This program is free software; you can redistribute it and/or modify
-# it under the terms of the GNU General Public License as published by
-# the Free Software Foundation; either version 2, or (at your option)
-# any later version.
-
-# This program is distributed in the hope that it will be useful,
-# but WITHOUT ANY WARRANTY; without even the implied warranty of
-# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
-# GNU General Public License for more details.
-
-# You should have received a copy of the GNU General Public License
-# along with this program. If not, see <http://www.gnu.org/licenses/>.
-
-# As a special exception to the GNU General Public License, if you
-# distribute this file as part of a program that contains a
-# configuration script generated by Autoconf, you may include it under
-# the same distribution terms that you use for the rest of that program.
-
-# This file is maintained in Automake, please report
-# bugs to <bug-automake at gnu.org> or send patches to
-# <automake-patches at gnu.org>.
-
-if [ -z "$PYTHON" ]; then
- PYTHON=python
-fi
-
-me=py-compile
-
-usage_error ()
-{
- echo "$me: $*" >&2
- echo "Try '$me --help' for more information." >&2
- exit 1
-}
-
-basedir=
-destdir=
-while test $# -ne 0; do
- case "$1" in
- --basedir)
- if test $# -lt 2; then
- usage_error "option '--basedir' requires an argument"
- else
- basedir=$2
- fi
- shift
- ;;
- --destdir)
- if test $# -lt 2; then
- usage_error "option '--destdir' requires an argument"
- else
- destdir=$2
- fi
- shift
- ;;
- -h|--help)
- cat <<\EOF
-Usage: py-compile [--help] [--version] [--basedir DIR] [--destdir DIR] FILES..."
-
-Byte compile some python scripts FILES. Use --destdir to specify any
-leading directory path to the FILES that you don't want to include in the
-byte compiled file. Specify --basedir for any additional path information you
-do want to be shown in the byte compiled file.
-
-Example:
- py-compile --destdir /tmp/pkg-root --basedir /usr/share/test test.py test2.py
-
-Report bugs to <bug-automake at gnu.org>.
-EOF
- exit $?
- ;;
- -v|--version)
- echo "$me $scriptversion"
- exit $?
- ;;
- --)
- shift
- break
- ;;
- -*)
- usage_error "unrecognized option '$1'"
- ;;
- *)
- break
- ;;
- esac
- shift
-done
-
-files=$*
-if test -z "$files"; then
- usage_error "no files given"
-fi
-
-# if basedir was given, then it should be prepended to filenames before
-# byte compilation.
-if [ -z "$basedir" ]; then
- pathtrans="path = file"
-else
- pathtrans="path = os.path.join('$basedir', file)"
-fi
-
-# if destdir was given, then it needs to be prepended to the filename to
-# byte compile but not go into the compiled file.
-if [ -z "$destdir" ]; then
- filetrans="filepath = path"
-else
- filetrans="filepath = os.path.normpath('$destdir' + os.sep + path)"
-fi
-
-$PYTHON -c "
-import sys, os, py_compile, imp
-
-files = '''$files'''
-
-sys.stdout.write('Byte-compiling python modules...\n')
-for file in files.split():
- $pathtrans
- $filetrans
- if not os.path.exists(filepath) or not (len(filepath) >= 3
- and filepath[-3:] == '.py'):
- continue
- sys.stdout.write(file)
- sys.stdout.flush()
- if hasattr(imp, 'get_tag'):
- py_compile.compile(filepath, imp.cache_from_source(filepath), path)
- else:
- py_compile.compile(filepath, filepath + 'c', path)
-sys.stdout.write('\n')" || exit $?
-
-# this will fail for python < 1.5, but that doesn't matter ...
-$PYTHON -O -c "
-import sys, os, py_compile, imp
-
-# pypy does not use .pyo optimization
-if hasattr(sys, 'pypy_translation_info'):
- sys.exit(0)
-
-files = '''$files'''
-sys.stdout.write('Byte-compiling python modules (optimized versions) ...\n')
-for file in files.split():
- $pathtrans
- $filetrans
- if not os.path.exists(filepath) or not (len(filepath) >= 3
- and filepath[-3:] == '.py'):
- continue
- sys.stdout.write(file)
- sys.stdout.flush()
- if hasattr(imp, 'get_tag'):
- py_compile.compile(filepath, imp.cache_from_source(filepath, False), path)
- else:
- py_compile.compile(filepath, filepath + 'o', path)
-sys.stdout.write('\n')" 2>/dev/null || :
-
-# Local Variables:
-# mode: shell-script
-# sh-indentation: 2
-# eval: (add-hook 'write-file-hooks 'time-stamp)
-# time-stamp-start: "scriptversion="
-# time-stamp-format: "%:y-%02m-%02d.%02H"
-# time-stamp-time-zone: "UTC"
-# time-stamp-end: "; # UTC"
-# End:
diff --git a/slickedit/orcus.vpj b/slickedit/orcus.vpj
index 8971e90..3f0dbfa 100644
--- a/slickedit/orcus.vpj
+++ b/slickedit/orcus.vpj
@@ -149,6 +149,11 @@
<F N="../benchmark/json_parser.cpp"/>
<F N="../benchmark/threaded_json_parser.cpp"/>
</Folder>
+ <Folder Name="../doc_example">
+ <F N="../doc_example/json_doc_1.cpp"/>
+ <F N="../doc_example/json_doc_2.cpp"/>
+ <F N="../doc_example/json_parser_1.cpp"/>
+ </Folder>
<Folder Name="../example">
<F N="../example/json.cpp"/>
<F N="../example/json_parser.cpp"/>
diff --git a/src/liborcus/json_document_tree.cpp b/src/liborcus/json_document_tree.cpp
index 1a7554b..7d5f46b 100644
--- a/src/liborcus/json_document_tree.cpp
+++ b/src/liborcus/json_document_tree.cpp
@@ -1026,7 +1026,7 @@ struct node::impl
node::node(double v) : mp_impl(orcus::make_unique<impl>(v)) {}
node::node(int v) : mp_impl(orcus::make_unique<impl>(v)) {}
node::node(bool b) : mp_impl(orcus::make_unique<impl>(b)) {}
-node::node(decltype(nullptr)) : mp_impl(orcus::make_unique<impl>(nullptr)) {}
+node::node(std::nullptr_t) : mp_impl(orcus::make_unique<impl>(nullptr)) {}
node::node(const char* p) : mp_impl(orcus::make_unique<impl>(p)) {}
node::node(std::initializer_list<detail::init::node> vs) : mp_impl(orcus::make_unique<impl>(std::move(vs))) {}
node::node(json::array array) : mp_impl(orcus::make_unique<impl>(std::move(array))) {}
diff --git a/src/liborcus/orcus_xml.cpp b/src/liborcus/orcus_xml.cpp
index 1862603..0e85cb9 100644
--- a/src/liborcus/orcus_xml.cpp
+++ b/src/liborcus/orcus_xml.cpp
@@ -544,6 +544,23 @@ void orcus_xml::read_file(const char* filepath)
#endif
string& strm = mp_impl->m_data_strm;
strm = load_file_content(filepath);
+ read_impl();
+}
+
+void orcus_xml::read_stream(const char* p, size_t n)
+{
+#if ORCUS_DEBUG_XML
+ cout << "reading file " << filepath << endl;
+#endif
+
+ string& strm = mp_impl->m_data_strm;
+ strm = std::string(p, n);
+ read_impl();
+}
+
+void orcus_xml::read_impl()
+{
+ string& strm = mp_impl->m_data_strm;
if (strm.empty())
return;
diff --git a/src/spreadsheet/sheet.cpp b/src/spreadsheet/sheet.cpp
index 3bf048d..aa70a7e 100644
--- a/src/spreadsheet/sheet.cpp
+++ b/src/spreadsheet/sheet.cpp
@@ -404,36 +404,33 @@ void sheet::set_date_time(row_t row, col_t col, int year, int month, int day, in
void sheet::set_format(row_t row, col_t col, size_t index)
{
- cell_format_type::iterator itr = mp_impl->m_cell_formats.find(col);
- if (itr == mp_impl->m_cell_formats.end())
- {
- std::unique_ptr<segment_row_index_type> p(new segment_row_index_type(0, mp_impl->m_row_size+1, 0));
-
- pair<cell_format_type::iterator, bool> r =
- mp_impl->m_cell_formats.insert(cell_format_type::value_type(col, p.get()));
-
- if (!r.second)
- {
- cerr << "insertion of new cell format container failed!" << endl;
- return;
- }
-
- p.release();
- itr = r.first;
- }
-
- segment_row_index_type& con = *itr->second;
- con.insert_back(row, row+1, index);
+ set_format(row, col, row, col, index);
}
void sheet::set_format(row_t row_start, col_t col_start, row_t row_end, col_t col_end, size_t index)
{
for (col_t col = col_start; col <= col_end; ++col)
{
- for (row_t row = row_start; row <= row_end; ++row)
+ cell_format_type::iterator itr = mp_impl->m_cell_formats.find(col);
+ if (itr == mp_impl->m_cell_formats.end())
{
- set_format(row, col, index);
+ std::unique_ptr<segment_row_index_type> p(new segment_row_index_type(0, mp_impl->m_row_size+1, 0));
+
+ pair<cell_format_type::iterator, bool> r =
+ mp_impl->m_cell_formats.insert(cell_format_type::value_type(col, p.get()));
+
+ if (!r.second)
+ {
+ cerr << "insertion of new cell format container failed!" << endl;
+ return;
+ }
+
+ p.release();
+ itr = r.first;
}
+
+ segment_row_index_type& con = *itr->second;
+ con.insert_back(row_start, row_end+1, index);
}
}
diff --git a/test-driver b/test-driver
index d306056..8e575b0 100755
--- a/test-driver
+++ b/test-driver
@@ -3,7 +3,7 @@
scriptversion=2013-07-13.22; # UTC
-# Copyright (C) 2011-2013 Free Software Foundation, Inc.
+# Copyright (C) 2011-2014 Free Software Foundation, Inc.
#
# This program is free software; you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
@@ -106,11 +106,14 @@ trap "st=143; $do_exit" 15
# Test script is run here.
"$@" >$log_file 2>&1
estatus=$?
+
if test $enable_hard_errors = no && test $estatus -eq 99; then
- estatus=1
+ tweaked_estatus=1
+else
+ tweaked_estatus=$estatus
fi
-case $estatus:$expect_failure in
+case $tweaked_estatus:$expect_failure in
0:yes) col=$red res=XPASS recheck=yes gcopy=yes;;
0:*) col=$grn res=PASS recheck=no gcopy=no;;
77:*) col=$blu res=SKIP recheck=no gcopy=yes;;
@@ -119,6 +122,12 @@ case $estatus:$expect_failure in
*:*) col=$red res=FAIL recheck=yes gcopy=yes;;
esac
+# Report the test outcome and exit status in the logs, so that one can
+# know whether the test passed or failed simply by looking at the '.log'
+# file, without the need of also peaking into the corresponding '.trs'
+# file (automake bug#11814).
+echo "$res $test_name (exit status: $estatus)" >>$log_file
+
# Report outcome to console.
echo "${col}${res}${std}: $test_name"
diff --git a/test/css/basic8.css b/test/css/basic8.css
index 15ef2cf..f467bef 100644
--- a/test/css/basic8.css
+++ b/test/css/basic8.css
@@ -11,5 +11,5 @@
}
.ribbon::after::selection {
- content: "Selected orange box.";
+ content: 'Selected orange box.';
}
diff --git a/test/xml-mapped/content-basic/flat/data.txt b/test/xml-mapped/content-basic/flat/data.txt
deleted file mode 100644
index 198703e..0000000
--- a/test/xml-mapped/content-basic/flat/data.txt
+++ /dev/null
@@ -1,22 +0,0 @@
----
-Sheet name: data
-rows: 9 cols: 4
-+------------------+-----------+------------+---------+
-| TOP SECRET | | | |
-+------------------+-----------+------------+---------+
-| Simple Data File | | 2012-08-12 | |
-+------------------+-----------+------------+---------+
-| | | | |
-+------------------+-----------+------------+---------+
-| id | first | last | score |
-+------------------+-----------+------------+---------+
-| 1 [v] | Bill | Clinton | 456 [v] |
-+------------------+-----------+------------+---------+
-| 2 [v] | David | Cameron | 323 [v] |
-+------------------+-----------+------------+---------+
-| 3 [v] | Barak | Obama | 234 [v] |
-+------------------+-----------+------------+---------+
-| 4 [v] | Yoshihiko | Noda | 192 [v] |
-+------------------+-----------+------------+---------+
-| 5 [v] | Angela | Merkel | 210 [v] |
-+------------------+-----------+------------+---------+
diff --git a/test/xml-mapped/fuel-economy/flat/data.txt b/test/xml-mapped/fuel-economy/flat/data.txt
deleted file mode 100644
index 2c76ff2..0000000
--- a/test/xml-mapped/fuel-economy/flat/data.txt
+++ /dev/null
@@ -1,6 +0,0 @@
----
-Sheet name: data
-rows: 1 cols: 2
-+---------+-------------+
-| atvtype | 29.9645 [v] |
-+---------+-------------+
--
Alioth's /usr/local/bin/git-commit-notice on /srv/git.debian.org/git/pkg-openoffice/liborcus.git
More information about the Pkg-openoffice-commits
mailing list