[h5py] 176/455: Docs updates

Ghislain Vaillant ghisvail-guest at moszumanska.debian.org
Thu Jul 2 18:19:30 UTC 2015


This is an automated email from the git hooks/post-receive script.

ghisvail-guest pushed a commit to annotated tag 1.3.0
in repository h5py.

commit 32515bac019b7fd164aec545f7f42989714dc494
Author: andrewcollette <andrew.collette at gmail.com>
Date:   Sat Dec 6 21:17:52 2008 +0000

    Docs updates
---
 docs/source/guide/build.rst |  2 ++
 docs/source/guide/hl.rst    | 20 ++++++++++++++++----
 docs/source/guide/quick.rst |  4 +---
 3 files changed, 19 insertions(+), 7 deletions(-)

diff --git a/docs/source/guide/build.rst b/docs/source/guide/build.rst
index 54a5b85..c6d7b17 100644
--- a/docs/source/guide/build.rst
+++ b/docs/source/guide/build.rst
@@ -1,3 +1,5 @@
+.. _build:
+
 ******************
 Installation guide
 ******************
diff --git a/docs/source/guide/hl.rst b/docs/source/guide/hl.rst
index 01a4598..71f1c58 100644
--- a/docs/source/guide/hl.rst
+++ b/docs/source/guide/hl.rst
@@ -450,6 +450,10 @@ Like Numpy arrays, Dataset objects have attributes named "shape" and "dtype":
 
 .. _slicing_access:
 
+Special Features
+----------------
+
+
 Slicing access
 --------------
 
@@ -551,12 +555,20 @@ features.  These are enabled by the keywords provided to
 :meth:`Group.create_dataset`.  Some of the more useful are:
 
 Compression
-    Transparent GZIP compression 
+    Transparent compression 
     (keyword *compression*)
     can substantially reduce the storage space
-    needed for the dataset.  Supply an integer between 0 and 9.  Using the
-    *shuffle* filter along with this option can improve the compression ratio
-    further.
+    needed for the dataset.  The default compression method is GZIP (DEFPLATE),
+    which is universally supported by other installations of HDF5.
+    Supply an integer between 0 and 9 to enable GZIP compression at that level.
+    Using the *shuffle* filter along with this option can improve the
+    compression ratio further.
+
+Error-Detection
+    All versions of HDF5 include the *fletcher32* checksum filter, which enables
+    read-time error detection for datasets.  If part of a dataset becomes
+    corrupted, a read operation on that section will immediately fail with
+    H5Error.
 
 Resizing
     When using HDF5 1.8,
diff --git a/docs/source/guide/quick.rst b/docs/source/guide/quick.rst
index 35c5604..453a0d5 100644
--- a/docs/source/guide/quick.rst
+++ b/docs/source/guide/quick.rst
@@ -44,9 +44,7 @@ HDF5-aware application can understand.
 Getting data into HDF5
 ======================
 
-First, install h5py by following the `installation instructions`__.
-
-__ http://h5py.alfven.org/build.html
+First, install h5py by following the :ref:`installation instructions <build>`.
 
 The ``import *`` construct is safe when used with the main package::
 

-- 
Alioth's /usr/local/bin/git-commit-notice on /srv/git.debian.org/git/debian-science/packages/h5py.git



More information about the debian-science-commits mailing list