[h5py] 101/455: Quick start guide
Ghislain Vaillant
ghisvail-guest at moszumanska.debian.org
Thu Jul 2 18:19:22 UTC 2015
This is an automated email from the git hooks/post-receive script.
ghisvail-guest pushed a commit to annotated tag 1.3.0
in repository h5py.
commit 3dcec4203902210cad307bb840052686bfe3e56f
Author: andrewcollette <andrew.collette at gmail.com>
Date: Wed Aug 13 03:08:42 2008 +0000
Quick start guide
---
docs/source/build.rst | 9 +-
docs/source/index.rst | 1 +
docs/source/licenses.rst | 157 ++++++++++++++++++++++
docs/source/overview.rst | 162 +++++++++++++++++++++++
docs/source/quick.rst | 332 +++++++++++++++++++++++++++++++++++++++++++++++
5 files changed, 657 insertions(+), 4 deletions(-)
diff --git a/docs/source/build.rst b/docs/source/build.rst
index aed82c5..52496c2 100644
--- a/docs/source/build.rst
+++ b/docs/source/build.rst
@@ -63,9 +63,9 @@ Installing on Windows
=====================
**It's strongly recommended that you use the pre-built .exe installer.** It
-will install h5py, a private copy of HDF5 1.8.1 with ZLIB and SZIP compression
-enabled, and the proper C runtime dependencies. You must have the following
-already installed:
+will install h5py, a private copy of HDF5 1.8.1 with ZLIB and (optionally)
+SZIP compression enabled, and the proper C runtime dependencies. You must have
+the following already installed:
- Python 2.5
- Numpy_ 1.0.3 or higher
@@ -98,7 +98,8 @@ Get HDF5 and create import file
contain ``include`` and ``dll``, among other things.
3. Open a command prompt in ``C:\hdf5\dll`` and run
``pexports hdf5dll.dll > hdf5dll.def``
-4. Create the directory ``C:\hdf5\dll2`` and move ``hdf5dll.def`` there
+4. Run ``dlltool -D hdf5dll.dll -d hdf5dll.def -l hdf5dll.dll.a``
+5. Create the directory ``C:\hdf5\dll2`` and move ``hdf5dll.dll.a`` there
Compile h5py
------------
diff --git a/docs/source/index.rst b/docs/source/index.rst
index 5f31659..a273a01 100644
--- a/docs/source/index.rst
+++ b/docs/source/index.rst
@@ -35,6 +35,7 @@ Contents:
:maxdepth: 2
build
+ quick
threads
licenses
diff --git a/docs/source/licenses.rst b/docs/source/licenses.rst
new file mode 100644
index 0000000..97e946e
--- /dev/null
+++ b/docs/source/licenses.rst
@@ -0,0 +1,157 @@
+***********************
+Licenses and legal info
+***********************
+
+Copyright Notice and Statement for the h5py Project
+===================================================
+
+::
+
+ Copyright (c) 2008 Andrew Collette
+ http://h5py.alfven.org
+ All rights reserved.
+
+ Redistribution and use in source and binary forms, with or without
+ modification, are permitted provided that the following conditions are
+ met:
+
+ a. Redistributions of source code must retain the above copyright
+ notice, this list of conditions and the following disclaimer.
+
+ b. Redistributions in binary form must reproduce the above copyright
+ notice, this list of conditions and the following disclaimer in the
+ documentation and/or other materials provided with the
+ distribution.
+
+ c. Neither the name of the author nor the names of contributors may
+ be used to endorse or promote products derived from this software
+ without specific prior written permission.
+
+ THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
+ "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT
+ LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR
+ A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT
+ OWNER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL,
+ SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT
+ LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE,
+ DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY
+ THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT
+ (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
+ OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
+
+
+HDF5 Copyright Statement
+========================
+
+::
+
+ HDF5 (Hierarchical Data Format 5) Software Library and Utilities
+ Copyright 2006-2007 by The HDF Group (THG).
+
+ NCSA HDF5 (Hierarchical Data Format 5) Software Library and Utilities
+ Copyright 1998-2006 by the Board of Trustees of the University of Illinois.
+
+ All rights reserved.
+
+ Contributors: National Center for Supercomputing Applications (NCSA)
+ at the University of Illinois, Fortner Software, Unidata Program
+ Center (netCDF), The Independent JPEG Group (JPEG), Jean-loup Gailly
+ and Mark Adler (gzip), and Digital Equipment Corporation (DEC).
+
+ Redistribution and use in source and binary forms, with or without
+ modification, are permitted for any purpose (including commercial
+ purposes) provided that the following conditions are met:
+
+ 1. Redistributions of source code must retain the above copyright
+ notice, this list of conditions, and the following disclaimer.
+ 2. Redistributions in binary form must reproduce the above
+ copyright notice, this list of conditions, and the following
+ disclaimer in the documentation and/or materials provided with the
+ distribution.
+ 3. In addition, redistributions of modified forms of the source or
+ binary code must carry prominent notices stating that the original
+ code was changed and the date of the change.
+ 4. All publications or advertising materials mentioning features or
+ use of this software are asked, but not required, to acknowledge that
+ it was developed by The HDF Group and by the National Center for
+ Supercomputing Applications at the University of Illinois at
+ Urbana-Champaign and credit the contributors.
+ 5. Neither the name of The HDF Group, the name of the University,
+ nor the name of any Contributor may be used to endorse or promote
+ products derived from this software without specific prior written
+ permission from THG, the University, or the Contributor, respectively.
+
+ DISCLAIMER: THIS SOFTWARE IS PROVIDED BY THE HDF GROUP (THG) AND THE
+ CONTRIBUTORS "AS IS" WITH NO WARRANTY OF ANY KIND, EITHER EXPRESSED OR
+ IMPLIED. In no event shall THG or the Contributors be liable for any
+ damages suffered by the users arising out of the use of this software,
+ even if advised of the possibility of such damage.
+
+ Portions of HDF5 were developed with support from the University of
+ California, Lawrence Livermore National Laboratory (UC LLNL). The
+ following statement applies to those portions of the product and must
+ be retained in any redistribution of source code, binaries,
+ documentation, and/or accompanying materials:
+
+ This work was partially produced at the University of California,
+ Lawrence Livermore National Laboratory (UC LLNL) under contract
+ no. W-7405-ENG-48 (Contract 48) between the U.S. Department of Energy
+ (DOE) and The Regents of the University of California (University) for
+ the operation of UC LLNL.
+
+ DISCLAIMER: This work was prepared as an account of work sponsored by
+ an agency of the United States Government. Neither the United States
+ Government nor the University of California nor any of their
+ employees, makes any warranty, express or implied, or assumes any
+ liability or responsibility for the accuracy, completeness, or
+ usefulness of any information, apparatus, product, or process
+ disclosed, or represents that its use would not infringe privately-
+ owned rights. Reference herein to any specific commercial products,
+ process, or service by trade name, trademark, manufacturer, or
+ otherwise, does not necessarily constitute or imply its endorsement,
+ recommendation, or favoring by the United States Government or the
+ University of California. The views and opinions of authors expressed
+ herein do not necessarily state or reflect those of the United States
+ Government or the University of California, and shall not be used for
+ advertising or product endorsement purposes.
+
+PyTables Copyright Statement
+============================
+
+::
+
+ Copyright Notice and Statement for PyTables Software Library and Utilities:
+
+ Copyright (c) 2002, 2003, 2004 Francesc Altet
+ Copyright (c) 2005, 2006, 2007 Carabos Coop. V.
+ All rights reserved.
+
+ Redistribution and use in source and binary forms, with or without
+ modification, are permitted provided that the following conditions are
+ met:
+
+ a. Redistributions of source code must retain the above copyright
+ notice, this list of conditions and the following disclaimer.
+
+ b. Redistributions in binary form must reproduce the above copyright
+ notice, this list of conditions and the following disclaimer in the
+ documentation and/or other materials provided with the
+ distribution.
+
+ c. Neither the name of the Carabos Coop. V. nor the names of its
+ contributors may be used to endorse or promote products derived
+ from this software without specific prior written permission.
+
+ THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
+ "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT
+ LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR
+ A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT
+ OWNER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL,
+ SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT
+ LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE,
+ DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY
+ THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT
+ (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
+ OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
+
+
diff --git a/docs/source/overview.rst b/docs/source/overview.rst
new file mode 100644
index 0000000..7172c16
--- /dev/null
+++ b/docs/source/overview.rst
@@ -0,0 +1,162 @@
+********
+Overview
+********
+
+.. note::
+
+ This document assumes a basic familiarity with HDF5, including groups,
+ datasets, and the file hierarchy. For more information, the `user guide
+ distributed by the HDF Group`__ is an excellent introduction. It also
+ assumes you have used `Numpy`_ before.
+
+__ http://hdf.ncsa.uiuc.edu/HDF5/doc/UG/index.html
+.. _Numpy: http://numpy.scipy.org
+
+High-level classes
+==================
+
+While the ``h5py.h5*`` modules provide access to the guts of the HDF5 library,
+they are not very convenient for everyday use. For example, creating a new
+dataset with chunking and compression requires creating datatype, dataspace and
+property list objects, assigning the correct values to the property list in
+the right order, and managing the group object to which the dataset will be
+attached. To make interacting with HDF5 less painful, a pure-Python
+"high-level" interface is provided to encapsulate these common operations.
+
+It consists of three classes:
+
+* File: Represents an HDF5 file on disk
+* Group: Represents an HDF5 group, containing datasets and other groups
+* Dataset: An HDF5 dataset
+
+All communication with HDF5 is done through Numpy arrays, dtype objects, and
+slicing conventions. No low-level HDF5 objects (datatype/dataspace objects,
+etc.) are exposed, and no additional Python-side abstractions are introduced.
+Objects typically have a small number of methods.
+
+Paths in HDF5 files
+-------------------
+
+HDF5 files are organized like a filesystem. Groups are analagous to
+directories, while datasets are like the files stored in them. Paths are
+always specified UNIX-style, starting at ``/`` (the "root" group). It's good
+to limit yourself to ASCII, but you're welcome to use spaces, quotes,
+punctuation and other symbol characters.
+
+
+Group objects
+-------------
+
+These represent HDF5 *groups*, directory-like objects which contain *links* to
+HDF5 objects like datasets, or other groups. To a good approximation, you
+can think of them as dictionaries which map a string name to an HDF5 object.
+Like objects stored in Python dictionaries, the same HDF5 object can be
+referred to by more than one group. Groups can even contain themselves!
+
+This dictionary metaphor is how h5py treats groups. They support the following
+Python behavior:
+
+* Item access (``group["name"]``)
+
+ Accessing a member by name returns the appropriate HDF5 object; usually a
+ dataset or another group. Assigning to a name stores the object in the file
+ in the appropriate way (see the docstring for details). Deleting an item
+ "unlinks" it from the group. Like Python objects in dictionaries, when zero
+ groups refer to an object, it's permanently gone.
+
+* Iteration (``for x in group...``, etc.)
+
+ Iterating over a group yields the *names* of its members, like a Python dict.
+ You can use the method ``iteritems()`` to get ``(name, obj)`` tuples. The
+ same restrictions as in Python apply for iteration; don't modify the group
+ while iterating.
+
+* Length (``len(group)``)
+
+ This is just the number of entries in the group.
+
+* Membership (``if "name" in group...``)
+
+ Test if a name appears in the group.
+
+They also support the following methods:
+
+.. method:: create_group(name)
+
+ Create a new, empty group attached to this one, called "name".
+
+.. method:: create_dataset(name, *args, **kwds)
+
+ Create a new dataset attached to this group. The arguments are passed to
+ the Dataset constructor below.
+
+
+File objects
+------------
+
+These represent the HDF5 file itself. Since every HDF5 file contains at least
+the *root group* (called ``/``, as in a POSIX filesystem), it also provides
+your entry point into the file. File objects inherit from Group objects; all
+the Group behavior and methods are available, and will operate on the root
+(``/``) group. For example, to access datasets::
+
+ ds1 = file_obj["/ds1"] (or file_obj["ds1"])
+ ds2 = file_obj["/grp1/ds2"] (or file_obj["grp1/ds2"])
+
+
+Opening (or creating) a file
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+
+You use the File constructor directly to open or create an HDF5 file. Like the
+standard Python function ``open()``, the syntax is:
+
+.. method:: File(name, mode='a')
+
+ Allowed modes are:
+
+ - r Readonly, file must exist
+ - r+ Read/write, file must exist
+ - w Create file, truncate if exists
+ - w- Create file, fail if exists
+ - a Read/write if exists, create otherwise (default)
+
+.. method:: close()
+
+ When you're done, as with Python files, it's important to close the file so
+ that all the data gets written.
+
+
+Python attributes
+~~~~~~~~~~~~~~~~~
+
+.. attribute:: name
+
+ Name used to open the file
+
+.. attribute:: mode
+
+ Mode character used to open the file
+
+
+Browsing a file
+~~~~~~~~~~~~~~~
+
+.. method:: browse()
+
+ Specifying the full name of an HDF5 resource can be tedious and error-prone.
+ Therefore, h5py includes a small command-line browser which can be used like
+ a UNIX shell to explore an HDF5 file and import datasets and groups into an
+ interactive session. It includes things like ``ls`` and tab-completion. To
+ open the browser, simply call browse(). Type ``help`` at the prompt for a
+ list of commands.
+
+
+
+
+
+
+
+
+
+
+
diff --git a/docs/source/quick.rst b/docs/source/quick.rst
new file mode 100644
index 0000000..06d9c0e
--- /dev/null
+++ b/docs/source/quick.rst
@@ -0,0 +1,332 @@
+*****************
+Quick Start Guide
+*****************
+
+First, install h5py by following the `installation instructions`__.
+
+__ http://h5py.alfven.org/build.html
+
+The ``import *`` construct is safe when used with the main package::
+
+ >>> from h5py import *
+
+The rest of the examples here assume you've done this. Among other things, it
+imports the three classes ``File``, ``Group`` and ``Dataset``, which will cover
+99% of your needs.
+
+
+Storing simple data
+===================
+
+Create a new file
+-----------------
+
+Files are opened using a Python-file-like syntax::
+
+ >>> f = File("myfile.hdf5", 'w') # Create/truncate file
+ >>> f
+ File "myfile.hdf5", root members:
+ >>> type(f)
+ <class 'h5py.highlevel.File'>
+
+Create a dataset
+----------------
+
+Datasets are like Numpy arrays which reside on disk; they are identified by
+a unique name, shape, and a Numpy dtype. The easiest way to create them is
+with a method of the File object you already have::
+
+ >>> dset = f.create_dataset("MyDataset", (2,3), '=i4')
+ >>> dset
+ Dataset "MyDataset": (2L, 3L) dtype('int32')
+ >>> type(dset)
+ <class 'h5py.highlevel.Dataset'>
+
+This creates a new 2-d 6-element (2x3) dataset containing 32-bit signed integer
+data, in native byte order, located in the file at "/MyDataset".
+
+Read & write data
+-----------------
+
+You can now store data in it using the Numpy-like slicing syntax::
+
+ >>> print dset[...]
+ [[0 0 0]
+ [0 0 0]]
+ >>> import numpy
+ >>> myarr = numpy.ones((2,), '=i2') # The dtype doesn't have to exactly match
+ >>> dset[:,0] = myarr
+ >>> print dset[...]
+ [[1 0 0]
+ [1 0 0]]
+
+Closing the file
+----------------
+
+You don't need to do anything special to "close" datasets. However, you must
+remember to close the file before exiting Python, to prevent data loss. This
+will automatically close all the open HDF5 objects::
+
+ >>> dset
+ Dataset "MyDataset": (2L, 3L) dtype('int32')
+ >>> f.close()
+ >>> dset
+ Invalid dataset
+
+
+More about datasets
+===================
+
+Automatic creation
+------------------
+
+If you already have an array you want to store, you don't even need to call
+``create_dataset``. Simply assign it to a name::
+
+ >>> myarr = numpy.ones((50,75))
+ >>> f["MyDataset"] = myarr
+ >>> f["MyDataset"]
+ Dataset "MyDataset": (50L, 75L) dtype('float64')
+
+Storing compound data
+---------------------
+
+You can store "compound" data (struct-like, using named fields) using the Numpy
+facility for compound data types. For example, suppose we have data that takes
+the form of (temperature, voltage) pairs::
+
+ >>> import numpy
+ >>> mydtype = numpy.dtype([('temp','=f4'),('voltage','=f8')])
+ >>> dset = f.create_dataset("MyDataset", (20,30), mydtype)
+ >>> dset
+ Dataset "MyDataset": (20L, 30L) dtype([('temp', '<f4'), ('voltage', '<f8')])
+
+You can also access data using Numpy recarray-style indexing. The following
+are all legal slicing syntax for the above array (output omitted for brevity)::
+
+ >>> dset[0,0]
+ >>> dset[0,:]
+ >>> dset[...]
+ >>> dset['temp']
+ >>> dset[0,0,'temp']
+ >>> dset[8:14:2, ::2, 'voltage']
+
+Shape and data type
+-------------------
+
+Like Numpy arrays, Dataset objects have attributes named "shape" and "dtype"::
+
+ >>> dset = f.create_dataset("MyDataset", (4,5), '=c8')
+ >>> dset.dtype
+ dtype('complex64')
+ >>> dset.shape
+ (4L, 5L)
+
+These attributes are read-only.
+
+Values and 0-dimensional datasets
+---------------------------------
+
+HDF5 allows you to store "scalar" datasets. These have the shape "()". You
+can use the syntax ``dset[...]`` to recover the value as an 0-dimensional
+array. Also, the special attribute ``value`` will return a scalar for an 0-dim
+array, and a full n-dimensional array for all other cases:
+
+ >>> f["ArrayDS"] = numpy.ones((2,2))
+ >>> f["ScalarDS"] = 1.0
+ >>> f["ArrayDS"].value
+ array([[ 1., 1.],
+ [ 1., 1.]])
+ >>> f["ScalarDS"].value
+ 1.0
+
+
+Using HDF5 options
+------------------
+
+You can specify a number of HDF5 features when creating a dataset. See the
+Dataset constructor for a complete list. For example, to create a (100,100)
+dataset stored as (100,10) size chunks, using GZIP compression level 6::
+
+ >>> dset = f.create_dataset("MyDataset", (100,100), chunks=(100,10), compression=6)
+
+
+Groups & multiple objects
+=========================
+
+The root group
+--------------
+
+Like a filesystem, HDF5 supports the concept of storing multiple objects in
+containers, called "groups". The File object behaves as one of these
+groups (it's actually the *root group* "``/``", again like a UNIX filesystem).
+You store objects by giving them different names:
+
+ >>> f["DS1"] = numpy.ones((2,3))
+ >>> f["DS2"] = numpy.ones((1,2))
+ >>> f
+ File "myfile.hdf5", root members: "DS1", "DS2"
+
+Beware, you need to delete an existing object; as HDF5 won't do this automatically::
+
+ >>> f["DS3"] = numpy.ones((2,2))
+ >>> f["DS3"] = numpy.ones((2,2))
+ Traceback (most recent call last):
+ ... snip traceback ...
+ h5py.h5.DatasetError: Unable to create dataset (H5Dcreate)
+ HDF5 Error Stack:
+ 0: "Unable to create dataset" at H5Dcreate
+ 1: "Unable to name dataset" at H5D_create
+ 2: "Already exists" at H5G_insert
+ 3: "Unable to insert name" at H5G_namei
+ 4: "Unable to insert entry" at H5G_stab_insert
+ 5: "Unable to insert key" at H5B_insert
+ 6: "Can't insert leaf node" at H5B_insert_helper
+ 7: "Symbol is already present in symbol table" at H5G_node_insert
+
+Removing objects
+----------------
+
+You can "delete" (unlink) an object from a group::
+
+ >>> f["DS"] = numpy.ones((10,10))
+ >>> f["DS"]
+ Dataset "DS": (10L, 10L) dtype('float64')
+ >>> del f["DS"]
+ >>> f["DS"]
+ Traceback (most recent call last):
+ ... snip traceback ...
+ h5py.h5.ArgsError: Cannot stat object (H5Gget_objinfo)
+ HDF5 Error Stack:
+ 0: "Cannot stat object" at H5Gget_objinfo
+ 1: "Unable to stat object" at H5G_get_objinfo
+ 2: "Component not found" at H5G_namei
+ 3: "Not found" at H5G_stab_find
+ 4: "Not found" at H5G_node_found
+
+Creating subgroups
+------------------
+
+You can create subgroups by giving them names:
+
+ >>> f.create_group('subgrp')
+ Group "subgrp" (0 members)
+
+Be careful, as most versions of HDF5 don't support "automatic" (recursive)
+creation of intermediate groups. Instead of doing::
+
+ >>> f.create_group('foo/bar/baz') # WRONG
+
+you have to do:
+
+ >>> f.create_group('foo')
+ >>> f.create_group('foo/bar')
+ >>> f.create_group('foo/bar/baz')
+
+This restriction will be raised in the future, as HDF5 1.8.X provides a feature
+that does this automatically.
+
+
+Group tricks
+------------
+
+Groups support iteration (yields the member names), len() (gives the number
+of members), and membership testing:
+
+ >>> g = f.create_group('subgrp')
+ >>> g["DS1"] = numpy.ones((2,2))
+ >>> g["DS2"] = numpy.ones((1,2))
+ >>> g["DS3"] = numpy.ones((10,10))
+ >>> for x in g:
+ ... print x
+ ...
+ DS1
+ DS2
+ DS3
+ >>> for x, ds in g.iteritems():
+ ... print x, ds.shape
+ ...
+ DS1 (2L, 2L)
+ DS2 (1L, 2L)
+ DS3 (10L, 10L)
+ >>> len(g)
+ 3
+ >>> "DS1" in g
+ True
+ >>> "DS4" in g
+ False
+
+Group caveats
+-------------
+
+The HDF5 file graph is not limited to a tree configuration. Like hard links in
+a file system, group "members" are actually references to shared HDF5 objects.
+This can lead to odd behavior; for example, it's perfectly legal for a group
+to contain itself. When you assign an existing HDF5 object to a name, HDF5
+will create a new reference (hard link) with that name, which points to the
+object.
+
+ >>> dset = f.create_dataset("MyDS", (1,2), '=i2')
+ >>> f["DS Alias"] = dset # creates a new hard link
+
+Recursion:
+
+ >>> f["self"] = f
+ >>> f.names
+ ("self",)
+ >>> f["self"].names
+ ("self",)
+ >>> f["self/self"].names
+ ("self",)
+
+While this has many benefits (many paths can share the same underlying data),
+you should be careful not to get yourself into trouble.
+
+Attributes
+==========
+
+HDF5 lets you associate small bits of data with both groups and datasets.
+A dictionary-like object which exposes this behavior is attached to every
+Group and Dataset object as the attribute ``attrs``. You can store any scalar
+or array value you like::
+
+ >>> dset = f.create_dataset("MyDS", (2,3), '=i4')
+ >>> dset.attrs
+ Attributes of "MyDS": (none)
+ >>> dset.attrs["Name"] = "My Dataset"
+ >>> dset.attrs["Frob Index"] = 4
+ >>> dset.attrs["Baz Order"] = numpy.arange(10)
+ >>> for name, value in dset.attrs.iteritems():
+ ... print name, value
+ ...
+ Name My Dataset
+ Frob Index 4
+ Baz Order [0 1 2 3 4 5 6 7 8 9]
+
+Attributes can be associated with any named HDF5 object, including the root
+group.
+
+More information
+================
+
+Everything in h5py is documented with docstrings. The `online HTML
+documentation`__ provides a cross-referenced document with this information.
+The classes described in this document are stored in the ``h5py.highlevel``
+module.
+
+__ http://h5py.alfven.org/docs/
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
--
Alioth's /usr/local/bin/git-commit-notice on /srv/git.debian.org/git/debian-science/packages/h5py.git
More information about the debian-science-commits
mailing list