Scientists have chosen to store their data from experiments in HDF
format for many years, most recently adopting the HDF5 format. HDF5
offers the ability to store both primary experimental data (such as an
image and a set of positioner values) and metadata about the experiment.
HDF5 is a very efficient format to store binary data such as images.
It's this combination of primary plus metadata storage that drives
adoption of an advanced file container such as HDF5.
A single image is stored as one dataset. It is also possible to store a
series of image frames as one dataset. It is possible to do even more
advanced arrangements.
There is an international group of scientists (NeXus) that formed to
define a standard of how to arrange scientific data in HDF files to make
it easy (ok, easier) to use experiment data collected from various
X-ray, neutron, and muon science user facilities. See the website for
more details: http://www.nexusformat.org
I'm not sure the choice of data file format can help in the
reconstruction. Rather, the data file format helps in the organization
of data files, potentially reducing the number of data files related to
a single experiment or other use case.
Pete
On 3/22/2012 12:27 PM, Emmanuel Mayssat wrote:
Just out of curiosity, why would you store images in hdf5 files?
How do you store them? 1 image = 1 huge table with pixel intensity?
It must be because it is easier to get statistics, extract region of
interest, work with several images (datasets), etc.
Is that correct?
For a project, I worked on some time ago (but dropped due to lack of
funding), we were looking at CT-scan type of applications with 8
exposures per rotation angle, 180 deg rotation with 0.5 deg step, i.e.
descent size data set. I am wondering if hdf5 could have helped in the
reconstruction. Then and still today, we were/are using mar/crayonix
software.
MAR doesn't store image in hdf5 format, but can the areaDetector package
convert those files on the fly?
--
E
------------------------------------------------------------------------
*From:* Mark Rivers <[email protected]>
*To:* 'Emmanuel Mayssat' <[email protected]>; epics
<[email protected]>
*Sent:* Thursday, March 22, 2012 7:32 AM
*Subject:* RE: hdf5 (h5py) anyone?
The EPICS areaDetector
<http://cars9.uchicago.edu/software/epics/areaDetector.html> package
currently has 2 file writers that produce HDF5 files:
NDFileNexus
<http://cars.uchicago.edu/software/epics/NDPluginFile.html#NeXus>
creates NeXus compliant HDF5 files using the NeXus API. It was written
by John Hammonds from the APS.
NDFileHDF5
<http://cars.uchicago.edu/software/epics/NDPluginFile.html#HDF5> creates
HDF5 files using the native HDF5 API. It was written by Ulrik Pedersen
from Diamond Light Source
Mark
--
----------------------------------------------------------
Pete R. Jemian, Ph.D. <[email protected]>
Beam line Controls and Data Acquisition, Group Leader
Advanced Photon Source, Argonne National Laboratory
Argonne, IL 60439 630 - 252 - 3189
-----------------------------------------------------------
Education is the one thing for which people
are willing to pay yet not receive.
-----------------------------------------------------------
- References:
- hdf5 (h5py) anyone? Emmanuel Mayssat
- RE: hdf5 (h5py) anyone? Mark Rivers
- Re: hdf5 (h5py) anyone? Emmanuel Mayssat
- Navigate by Date:
- Prev:
RE: hdf5 (h5py) anyone? Malitsky, Nikolay D
- Next:
RE: hdf5 (h5py) anyone? Mark Rivers
- Index:
1994
1995
1996
1997
1998
1999
2000
2001
2002
2003
2004
2005
2006
2007
2008
2009
2010
2011
<2012>
2013
2014
2015
2016
2017
2018
2019
2020
2021
2022
2023
2024
- Navigate by Thread:
- Prev:
Re: hdf5 (h5py) anyone? Emmanuel Mayssat
- Next:
RE: hdf5 (h5py) anyone? Mark Rivers
- Index:
1994
1995
1996
1997
1998
1999
2000
2001
2002
2003
2004
2005
2006
2007
2008
2009
2010
2011
<2012>
2013
2014
2015
2016
2017
2018
2019
2020
2021
2022
2023
2024
|