Comments (19)
Hi,
Answering 2 first: The type property on the returning buffer of readDatasetAsBuffer should be set to one of the H5Type's listed for it in : http://hdf-ni.github.io/hdf5.node/ref/globals.html. The checked ones I've tested yet more probably work. If there is a particular type you need that doesn't work, let me know and I'll add. The documentation is with the github tip and the npm published is a bit older; I plan to republish to npm this weekend. Parts like the image interface work better and the other depend projects need the upgrade.
Note the options on the dataset interface is 2 member arrays like
var readAsBuffer = h5lt.readDatasetAsBuffer(group.id, 'Waldo', {start: [3,4], stride: [1,1], count: [2,2]});
start is the x,y offset and count is the width and height of the slab. stride I need to look and test some more. stride is usually [1,1] yet it can make it skip over certain 'strides' in x and y.
I'll test and see about the readDataset returning a hyperslab. I need to get more hyperslabbing flowing through https://github.com/HDF-NI/hdf5.ws
Tomorrow morning I'll have more answers
from hdf5.node.
Also I need to work up tests of higher rank hyperslabs; and soon if you have such data!
from hdf5.node.
Hello @rimmartin . Thank you for getting back to me so quickly. The dataset I'm working with is of type H5T_IEEE_F32LE
and has rank 1.
readDatasetAsBuffer()
seems to be working well with 1 member arrays for my dataset, as I would expect it to considering the rank of the latter. I do need a little more testing to confirm this, though.
Manually converting to Float32Array
works, although that's probably because I'm on a LE system myself. I think TypeArray instances on a LE system only support LE-serialized data, likewise for instances on BE systems.
If possible, I'd kindly request to have the H5Type set even in case of unsupported types (perhaps in another attribute), so as to be able to add the appropriate bits of conversion logic. Having full support for all types within this module would be awesome but also fairly complex, I imagine - if only for the endianness issue.
from hdf5.node.
Interesting, I never tried a 1 rank. Will test it. The start, stride and count arrays do control the glue to the native hdf hyperslab so all ranks should work. I focused on rank 2 and have some rank 3 coming up that stores data from code dependent on http://pointclouds.org/
I'm checking and testing H5Type being available and also being able to query for this info before pulling data. Query children of groups and the file object
The node Buffer https://nodejs.org/api/buffer.html has endian and byte order swapping and that's one of the reasons it got more focus for the varied set from native primitives of hdf5 data types. Also for going straight to javascript buffers in browsers. I suppose an option could be added to call a swap on the native side. Have to study this some more.
The Typed Arrays match a subset of hdf types although I suppose they can hold the data for any hdf5 data that matches the number of bits. Yet I wasn't sure how such data could be used with the math functions of the Typed Arrays.
I see mocha is not running some tests; they are not necessarily failing but just never run. So this will take more than this morning. I'll see what the weekend brings.
This is interesting and will extend this project for more users
from hdf5.node.
I'm checking and testing H5Type being available and also being able to query for this info before pulling data. Query children of groups and the file object
This alone would be awesome and of immediate help.
I see mocha is not running some tests; they are not necessarily failing but just never run. So this will take more than this morning. I'll see what the weekend brings.
No pressure meant. I appreciate all the awesome work you're doing. Bravo!
from hdf5.node.
There is now a getDataType (http://hdf-ni.github.io/hdf5.node/ref/groups.html) on groups and the file object for finding out about the type without returning the data. The type will be one of http://hdf-ni.github.io/hdf5.node/ref/globals.html#h5type. The same type will be on the type property on the buffer returned from h5lt.readDatasetAsBuffer. There were a number of dunderheaded things I needed to fix. When a dataset is asked its type https://support.hdfgroup.org/HDF5/doc/RM/RM_H5D.html#Dataset-GetType it returns a copy of a predefined type but would never match in my map for it. Needed to search the opposite direction map and use the https://support.hdfgroup.org/HDF5/doc/RM/RM_H5T.html#Datatype-Equal to find a predefined type.
This gets us to the endianness of javascript https://developer.mozilla.org/en-US/docs/Glossary/Endianness. Now this interface will return what is stored in the h5. Might be good to return the endianness boolean to pass along to javascript. You bring up a good point for making this more seamless for users. I'm looking at the v8 api https://v8docs.nodesource.com/ if there is a way to set the endianness on typed arrays. Browser dataview objects have a mechanism for it https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/DataView/getFloat32. It might be a question for v8 team although a solution may take some scheduling time with them. So what to do in the mean time...
Could swap only the one endian kind; yet swapping is some cpu expense. @vincent-tr how is/would endianness effect your applications?
Need to gain more understanding. With the predefined hdf5 types we can insure the data is the same endianness no matter the native system, yet javascript seems to be 'flexible'.
from hdf5.node.
Oh man, that's awesome.
Can't seem to get the version from master to run, though. I compile correctly but I keep getting the following during execution:
dyld: lazy symbol binding failed: Symbol not found: __ZN8NodeHDF55Int6410InitializeEN2v85LocalINS1_6ObjectEEE
Referenced from: /Users/jacopo/Code/node-hdf5/build/Release/hdf5.node
Expected in: flat namespace
dyld: Symbol not found: __ZN8NodeHDF55Int6410InitializeEN2v85LocalINS1_6ObjectEEE
Referenced from: /Users/jacopo/Code/node-hdf5/build/Release/hdf5.node
Expected in: flat namespace
Abort trap: 6
Everything works with 0.2.1 from NPM. I did upgrade to Mac OS Sierra, though...
from hdf5.node.
Oh mac. I'll try the 10.7 I have access to. In some months might get access to a better mac. Which hdf5 version? Been working with 1.10.0-patch1 and 1.8.17. And node v6.5.0. Should work with other versions
from hdf5.node.
I've been working with hdf5 v1.8.17 and node v6.6.0 and v4.5.0/LTS. No
worries, I'll fire up a Linux VM and see if I can get it to work there.
rimmartin wrote:
Oh mac. I'll try the 10.7 I have access to. In some months might get
access to a better mac. Which hdf5 version? Been working with
1.10.0-patch1 and 1.8.17. And node v6.5.0. Should work with other versions—
You are receiving this because you authored the thread.
Reply to this email directly, view it on GitHub
#31 (comment), ormute the thread
https://github.com/notifications/unsubscribe-auth/ADKxB8AxjI8JYt8OuuaDzRthJIdJmQZjks5qucw_gaJpZM4KDBRN.
from hdf5.node.
Got it to work on a Void Linux VM. Node v6.7.0, HDF5 v1.8.17, GCC v4.9.4. Fantastic.
As swapping needs some CPU work, I'd say passing the endianness boolean would be a good compromise. It doesn't add code that would eventually have to be removed in favour of a lower-level solution while still making it a little easier to automate the swap on our (devs using this module) side.
from hdf5.node.
great! I'll be looking at mac and also more endianness
from hdf5.node.
Hi,
I added a callback on readDataset with the options of the dataset returned:
const readBuffer=h5lt.readDataset(group.id, 'Two Rank', function(options) {
console.dir("options: "+JSON.stringify(options));
});
where the usual properties plus endianess wll return. I'm thinking if a callback is supplied the 'reserved' properties won't be set on the return typed array or buffer. Example callback object
options: {"rank":2,"rows":3,"columns":2,"endian":0}
Adding an endianess or byte order enumeration to the package, testing a bit and then will commit to github
from hdf5.node.
If I may, I would suggest including the options
properties within the returned Buffer
to maintain compat. w/ Nodejs' API style.
It'd be semantically confusing, I think, to have a further description of the synchronously returned object provided through a seemingly asynchronous flow. It would also hinder a potential move towards an actual asynchronous API.
My very IMHO 2-cents.
from hdf5.node.
The native callback I believe is synchronous. I haven't learned how to do async from there.
What I'm looking at, the attributes on the dataset go all the way to the h5; and these could be in the way; especially while adding more and more. Concerned of conflict. Was looking to return these without the potential conflict?
from hdf5.node.
I understand (I think... :) ). What about returning the buffer as a part of a more generic object? Something like
const results = h5lt.readDataset(group.id, 'Two Rank');
results.buffer; // The actual buffer
results.endian; // Endianness
...
from hdf5.node.
Hi, work customer support tied me up a few days:-)
I'm adding a getByteOrder on the files and groups for their children and calling this "byte_order"
https://support.hdfgroup.org/HDF5/doc/RM/RM_H5T.html#Datatype-GetOrder for these enumerations.
Also will dialog with nodejs team and hopefully v8 team about endianess support in the future of javascript
from hdf5.node.
Pushed my current state. There is now an undocumented getByteOrder and a global enumeration togo with it
module.exports.H5TOrder = {
H5T_ORDER_LE :0, /* Little-endian byte order */
H5T_ORDER_BE :1, /* Big-endian byte order */
H5T_ORDER_VAX :2, /* VAX mixed byte order */
H5T_ORDER_MIXED :3, /* Mixed byte order among members of a compound datatype (see below) */
H5T_ORDER_NONE :4, /* No particular order (fixed-length strings, object and region references) */
}
I'm also going to add the options to the making and writing so users have unreserved attributes on their h5 data; avoiding any potential conflicts.
will add documentation
from hdf5.node.
@rimmartin, how do I thank you?
rimmartin wrote:
Pushed my current state. There is now an undocumented getByteOrder and
a global enumeration togo with it|module.exports.H5TOrder = {
H5T_ORDER_LE :0, /* Little-endian byte order /
H5T_ORDER_BE :1, / Big-endian byte order /
H5T_ORDER_VAX :2, / VAX mixed byte order /
H5T_ORDER_MIXED :3, / Mixed byte order among members of a compound
datatype (see below) /
H5T_ORDER_NONE :4, / No particular order (fixed-length strings,
object and region references) */
}
|I'm also going to added the options to the making and writing so users
have unreserved attributes on their h5 data; avoiding any potential
conflicts.will add documentation
—
You are receiving this because you authored the thread.
Reply to this email directly, view it on GitHub
#31 (comment), ormute the thread
https://github.com/notifications/unsubscribe-auth/ADKxBzseNpygviQQj0qmU59Dgr2BtiSuks5qw6VNgaJpZM4KDBRN.
from hdf5.node.
You have! By making this project get more serious with data types and endianess:-) This has been a great thread
from hdf5.node.
Related Issues (20)
- Support for Single Writer Multiple Reader (SWMR)? HOT 7
- Getting dataset attribute (getDatasetAttribute) on a 32 bit floating point in NODE/javascript returns a totally different value HOT 6
- hdf5_home_win does not get set HOT 2
- Read dataset with 2d array stored fails on reading chunks HOT 10
- issues reading 4 dimensional dataset HOT 2
- SyntaxError: unsupported data type on compound datasets HOT 5
- Node 12? HOT 32
- Delete attribute HOT 6
- Tutorial: "Writing & Reading subsets" is not working. HOT 3
- Segfault reading HOT 4
- issue with appending to tables on windows 10
- Error: The specified procedure could not be found. (process.dlopen) HOT 1
- `hdf5.File is not a constructor` when bundled with webpack HOT 2
- cannot install HOT 3
- Win10+Node v14.15.5 Compilation errors HOT 6
- Getting data from Buffer
- Compile Error HOT 12
- install with yarn? HOT 2
- windows-build-tools / vs2017? HOT 37
- Error with handling variable length data (H5T_VLEN) HOT 6
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from hdf5.node.