elrnv / vtkio Goto Github PK
View Code? Open in Web Editor NEWVisualization ToolKit (VTK) file parser and writer
License: Apache License 2.0
Visualization ToolKit (VTK) file parser and writer
License: Apache License 2.0
Hello elrnv!
Thank you three contributors for your work!I'm learning to use vtkio and it feels great!
After studying the examples in the readme, I already know how to construct a VTK object(with default attributes) using vtkio, and bufferd write it into a .vtk file. Here is my Rust file:
test_vtkio.rs :
use std::io::{BufWriter, Write};
use vtkio::model::*;
// The whole idea:
// 1. Construct a String to store the vtk object;
// 2. Construct a VTK object containing data;
// 2. Write VTK object into String;
// 3. Write the String into the .vtk file.
fn main() {
// Prepare a file for loading vtk content
let file_path = "./test2.vtk";
let vtk_file = std::fs::File::create(file_path).unwrap();
let mut vtk_writer = BufWriter::new(vtk_file);
// Prepare a String to hold vtk file information
let mut vtk_strings = String::new();
// Construct the content of the vtk file
let tri2 = make_triangle_legacy();
// Write the content of the vtk file into String
tri2.write_legacy_ascii(&mut vtk_strings)
.expect("Write vtk info into bytes failed!");
// Print out the written string
println!("{}", vtk_strings.as_str());
// Write the String into the .vtk file
write!(vtk_writer, "{}", vtk_strings.as_str()).expect("Write VTK error!");
}
fn make_triangle_legacy() -> Vtk {
Vtk {
version: Version { major: 4, minor: 2 },
title: String::from("Rust vtk io crate test."),
byte_order: ByteOrder::BigEndian,
file_path: None,
data: DataSet::inline(UnstructuredGridPiece {
points: IOBuffer::F64(vec![
0.0, 0.0, 0.0, 1.0, 0.0, 0.0, 0.0, 1.0, 0.0, 1.0, 1.0, 0.0,
]),
cells: Cells {
cell_verts: VertexNumbers::Legacy {
num_cells: 2,
vertices: vec![3, 0, 1, 2, 3, 3, 2, 1],
},
types: vec![CellType::Triangle; 2],
},
data: Attributes {
..Default::default()
},
}),
}
}
And the above code will generate a vtk file: test2.vtk. Its contents are:
test2.vtk:
# vtk DataFile Version 4.2
Rust vtk io crate test.
ASCII
DATASET UNSTRUCTURED_GRID
POINTS 4 double
0 0 0 1 0 0 0 1 0 1 1 0
CELLS 2 8
3 0 1 2 3 3 2 1
CELL_TYPES 2
5
5
POINT_DATA 4
CELL_DATA 2
The test2.vtk file can be well loaded and visualized by paraview without any data:
However I'm having trouble writing Dataset attributes to a .vtk file in Legacy style using vtkio. I don't know how to write the following point_data into the test2.vtk file in Legacy style,
POINT_DATA 4
VECTORS displacement float
0 0 0
0.5 0 0
0 0 0
0.5 0 0
Please teach me how to construct a VTK object containing point/cell data, thanks a lot!
The nom
dependency may need updating from version 3. The same was true of quick-xml but I see this has already switched to the latest git version in the release-0.7 branch.
Compiler warning:
warning: the following packages contain code that will be rejected by a future version of Rust: nom v3.2.1
For reference this is on rust version: stable-x86_64-unknown-linux-gnu unchanged - rustc 1.68.2
Below is an example of pretty much every warning for a --future-incompat-report
:
The package `nom v3.2.1` currently triggers the following future incompatibility lints:
> warning: trailing semicolon in macro used in expression position
> --> /home/tony/.cargo/registry/src/github.com-1ecc6299db9ec823/nom-3.2.1/src/macros.rs:516:35
> |
> 516 | map!(__impl $i, call!($f), $g);
> | ^
> |
> ::: /home/tony/.cargo/registry/src/github.com-1ecc6299db9ec823/nom-3.2.1/src/nom.rs:388:3
> |
> 388 | map!(i, be_u8, | x | { x as i8 })
> | --------------------------------- in this macro invocation
> |
> = warning: this was previously accepted by the compiler but is being phased out; it will become a hard error in a future release!
> = note: for more information, see issue #79813 <https://github.com/rust-lang/rust/issues/79813>
> = note: macro invocations at the end of a block are treated as expressions
> = note: to ignore the value produced by the macro, add a semicolon after the invocation of `map`
> = note: `#[allow(semicolon_in_expressions_from_macros)]` on by default
> = note: this warning originates in the macro `map` (in Nightly builds, run with -Z macro-backtrace for more info)
When I tested the import/export example, the following error occurred:
error[E0432]: unresolved imports `vtkio::export_ascii`, `vtkio::import`
--> src/main.rs:2:13
|
2 | use vtkio::{export_ascii, import};
| ^^^^^^^^^^^^ ^^^^^^ no `import` in the root
| |
| no `export_ascii` in the root
I want to ask how to solve this problem?
Hello !
Thank you for this nice piece of software which enabled exporting vtk directly from rust.
I have a problem to read point clouds created with vtkio (PolyData containing only points, no cells).
Files are opened only if those contain non-empty cells.
Minimal example (from doc)
readable by paraview
use vtkio::model::*; // import model definition of a VTK file
let mut vtk_bytes = Vec::<u8>::new();
Vtk {
version: Version::new((2,0)),
byte_order: ByteOrder::BigEndian,
title: String::from("Triangle example"),
file_path: None,
data: DataSet::inline(PolyDataPiece {
points: vec![0.0f32, 0.0, 0.0, 1.0, 0.0, 0.0, 0.0, 0.0, -1.0].into(),
polys: Some(VertexNumbers::Legacy {
num_cells: 1,
vertices: vec![3, 0, 1, 2]
}),
data: Attributes::new(),
..Default::default()
})
}.write_xml(&mut vtk_bytes);
not readable by paraview (does load but no data is on screen)
use vtkio::model::*; // import model definition of a VTK file
let mut vtk_bytes = Vec::<u8>::new();
Vtk {
version: Version::new((2,0)),
byte_order: ByteOrder::BigEndian,
title: String::from("Triangle example"),
file_path: None,
data: DataSet::inline(PolyDataPiece {
points: vec![0.0f32, 0.0, 0.0, 1.0, 0.0, 0.0, 0.0, 0.0, -1.0].into(),
data: Attributes::new(),
..Default::default()
})
}.write_xml(&mut vtk_bytes);
no data either for this example
use vtkio::model::*; // import model definition of a VTK file
let mut vtk_bytes = Vec::<u8>::new();
Vtk {
version: Version::new((2,0)),
byte_order: ByteOrder::BigEndian,
title: String::from("Triangle example"),
file_path: None,
data: DataSet::inline(PolyDataPiece {
points: vec![0.0f32, 0.0, 0.0, 1.0, 0.0, 0.0, 0.0, 0.0, -1.0].into(),
polys: Some(VertexNumbers::Legacy {
num_cells: 0,
vertices: vec![]
}),
data: Attributes::new(),
..Default::default()
})
}.write_xml(&mut vtk_bytes);
Hi,
I'm using vtkio for a toy FEA project in rust. First off, thanks a lot for creating and supporting this lib !
But it's very cumbersome to use, take Attribute for example:
I opened a VTP file, I want to read the cell array "dof":
pv.read("file.vtp").cell_data["dof"]
let vtk_file = Vtk::import("test.vtp").unwrap();
let pieces = match vtk_file.data {
DataSet::PolyData { meta, pieces } => pieces,
_ => panic!("Only PolyData is supported"),
};
let dataset = match pieces[0] {
Piece::Inline(dataset) => dataset,
_ => panic!("Only inline data is supported"),
};
let mut dof = None;
for attr in dataset.data.cell {
let (name, iobuf) = match attr {
Attribute::DataArray(d) => (d.name, d.data),
_ => panic!("Only DataArray is supported"),
};
if name == "dof" {
dof = Some(iobuf.into_vec::<u8>().unwrap());
}
}
let dof = dof.expect("dof must be present");
let polydata = Vtk::read("test.vtp").unwrap().as_polydata().unwrap();
let cell_data = polydata.data.cell.as_map();
let dof = cell_data.get("dof").expect("dof must be present on cells");
-> I don't understand the pieces
vector of DataSet
, aren't VTP files always single element ? Are you trying to implement something like pyvista's MultiBlock ?
Do you have any plans to simplify the API ? Are you open to PRs ? Are there reasons to make the API so complex ?
Hey Egor!
Time passes so quickly. I'm trying my hand at importing MRI data into Houdini again, and thought to see how you are. Would you ever have time to Zoom? I'm trying to visualize the famous Dorr 2008 mouse brain, and am blocked. If you ever have time, I'd be oh so grateful. I'm in the Beijing time zone. Would still love to make a video tutorial (once I actually figure it out, lol) :-)
Hello,
I am trying to use this library to write a vtk file of unstructured grid, used in SPH simulation. Is there any example where we can write the data to a file?
A sample file, I want to write is
# vtk DataFile Version 3.0
Time some
ASCII
DATASET UNSTRUCTURED_GRID
POINTS 16 float
-0.8999 -0.0003 0.0
-0.6000 -0.0003 0.0
-0.3000 -0.0003 0.0
0.0000 -0.0003 0.0
-0.8999 0.2997 0.0
-0.6000 0.2997 0.0
-0.3000 0.2997 0.0
0.0000 0.2997 0.0
-0.8999 0.5997 0.0
-0.6000 0.5997 0.0
-0.3000 0.5997 0.0
0.0000 0.5997 0.0
-0.8999 0.8997 0.0
-0.6000 0.8997 0.0
-0.3000 0.8997 0.0
0.0000 0.8997 0.0
POINT_DATA 16
SCALARS Diameter float 1
LOOKUP_TABLE default
0.1500
0.1500
0.1500
0.1500
0.1500
0.1500
0.1500
0.1500
0.1500
0.1500
0.1500
0.1500
0.1500
0.1500
0.1500
0.1500
VECTORS Force float
1464120.5000 437985.7188 0.0000
406.5037 -1765.8165 0.0000
0.0000 -1765.8000 0.0000
0.0000 -1765.8000 0.0000
1464120.5000 437985.6562 0.0000
406.5037 -1765.8165 0.0000
0.0000 -1765.8000 0.0000
0.0000 -1765.8000 0.0000
1464120.5000 437985.5938 0.0000
406.5037 -1765.8165 0.0000
0.0000 -1765.8000 0.0000
0.0000 -1765.8000 0.0000
1464120.2500 437985.9688 0.0000
406.5037 -1765.8165 0.0000
0.0000 -1765.8000 0.0000
0.0000 -1765.8000 0.0000
This should improve internal debugging.
Naturally writer::Error
should also implement std::fmt::Display
.
The XML based VTK files use extensions (e.g. .vtu
, .vtp
, .vts
), which indicate the type of object being represented. This must match the type specified in the VTKFile xml tag (e.g. UnstructuredGrid
, PolygonData
, StructuredGrid
resp.). We should clarify this connection somewhere in the docs, and improve the error message that currently reads:
XML error: The extension of the VTK file doesn't match the type specified in the VTKFile tag
to specify potential resolutions (recommend file extension to match the VTKFile tag or a VTKFile tag to match the extension).
Action items:
I have a somewhat urgent need for support for non-linear cells. That is, for the legacy format, the VTK_QUADRATIC_*
cell types. Since we're up against a deadline, I'd probably hack something together in a local fork before wrapping it up for contribution after the deadline (February at best).
Before I dive into the code base of vtkio
, however, I wanted to ask if there's a specific reason that these cell types are not currently supported, or is it simply an omission? Do they need to be treated differently? Any advice you might have would be greatly appreciated!
The original maintainer, Bill Lorensen, of the website hosting the VTK file documentation and examples has sadly passed away, and the referenced github site is no longer maintained.
Many thanks to Bill, whose contributions have immensely helped the development of this library.
All references need to be updated to point to the same site hosted and maintained now by kitware.
Hi! I'm a novice to Houdini & programming in general, trying to learn to create motion design from neuroscience data. Thanks so much for creating this tool!!
I receive this error after creating new folder called dso
and unzipping contents into it. I can't find a cargo.toml file to modify, so perhaps that's my problem? Any help would be appreciated. Cheers!
Windows 10 (20H2)
Houdini 18.5.462 Apprentice Edition
There are different ways one could export files, and currently the API tries its best to guess what is intended by looking at the file type and the model definition.
For XML files, there are different ways to export the same Vtk
model. In the current implementation when exporting in XML, a Vtk
struct is converted to VTKFile
which represents the XML format for these files, and while this contains all the information for the final export, the conversion itself assumes a number of defaults (e.g. whether to use base64 encoding).
The proposal here is to add a config struct that can be passed to the export function as well as Vtk::try_into_xml_format
, to enable custom export options. This can be combined with the compression scheme, which is already passed in the Vtk::try_into_xml_format
function. One use case is to help with debugging as suggested in #38 by enabling ASCII output.
This struct should look like:
struct XMLExportConfig {
compressor: Compressor,
compression_level: u32,
data_format: DataArrayFormat,
... // Other configuration options.
}
Apparently Paraview on Windows exports (ASCII) legacy VTK files using CRLF line endings instead of LF line endings.
In the current v0.6.0
release and the release-v0.7
branch this causes an error, e.g.:
Parse error: Alt
Files to reproduce the issue: legacy.zip
When all line endings in the file are replaced with LF, the files are parsed successfully.
Hi,
I am working on a particle based simulation repo had a difficulty on vtk files. I tried to write a simple library for io of vtk files. I am new to rust and this sort of stuff, but I will try to contribute.
Thanks for the library.
Hello. Thanks for crate! I've had a issue when i'm using Paraview to overview compression VTU
and VTI
file generated by vtkio
Example of VTI export code:
let mut point = vec![
Attribute::DataArray(DataArray {
name: String::from("concentration"),
elem: ElementType::default(),
data: IOBuffer::F64(
cloud
.c
.index_axis(Axis(0), t_index)
.map(|x| *x)
.into_iter()
.collect(),
)}),
];
let vti_vtk = Vtk {
title: String::from("cloud_vti"),
file_path: Some(PathBuf::from(path.clone())),
byte_order: ByteOrder::LittleEndian,
version: vtkio::model::Version { major: 2, minor: 2 },
data: DataSet::ImageData {
extent: Extent::Ranges([
-x_ext..=x_ext,
0..=y_ext,
0..=vert_ext - 1,
]),
origin: [0., 0., 0.],
spacing: [
(simulation.step) as f32,
(simulation.step) as f32,
simulation.step as f32,
],
meta: None,
pieces: vec![
Piece::Inline(Box::new(ImageDataPiece {
extent: Extent::Ranges([
-x_ext..=x_ext,
0..=y_ext,
0..=vert_ext - 1,
]),
data: Attributes {
point,
cell: Vec::new(),
},
}
))
],
},
};
let mut file = File::create(path.as_str()).unwrap();
let vtk_file = vti_vtk.try_into_xml_format(vtkio::xml::Compressor::ZLib, 9).unwrap();
file.write_all(vtk_file.to_string().as_bytes()).unwrap();
If i'm using any type of compression Zlib
, Lzma
or Lz4
paraview has this issue.
ERROR: In vtkXMLDataParser.cxx, line 564
vtkXMLDataParser (000001CAFC5A13B0): Error reading compression header.
ERROR: In vtkXMLDataParser.cxx, line 881
vtkXMLDataParser (000001CAFC5A13B0): ReadCompressionHeader failed. Aborting read.
ERROR: In vtkXMLStructuredDataReader.cxx, line 345
vtkXMLImageDataReader (000001CAED603050): Error reading extent -262 262 0 262 0 25 from piece 0
ERROR: In vtkXMLDataReader.cxx, line 410
vtkXMLImageDataReader (000001CAED603050): Cannot read point data array "concentration" from PointData in piece 0. The data array in the element may be too short.
ERROR: In vtkXMLDataParser.cxx, line 564
vtkXMLDataParser (000001CAFC5A13B0): Error reading compression header.
ERROR: In vtkXMLDataParser.cxx, line 881
vtkXMLDataParser (000001CAFC5A13B0): ReadCompressionHeader failed. Aborting read.
ERROR: In vtkXMLStructuredDataReader.cxx, line 345
vtkXMLImageDataReader (000001CAED603050): Error reading extent -262 262 0 262 0 25 from piece 0
ERROR: In vtkXMLDataReader.cxx, line 410
vtkXMLImageDataReader (000001CAED603050): Cannot read point data array "concentration" from PointData in piece 0. The data array in the element may be too short.
ERROR: In vtkXMLDataParser.cxx, line 564
vtkXMLDataParser (000001CAFC5A13B0): Error reading compression header.
ERROR: In vtkXMLDataParser.cxx, line 881
vtkXMLDataParser (000001CAFC5A13B0): ReadCompressionHeader failed. Aborting read.
ERROR: In vtkXMLStructuredDataReader.cxx, line 345
vtkXMLImageDataReader (000001CAED603050): Error reading extent -262 262 0 262 0 25 from piece 0
ERROR: In vtkXMLDataReader.cxx, line 410
vtkXMLImageDataReader (000001CAED603050): Cannot read point data array "concentration" from PointData in piece 0. The data array in the element may be too short.
ERROR: In vtkXMLDataParser.cxx, line 564
vtkXMLDataParser (000001CAFC5A13B0): Error reading compression header.
ERROR: In vtkXMLDataParser.cxx, line 881
vtkXMLDataParser (000001CAFC5A13B0): ReadCompressionHeader failed. Aborting read.
ERROR: In vtkXMLStructuredDataReader.cxx, line 345
vtkXMLImageDataReader (000001CAED603050): Error reading extent -262 262 0 262 0 25 from piece 0
ERROR: In vtkXMLDataReader.cxx, line 410
vtkXMLImageDataReader (000001CAED603050): Cannot read point data array "concentration" from PointData in piece 0. The data array in the element may be too short.
ERROR: In vtkXMLDataParser.cxx, line 564
vtkXMLDataParser (000001CAFC5A13B0): Error reading compression header.
ERROR: In vtkXMLDataParser.cxx, line 881
vtkXMLDataParser (000001CAFC5A13B0): ReadCompressionHeader failed. Aborting read.
ERROR: In vtkXMLStructuredDataReader.cxx, line 345
vtkXMLImageDataReader (000001CAED603050): Error reading extent -262 262 0 262 0 25 from piece 0
ERROR: In vtkXMLDataReader.cxx, line 410
vtkXMLImageDataReader (000001CAED603050): Cannot read point data array "concentration" from PointData in piece 0. The data array in the element may be too short.
If I'm using Compressor::None
there isn't issue and all is good but a size of file is too large for me.
I tried to use vtkio = {version = "0.6.3", git="https://github.com/elrnv/vtkio.git", branch="fix-pygmsh"}
branch because it commit message says:
Implement compression support for individual (inline) DataArrays.
But, unfortunately, it's still doesn't work.
Verison of Rust: rustc 1.66.1
Version of Paraview: 5.11.0
The attached file fluid_1_91.vtu
cannot be loaded using v0.6.0 or the release-0.7
branch and results in the errors:
XML error: Deserialization error: Xml(UnexpectedEof("</Err(Utf8Error { valid_up_to: 0, error_len: Some(1) })>"))
Deserialization error: Xml(UnexpectedEof("</Err(Utf8Error { valid_up_to: 0, error_len: Some(1) })>"))
Unexpected EOF during reading </Err(Utf8Error { valid_up_to: 0, error_len: Some(1) })>.
Unexpected EOF during reading </Err(Utf8Error { valid_up_to: 0, error_len: Some(1) })>.
while Paraview is able to load the file.
The file uses a <AppendedData encoding="raw">...</AppendedData>
raw binary block for all the data arrays and I suspect it happens to contain characters or sequences that confuse the XML parser.
The file fluid_1_91_encoded.vtu
can be read without problems after opening the original file in Paraview and re-exporting it to VTU with the option "Encode appended data" which I guess applies base64 encoding.
I don't know if it is possible to somehow use different options for the XML parser to support the raw binary block or whether this is actually not allowed by the XML format and would require manual parsing.
Currently the version in the Vtk
type represents the version for both XML and Legacy formats, which have independent versioning.
The problem is that a legacy file loaded as Vtk
will be written with an incorrect version in XML and vice versa, unless the user explicitly updates the version. This is bad for usability because it puts the burden of learning about vtk
versions on the user and forces people to have to handle these cases explicitly.
In the majority of cases, versioning should be handled automatically, and overridden if needed.
The current proposal to resolve this is to refactor version to be an enum as follows:
enum Version {
/// Automatically handle versioning on write for both Legacy and XML formats.
Auto,
/// Loaded Legacy format with this version. Writing in XML format is handled as with the `Auto` variant.
Legacy { major: u32, minor: u32 },
/// Loaded XML format with this version. Writing in Legacy is handled as with the `Auto` variant.
XML { major: u32, minor: u32 },
}
The rules for writing should be such be that the minimum compatible version is used for both Legacy and XML format. The implementer should consult the VTK docs for this. The automatic behaviour should be clearly documented in the code and updated whenever a newer VTK feature is added.
As promised in #17 I want to provide a few CellType examples. There are different options. The examples can live in the README.md
as already started or a dedicated example folder could be created or actually both could be in sync.
I like the idea of having executable examples and simultaniously the same in the README.md
. This of course means manually syncing of both because for now Markdown can not include any source code from a file automatically but those examples are probably not changed very often. Also some people might think the README.md
is to bloaded then.
Hi. First, thanks a lot for this crate! It has been super useful for getting up to speed with a reasonable workflow in a short amount of time for some FEM code that I'm working on. Using your crate, I was able to quickly port some Julia code that I had.
That said, the legacy format is limited, and I will run into its limitations in the not too distant future. I might be interested in investing some time on a crate that would work with e.g. XML-based VTK formats. Would you like vtkio
to remain strictly a crate for legacy IO, or would it make sense to expand the feature set to encompass the XML-based formats as well? Put differently, if I were to start such a venture, should I create a new crate for this purpose or would you accept contributions in this direction?
I am struggling creating a BezierCurve. Could you give me an example how this is done?
See title. I think this would make sense and would allow for more rapid prototyping with e.g. functions returning Box<std::error::Error>
.
I would happily submit a PR (once time permits...) if you agree that this could be useful.
I wrote a code that creates a simple cube:
fn main() {
use vtkio::model::*;
#[rustfmt::skip]
let points = IOBuffer::F32(vec![
0.0, 0.0, 0.0,
1.0, 0.0, 0.0,
0.0, 1.0, 0.0,
1.0, 1.0, 0.0,
0.0, 0.0, 1.0,
1.0, 0.0, 1.0,
0.0, 1.0, 1.0,
1.0, 1.0, 1.0,
]);
let vertex_numbers = VertexNumbers::XML {
#[rustfmt::skip]
connectivity: vec![
0, 2, 3, 1,
0, 1, 5, 4,
1, 3, 7, 5,
3, 2, 6, 7,
2, 0, 4, 6,
4, 5, 7, 6,
],
offsets: vec![4, 8, 12, 16, 20, 24],
};
let piece = PolyDataPiece {
points,
polys: Some(vertex_numbers),
..Default::default()
};
let vtk = Vtk {
version: Version::new((1, 0)),
title: "simple cube".to_string(),
byte_order: ByteOrder::LittleEndian,
data: DataSet::PolyData {
meta: None,
pieces: vec![Piece::Inline(Box::new(piece))],
},
file_path: None,
};
let mut buf = Vec::<u8>::new();
vtk.write_xml(&mut buf).unwrap();
std::fs::write("simple-cube.vtp", &buf).unwrap();
}
The contents of simple-cube.vtp
are as follows:
<VTKFile type="PolyData" version="1.0" byte_order="LittleEndian" header_type="UInt64"><PolyData><Piece NumberOfPoints="8" NumberOfLines="0" NumberOfStrips="0" NumberOfPolys="6" NumberOfVerts="0"><PointData/><CellData/><Points><DataArray type="Float32" format="binary" NumberOfComponents="3">YAAAAAAAAAAAAAAAAAAAAAAAAAAAAIA/AAAAAAAAAAAAAAAAAACAPwAAAAAAAIA/AACAPwAAAAAAAAAAAAAAAAAAgD8AAIA/AAAAAAAAgD8AAAAAAACAPwAAgD8AAIA/AACAPwAAgD8=</DataArray></Points><Polys><DataArray type="UInt64" Name="connectivity" format="binary" NumberOfComponents="1">wAAAAAAAAAAAAAAAAAAAAAIAAAAAAAAAAwAAAAAAAAABAAAAAAAAAAAAAAAAAAAAAQAAAAAAAAAFAAAAAAAAAAQAAAAAAAAAAQAAAAAAAAADAAAAAAAAAAcAAAAAAAAABQAAAAAAAAADAAAAAAAAAAIAAAAAAAAABgAAAAAAAAAHAAAAAAAAAAIAAAAAAAAAAAAAAAAAAAAEAAAAAAAAAAYAAAAAAAAABAAAAAAAAAAFAAAAAAAAAAcAAAAAAAAABgAAAAAAAAA=</DataArray><DataArray type="UInt64" Name="offsets" format="binary" NumberOfComponents="1">MAAAAAAAAAAEAAAAAAAAAAgAAAAAAAAADAAAAAAAAAAQAAAAAAAAABQAAAAAAAAAGAAAAAAAAAA=</DataArray></Polys></Piece></PolyData></VTKFile>
When I tried to load it in ParaView (5.8.0), I got the following error message.
Error reading cell offsets: Unsupported array type: vtkUnsignedLongLongArray
ParaView successfully displayed the cube after changing the type of connectivity
and offsets
to Int64
, as shown below.
<VTKFile type="PolyData" version="1.0" byte_order="LittleEndian" header_type="UInt64"><PolyData><Piece NumberOfPoints="8" NumberOfLines="0" NumberOfStrips="0" NumberOfPolys="6" NumberOfVerts="0"><PointData/><CellData/><Points><DataArray type="Float32" format="binary" NumberOfComponents="3">YAAAAAAAAAAAAAAAAAAAAAAAAAAAAIA/AAAAAAAAAAAAAAAAAACAPwAAAAAAAIA/AACAPwAAAAAAAAAAAAAAAAAAgD8AAIA/AAAAAAAAgD8AAAAAAACAPwAAgD8AAIA/AACAPwAAgD8=</DataArray></Points><Polys><DataArray type="Int64" Name="connectivity" format="binary" NumberOfComponents="1">wAAAAAAAAAAAAAAAAAAAAAIAAAAAAAAAAwAAAAAAAAABAAAAAAAAAAAAAAAAAAAAAQAAAAAAAAAFAAAAAAAAAAQAAAAAAAAAAQAAAAAAAAADAAAAAAAAAAcAAAAAAAAABQAAAAAAAAADAAAAAAAAAAIAAAAAAAAABgAAAAAAAAAHAAAAAAAAAAIAAAAAAAAAAAAAAAAAAAAEAAAAAAAAAAYAAAAAAAAABAAAAAAAAAAFAAAAAAAAAAcAAAAAAAAABgAAAAAAAAA=</DataArray><DataArray type="Int64" Name="offsets" format="binary" NumberOfComponents="1">MAAAAAAAAAAEAAAAAAAAAAgAAAAAAAAADAAAAAAAAAAQAAAAAAAAABQAAAAAAAAAGAAAAAAAAAA=</DataArray></Polys></Piece></PolyData></VTKFile>
Hey All,
is there any way to extract data from a vtk data easily ?
I couldn't find any evidence of a extraction method for the datasets into arrays.
I tried to write a function myself, but now i am stuck making it generic for the different datatypes of IOBuffer
and, to be honest, i have no idea what I am doing here, but it returns a array in the end.
use vtkio::model::*;
fn main() {
let file = "test.vtk";
let mut vtk_file = match Vtk::import(&file)
.expect(&format!("Failed to load file: {:?}", file))
.data
{
DataSet::PolyData { pieces, .. } => pieces,
_ => panic!("OhOh"),
};
let data = &**match &vtk_file[0] {
Piece::Inline(x) => x,
_ => panic!("OhOh2"),
};
let x = &data.data.point[0];
let y = match &x {
Attribute::Field { name, data_array } => data_array,
_ => panic!("OhOh3"),
};
let mut data: Option<&Vec<i32>> = None;
for elem in y {
if elem.name == "id" {
data = Some(match &elem.data {
IOBuffer::I32(vec) => vec,
_ => panic!("OhOh4"),
});
break;
}
}
let data = if let Some(data) = data {
data
} else {
panic!("OhOh5")
};
println!("{:?}", data);
}
}
Is there any better way to do this?
vtkio
is unable to parse VTK and VTU files generated by pygmsh
which internally uses meshio
for generation of VTK and VTU files.
I have tested the following file types:
Paraview is able to visualize each file without problem. The spreadsheet view for each file shows the appropriate data.
Here is the script used to generate the VTK and VTU files:
import pygmsh
with pygmsh.geo.Geometry() as geom:
p = geom.add_polygon(
[
[0.0, 0.0],
[1.0, -0.2],
[1.1, 1.2],
[0.1, 0.7],
],
mesh_size=0.4,
)
geom.add_physical(p.lines[0], label="bottom")
geom.add_physical(p.lines[1], label="right")
geom.add_physical(p.lines[2], label="top")
geom.add_physical(p.lines[3], label="left")
mesh = geom.generate_mesh()
mesh.write("no-compression.vtu", compression=None)
mesh.write("lzma.vtu", compression="lzma")
mesh.write("zlib.vtu", compression="zlib")
mesh.write("ascii.vtu", binary=False)
mesh.write("binary.vtk")
mesh.write("ascii.vtk", binary=False)
Here is the file I am using to test the import of these files:
use std::path::Path;
use vtkio::model::Vtk;
fn main() -> Result<(), Box<dyn std::error::Error>> {
let inputs = &[
"ascii.vtk",
"binary.vtk",
"ascii.vtu",
"no-compression.vtu",
"lzma.vtu",
"zlib.vtu",
];
let vtks = inputs
.iter()
.map(|input| Vtk::import(input));
for (input, vtk) in inputs.iter().zip(vtks) {
print!("{}.. ", input);
match vtk {
Ok(_) => println!("parsed!"),
Err(e) => println!("{}", e),
};
}
Ok(())
}
And the output
ascii.vtk.. Parse error: Alt
binary.vtk.. Parse error: Alt
ascii.vtu.. XML error: Validation error: InvalidDataFormat
no-compression.vtu.. XML error: Validation error: DataArraySizeMismatch { name: "types", expected: 38, actual: 37 }
lzma.vtu.. XML error: Validation error: Base64Decode(InvalidByte(22, 61))
zlib.vtu.. XML error: Validation error: Base64Decode(InvalidByte(22, 61))
I'm not sure if the problem is with vtkio
or mehsio
. I have opened an issue here since Kitware's Paraview (which also develops VTK) reports no problems with these files.
Thanks for your help!
As found in #38, creating point clouds is not particularly intuitive and we should improve the documentation surrounding this, and perhaps add some helper functions to make this simple and easily discoverable.
I am open to suggestions on exactly what needs to be done.
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.