cesiumgs / 3d-tiles-tools Goto Github PK
View Code? Open in Web Editor NEWLicense: Apache License 2.0
License: Apache License 2.0
Similar to the automatic clean up stages in https://github.com/AnalyticalGraphicsInc/gltf-pipeline, 3d-tiles-tools/tools
could have a stage to perform any needed cleanup on the tileset. This may include adding a _3DTILESDIFFUSE
semantic to the glTFs to support different color blend modes in Cesium: CesiumGS/cesium#4451
Hi,
I have a classified point cloud dataset in .pnts format.
I tried using the upgrade command to convert pnts dataset to glb, but it seems that I've lost the classification attribute in the process.
When I use the analyse tool on .glb file, it mentions classification attribute, but when I open the dataset in Cesium viewer, I get no preview when I apply the classification style.
Is there another way to convert point cloud to glb, or is this the issue with Cesium viewer?
Thank you
I did conversion using following command:
npx 3d-tiles-tools upgrade --targetVersion 1.1 -i path/to/pnts/dataset/tileset.json -o ./output/upgraded
This is the result of analyse command:
{
"asset": {
"generator": "glTF-Transform",
"version": "2.0"
},
"accessors": [
{
"type": "VEC3",
"componentType": 5126,
"count": 8277,
"max": [
17.905780792236328,
11.288053512573242,
0.07561510056257248
],
"min": [
17.62045669555664,
11.036800384521484,
-0.08327221870422363
],
"bufferView": 0,
"byteOffset": 0
},
{
"type": "SCALAR",
"componentType": 5121,
"count": 8277,
"bufferView": 0,
"byteOffset": 12
}
],
"bufferViews": [
{
"buffer": 0,
"byteOffset": 0,
"byteLength": 132432,
"byteStride": 16,
"target": 34962
}
],
"buffers": [
{
"byteLength": 132432
}
],
"materials": [
{
"pbrMetallicRoughness": {
"metallicFactor": 0
}
}
],
"meshes": [
{
"primitives": [
{
"attributes": {
"POSITION": 0,
"_CLASSIFICATION": 1
},
"mode": 0,
"material": 0,
"extensions": {
"EXT_structural_metadata": {
"propertyAttributes": [
0
]
}
}
}
]
}
],
"nodes": [
{
"rotation": [
-0.7071067811865475,
0,
0,
0.7071067811865476
],
"mesh": 0
}
],
"scenes": [
{
"nodes": [
0
]
}
],
"extensionsUsed": [
"EXT_structural_metadata"
],
"extensions": {
"EXT_structural_metadata": {
"schema": {
"id": "ID_batch_table",
"name": "Generated from batch_table",
"classes": {
"class_batch_table": {
"name": "Generated from batch_table",
"properties": {
"Classification": {
"name": "Classification",
"description": "Generated from Classification",
"type": "SCALAR",
"componentType": "UINT8",
"required": true
}
}
}
}
},
"propertyAttributes": [
{
"class": "class_batch_table",
"properties": {
"Classification": {
"attribute": "_CLASSIFICATION"
}
}
}
]
}
}
}
Hi, I'm trying to merge (using 3.0) 2 tileset's with implicit tiling. When viewing the result in Cesium the following error occurs: "Only box, region, 3DTILES_bounding_volume_S2, and 3DTILES_bounding_volume_cylinder are supported for implicit tiling".
The resulting tileset.json file has boundingVolume.Sphere, in the input tileset.json files I've used boundingVolume.region.
Is it possible the merge tool creates boundingVolume.region instead of boundingVolume.sphere?
There currently is a glbToI3dm
command that takes a GLB file and generates an I3DM from that. This I3DM contains a single instance. This is of somewhat limited use, considering that I3DM is now considered a 'legacy' format, and there is no longer any benefit of wrapping a GLB into such an I3DM.
We could consider to create utility functions for creating real instanced models.
(The output should preferably be GLBs with EXT_mesh_gpu_instancing
extension, but we could also look at creating I3DM (maybe with URI as payload), if it's considered to be worth the effort)
In terms of implementing the functionality itself, this could be a pretty low-hanging fruit. The ExtInstanceFeaturesDemo.ts
already shows that creating such a GLB from a bunch of translations
is trivial.
The main degree of freedom is: Where does the instancing information come from?
A pragmatic (and generic) first shot could be to just use some JSON file with information like
{
translations: [
[1,2,3],
[2,3,4],
},
rotations: [
[0,0,0,1],
[0,1,0,0],
},
scales: [
[1,1,1],
[2,2,2],
}
}
Apparently, there are sources for data that could be relevant for creating such instanced models. For example, @bertt mentioned in the forum that opentrees.org could be a source for creating instanced tree models. Other sources of instancing information could be databases with the locations of wind power plants, for example. Maybe there could even be a thin convenience layer that takes longitude/latitude/height
information and converts that into the low-level translation/rotation/scale
information.
Hi I'm trying to convert B3dm file to glb using 3d-tiles-tools.
I tried using the convertB3dmToGlb command as I wanted to convert the batch table information to feature table, I used the following command:
npx 3d-tiles-tools convertB3dmToGlb -i ./data/example.b3dm -o ./data/example.glb
But when I execute the above command it throws the following error:
Usage: npx 3d-tiles-tools <command> [options]
Commands:
convert Convert between tilesets and tileset package formats. The
input and output can be paths to tileset JSON files,
'.3tz', or '.3dtiles' files.
glbToB3dm Repackage the input glb as a b3dm with a basic header.
glbToI3dm Repackage the input glb as a i3dm with a basic header.
b3dmToGlb Extract the binary glTF asset from the input b3dm.
i3dmToGlb Extract the binary glTF asset from the input i3dm.
cmptToGlb Extract the binary glTF assets from the input cmpt.
optimizeB3dm Pass the input b3dm through gltf-pipeline. To pass options
to gltf-pipeline, place them after --options. (--options -h
for gltf-pipeline help)
optimizeI3dm Pass the input i3dm through gltf-pipeline. To pass options
to gltf-pipeline, place them after --options. (--options -h
for gltf-pipeline help)
gzip Gzips the input tileset directory.
ungzip Ungzips the input tileset directory.
combine Combines all external tilesets into a single tileset.json
file.
merge Merge any number of tilesets together into a single
tileset.
upgrade Upgrades the input tileset to the latest version of the 3D
Tiles spec. Embedded glTF models will be upgraded to glTF
2.0.
pipeline Execute a pipeline that is provided as a JSON file
analyze Analyze the input file, and write the results to the output
directory. This will accept B3DM, I3DM, PNTS, CMPT, and GLB
files (both for glTF 1.0 and for glTF 2.0), and write files
into the output directory that contain the feature table,
batch table, layout information, the GLB, and the JSON of
the GLB
tilesetToDatabase Create a sqlite database for a tileset. (Deprecated - use
'convert' instead)
databaseToTileset Unpack a tileset database to a tileset folder. (Deprecated
- use 'convert' instead)
Options:
--version Show version number [boolean]
-h, --help Show help [boolean]
-f, --force Output can be overwritten if it already exists.
[boolean] [default: false]
Unknown arguments: i, o, convertB3dmToGlb
As per the help convertB3dmToGlb is not listed in options but it is listed in readme file. May I know know to use this functionality.
Hi,
I found your tool when searching for ways to convert .b3dm into .gltf.
Apparently it needs some other piece of software called NPM?
Can you perhaps add just some introduction on what is actually needed to run your tool?
E.g. what is this NPM and where do I get it? That would help a noob like me a lot.
Thanks.
The analyze
command currently generates information about the structure of B3DM, I3DM, PNTS, CMPT, and GLB files. This information is low-level, technical, for analysis and debugging.
The output could be extended to contain information about the bounding boxes of the respective tiles. This could be added to the output once in the (min,max)
representation, but also in the (center,halfAxes)
representation.
你好,模型gzip后,打开后不能显示,是什么原因,怎么解决?
The diagram that is currently shown in the IMPLEMENTION.md
shows the relationship of the PropertyModel
and MetadataEntityModel
classes.
What is not made explicit in this diagram (and could also be pointed out more clearly in some of the TSDoc comments):
PropertyModel
provides "raw" values. It does not take into account normalized
, noData
, offset
, and other modifiers of the property, most importantly because the actual property that appears in the metadata entity may override the offset/scale
properties that have defined in the class property. This also implies that the values are numeric when the property type is ENUM
, as pointed out in the TSDoc of the PropertyModel
MetadataEntityModel
provides convenience access to the property values, as in "Now what do we really have here?" - including possible offset/scale
considerations and their overrides, as well as noData
and default
handling. This is described in (and handled by) the MetadataValues
processValue
methodHowever, ENUM
s are a special beast. For ENUM
types, the noData
and default
values are given as the strings of the enum values, like noData: "ENUM_VALUE_A"
. This was not properly considered until now. It does raise some questions about where and how to translate between the numeric representation and the string representation. But in any case, the implementation should be changed so that the MetadataEntityModel
always returns the string representations of ENUM
values, and properly takes the noData
and default
values into account.
This is already addressed as part of #70 (the first/main commit being f6d0d7b , still to be reviewed and cleaned up), because it originally came up during the validation of ENUM
types in property textures for CesiumGS/3d-tiles-validator#280
While testing out ImplicitToExplicitDemo
I noticed that the output tileset uses "contents" instead of "content". This is similar to the issue that was fixed in #30.
I think this would be a matter of updating ImplicitTraversedTile::asFinalTile
, ExplicitTraversedTile::asFinalTile
, and ImplicitToExplicitDemo::buildExplicitHierarchy
.
When I am using the command line to convert .gltf models to .b3dm, it does not generate a tileset.json file. How can I generate a tileset.json file?
For example, on the BoxTextured GLB sample model (https://github.com/KhronosGroup/glTF-Sample-Models/tree/master/2.0/BoxTextured/glTF-Binary), packing and unpacking creates a different GLB:
$ npx 3d-tiles-tools glbToB3dm -i BoxTextured.glb -o BoxTextured.b3dm
$ npx 3d-tiles-tools b3dmToGlb -i BoxTextured.b3dm -o BoxTexturedOut.glb
$ stat -c "%s %n" BoxTextured*.glb
6540 BoxTextured.glb
6544 BoxTexturedOut.glb
According to the Khronos glTF Validator website, the resulting GLB file is valid with warnings. Unfortunately some other tools and validators are less lenient and reject a GLB that doesn't conform to its own stated length.
The b3dm spec states
The actual binary glTF data does not include possible trailing padding bytes. Clients must take into account the length from the binary glTF header, and use this length to determine the part of the tile data that actually represents the binary glTF.
so I would argue that 3d-tiles-tools should also strip out these bytes.
Thoughts?
If this looks valid, I think I can create a pull request sometime in the coming weeks.
Hello everyone!
I have this structure:
Model/
├─ tileset.json
├─ Data/
│ ├─ a.b3dm
│ ├─ b0.b3dm
│ ├─ b1.b3dm
│ ├─ ... and other 620 .b3dm files
I tried to convert it, but I can only do it one file at a time (using 3d-tiles-tools b3dmToGlb
). But unfortunately it will be very long and laborious, because I have 20 such folders, in which there are 500-1000 such tiles that need to be converted. And even if I suddenly do it manually, then I don’t understand how then I can correctly combine all these glbs into one model.
I also tried 3d-tiles-tools merge
and 3d-tiles-tools combine
, but it does not lead to anything, it just creates an output folder that contains exactly the same set of files, but tileset.json changes to this:
{
"asset": {
"version": "1.1"
},
"geometricError": 10000,
root: {
"boundingVolume": {
"sphere": [
13.637106371665038,
33.194847012305296,
15.729257822775878,
63.32171125738676
]
},
"refine": "ADD",
"geometricError": 10000,
"children": [
{
"boundingVolume": {
"sphere": [
13.637106371665038,
33.194847012305296,
15.729257822775878,
63.32171125738676
]
},
"content": {
"uri": "Model/tileset.json"
},
"geometricError": 4.56985088,
"refine": "REPLACE"
}
]
}
}
What am I doing wrong? Can someone suggest me what needs to be done?
node-stream-zip
does not support deflate64, so any zip file using it fails.One part of #61 was removing the quiet:boolean
parameter that determined whether certain console.log
messages should be printed, and replacing that with the logger. This flag was supposed to be removed in 42c0436 , but unfortunately, there's another boolean flag for the tile content processing, namely processExternalTilesets:boolean
, and this led to a state where in some places, a quiet:false
was passed in, and now interpreted as processExternalTilesets:false
(for example, in the TileContentProcessing
class).
Is it possible to add/adjust code for single sided materials to become double sided?
The previous version of the tools contained a mechanism for some operations that caused the tileset.json
to be written in zipped form when it was zipped in the input, and to be written in non-zipped form when it was not zipped in the input. (See, for example, the previous combineTilesets
function).
The current state of the tools tried to emulate this behavior, by keeping track of whether the input tileset.json
was zipped. This causes a bug where the tileset.json
will not be zipped, even when running gzip
.
To solve this: The zippedness of the input should not matter. The tools should always be able to read zipped input, but write the output in un-zipped form, unless zipping was the actual operation that should be performed. This can either be via the gzip
command, or via a gzip
pipeline stage.
(The (legacy) tilesOnly
flag for the gzip
operation has to be taken into account here, but that has already been generalized, and should be easy to handle)
The block at
(or the corresponding block in the state after merging #64 ) has to be replaced with a block like this, which also contains some information about what went wrong there: // The 'textureInfoDef' that is created here will be stored
// by glTF-Transform internally. If there are extensions for
// this object (e.g. a KHR_texture_transform extension), then
// the extension information will be added to THE exact object.
// Creating and returning a new object here would cause this
// information to be omitted. So the object that is returned here
// has to be THE exact object that was created internally, but
// extended with the structural metadata extension information.
const basicTextureDef = context.createTextureInfoDef(texture, textureInfo);
const propertyTexturePropertyDef = basicTextureDef as any;
propertyTexturePropertyDef.channels = propertyTextureProperty.getChannels();
propertyTexturePropertyDef.offset = propertyTextureProperty.getOffset();
propertyTexturePropertyDef.scale = propertyTextureProperty.getScale();
propertyTexturePropertyDef.max = propertyTextureProperty.getMax();
propertyTexturePropertyDef.min = propertyTextureProperty.getMin();
return propertyTexturePropertyDef as PropertyTexturePropertyDef;
This may have to be broken down into smaller issues at some point, but some context is summarized here:
Much of the functionality of version 0.2.0 of the tools was built by extracting "generic" functionality from the 3d-tiles-validator. Nearly all of this functionality is exposed by the 3d-tiles-tools on an API level, because it is supposed to be used by the validator, as an internal project. Therefore, all of the API is marked as @internal
in the documentation.
The overarching question is now: What should be made 'public', and how?
This refers to the API level for each part, but also to the structure of the libraries. There are some "low-level" functionalities that are basic enough so that one could justify offering them as a dedicated library. (Note: The directory structure in the current state of the tools already reflects a possible way to break down the tools into smaller libraries....). Such libraries could be, for example:
In many ways, this boils down to the question of the intended granularity of the libraries.
For each library itself, there's still the quesition about the exact API.
(From my Java backround, I'm a fan of the purist approach, which can be summarized as: "No public constructors, period". But I'm not (yet) sure whether this is considered to be "idiomatic" in TypeScript).
The importance of this question is already apparent: The 3d-tiles-validator
is supposed to use nearly everything of the 3d-tiles-tools
. So the tools will be a dependency of the validator. On the other hand, it would make a lot of sense to use the validator to check whether input/output tilesets of the tools are really valid, meaning that the validator would be a dependency of the tools. This shows that there is a line that should be drawn, but it's not yet clear where to draw this line.
after convert to b3dm,the model very dark,is there any method?thanks very much
There currently are a few places where "log messages" are printed - using console.log
. In some cases, this is already "funneled" into a single, internal log
function (similar to the logCallback
in the previous version). But there should be something that offers more fine-grained control over what is logged, and where the log output goes.
Of course, one could just go to https://github.com/topics/logging-library?l=typescript and pick the first result. But maybe there are reasons to put more thought into that, and maybe even reasons to pick one specific library.
Obvious places where this should come in handy:
When upgrading a tileset, it currently prints something like this:
Upgrading asset version number
Upgrading asset version from 1.0 to 1.1
Upgrading refine to be in uppercase
Upgrading content.url to content.uri
Renaming 'url' property for content parent.b3dm to 'uri'
Renaming 'url' property for content ll.b3dm to 'uri'
Renaming 'url' property for content lr.b3dm to 'uri'
Renaming 'url' property for content ur.b3dm to 'uri'
Renaming 'url' property for content ul.b3dm to 'uri'
Upgrading extension declarations
Upgrading GLB in ll.b3dm
Upgrading GLB in lr.b3dm
Upgrading GLB in parent.b3dm
Upgrading GLB in ul.b3dm
Upgrading GLB in ur.b3dm
So it prints
These could just be different log level (as in INFO
and FINE
, or FINE
and FINER
).
When executing a "pipeline", things become ... nested:
Executing pipeline
Executing tilesetStage 0 of 6: upgrade
Executing tilesetStage : upgrade
currentInput: ./specs/data/TilesetOfTilesets
currentOutput: C:\<temp>\tilesetStage-0-upgrade
<The output from `upgrade`, as above>
Executing tilesetStage 1 of 6: combine
Executing tilesetStage : combine
currentInput: C:\<temp>tilesetStage-0-upgrade
currentOutput: C:\<temp>tilesetStage-1-combine
Executing tilesetStage 2 of 6: _b3dmToGlb
Executing tilesetStage : _b3dmToGlb
currentInput: C:\<temp>tilesetStage-1-combine
currentOutput: C:\<temp>tilesetStage-2-_b3dmToGlb
Executing contentStage 0 of 1: b3dmToGlb
Processing source: parent.b3dm with type CONTENT_TYPE_B3DM
to target: parent.glb
Processing source: tileset3/ll.b3dm with type CONTENT_TYPE_B3DM
to target: tileset3/ll.glb
Processing source: lr.b3dm with type CONTENT_TYPE_B3DM
to target: lr.glb
Processing source: ur.b3dm with type CONTENT_TYPE_B3DM
to target: ur.glb
Processing source: ul.b3dm with type CONTENT_TYPE_B3DM
to target: ul.glb
Processing source: tileset.json with type CONTENT_TYPE_TILESET
to target: tileset.json
Executing tilesetStage 3 of 6: _optimizeGlb
Executing tilesetStage : _optimizeGlb
currentInput: C:\<temp>tilesetStage-2-_b3dmToGlb
currentOutput: C:\<temp>tilesetStage-3-_optimizeGlb
Executing contentStage 0 of 1: optimizeGlb
Processing source: parent.glb with type CONTENT_TYPE_GLB
to target: parent.glb
Processing source: tileset3/ll.glb with type CONTENT_TYPE_GLB
to target: tileset3/ll.glb
Processing source: lr.glb with type CONTENT_TYPE_GLB
to target: lr.glb
Processing source: ur.glb with type CONTENT_TYPE_GLB
to target: ur.glb
Processing source: ul.glb with type CONTENT_TYPE_GLB
to target: ul.glb
Processing source: tileset.json with type CONTENT_TYPE_TILESET
to target: tileset.json
Executing tilesetStage 4 of 6: _separateGltf
Executing tilesetStage : _separateGltf
currentInput: C:\<temp>tilesetStage-3-_optimizeGlb
currentOutput: C:\<temp>tilesetStage-4-_separateGltf
Executing contentStage 0 of 1: separateGltf
Processing source: parent.glb with type CONTENT_TYPE_GLB
to target: parent.gltf
Processing source: tileset3/ll.glb with type CONTENT_TYPE_GLB
to target: tileset3/ll.gltf
Processing source: lr.glb with type CONTENT_TYPE_GLB
to target: lr.gltf
Processing source: ur.glb with type CONTENT_TYPE_GLB
to target: ur.gltf
Processing source: ul.glb with type CONTENT_TYPE_GLB
to target: ul.gltf
Processing source: tileset.json with type CONTENT_TYPE_TILESET
to target: tileset.json
Executing tilesetStage 5 of 6: gzip
Executing tilesetStage : gzip
currentInput: C:\<temp>tilesetStage-4-_separateGltf
currentOutput: ./output/result
Whether or not log libraries offer a functionality to keep track of indentation levels has to be investigated. (Maybe the indentation has to be tracked manually - right now, the indentation for the pipeline itself here is just "fixed", but this hits some limits when a certain stage creates own log messages...)
Any glTFs that contain the CESIUM_RTC extension can instead us the RTC_CENTER
feature table property: CesiumGS/3d-tiles#263.
There should be a command line function to extract the elements of a CMPT file. Right now, there only is cmptToGlb
, which walks through the CMPT "recursively", and extracts all GLB that are found. So for a CMPT that has a structure like
CMPT
CMPT
I3DM
I3DM
B3DM
the function will extract the GLBs from both I3DMs, and the GLB from the B3DM. But there is no easy way to just etract that (inner) CMPT and B3DM from the top-level one.
The implementation should be fairly trivial, with only a few degrees of freedom. For example, one could consider a recurse:boolean
parameter, to decide whether the result for the example should be the (inner)CMPT+B3DM, or all the "leaves" (I3DM, I3DM, B3DM).
This package fails to install due to the old better-sqlite3 dependency. Since this is now a dependency of 3d-tiles-validator, installation of that package fails as well when using the latest HEAD. The dependency should be updated to 8.0.1 to match.
See also CesiumGS/3d-tiles-validator#245
Apparently, FME creates 3D Tiles tilesets where the metallicFactor
in the glTF assets in B3DMs is set to 1.0 (the default value). In an ideal world, we'd open an issue in their repo, and this would be fixed in the next release. But in reality, poeple seem to stumble over that, because it shows up as textures being "too dark" and surfaces being "too shiny", e.g.
The first thread ended with a snippet that is based on a certain commit state of the 3D Tiles tools, using and internal API, and that may break at any point in time. It was just intended as a workaround that the user had with one particular tileset.
If this comes up more frequently, we could consider offering a more stable (maybe even command-line) functionality for modifying the metallicFactor
in all tile contents of a tileset.
Some questions about the compression of tileset package entries are still open. This refers to the specification as well as the implementation.
Regarding the specification:
Details about the entry compression may have to be added to the 3DTILES specification proposal at CesiumGS/3d-tiles#727
The 3TZ specification at https://github.com/erikdahlstrom/3tz-specification/ already covers the entry compression, and says
For optimal read performance, files in the archive should be stored without compression, however, for a good trade of read performance and file size, use the Zstandard compression method. For best compatibility with legacy software, choose the standard deflate compression method, noting that this is the slowest of the three methods.
(The question of whether entries may be compressed with other methods - e.g. with GZIP - remains unanswered here)
Regarding the implementation:
The implementation here (i.e. the TilesetSource
and TilesetTarget
interface) is supposed to handle different package types (and the file system) as transparently as possible. This does raise questions about the intended behavior. For example, with (pseudocode) lines like
// Read the "tileset.json":
const buffer = source.read("tileset.json");
// Write the "tileset.json"
target.write("tileset.json", buffer);
it is not clear at which point the buffer
may contain compressed data, how to detect the compression method when reading it, or how to define the compression method when writing it.
For many use-cases, it would be nice if the client didn't have to care about compression. This could largely be achieved by establishing a simple contract for the TilesetSource
and TilesetTarget
:
This would allow very convenient handling of compression on the API level.
When reading data, then the responsibility for detecting the compression method and uncompressing the data could solely be in the TilesetSource
implementation.
This is important. If the client had to do something like
let buffer= source.read("tileset.json");
if (compressionMethodOf(buffer) === "gzip") {
buffer = gzip.uncompress(buffer);
} else if (compressionMethodOf(buffer) === "zstd") {
buffer = zstd.uncompress(buffer);
} else if (compressionMethodOf(buffer) === "brotli") {
buffer = brotli.uncompress(buffer);
} else {
// May be DEFLATE - this cannot be detected
let uncompressed = undefined;
try {
uncompressed = deflate.uncompress(buffer);
} catch (e) {
// Nope, that wasn't deflate...
}
return buffer;
}
then this would require the client to detect the compression methods and (imprtant:) integrate all the compression libraries as dependencies.
The client should be able to just do a
const buffer = source.read("tileset.json");
and be done.
When writing data, then the responsibility for compressing the data could solely be in the TilesetTarget
implementation. It would be trivial to have code like
// Write uncompressed data
const target = createTarget();
target.write("tileset.json", buffer);
// Write compressed data by wrapping the target into a compressing one
const compressingTarget = Targets.wrapGzip(target);
compressingTarget.write("tileset.json", buffer);
There may be cases where the client would like to access the compressed data. For example: When the data from a source is supposed to be served over the network, then the client may want to access the compressed data (if it is compressed with a method that can be used as the Content-Encoding
).
This may make it necessary to add the corresponding functionality to the TilesetSource
, like
// By default, always read uncompressed data
const uncompressedData = source.read("tileset.json");
// But allow determining the compression method
const method = source.getMethod("tileset.json"); // Can return "gzip" or "zstd" or... (what else?)
// Read the "raw" (compressed) data
const compressedData = source.readRaw("tileset.json");
But the decision here will depend on the decisions that are made on the specification level.
$ node bin/3d-tiles-tools.js -i some.b3dm -o out.b3dm optimizeB3dm
>>> Binary glTF version is not 1
Hello, I would like to ask how to solve the problem of merging two 3DTiles data and loading them in Cesium. However, there is clearly a gap in the merged 3DTiles. Have you ever encountered this problem?
npx 3d-tiles-tools merge -i C:\Users\xxj\Desktop\newobj\obj\test_obj1\tileset.json -i C:\Users\xxj\Desktop\newobj\obj\test_obj2\tileset.json -o C:\Users\xxj\Desktop\newobj\obj\3dtiles
I use this command to merge
Using the merge
utility as presented in the docs starts by producing an output tileset.json
root file which references the input tilesets in children directories. Then it continues by copying all the tile contents to the output folder.
npx 3d-tiles-tools merge -i tileset_1/n_0.json -i tileset_2/n_0.json -o ./_merged_tileset
Since there are lots of files to copy, this copy operation can take a lot of time. Having an option to disable tiles copying
(json + b3dm) would be great! Therefore, the output tileset would only reference the input tilesets - probably via a uri
relative to the output tileset file. eg in the case above, the output tileset would look like
{
"asset": {"version": "1.1"},
"geometricError": 1,
"root": {
"boundingVolume": {...},
"refine": "ADD",
"geometricError": 1,
"children": [
{
"boundingVolume": {...},
"content": {
// Important line: relative reference to merged tilesets
"uri": "../tileset_1/n_0.json"
},
"geometricError": 1,
"refine": "REPLACE"
},
// ...
]
}
}
Note: this might be what the combine
utility is made for, but since it does not process in my case because of a Content does not have a URI
error (properties are url
), I could not check, see this feature request #43
The GltfUtilities
class has a method extractDataFromGlb
that can be used to extract the JSON- and binary data from a GLB file, covering glTF 1.0 as well as glTF 2.0.
However, it currently assumes that the BIN chunk is present, and throws a RangeError
when this is not the case. According to the 4.4.3.3. Binary buffer section of the specification:
When the binary buffer is empty or when it is stored by other means, this chunk SHOULD be omitted.
meaning that the implementation has to anticipate the case that there is no BIN chunk.
A quick test (which may be used in a similar form in the specs when this is fixed): It creates a GLB buffer that does not include a BIN chunk, and throws an error when trying to read it:
import { Accessor, Document } from "@gltf-transform/core";
import { NodeIO } from "@gltf-transform/core";
import { GltfUtilities } from "./src/contentProcessing/GltfUtilities";
async function createExampleDocument(): Promise<Document> {
const document = new Document();
const root = document.getRoot();
const scene = document.createScene();
root.setDefaultScene(scene);
const node = document.createNode();
scene.addChild(node);
/*/
// When creating a buffer, it DOES work...
const buffer = document.createBuffer();
const positions = [
0.0, 0.0, 0.0,
1.0, 0.0, 0.0,
0.0, 1.0, 0.0,
1.0, 1.0, 0.0,
];
const positionsAccessor = document.createAccessor();
positionsAccessor.setArray(new Float32Array(positions));
positionsAccessor.setType(Accessor.Type.VEC3);
positionsAccessor.setBuffer(buffer);
const primitive = document.createPrimitive();
primitive.setAttribute("POSITION", positionsAccessor);
const mesh = document.createMesh();
mesh.addPrimitive(primitive);
node.setMesh(mesh);
//*/
return document;
}
async function runTest() {
const inputDocument = await createExampleDocument();
const io = new NodeIO();
const glb = await io.writeBinary(inputDocument);
const data = GltfUtilities.extractDataFromGlb(Buffer.from(glb));
console.log("jsonData", data.jsonData);
console.log("binData", data.binData);
}
runTest();
The current ExtMeshFeaturesDemo.ts
creates a primitive that only contains the extension data. It should be extended to create sensible geometry that can be loaded in a viewer.
This should include a fix for the issue that was reported KhronosGroup/glTF-Tutorials#87 (comment) , although the details still have to be investigated.
If a tile url has a leading slash it should be treated as a warning, if not an error.
More ideas to come...
Definitely
n
tiles per tileset.json, m
levels per tileset.json, etc.As Needed
Some lagacy B3DMs may contain glTF 1.0 (!) that contain oct-encoded normals. In glTF 1.0, they had been decoded with the shader that was part of the glTF file itself. When gltf-pipeline
upgrades them to glTF 2.0, these oct-encoded normals result in "invalid data" (i.e they are seen as "2D normals").
More details are described at #95 (comment) , including a small glTF-Transform-based "post-processing" step that should become part of the upgrade
functionality.
The functionality for creating a tileset JSON file from a set of tile content files was requested in #47 and implemented in #51 .
Until now, the bounding volumes that are put into the tileset are computed from points of a point cloud or from the glTF content (with wrappers for handling B3DM or I3DM, the RTC center etc). These bounding volumes are computed as axis-aligned bounding volumes. As pointed out in #47 (comment) , this does not necessarily result in a tight bounding volume. It would be preferable to compute tighter bounding volumes as actual oriented bounding boxes . A first shot of just using OrientedBoundingBox::fromPoints
did not result in valid bounding volumes (details in #51 (comment) ), so this is deferred for now.
This issue is intended for ...
OrientedBoundingBox::fromPoints
did not yield valid resultsCentOS 7 attempted to install 3d-tiles-tools through yum.
node JS was installed using the default values
yum install nodejs npm
The installed versions are as follows.
# node -v
v16.20.2
# npm -v
8.19.4
The installation was performed using the commands provided in the guide.
yum install 3d-tiles-tools
The following error occurred during installation.
npm ERR! code 1
npm ERR! path /home/bnt/3d-tiles-tools/node_modules/better-sqlite3
npm ERR! command failed
npm ERR! command sh -c -- prebuild-install || node-gyp rebuild --release
npm ERR! make: Entering directory `/home/bnt/3d-tiles-tools/node_modules/better-sqlite3/build'
npm ERR! TOUCH ba23eeee118cd63e16015df367567cb043fed872.intermediate
npm ERR! ACTION deps_sqlite3_gyp_locate_sqlite3_target_copy_builtin_sqlite3 ba23eeee118cd63e16015df367567cb043fed872.intermediate
npm ERR! TOUCH Release/obj.target/deps/locate_sqlite3.stamp
npm ERR! CC(target) Release/obj.target/sqlite3/gen/sqlite3/sqlite3.o
npm ERR! AR(target) Release/obj.target/deps/sqlite3.a
npm ERR! COPY Release/sqlite3.a
npm ERR! CXX(target) Release/obj.target/better_sqlite3/src/better_sqlite3.o
npm ERR! rm ba23eeee118cd63e16015df367567cb043fed872.intermediate
npm ERR! make: Leaving directory `/home/bnt/3d-tiles-tools/node_modules/better-sqlite3/build'
npm ERR! prebuild-install warn install /lib64/libm.so.6: version `GLIBC_2.29' not found (required by /home/bnt/3d-tiles-tools/node_modules/better-sqlite3/build/Release/better_sqlite3.node)
npm ERR! gyp info it worked if it ends with ok
npm ERR! gyp info using [email protected]
npm ERR! gyp info using [email protected] | linux | x64
npm ERR! gyp info find Python using Python version 3.6.8 found at "/usr/bin/python3"
npm ERR! gyp info spawn /usr/bin/python3
npm ERR! gyp info spawn args [
npm ERR! gyp info spawn args '/usr/lib/node_modules/npm/node_modules/node-gyp/gyp/gyp_main.py',
npm ERR! gyp info spawn args 'binding.gyp',
npm ERR! gyp info spawn args '-f',
npm ERR! gyp info spawn args 'make',
npm ERR! gyp info spawn args '-I',
npm ERR! gyp info spawn args '/home/bnt/3d-tiles-tools/node_modules/better-sqlite3/build/config.gypi',
npm ERR! gyp info spawn args '-I',
npm ERR! gyp info spawn args '/usr/lib/node_modules/npm/node_modules/node-gyp/addon.gypi',
npm ERR! gyp info spawn args '-I',
npm ERR! gyp info spawn args '/root/.cache/node-gyp/16.20.2/include/node/common.gypi',
npm ERR! gyp info spawn args '-Dlibrary=shared_library',
npm ERR! gyp info spawn args '-Dvisibility=default',
npm ERR! gyp info spawn args '-Dnode_root_dir=/root/.cache/node-gyp/16.20.2',
npm ERR! gyp info spawn args '-Dnode_gyp_dir=/usr/lib/node_modules/npm/node_modules/node-gyp',
npm ERR! gyp info spawn args '-Dnode_lib_file=/root/.cache/node-gyp/16.20.2/<(target_arch)/node.lib',
npm ERR! gyp info spawn args '-Dmodule_root_dir=/home/bnt/3d-tiles-tools/node_modules/better-sqlite3',
npm ERR! gyp info spawn args '-Dnode_engine=v8',
npm ERR! gyp info spawn args '--depth=.',
npm ERR! gyp info spawn args '--no-parallel',
npm ERR! gyp info spawn args '--generator-output',
npm ERR! gyp info spawn args 'build',
npm ERR! gyp info spawn args '-Goutput_dir=.'
npm ERR! gyp info spawn args ]
npm ERR! gyp info spawn make
npm ERR! gyp info spawn args [ 'BUILDTYPE=Release', '-C', 'build' ]
npm ERR! g++: error: unrecognized command line option ‘-std=gnu++14’
npm ERR! g++: error: unrecognized command line option ‘-std=c++17’
npm ERR! make: *** [Release/obj.target/better_sqlite3/src/better_sqlite3.o] 오류 1
npm ERR! gyp ERR! build error
npm ERR! gyp ERR! stack Error: `make` failed with exit code: 2
npm ERR! gyp ERR! stack at ChildProcess.onExit (/usr/lib/node_modules/npm/node_modules/node-gyp/lib/build.js:201:23)
npm ERR! gyp ERR! stack at ChildProcess.emit (node:events:513:28)
npm ERR! gyp ERR! stack at Process.ChildProcess._handle.onexit (node:internal/child_process:293:12)
npm ERR! gyp ERR! System Linux 3.10.0-1160.102.1.el7.x86_64
npm ERR! gyp ERR! command "/usr/bin/node" "/usr/lib/node_modules/npm/node_modules/node-gyp/bin/node-gyp.js" "rebuild" "--release"
npm ERR! gyp ERR! cwd /home/bnt/3d-tiles-tools/node_modules/better-sqlite3
npm ERR! gyp ERR! node -v v16.20.2
npm ERR! gyp ERR! node-gyp -v v9.1.0
npm ERR! gyp ERR! not ok
npm ERR! A complete log of this run can be found in:
npm ERR! /root/.npm/_logs/2024-02-19T13_15_36_907Z-debug-0.log
Can you give me a guide to fix this error and install it?
The TileData
payload
field carries a note that it can either contain the GLB data or an URI.
Functions like glbToI3dm
or the (internal) extractGlbBuffers
always assume that the payload is an embedded GLB. But this should be generalized. This will include some minor implementations for on-the-fly resolving of the GLB data from the URI and the corresponding error handling. It may also involve extending the CLI. For example, one could consider to offer an option like
... glbToI3dm -i input.glb -o output.i3dm -useUriForGlb
^--- this
at the command line.
Right now, the package.json
specifies
"engines": {
"node": ">=16.0.0"
},
There might be benefits in being more specific. The exact node version could be specified in an .nvmrc
file, which could also be picked up by the CI script. I'm a bit on the fence here: On the one hand, it could avoid random build failures that are caused by random changes in some Node version. On the other hand, it should be possible to compile and run the tools on every Node version (>=16), and if there is an issue with one version, then this should be detected and fixed.
The implications for users/contributors also have to be investigated. (E.g. would pinning the version to 20.10 mean that someone couldn't even compile it without having this exact version?).
I used the tool to merge two tilesets. However, some parts of the merged tileset are always rendered in low resolution regardless zooming levels. The original two tilesets can be rendered nicely before being merged.
I've found that by replacing "refine": "ADD"
with refine": "REPLACE"
or removing it from the top level tileset.json
will solve the problem.
I think, in this case, because the parent tile has no contents by its own, using the REPLACE
refinement should work. For some unknown reason, the ADD
refinement might cause some rendering issues.
Any thought on the behaviour?
If possible, use writeCallback
or at least use OS temp directories.
When using the convert command to convert a 3d tiles file based dataset to a .3dtiles database, the convert command should check either:
a) A tileset.json exists in the supplied folder before converting
b) If the user supplies a specific path a to a tilesets json (eg. example.json) it should rename to tileset.json on conversion
Otherwise, the user is able to generate a .3dtiles dataset that does not include a tileset.json. When uploaded to hosting services such as Cesium ion, this tileset is not able to be used as the expectation is a tileset.json is present.
Tentative changes for 1.0
that should be handled in the upgrade
command
asset.gltfUpAxis
removedBATCHID
to _BATCHID
Progress is in https://github.com/AnalyticalGraphicsInc/3d-tiles-tools/tree/2.0-tools
The KtxUtility
class is a thin convenience layer around the BinomialLLC Basis Universal encoder. It mainly offers that function convertImageData
, where the caller can throw in some arbitrary image data, and receive the resulting KTX image data. This is hiding all the nitty-gritty bits of (compiling!)/loading/initializing/configuring the underlying WASM module.
When calling this method many (many) times, it eventually causes a RuntimeError to be thrown from the WASM part:
RuntimeError: Aborted(). Build with -sASSERTIONS for more info.
(+ some stack trace into the WASM). The tooling and debugging possibilities around WASM are ... ... ... have a lot of room for improvement. But the observed behavior is a hint at ~"some sort of memory leak".
The WASM module has a delete()
function. The documentation of this function is:
/*
* Just kidding. It is not documented...
*/
But calling this function at the right place and time might resolve this issue.
When using the combine
method on tilesets produced by ContextCapture, then an error Content does not have a URI
is raised. This is probably because the tile contents in the tileset root json file (plus subsequent tiles) are referred to via the url
property rather than the uri
property - which references either b3dm or json tile contents.
I know I can use the 3d-tiles-tools upgrade
utility on local tilesets I do manage - although it will take a lot of time to upgrade loads of tilesets.
But for tilesets I do not manage, which are stored online on different servers and that I'd like to combine, then the easiest way would be for the combine
utility to allow for tilesets which use either the url
or the uri
property to reference tile content.
Is it possible to apply draco compression to glTF 2.0 b3dms?
When using the draco branch, I just get "Binary glTF version is not 1".
With the 2.0-tools branch, this error occurs:
$ node ./bin/3d-tiles-tools.js optimizeB3dm -i "[...].b3dm" -o ./output/optimized.b3dm -f --options -q
[...]\3d-tiles-tools\tools\lib\isJson.js:34
});
^
SyntaxError: Unexpected token )
And after deleting the faulty line 34, the output is also "Binary glTF version is not 1". Am I doing something wrong?
The structure
directory contains very basic structures for the handling of 3D Tiles data. These classes are plain old objects, without any functionality (i.e. no functions - only plain properties). These classes had been created manually in the 3d-tiles-validator, and then moved into the 3d-tiles-tools for the 0.2.0 release.
The structure of these classes directly corresponds to the 3D Tiles specification JSON schema. Therefore, it would probably make sense to replace these classes by classes that are auto-generated from the schema. The benefits here would be to
Of course, quick websearches show that there are tools for that, e.g. https://www.npmjs.com/package/json-schema-to-typescript - but ... further tests may be necessary before tackling this - for example, this library links to "json-schema-04", and we'll need one that supports much more recent schema versions.
I converted glb file to B3dm by glbToB3dm.
However tileset.json does not created.
Could you tell me how to generate tileset.json file for converted B3dm file?
When using CesiumJS v1.113 for New York City b3dm dataset, I ran into this issue:
RuntimeError: Fragment shader failed to compile. Compile log: ERROR: 0:236: '=' : dimension mismatch ERROR: 0:236: 'assign' : cannot convert from 'highp 2-component vector of float' to 'highp 3-component vector of float'
Then I used this tool and upgraded the dataset with this command:
npx ts-node src/main.ts upgrade -i ..\\NewYork\\tileset.json -o ..\\upgradedNY --targetVersion 1.1
The RuntimeError is gone, glb tiles are fetched, but no 3d buildings rendered on the map.
Below is my code (if you need the dataset, please let me know):
`
<!--from repo server-->
<script src="https://cesium.com/downloads/cesiumjs/releases/1.113/Build/Cesium/Cesium.js"></script>
<link href="https://cesium.com/downloads/cesiumjs/releases/1.113/Build/Cesium/Widgets/widgets.css" rel="stylesheet">
</head>
<body>
<div id="cesiumContainer" style="position:absolute; width:80%; height:80%"></div>
<script>
const viewer = new Cesium.Viewer('cesiumContainer');
try {
const tileset = Cesium.Cesium3DTileset.fromUrl('upgradedNY/tileset.json');
tileset.then(function(tileset){
viewer.scene.primitives.add(tileset);
viewer.zoomTo(
tileset,
new Cesium.HeadingPitchRange(0.5,-0.2,tileset.boundingSphere.radius * 4.0)
);
});
} catch (error) {
console.log(error)
}
// v113 error: TypeError: Cannot read properties of undefined (reading 'updateTransform')
//viewer.zoomTo(tileset, new Cesium.HeadingPitchRange(0, -0.5, 0));
// Remove default base layer
viewer.imageryLayers.remove(viewer.imageryLayers.get(0));
var wmts = new Cesium.WebMapTileServiceImageryProvider({
url : 'https://services.arcgisonline.com/arcgis/rest/services/World_Topo_Map/MapServer/WMTS/',
layer : 'World_Topo_Map',
style : 'default',
tileMatrixSetID : 'default028mm',
});
var arcgis = viewer.imageryLayers.addImageryProvider(wmts);
arcgis.alpha = 1.0; // 0.0 is transparent. 1.0 is opaque.
arcgis.brightness = 1.0; // > 1.0 increases brightness. < 1.0 decreases.
</script>
</body>
`
This should be easy, but opened to keep track of it: There should be a GitHub action for CI.
Most of the build infrastructure was taken from the 3d-tiles-validator, so it should mainly be a matter of copying the validator CI file here.
Edit: Note that the following refers to the https://github.com/CesiumGS/3d-tiles-validator/tree/2.0-tools branch!
This is the result of diving into a rabbit hole of debugging, and I'll try to provide some context, but keep it short:
There are some "legacy features" in 3D Tiles that are still used by other tools and existing assets, for example, the gltfUpAxis
property in the asset
. The fact that Cesium for Unreal does not honor this property caused some issues with rotated models, at https://community.cesium.com/t/unreal-load-3dtileset-not-by-ion-looks-wrong-rotation/12835 . The recommendation was to upgrade the tileset with 3d-tiles-validator
. The upgrade seems to work, but the resulting B3DMs contain GLBs that are invalid (and are not rendered by Cesium for Unreal and other tools - some viewers still manage to render them, though)
To reproduce: (Note: Example output files will be given below)
tileset.json
files in all subdirectories (otherwise, the conversion will complain with "More than one root tileset found in directory" - that should probably be fixed as well...)node ./bin/3d-tiles-tools.js upgrade --input C:/Data/3dtiles/ --output C:/Data/3dtiles_Upgraded
node ./bin/3d-tiles-tools.js b3dmToGlb --input C:\Data\3dtiles\Tile_+000_+001\Tile_+000_+001_L20_00000t3.b3dm --output C:\Data\Tile_+000_+001_L20_00000t3-OLD.glb
node ./bin/3d-tiles-tools.js b3dmToGlb --input C:\Data\3dtiles_Upgraded\Tile_+000_+001\Tile_+000_+001_L20_00000t3.b3dm --output C:\Data\Tile_+000_+001_L20_00000t3-NEW.glb
For the OLD
version, it prints an unrelated error about the buffer view target. For the NEW
one, the validator reports
{
"code": "GLB_CHUNK_TOO_BIG",
"message": "Chunk (0x004e4942) length (906568) does not fit total GLB length.",
"severity": 0,
"offset": 1528
},
{
"code": "GLB_LENGTH_MISMATCH",
"message": "Declared length (908100) does not match GLB length (908104).",
"severity": 0,
"offset": 908104
},
{
"code": "BUFFER_GLB_CHUNK_TOO_BIG",
"message": "GLB-stored BIN chunk contains 4 extra padding byte(s).",
"severity": 1,
"pointer": "/buffers/0"
},
(omitted unrelated warnings here).
Since the data set was shared publicly in the forum, I hope that it is OK to attach the relevant files here. (OLD
and NEW
referring to before and after the upgrade).
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.