Git Product home page Git Product logo

dracopy's Introduction

PyPI version

DracoPy

import os
import DracoPy

with open('bunny.drc', 'rb') as draco_file:
  mesh = DracoPy.decode(draco_file.read())

print(f"number of points: {len(mesh.points)}")
print(f"number of faces: {len(mesh.faces)}")
print(f"number of normals: {len(mesh.normals)}")

# Note: If mesh.points is an integer numpy array,
# it will be encoded as an integer attribute. Otherwise,
# it will be encoded as floating point.
binary = DracoPy.encode(mesh.points, mesh.faces)
with open('bunny_test.drc', 'wb') as test_file:
  test_file.write(encoding_test)

# If faces is omitted, DracoPy will encode a point cloud
binary = Dracopy.encode(mesh.points)

# Options for encoding:
binary = Dracopy.encode(
  mesh.points, faces=mesh.faces,
  quantization_bits=14, compression_level=1,
  quantization_range=-1, quantization_origin=None,
  create_metadata=False, preserve_order=False,
  colors=mesh.colors
)

DracoPy is a Python wrapper for Google's Draco mesh compression library.

Installation

Binary wheels are available for users with Python >= 3.6 and pip >= 20.

Installation from source requires Python >= 3.6, pip >= 10, and a C++ compiler that is fully compatible with C++11.

It supports Linux, OS X, and Windows. Numpy is required.

pip install DracoPy

Acknowledgements

We graciously thank The Stanford 3D Scanning Repository for providing the Stanford Bunny test model.

https://graphics.stanford.edu/data/3Dscanrep/

dracopy's People

Contributors

bretttully avatar denisri avatar fcollman avatar hanseuljun avatar manuel-castro avatar william-silversmith avatar zeruniverse avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

dracopy's Issues

How to compress a obj file?

I want to convert obj file to drc by python,
here is my code

def test_decoding_improper_file(): with open(os.path.join(testdata_directory, "bunny.obj"), "rb") as improper_file: file_content = improper_file.read() m = DracoPy.decode_buffer_to_mesh(file_content)
but error
`src\DracoPy.pyx:148: in DracoPy.decode_buffer_to_mesh
raise_decoding_error(mesh_struct.decode_status)


raise FileTypeException('Input mesh is not draco encoded')
E DracoPy.FileTypeException: Input mesh is not draco encoded

src\DracoPy.pyx:137: FileTypeException`
how do I handle it?

Removal of DracoPy.cpp from this repository

Thanks for this wonderful and useful library!

During the transition from a user via pypi to a more serious user building the package via source, I have noticed that the .cpp file generated from the .pyx file is included in the repository.

Given this .cpp can be derived from the .pyx file and how it exactly looks like depends on the system that built the .cpp file, I would like to suggest removing the .cpp file from this repository.

A search for an cython example repository also resulted in finding a .gitignore file with the .cpp file ignored. See interface.cpp ignored here.
https://gitlab.multiscale.utah.edu/mahanse/cmake-cpp-cython-python-demo/-/blob/master/.gitignore

Encoding does not return resulting faces and points count

I think it would make sense to be able to get the resulting faces and points count from the encode_mesh_to_buffer method.

Currently I am writing a tool to create gltf files from mesh sequences and there I am adding mesh compression with this library. To correctly set the faces and vertices count, I need to know them after the compression. What I do as a workaround is just decode it after encoding, but this is a bit a waste of time and resources:

encoded_blob = bytearray(DracoPy.encode_mesh_to_buffer(encoded_points, encoded_triangles))

# workaorund to get point & faces size back
decoded = DracoPy.decode_buffer_to_mesh(encoded_blob)
triangles_size = len(decoded.faces)
pts_size = int(len(decoded.points) / 3)

Would it be possible to somehow return them?

DracoPy installation

Greatings!

I'm doing "pip install DracoPy" (latest 1.3.0) and I have a problem with that version. It seems like this version 1.3.0 does not have the latest changes (DracoPy.encode(points, faces, tex_coord=None)).

I attach this image to show my problem. I hope it helps! Could you help me fixing this?
prove1

Thanks!

Custom metadata support

Hi!

First, thank you for the library development.
I want to use this lib in my python project. In addition to 3D geometry (draco mesh) I have to serialize/deserialize custom metadata attached to the mesh in the draco file. I see that current version of DracoPy doesn't support custom metadata. So I decided to make my own implementation of this in DracoPy.pyx file. And here I bumped into the problem:

I implemented recursive metadata struct.
in DracoPy.h:

  struct MetadataObject {
    std::unordered_map<std::string, std::vector<uint8_t>> entries;
    std::unordered_map<std::string, MetadataObject> sub_metadatas;
  };

And DracoPy.pxd

cdef extern from "DracoPy.h" namespace "DracoFunctions":
    cdef struct MetadataObject:
        unordered_map[string, vector[uint8_t]] entries
        unordered_map[string, MetadataObject] sub_metadatas

Also I added a function (to DracoPy.h) parsing draco metadata to MetadataObject and pass it into encode_mesh function.

And when I try to compile my solution I got cython compilation error:
RecursionError: maximum recursion depth exceeded while calling a Python object

I know declaring recursive structures is possible in Cython.
But it seems impossible to declare EXTERNAL recursive structures in Cython (I didn't find anything about it in google)

So my question is do you have ideas how to forward draco structure Metadata to Python?

Draco Submodule Should Point to Exact Version

I'm trying to use DracoPy to encode point clouds, and send them to a frontend I have, where they are decoded using Draco again (but in typescript, so with Three.js, not DracoPy). I used to do this by writing to temporary files using the draco binaries, reading from the files, and sending over those bytes. This worked, but I'd much prefer to just use DracoPy.

When I encode and decode using DracoPy, it works well. However, when I encode using DracoPy, and then decode using TypeScript, the colors get all messed up. My only working theory is that the Draco versions that DracoPy is incompatible with the one I am using on my frontend, which is believable because the snapshot of Draco that DracoPy points to isn't an exact version. Thus, the submodule should point to: google/draco@bd1e8de, rather than https://github.com/google/draco/commits/befe2d880992e6fdbfd29fe20a3b8664460b92cd.

Note: the specific way in which the colors are screwed up is that they are being interpreted as 0-1 when they are being sent as 0-255.

Decoding error for hemibrain meshes introduced between 0.0.15 and 0.0.18 release

@manuel-castro Thanks for your work on DracoPy. There seems to have been an issue introduced with parsing janelia hemibrain meshes between the 0.0.15 and 0.0.18 release. Worked example below. I've seen the same behaviour on miniconda with Python 3.6 as well. Thanks for any pointers!

With DracoPy 0.0.15

Python 3.8.5 | packaged by conda-forge | (default, Aug 28 2020, 20:36:22)
[Clang 10.0.1 ] on darwin
Type "help", "copyright", "credits" or "license" for more information.
>>> import cloudvolume
>>> print("cloudvolume: "+cloudvolume.__version__)
cloudvolume: 3.10.0
>>> from cloudvolume import CloudVolume
>>> vol=CloudVolume(cloudpath="precomputed://gs://neuroglancer-janelia-flyem-hemibrain/v1.1/segmentation", use_https=True)
>>> m=vol.mesh.get(1796817841, lod=3)
>>> m0=m[1796817841]
>>> m0.vertices
array([[130879.99 , 122751.875,  98367.5  ],
       [130944.   , 122559.87 ,  98367.5  ],
       [130944.   , 122687.87 ,  98367.5  ],
       ...,
       [164416.52 , 288768.4  , 215103.28 ],
       [168448.56 , 291904.47 , 215871.3  ],
       [168192.56 , 291648.44 , 215359.28 ]], dtype=float32)
>>> if (abs(m0.vertices[0][2]-98367.5)<0.1):
...   print("OK")
... else:
...   print("Error")
...
OK

With DracoPy 0.0.18

Python 3.8.5 | packaged by conda-forge | (default, Aug 28 2020, 20:36:22)
[Clang 10.0.1 ] on darwin
Type "help", "copyright", "credits" or "license" for more information.
>>> import cloudvolume
>>> print("cloudvolume: "+cloudvolume.__version__)
cloudvolume: 3.10.0
>>> from cloudvolume import CloudVolume
>>> vol=CloudVolume(cloudpath="precomputed://gs://neuroglancer-janelia-flyem-hemibrain/v1.1/segmentation", use_https=True)
>>> m=vol.mesh.get(1796817841, lod=3)
>>> m0=m[1796817841]
>>> m0.vertices
array([[2.3991288e+09, 2.3969825e+09, 2.3907894e+09],
       [2.3991288e+09, 2.3970153e+09, 2.3907894e+09],
       [2.3991124e+09, 2.3970317e+09, 2.3907894e+09],
       ...,
       [2.3695557e+09, 2.3658857e+09, 2.3866112e+09],
       [2.3694574e+09, 2.3658857e+09, 2.3866276e+09],
       [2.3694902e+09, 2.3658857e+09, 2.3866112e+09]], dtype=float32)
>>> if (abs(m0.vertices[0][2]-98367.5)<0.1):
...   print("OK")
... else:
...   print("Error")
...
Error

DracoPy issue

Follow the instructions, I started the installation from a clean minaconda env and found DracoPy was not installed. I did:

pip install DracoPy

After that I tried the examples, but it failed with the following error. Maybe something wrong with the version of DracoPy?

image

How to build wheel on apple silicon?

Hello,I tried to use PyPI to install dracopy on my mac apple silicon directly but failed when "installing beckend dependencies". I think because some backend dependencis does have appropriate version for this architectrue.
Did anyone try to rebuild the wheel for macox of apple silicon?

os X installation issue

ran into issues trying to install this on os X in a conda environment via pip

In file included from /Applications/Xcode.app/Contents/Developer/Platforms/MacOSX.platform/Developer/SDKs/MacOSX.sdk/usr/include/Availability.h:236:0, from /Applications/Xcode.app/Contents/Developer/Platforms/MacOSX.platform/Developer/SDKs/MacOSX.sdk/usr/include/stdlib.h:61, from /Users/forrestc/anaconda3/envs/emtoolschool/gcc/include/c++/cstdlib:72, from /Users/forrestc/anaconda3/envs/emtoolschool/gcc/include/c++/bits/stl_algo.h:59, from /Users/forrestc/anaconda3/envs/emtoolschool/gcc/include/c++/algorithm:62, from /private/var/folders/93/pxx43tfd7nqbr7rbdxxm7jpn4fvqmh/T/pip-install-7t93zdju/DracoPy/draco/src/draco/compression/point_cloud/algorithms/dynamic_integer_points_kd_tree_encoder.h:18, from /private/var/folders/93/pxx43tfd7nqbr7rbdxxm7jpn4fvqmh/T/pip-install-7t93zdju/DracoPy/draco/src/draco/compression/point_cloud/algorithms/dynamic_integer_points_kd_tree_encoder.cc:15: /Applications/Xcode.app/Contents/Developer/Platforms/MacOSX.platform/Developer/SDKs/MacOSX.sdk/usr/include/AvailabilityInternal.h:33:18: error: missing binary operator before token "(" #if __has_include(<AvailabilityInternalPrivate.h>)

compress .pcap file

I have a .pcap file. Can i compress this using this code? if yes please help me to compress

setup_requires missing scikit-build

this causes chained installations to fail because scikit-build isn't included. For example, the current cloud-volume graphene branch fails on travis because it pulls in DracoPy as a dependency, but scikit-build isn't properly specified as needed for installation.

Encode .obj

Hey, thanks for creating this.

I haven't been able to figure out how to load an .obj and encode it to .drc - is this possible?

Thanks,
Joseph.

Error in reshape

In the code there is an error:

DracoPy/src/DracoPy.pyx

Lines 93 to 94 in 1d3057c

if arr.ndim == 1:
arr = arr.reshape((len(arr), 3))

A flat array cannot be resized to its own size multiplied by 3.

A good answer is:
arr = arr.reshape((len(arr) // 3, 3))
or
arr = arr.reshape((-1, 3))

Windows Binaries

Hi Manuel!

I was building Kimimaro on Windows which has a (dumb, but annoying to remove) CloudVolume dependency. Unfortunately, it looks like it was crashing while compiling DracoPy. Would you be able to release the Windows wheels?

Thanks!

Installation in a Docker container fails

I'm trying to install CloudVolume, but DracoPy is a dependency of CloudVolume and it's DracoPy that's giving me the following error on installation with pip:

File "/home/node/.local/lib/python3.11/site-packages/skbuild/setuptools_wrap.py", line 453, in setup
297.6           _check_skbuild_parameters(cmake_install_dir, cmake_source_dir)
297.6         File "/home/node/.local/lib/python3.11/site-packages/skbuild/setuptools_wrap.py", line 282, in _check_skbuild_parameters
297.6           raise SKBuildError(msg)
297.6
297.6
297.6         setup parameter 'cmake_source_dir' set to a nonexistent directory.
297.6           Project Root  : /tmp/pip-install-81onv6ad/dracopy_b9dd79e7ca6f4185a0ad90e446d45051
297.6           CMake Source Directory: ./draco

Here's the relevant parts of my Dockerfile:

FROM node-base as python-base

RUN apk add python3 python3-dev py3-pip

FROM python-base as with-build-dependencies
RUN apk add build-base
RUN apk add cmake
RUN apk add zlib-dev

FROM with-build-dependencies as python-requirements

USER node
RUN pip install numpy --break-system-packages
RUN pip install Cython --break-system-packages
RUN pip install scikit-build --break-system-packages
RUN pip install pandas --break-system-packages

RUN pip install -r requirements.txt --break-system-packages

Pointcloud support

Hi,

Would it be possible to also wrap encoder->EncodePointCloudToBuffer ? Did you try it already?

Thanks so much!
Daniel

tex_coord data compression doesn't work

I used the following two pieces of code to judge the encoding performance of tex_coord data:
binary1 = DracoPy.encode(pos_data, face_data, preserve_order=True)
binary2 = DracoPy.encode(pos_data, face_data, tex_coord=np.array(tex_data), preserve_order=True)

and tex_coord is list (29753 size) which is 29753 x 2 x 4 = 238024 Bytes

and I found that binary1's size is 245450 bytes and binary2's size is 483480 bytes. so I could have the conclusion that the tex_coord data of the whole data is 483480-245450=238030 Bytes

After encoding is capacity of tex_coord part is greater to the original part 6 Bytes, which is rather confusing. I' ve no idea why, could
you help me.

std::vector larger than max_size

I get the following Error when I try
draco.draco_compress(points, faces[:n], colors), whenever n > 23409

Traceback (most recent call last):
File "/opt/env/lib/python3.10/site-packages/IPython/core/interactiveshell.py", line 3460, in run_code
exec(code_obj, self.user_global_ns, self.user_ns)
File "", line 1, in
draco.draco_compress(points, faces[:23410], colors)
File "/opt/project/src/utils/draco.py", line 18, in draco_compress
binary = DracoPy.encode(
File "DracoPy.pyx", line 192, in DracoPy.encode
RuntimeError: cannot create std::vector larger than max_size()

Attributes compression support?

Are you planning to add attributes compression support? Draco itself supports it. As well as texture coordinates, colors, normals etc.
The ability to compress topology is good, but handling additional mesh data will make your package really outstanding.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.