fdarmon / neuralwarp Goto Github PK
View Code? Open in Web Editor NEWCode release of paper Improving neural implicit surfaces geometry with patch warping
Code release of paper Improving neural implicit surfaces geometry with patch warping
Thank you for your great work.
autonomousvision/monosdf#85 (comment)
Taking scan65 as an example, the pair.txt I generated is different from the original data set. Will this have any impact on the final result?
I use the code you gave me to generate 'pair.txt'.
Looking forward to your reply.
Hi, your great work is impressing!
But I'm a little confused about the warp loss and Validity masks:
1."our warping-based loss such that every valid patch in the reference image is given the same weight." why is same weight?
2."second, when the reference and source views are on two different sides of the plane defined by xi and the normal ni; third, when a camera center is too close to the plane defined by xi and the normal ni." Why the binary indicator V will be 0.
Looking forward to your reply.
Hello, first of all thank you for your work, and then may I ask why the loss drops very slowly when training the custom data set and the resulting mesh effect is also poor.
As descripted in your paper, "the results for each method are taken from their original paper".
But why results for Neus[34] are different from original paper?
[34] Peng Wang, Lingjie Liu, Yuan Liu, Christian Theobalt, Taku Komura, and Wenping Wang. NeuS: Learning neural implicit surfaces by volume rendering for multi-view reconstruction. In Adv. Neural Inform. Process. Syst., 2021.
Hello, what operating system are you deploying and training on, windows or linux? Thank you!
Dear @fdarmon
Thank you very much for your work. In your error visualization in appendix, the GT points look very clean compared to the provided GT. I wonder if you preprocess the GT points from DTU to get such visualization.
Look forward to your reply!
Hello, if I need to train a custom dataset, what are the format requirements?
Hello,
thanks for your excellent work! I am wondering how you create the result visulization teaser.gif in readme, with a moving camera and a sliding bar to switch between two meshes. Thank you in advance!
Best regards
Hi, Thanks for the excellent paper.
I was going through the code base and the warping seems to happen only for random pixels sampled on the image plane (viz. clipped off at boundaries), and not patches.
line 155-166 in https://github.com/fdarmon/NeuralWarp/blob/main/datasets/scene_dataset.py
Is it right, or am I missing something ?
Thank you very much for the author's contribution, I want to try to train my own data, but I am confused about the pair.txt file in "dtu_supp", how do I get this?
Hello!
Thanks for sharing the code, your work is really impressive! I have noticed that you did not include any license file. Per copyright laws, we have to assume the most restrictive license (i.e. all derivatives are forbidden, etc.)
Was that intentional, or do you actually not mind other research projects building on top of your code?
Dear author,
Thank you for your open source code. I notice that volsdf normalize the camera (within a sphere with R=3) on DTU dataset. However, NeuralWarp loads camera params from the original cameras.npz, which is not normalized. Since this project is built upon volsdf, I want to know why the author removed the camera normalization step. Moreover, I think the default value of --bbox_size in extract_mesh.py is not much reasonable. It should be set carefully for each scene because the camera is not normalized. If the value is too large, then a lot of empty space is wasted. If the value is too small, the extracted mesh is defective.
I want to hear the author's viewpoint on this.
Thanks!
Hi author, I'm curious why using GT rgb as the criterion when calculating the warp loss, instead of the rendered rgb? This way you can also supervise the rendering network to get a more accurate color
Dear author,
How to reconstruct texture after generating mesh ? Can you give me any suggestion?
Thanks!
Dear author,
According to the description of the paper, the training pipeline includes two stages. First, train for 100k iterations in the same setting as VolSDF. Then finetune for 50k iterations with the proposed method. Does this mean that I need to add the option "--is_continue --timestamp XXXXX" in stage 2? Moreover, according to the paper, the learning rate of stage 2 is 1e-5, which is different from the learning rate (5.0e-4) in NeuralWarp.conf. Do I need to change the learning rate in the configuration to 1e-5?
Thanks!
Hello. Thank you for your amazing work and for sharing the code!
I tried training your model from scratch on some of the benchmark scenes and had problems reproducing your results. It seems like the model is quite susceptible to the random seed, and even after several attempts, the quality I obtained was lower than reported.
Experiment | Fountain - Full | Fountain - Center | Herzjesu - Full | Herzjesu - Center |
---|---|---|---|---|
pre-trained | 7.77 | 1.91 | 8.88 | 2.03 |
seed 2022 | 8.03 | 2.65 | 7.66 | 2.55 |
seed 42 | 13.36 | 7.43 | 10.54 | 2.58 |
Could you provide some insights regarding this issue?
A declarative, efficient, and flexible JavaScript library for building user interfaces.
๐ Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. ๐๐๐
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google โค๏ธ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.