Git Product home page Git Product logo

Comments (4)

zhengtianyu1996 avatar zhengtianyu1996 commented on August 23, 2024

okay, update something about the Nan/Inf problem:

in losses.affinity_loss function
the edge seems good, the not_ignore seems good. But when it runs to tf.logical_and:

edge = tf.logical_and(edge, not_ignore)

the output 'edge' will be a completely zero-matrix. That means no effective value left. So the final edge_loss will meet problems.

I will continue debugging, hope could help someone.

from adaptive_affinity_fields.

zhengtianyu1996 avatar zhengtianyu1996 commented on August 23, 2024

I think the problem is just caused by:
edge_indices = tf.where(tf.reshape(edge, [-1]))
because 'edge' sometimes will be a completely zero-matrix, so the shape of edge_indices will be (0,1) sometimes. Then
edge_loss = tf.gather(edge_loss, edge_indices)
will generate inf values.

So I think 'edge' and 'not_ignore' should be checked well. However, I still don't know whether it's a common problem, maybe it's related to the dataset itself. How do you think about it? @twke18

from adaptive_affinity_fields.

arc144 avatar arc144 commented on August 23, 2024

okay, update something about the Nan/Inf problem:

in losses.affinity_loss function
the edge seems good, the not_ignore seems good. But when it runs to tf.logical_and:

edge = tf.logical_and(edge, not_ignore)

the output 'edge' will be a completely zero-matrix. That means no effective value left. So the final edge_loss will meet problems.

I will continue debugging, hope could help someone.

I was looking how ignores_from_label and edges_from_label compute the edge map and it seems they compute it differently. ignores_from_label does it backwards, i.e. for st_y in range(2*size,-1,-size):, whereas edges_from_label does it forward, i.e. for st_y in range(0,2*size+1,size):.

Is it intentional? Could this be the source of the zero-matrix issue when edge = tf.logical_and(edge, not_ignore) is computed?

from adaptive_affinity_fields.

xychenunc avatar xychenunc commented on August 23, 2024

Have you guys obtained improved results using affinity field loss? I tried many times, but I can hardly get improved results over my baseline. I also have the same issue as yours and I just do not take the term which is Nan into account when computing the total loss.

from adaptive_affinity_fields.

Related Issues (16)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.