Git Product home page Git Product logo

Comments (4)

Hvass-Labs avatar Hvass-Labs commented on May 7, 2024 8

After some more time searching the internet, I found out how to do this. We need to append the code :0 to the name of the op to get its associated tensor. I don't know why, but here's how to do it:

bar = tf.get_default_graph().get_tensor_by_name('network/layer_conv1/Relu:0')
print(bar)

Which gives the following output:

Tensor("network/layer_conv1/Relu:0", shape=(?, 24, 24, 64), dtype=float32)

And we can now run the session to get the output of the convolutional layer as follows:

baz = session.run(bar, feed_dict={x: images_test[0:10, :, :, :]})
print(baz.shape)

Which outputs:

(10, 24, 24, 64)

And this represents:

[input_image, height, width, output_channel]

This took a long time to figure out and the solution is not obvious at all. One must know low-level details of both Pretty Tensor and TensorFlow to figure out how to do this. Please consider these things both when designing the API's and when documenting them.

from prettytensor.

eiderman avatar eiderman commented on May 7, 2024

Unfortunately, these graphs have many, many nodes in them. The namespace can help and the result of the activation would be prefix/layer_name/relu so you can find it that way as a one-off. I've outlined a few more choices below.

The easiest way is to assign the values back to a variable and do something with it, of course this ends up introducing a lot of line noise:

x_pretty = x_pretty.conv2d(...)
do_something_with_image(x_pretty)
x_pretty = x_pretty.max_pool()

I would find that solution to be less than ideal, so I will lay out a couple of other ones:

with pt.defaults_scope(activation_fn=tf.nn.relu):
  seq = x_pretty.sequential()  # Now each call changes seq.
  seq.conv2d(kernel=5, depth=64, name='layer_conv1')
  do_something(seq.as_layer()) # as_layer takes a snapshot
  seq.max_pool(kernel=2, stride=2)
  seq.conv2d(kernel=5, depth=64, name='layer_conv2')
  do_something(seq.as_layer()) # as_layer takes a snapshot
  seq.max_pool(kernel=2, stride=2).flatten()
  seq.fully_connected(size=256, name='layer_fc1')
  do_something_else(seq.as_layer())
  seq.fully_connected(size=128, name='layer_fc2')
  do_something_else(seq.as_layer())
  y_pred, loss = seq.as_layer().softmax_classifier(class_count=10, labels=y_true)

You could also use the callback _method_complete on any Pretty Tensor object. It is called at the end of each method call in order to support side-effectful execution (sequential) or the standard execution; the type coming in can be anything Tensor like, so you'd have to call super for it to be wrapped as a PT.

If you feel like this would be a useful general abstraction to allow a user-defined callback, then I welcome the contribution. It would probably be a good way to standardize summaries as well :)

from prettytensor.

Hvass-Labs avatar Hvass-Labs commented on May 7, 2024

Thanks for the quick answer!

The reason I like Pretty Tensor is the elegant syntax when using the chained-mode of constructing the network, so I don't want to ruin that.

I think it would be OK for my project if I just get the output of the layers using their names. However, when I try the following (note that I have actually enclosed the above code in the namespace 'network' in my own code):

bar = tf.get_default_graph().get_tensor_by_name('network/layer_conv1/Relu')
print(bar)

I get this error:

ValueError: The name 'network/layer_conv1/Relu' refers to an Operation, not a Tensor. Tensor names must be of the form "<op_name>:<output_index>".

If instead I have:

bar = tf.get_default_graph().get_operation_by_name('network/layer_conv1/Relu')
print(bar)

I get the following output:

name: "network/layer_conv1/Relu"
op: "Relu"
input: "network/layer_conv1/add"
attr {
  key: "T"
  value {
    type: DT_FLOAT
  }
}

I then try and execute this in the TensorFlow session to get the output of the convolutional layer:

baz = session.run(bar, feed_dict={x: images_test[0:10, :, :, :]})
print(baz)

But I just get None as the result.

How should I do this? What is the reason?

Thanks again.

from prettytensor.

geometrikal avatar geometrikal commented on May 7, 2024

For people who come across this with searching, the OP's tutorials are really helpful:

https://github.com/Hvass-Labs/TensorFlow-Tutorials

from prettytensor.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.