Git Product home page Git Product logo

Comments (8)

Thommy257 avatar Thommy257 commented on July 24, 2024

Hi,
the QuantumTrainer provides a high level abstraction of a standard machine learning training routine. It stores the training statistics and takes a model as input. During the training, it saves the state of the model in checkpoint files, that can be used to instantiate a new model for inference through:

model = TketModel.from_checkpoint('path/to/checkpoint')

from lambeq.

Thommy257 avatar Thommy257 commented on July 24, 2024

The output of your model depends on your circuit/diagrams. If you have one open (qubit) wire, the output will be a 2-d array, if you have two output wires, the output will be a 4-d array, and so on. Currently, only the PytorchModel supports additional layers, however, we'll add Pennylane and tensorflow-quantum support soon.

from lambeq.

ACE07-Sev avatar ACE07-Sev commented on July 24, 2024

So imagining I have a dataset of 1 feature, and one label, if I feed the dataset to the quantum_trainer.ipynb version, can I extract predictions for binary classification?

from lambeq.

ACE07-Sev avatar ACE07-Sev commented on July 24, 2024

Is the quantum_trainer.ipynb basically creating and training the model on the dataset? If so, how can I extract predictions on new samples?

from lambeq.

Thommy257 avatar Thommy257 commented on July 24, 2024

The trainer only wraps the training routine of the model. You define an instance of the model first, pass it to the trainer and after training, you can predict the outcome of a new diagram through:

score = model(new_samples_diagrams)

from lambeq.

ACE07-Sev avatar ACE07-Sev commented on July 24, 2024

Understood, so to call the model on new instances I have to first pass them through the parser to be reformed as diagrams as well correct?

from lambeq.

Thommy257 avatar Thommy257 commented on July 24, 2024

That is correct. But bear in mind that you have to take care of word tokens that are unknown to the model. If you pass a diagram to the model that contains a word that wasn't part of the training process, it'll fail to calculate the output.

In classical NLP, one would for example deal with that by introducing an <unk> token, that replaces all rare words in your corpus.

from lambeq.

y-richie-y avatar y-richie-y commented on July 24, 2024

The original question has been answered, so this issue will be closed.

from lambeq.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.