Git Product home page Git Product logo

Comments (2)

cnhuz avatar cnhuz commented on July 29, 2024 5

我在windows下构建也遇到了,把Dockerfile.base中这段代码


RUN cat > /get-models.py <<EOF
from transformers import AutoTokenizer, AutoModelForSeq2SeqLM, pipeline
AutoModelForSeq2SeqLM.from_pretrained('Helsinki-NLP/opus-mt-zh-en')
AutoTokenizer.from_pretrained('Helsinki-NLP/opus-mt-zh-en')
pipeline('text-generation', model='succinctly/text2image-prompt-generator')
EOF

改为


RUN echo "from transformers import AutoTokenizer, AutoModelForSeq2SeqLM, pipeline" > /get-models.py && \
    echo "AutoModelForSeq2SeqLM.from_pretrained('Helsinki-NLP/opus-mt-zh-en')" >> /get-models.py && \
    echo "AutoTokenizer.from_pretrained('Helsinki-NLP/opus-mt-zh-en')" >> /get-models.py && \
    echo "pipeline('text-generation', model='succinctly/text2image-prompt-generator')" >> /get-models.py

就可以了

from docker-prompt-generator.

baymax55 avatar baymax55 commented on July 29, 2024

构建Dockerfile.gpu出错时,将下列代码进行替换:
RUN cat > /get-models.py <<EOF from clip_interrogator import Config, Interrogator import torch config = Config() config.device = 'cuda' if torch.cuda.is_available() else 'cpu' config.blip_offload = False if torch.cuda.is_available() else True config.chunk_size = 2048 config.flavor_intermediate_count = 512 config.blip_num_beams = 64 config.clip_model_name = "ViT-H-14/laion2b_s32b_b79k" ci = Interrogator(config) EOF

->

RUN echo "from clip_interrogator import Config, Interrogator" >> /get-models.py && \ echo "import torch" >> /get-models.py && \ echo "config = Config()" >> /get-models.py && \ echo "config.device = 'cuda' if torch.cuda.is_available() else 'cpu'" >> /get-models.py && \ echo "config.blip_offload = False if torch.cuda.is_available() else True" >> /get-models.py && \ echo "config.chunk_size = 2048" >> /get-models.py && \ echo "config.flavor_intermediate_count = 512" >> /get-models.py && \ echo "config.blip_num_beams = 64" >> get-models.py && \ echo "config.clip_model_name = \"ViT-H-14/laion2b_s32b_b79k\"" >> /get-models.py && \ echo "ci = Interrogator(config)" >> /get-models.py

from docker-prompt-generator.

Related Issues (19)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.