Git Product home page Git Product logo

Comments (14)

willmil11 avatar willmil11 commented on May 25, 2024 1

@nicolay-r The latest version of python is not compatible? You could have begun with that lemme try with the one you specified..

from lamda-rlhf-pytorch.

willmil11 avatar willmil11 commented on May 25, 2024

Please help me :(

from lamda-rlhf-pytorch.

willmil11 avatar willmil11 commented on May 25, 2024

What do i do i just wanna chat with LaMDA :(

from lamda-rlhf-pytorch.

nicolay-r avatar nicolay-r commented on May 25, 2024

Hi @willmil11

Thanks for sharing traceback and log.

Traceback (most recent call last):
File "/home/vscode/lamda/LaMDA-rlhf-pytorch/train.py", line 4, in
from colossalai.core import global_context as gpc
ModuleNotFoundError: No module named 'colossalai.core'

I believe that the problem with colossalai or of your Python compatibility with this package.
You can try to reproduce this issue using ipython / python and importing
from colossalai.core import global_context as gpc
If the colossalai 0.3.5 that you have is compatible with your Python, then I would recommend to manually downgrade the colossai

from lamda-rlhf-pytorch.

willmil11 avatar willmil11 commented on May 25, 2024

Hi @nicolay-r thanks for your quick response but i still have a question

To what version to I downgrade colossalai, how do i do that and lastly. Will i be able to directly chat with lamda after the training is complete? How?

from lamda-rlhf-pytorch.

nicolay-r avatar nicolay-r commented on May 25, 2024

@willmil11, I did not experiment with LaMDA with this project since I was not able to launch training (GoogleColab). That was almost a year ago. If anything related to the LORA and parameter efficient tunning since that time has been changed, then it is worth to try. You may try to lauch and act by relying on loss during trainng process as well as the overall training time it takes.

As for colossalai, you may have a look available versions: https://pypi.org/project/colossalai/#history
The quickest way I believe is to manually try downgrate to the most recent prior version and so on

from lamda-rlhf-pytorch.

nicolay-r avatar nicolay-r commented on May 25, 2024

Another option is to check wether the colossalai is actually works with unittests and related examples.

from lamda-rlhf-pytorch.

willmil11 avatar willmil11 commented on May 25, 2024

@nicolay-r I see what you're saying but I still don't understand what version of colossalai you suggest me to downgrade to (do you want me to try them all [there are hundreds on the GitHub page but only about 10 on pypi.org])

from lamda-rlhf-pytorch.

nicolay-r avatar nicolay-r commented on May 25, 2024

@willmil11 , I mean releases from PyPI: https://pypi.org/project/colossalai/#history
I can' t recommend in your particular case since there might be other dependency ruies.
So that you can try out with 0.3.4 and then down to 0.3.0

from lamda-rlhf-pytorch.

nicolay-r avatar nicolay-r commented on May 25, 2024

@willmil11, did the version switching sorted out the issue?

from lamda-rlhf-pytorch.

willmil11 avatar willmil11 commented on May 25, 2024

@nicolay-r 0.3.4 said that colossalai.core doesnt exist and 0.3.0 said:

Traceback (most recent call last):
  File "/home/vscode/lamda/LaMDA-rlhf-pytorch/train.py", line 3, in <module>
    import colossalai
  File "/usr/local/lib/python3.11/dist-packages/colossalai/__init__.py", line 1, in <module>
    from .initialize import (
  File "/usr/local/lib/python3.11/dist-packages/colossalai/initialize.py", line 18, in <module>
    from colossalai.amp import AMP_TYPE, convert_to_amp
  File "/usr/local/lib/python3.11/dist-packages/colossalai/amp/__init__.py", line 8, in <module>
    from colossalai.context import Config
  File "/usr/local/lib/python3.11/dist-packages/colossalai/context/__init__.py", line 4, in <module>
    from .moe_context import MOE_CONTEXT
  File "/usr/local/lib/python3.11/dist-packages/colossalai/context/moe_context.py", line 8, in <module>
    from colossalai.tensor import ProcessGroup
  File "/usr/local/lib/python3.11/dist-packages/colossalai/tensor/__init__.py", line 2, in <module>
    from .colo_parameter import ColoParameter
  File "/usr/local/lib/python3.11/dist-packages/colossalai/tensor/colo_parameter.py", line 5, in <module>
    from colossalai.tensor.colo_tensor import ColoTensor
  File "/usr/local/lib/python3.11/dist-packages/colossalai/tensor/colo_tensor.py", line 11, in <module>
    from colossalai.tensor.tensor_spec import ColoTensorSpec
  File "/usr/local/lib/python3.11/dist-packages/colossalai/tensor/tensor_spec.py", line 10, in <module>
    @dataclass
     ^^^^^^^^^
  File "/usr/lib/python3.11/dataclasses.py", line 1220, in dataclass
    return wrap(cls)
           ^^^^^^^^^
  File "/usr/lib/python3.11/dataclasses.py", line 1210, in wrap
    return _process_class(cls, init, repr, eq, order, unsafe_hash,
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/lib/python3.11/dataclasses.py", line 958, in _process_class
    cls_fields.append(_get_field(cls, name, type, kw_only))
                      ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/lib/python3.11/dataclasses.py", line 815, in _get_field
    raise ValueError(f'mutable default {type(f.default)} for field '
ValueError: mutable default <class 'colossalai.tensor.distspec._DistSpec'> for field dist_attr is not allowed: use default_factory

lemme try previous versions...

from lamda-rlhf-pytorch.

willmil11 avatar willmil11 commented on May 25, 2024

Tried 0.3.0 to 0.2.0 which is the oldest version available on pip didnt work, same error

from lamda-rlhf-pytorch.

willmil11 avatar willmil11 commented on May 25, 2024

And heres what i get if i run it with colossalai command:

root@rpi4-20220808:/home/vscode/lamda/LaMDA-rlhf-pytorch# colossalai run --nproc_per_node 1 train.py
Traceback (most recent call last):
  File "/usr/local/bin/colossalai", line 5, in <module>
    from colossalai.cli import cli
  File "/usr/local/lib/python3.11/dist-packages/colossalai/__init__.py", line 1, in <module>
    from .initialize import (
  File "/usr/local/lib/python3.11/dist-packages/colossalai/initialize.py", line 18, in <module>
    from colossalai.amp import AMP_TYPE, convert_to_amp
  File "/usr/local/lib/python3.11/dist-packages/colossalai/amp/__init__.py", line 8, in <module>
    from colossalai.context import Config
  File "/usr/local/lib/python3.11/dist-packages/colossalai/context/__init__.py", line 4, in <module>
    from .moe_context import MOE_CONTEXT
  File "/usr/local/lib/python3.11/dist-packages/colossalai/context/moe_context.py", line 8, in <module>
    from colossalai.tensor import ProcessGroup
  File "/usr/local/lib/python3.11/dist-packages/colossalai/tensor/__init__.py", line 2, in <module>
    from .colo_parameter import ColoParameter
  File "/usr/local/lib/python3.11/dist-packages/colossalai/tensor/colo_parameter.py", line 5, in <module>
    from colossalai.tensor.colo_tensor import ColoTensor
  File "/usr/local/lib/python3.11/dist-packages/colossalai/tensor/colo_tensor.py", line 11, in <module>
    from colossalai.tensor.tensor_spec import ColoTensorSpec
  File "/usr/local/lib/python3.11/dist-packages/colossalai/tensor/tensor_spec.py", line 10, in <module>
    @dataclass
     ^^^^^^^^^
  File "/usr/lib/python3.11/dataclasses.py", line 1220, in dataclass
    return wrap(cls)
           ^^^^^^^^^
  File "/usr/lib/python3.11/dataclasses.py", line 1210, in wrap
    return _process_class(cls, init, repr, eq, order, unsafe_hash,
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/lib/python3.11/dataclasses.py", line 958, in _process_class
    cls_fields.append(_get_field(cls, name, type, kw_only))
                      ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/lib/python3.11/dataclasses.py", line 815, in _get_field
    raise ValueError(f'mutable default {type(f.default)} for field '
ValueError: mutable default <class 'colossalai.tensor.distspec._DistSpec'> for field dist_attr is not allowed: use default_factory

from lamda-rlhf-pytorch.

nicolay-r avatar nicolay-r commented on May 25, 2024

Hi @willmil11! You need to try out other versions down from 0.3.4 as well.
For example 0.3.3 does not have the issue with the core in Python 3.10.12, so you may switch to this.
Notebook: https://colab.research.google.com/drive/1x2KrBCR1pAd5Huk6qCMmPlq8SyQDGdes?usp=sharing
In tems of you most recent exception, I believe the most recent python (3.11+) is not compatible: python/cpython#99401

from lamda-rlhf-pytorch.

Related Issues (7)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.