Git Product home page Git Product logo

mlchain-python's People

Contributors

duonglong289 avatar lamhoangtung avatar meocong avatar pdd-vn avatar phamngoclinh96 avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar

mlchain-python's Issues

objc[28043]: +[__NSCFConstantString initialize] may have been in progress in another thread when fork() was called.

Hi there

I found a bug when running with mlchain run on my MBP (OSX 10.15.6 but I suspect this problem can be reproduced on the older versions)

  • When configure wrapper: None in mlconfig => Run mlchain run => There is no bug when calling API

  • When configure wrapper: gunicorn in mlconfig => Run mlchain run => There is 1 bug that prohibits function called:
    objc[28043]: +[__NSCFConstantString initialize] may have been in progress in another thread when fork() was called. objc[28043]: +[__NSCFConstantString initialize] may have been in progress in another thread when fork() was called. We cannot safely call it or ignore it in the fork() child process. Crashing instead. Set a breakpoint on objc_initializeAfterForkError to debug.

The problem disappears when I run this command before mlchain run
export OBJC_DISABLE_INITIALIZE_FORK_SAFETY=YES

As I tested:
On Windows: There is no gunicorn installed. Should find an alternative
On Linux: Everything works fine

Need to expose sock from Gunicorn wrapper

As I know, mlchain inherently supports serving with nginx with 8080 port through mlchain serve ...
However, when inspecting code, there is no exposed sock for nginx to use

Please add support for this. Thank you

Remove + compress js files

There are unnecessary + uncompressed Javascript files:

docs/js/chat.js
docs/js/custom.js
docs/js/termynal.js
mlchain/server/templates/swaggerui/swagger-ui-standalone-preset.js

Compress css files

The css files should be compressed.

docs/css/custom.css
docs/css/style.css
docs/css/termynal.css
mlchain/server/static/Source-Sans-Pro.css
mlchain/server/templates/swaggerui/swagger-ui.css

Support on ARM

Hi there, we are experimenting this lib on ARM based hardware (E.g NVIDIA Jetson).
You should cover this with CI CD and future plan

Update mlchain init mlconfig.yml

Hello there, as far as I know, mlchain supports many features. However, when using mlchain init I think it should be better.
E.g:

  • Mlchain supports mode for exported environment variable
  • Note for production and dev environments: wrapper, host (localhost -> 0.0.0.0)
  • etc

ModuleNotFoundError: No module named 'code.server'; 'code' is not a package

Some may encounter this at mlchain run

Traceback (most recent call last):
  File "/usr/local/lib/python3.7/dist-packages/gunicorn/arbiter.py", line 583, in spawn_worker
    worker.init_process()
  File "/usr/local/lib/python3.7/dist-packages/gunicorn/workers/gthread.py", line 92, in init_process
    super().init_process()
  File "/usr/local/lib/python3.7/dist-packages/gunicorn/workers/base.py", line 119, in init_process
    self.load_wsgi()
  File "/usr/local/lib/python3.7/dist-packages/gunicorn/workers/base.py", line 144, in load_wsgi
    self.wsgi = self.app.wsgi()
  File "/usr/local/lib/python3.7/dist-packages/gunicorn/app/base.py", line 67, in wsgi
    self.callable = self.load()
  File "/usr/local/lib/python3.7/dist-packages/mlchain/cli/run.py", line 205, in load
    serve_model = get_model(entry_file, serve_model=True)
  File "/usr/local/lib/python3.7/dist-packages/mlchain/cli/run.py", line 310, in get_model
    module = importlib.import_module(import_name)
  File "/usr/lib/python3.7/importlib/__init__.py", line 127, in import_module
    return _bootstrap._gcd_import(name[level:], package, level)
  File "<frozen importlib._bootstrap>", line 1006, in _gcd_import
  File "<frozen importlib._bootstrap>", line 983, in _find_and_load
  File "<frozen importlib._bootstrap>", line 962, in _find_and_load_unlocked
ModuleNotFoundError: No module named 'code.server'; 'code' is not a package

Better mlchain exported variables converter

Hello there,

As far as I know, the exported environment variables are considered string in Python
Mlchain has already supported type indicator in function to convert from image binary to numpy array. However, when exporting int8, I get a string in my code

Please update, it should be a nice feature.

ImportError: cannot import name 'InvalidDataError' from 'hyperframe.exceptions'

@vuonghoainam reported a bug when install mlchain 0.1.7 and run with mlchain run

    from .h2 import H2Protocol
  File "/usr/local/lib/python3.7/site-packages/hypercorn/protocol/h2.py", line 4, in <module>
    import h2.connection
  File "/usr/local/lib/python3.7/site-packages/h2/connection.py", line 33, in <module>
    from .frame_buffer import FrameBuffer
  File "/usr/local/lib/python3.7/site-packages/h2/frame_buffer.py", line 9, in <module>
    from hyperframe.exceptions import InvalidFrameError, InvalidDataError
ImportError: cannot import name 'InvalidDataError' from 'hyperframe.exceptions' (/usr/local/lib/python3.7/site-packages/hyperframe/exceptions.py)

Mlchain installation problem with virtual environment (anaconda)

Mlchain installation problems with virtual environment (anaconda)

ERROR: Failed building wheel for bottleneck
Failed to build bottleneck
ERROR: Could not build wheels for bottleneck which use PEP 517 and cannot be installed directly

Ubuntu: 18.04.5 LTS
python : 3.6
conda : 4.8.3

ImportError: cannot import name 'ResponseClosed' from 'httpx'

Traceback (most recent call last):
  File "/home/techainer_docker/miniconda/bin/mlchain", line 5, in <module>
    from mlchain.cli.main import main
  File "/home/techainer_docker/miniconda/lib/python3.8/site-packages/mlchain/__init__.py", line 34, in <module>
    from .client import Client
  File "/home/techainer_docker/miniconda/lib/python3.8/site-packages/mlchain/client/__init__.py", line 3, in <module>
    from .grpc_client import GrpcClient
  File "/home/techainer_docker/miniconda/lib/python3.8/site-packages/mlchain/client/grpc_client.py", line 4, in <module>
    from .base import MLClient
  File "/home/techainer_docker/miniconda/lib/python3.8/site-packages/mlchain/client/base.py", line 10, in <module>
    from httpx import (
ImportError: cannot import name 'ResponseClosed' from 'httpx' (/home/techainer_docker/miniconda/lib/python3.8/site-packages/httpx/__init__.py)

TensorFlow 1 Session hanging when run with pure python but not with CLI

One internal project of Techainer discover a bug that when using python3 server.py with a content like this:

from mlchain.base import ServeModel
from model import Model
from mlchain import mlconfig

# mlconfig.load_config('mlconfig.yaml')
model = Model(weight_path=mlconfig.weight,
                     debug=mlconfig.debug)
model = ServeModel(model)

if __name__ == "__main__":
    from mlchain.rpc.server.flask_server import FlaskServer
    FlaskServer(model).run(bind=['127.0.0.1:8004'], gunicorn=True)

Have cause the self.sess.run inside the model class to hang forever. While using mlchain run CLI doesn't.

Noted that this model use Tensorflow 1.14, 1.15 suffer the same problem

This DOES NOT effect production usage since we only use mlchain run but this bug is worth more examination.

TypeError: 'coroutine' object is not iterable

When I ran QuartServer. I got an exception.

Traceback (most recent call last):                                                                                                      
  File "/mnt/hdd/spaces/miles/.local/lib/python3.7/site-packages/uvicorn/protocols/http/httptools_impl.py", line 396, in run_asgi       
    result = await app(self.scope, self.receive, self.send)                                                                             
  File "/mnt/hdd/spaces/miles/.local/lib/python3.7/site-packages/uvicorn/middleware/proxy_headers.py", line 45, in __call__             
    return await self.app(scope, receive, send)                                                                                         
  File "/mnt/hdd/spaces/miles/.local/lib/python3.7/site-packages/quart/app.py", line 2117, in __call__                                  
    await self.asgi_app(scope, receive, send)                                                                                           
  File "/mnt/hdd/spaces/miles/.local/lib/python3.7/site-packages/quart/app.py", line 2140, in asgi_app                                  
    await asgi_handler(receive, send)                                                                                                   
  File "/mnt/hdd/spaces/miles/.local/lib/python3.7/site-packages/quart/asgi.py", line 33, in __call__                                   
    _raise_exceptions(done)                                                                                                             
  File "/mnt/hdd/spaces/miles/.local/lib/python3.7/site-packages/quart/asgi.py", line 256, in _raise_exceptions                         
    raise task.exception()                                                                                                              
  File "/mnt/hdd/spaces/miles/.local/lib/python3.7/site-packages/quart/asgi.py", line 84, in handle_request                             
    await asyncio.wait_for(self._send_response(send, response), timeout=timeout)                                                        
  File "/usr/lib/python3.7/asyncio/tasks.py", line 442, in wait_for                                                                     
    return fut.result()                                                                                                                 
  File "/mnt/hdd/spaces/miles/.local/lib/python3.7/site-packages/quart/asgi.py", line 98, in _send_response                             
    async for data in body:                                                                                                             
  File "/mnt/hdd/spaces/miles/.local/lib/python3.7/site-packages/quart/wrappers/response.py", line 129, in _aiter                       
    for data in iterable:  # type: ignore                                                                                               
TypeError: 'coroutine' object is not iterable

My environment:

MlChain 0.1.9
Flask 1.1.2
Quart 0.14.1

Defining api routes

I've recently picked up on using this framework. I can't find instruction on how to define the specific URL for the API, other than IP:port. What I want to do is deploying an API to an URL like IP:port/route/to/api. How do I achieve that? Thank you.

Support Starlette

Starlette is more stable than Quart, so we are considering remove Quart and integrate Starlette as default server (Now is Flask)

Sentry integration

  • Sentry integration
  • Integrate Sentry Performance for Mlchain Client and Task
  • Add GPU info when error with Sentry
  • Ignore Python Packages when logging Sentry
  • Fixed Sentry failed with Gunicorn started

PyYAML Breaking Dependency

At mlchain 0.1.8rc1, I have encountered the following bug during the initialization phase with mlchain. Please help resolve this. Thank you
This link should be helpful I guess: Stackoverflow
face_detection_1 | Traceback (most recent call last): face_detection_1 | File "/usr/local/bin/mlchain", line 11, in <module> face_detection_1 | sys.exit(main()) face_detection_1 | File "/usr/local/lib/python3.7/dist-packages/mlchain/cli/main.py", line 50, in main face_detection_1 | cli.main(args=sys.argv[1:], prog_name="python -m mlchain" if as_module else None) face_detection_1 | File "/usr/local/lib/python3.7/dist-packages/click/core.py", line 782, in main face_detection_1 | rv = self.invoke(ctx) face_detection_1 | File "/usr/local/lib/python3.7/dist-packages/click/core.py", line 1259, in invoke face_detection_1 | return _process_result(sub_ctx.command.invoke(sub_ctx)) face_detection_1 | File "/usr/local/lib/python3.7/dist-packages/click/core.py", line 1066, in invoke face_detection_1 | return ctx.invoke(self.callback, **ctx.params) face_detection_1 | File "/usr/local/lib/python3.7/dist-packages/click/core.py", line 610, in invoke face_detection_1 | return callback(*args, **kwargs) face_detection_1 | File "/usr/local/lib/python3.7/dist-packages/mlchain/cli/run.py", line 104, in run_command face_detection_1 | config = mlconfig.load_file(config) face_detection_1 | File "/usr/local/lib/python3.7/dist-packages/mlchain/config.py", line 152, in load_file face_detection_1 | return load_yaml(path) face_detection_1 | File "/usr/local/lib/python3.7/dist-packages/mlchain/config.py", line 144, in load_yaml face_detection_1 | return yaml.load(f, Loader=yaml.FullLoader) face_detection_1 | AttributeError: module 'yaml' has no attribute 'FullLoader'

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.