Git Product home page Git Product logo

waynelwz / starwhale Goto Github PK

View Code? Open in Web Editor NEW

This project forked from star-whale/starwhale

0.0 0.0 0.0 130.14 MB

a mlops tool

Home Page: starwhale.vercel.app

License: Apache License 2.0

Java 38.57% Python 26.72% Makefile 0.20% Shell 0.47% JavaScript 0.46% HTML 0.03% TypeScript 29.35% SCSS 0.20% CSS 0.33% Smarty 0.03% Jupyter Notebook 3.37% Dockerfile 0.01% MDX 0.05% Jinja 0.01% EJS 0.19%

starwhale's Introduction

An MLOps/LLMOps Platform

πŸš€ ️☁️ Starwhale Cloud is now open to the public, try it! πŸŽ‰πŸ»

Artifact Hub PyPI - Python Version Client/SDK UT Server UT Starwhale E2E Test Codecov Codecov

English | δΈ­ζ–‡

What is Starwhale

Starwhale is an MLOps/LLMOps platform that make your model creation, evaluation and publication much easier. It aims to create a handy tool for data scientists and machine learning engineers. Starwhale helps you:

  • πŸ—οΈ Keep track of your training/testing dataset history including data items and their labels, so that you can easily access them.
  • 🧳 Manage your model packages that you can share across your team.
  • 🌊 Run your models in different environments, either on a Nvidia GPU server or on an embedded device like Cherry Pi.
  • πŸ”₯ Create a online service with interactive Web UI for your models.

Key Concepts

🦍 Starwhale Instance

Each deployment of Starwhale is called an instance. All instances can be managed by the Starwhale Client (swcli). You can start using Starwhale with one of the following instance types:

  • πŸ‘» Starwhale Standalone: Rather than a running service, Starwhale Standalone is actually a repository that resides in your local file system. It is created and managed by the Starwhale Client (SWCLI). You only need to install SWCLI to use it. Currently, each user on a single machine can have only ONE Starwhale Standalone instance. We recommend you use the Starwhale Standalone to build and test your datasets, runtime, and models before pushing them to Starwhale Server/Cloud instances.
  • 🎍 Starwhale Server: Starwhale Server is a service deployed on your local server. Besides text-only results from the Starwhale Client (SWCLI), Starwhale Server provides Web UI for you to manage your datasets and models, evaluate your models in your local Kubernetes cluster, and review the evaluation results.
  • ☁️ Starwhale Cloud: Starwhale Cloud is a managed service hosted on public clouds. By registering an account on https://cloud.starwhale.cn , you are ready to use Starwhale without needing to install, operate, and maintain your own instances. Starwhale Cloud also provides public resources for you to download, like datasets, runtimes, and models. Check the "starwhale/public" project on Starwhale Cloud for more details.

Starwhale tries to keep concepts consistent across different types of instances. In this way, people can easily exchange data and migrate between them.

🐘 Starwhale Dataset

Starwhale Dataset offers efficient data storage, loading, and visualization capabilities, making it a dedicated data management tool tailored for the field of machine learning and deep learning

dataset overview

import torch
from starwhale import dataset, Image

# build dataset for starwhale cloud instance
with dataset("https://cloud.starwhale.cn/project/starwhale:public/dataset/test-image", create="empty") as ds:
    for i in range(100):
        ds.append({"image": Image(f"{i}.png"), "label": i})
    ds.commit()

# load dataset
ds = dataset("https://cloud.starwhale.cn/project/starwhale:public/dataset/test-image")
print(len(ds))
print(ds[0].features.image.to_pil())
print(ds[0].features.label)

torch_ds = ds.to_pytorch()
torch_loader = torch.utils.data.DataLoader(torch_ds, batch_size=5)
print(next(iter(torch_loader)))

πŸ‡ Starwhale Model

Starwhale Model is a standard format for packaging machine learning models that can be used for various purposes, like model fine-tuning, model evaluation, and online serving. A Starwhale Model contains the model file, inference codes, configuration files, and any other files required to run the model.

overview

# model build
swcli model build . --module mnist.evaluate --runtime pytorch/version/v1 --name mnist

# model copy from standalone to cloud
swcli model cp mnist https://cloud.starwhale.cn/project/starwhale:public

# model run
swcli model run --uri mnist --runtime pytorch --dataset mnist
swcli model run --workdir . --module mnist.evaluator --handler mnist.evaluator:MNISTInference.cmp

🐌 Starwhale Runtime

Starwhale Runtime aims to provide a reproducible and sharable running environment for python programs. You can easily share your working environment with your teammates or outsiders, and vice versa. Furthermore, you can run your programs on Starwhale Server or Starwhale Cloud without bothering with the dependencies.

overview

# build from runtime.yaml, conda env, docker image or shell
swcli runtime build --yaml runtime.yaml
swcli runtime build --conda pytorch --name pytorch-runtime --cuda 11.4
swcli runtime build --docker pytorch/pytorch:1.9.0-cuda11.1-cudnn8-runtime
swcli runtime build --shell --name pytorch-runtime

# runtime activate
swcli runtime activate pytorch

# integrated with model and dataset
swcli model run --uri test --runtime pytorch
swcli model build . --runtime pytorch
swcli dataset build --runtime pytorch

πŸ„ Starwhale Evaluation

Starwhale Evaluation enables users to evaluate sophisticated, production-ready distributed models by writing just a few lines of code with Starwhale Python SDK.

import typing as t
import gradio
from starwhale import evaluation
from starwhale.api.service import api

def model_generate(image):
    ...
    return predict_value, probability_matrix

@evaluation.predict(
    resources={"nvidia.com/gpu": 1},
    replicas=4,
)
def predict_image(data: dict, external: dict) -> None:
    return model_generate(data["image"])

@evaluation.evaluate(use_predict_auto_log=True, needs=[predict_image])
def evaluate_results(predict_result_iter: t.Iterator):
    for _data in predict_result_iter:
        ...
    evaluation.log_summary({"accuracy": 0.95, "benchmark": "test"})

@api(gradio.File(), gradio.Label())
def predict_view(file: t.Any) -> t.Any:
    with open(file.name, "rb") as f:
        data = Image(f.read(), shape=(28, 28, 1))
    _, prob = predict_image({"image": data})
    return {i: p for i, p in enumerate(prob)}

Installation

πŸ‰ Starwhale Standalone

Requirements: Python 3.7~3.11 in the Linux or macOS os.

python3 -m pip install starwhale

πŸ₯­ Starwhale Server

Starwhale Server is delivered as a Docker image, which can be run with Docker directly or deployed to a Kubernetes cluster. For the laptop environment, using Minikube is a appropriate choice.

minikube start --addons ingress
helm repo add starwhale https://star-whale.github.io/charts
helm repo update
helm pull starwhale/starwhale --untar --untardir ./charts

helm upgrade --install starwhale ./charts/starwhale -n starwhale --create-namespace -f ./charts/starwhale/values.minikube.global.yaml

Quick Tour

We use MNIST as the hello world example to show the basic Starwhale Model workflow.

πŸͺ… MNIST Evaluation in Starwhale Standalone

πŸͺ† MNIST Evaluation in Starwhale Server

Examples

Documentation, Community, and Support

Contributing

πŸŒΌπŸ‘PRs are always welcomed πŸ‘πŸΊ. See Contribution to Starwhale for more details.

License

Starwhale is licensed under the Apache License 2.0.

starwhale's People

Contributors

anda-ren avatar dependabot[bot] avatar dreamlandliu avatar goldenxinxing avatar jialeicui avatar lijing-susan avatar tianweidut avatar waynelwz avatar xuchuan avatar yetone avatar

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    πŸ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. πŸ“ŠπŸ“ˆπŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❀️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.