Topic: pre-trained-model Goto Github
Some thing interesting about pre-trained-model
Some thing interesting about pre-trained-model
pre-trained-model,A collection of Audio and Speech pre-trained models.
User: balavenkatesh3322
pre-trained-model,A LITE BERT FOR SELF-SUPERVISED LEARNING OF LANGUAGE REPRESENTATIONS, 海量中文预训练ALBERT模型
User: brightmart
Home Page: https://arxiv.org/pdf/1909.11942.pdf
pre-trained-model,The implementation of our ICCV 2023 paper "Downstream-agnostic Adversarial Examples"
Organization: cgcl-codes
Home Page: https://arxiv.org/pdf/2307.12280.pdf
pre-trained-model,Language Understanding Evaluation benchmark for Chinese: datasets, baselines, pre-trained models,corpus and leaderboard
Organization: chineseglue
Home Page: https://www.CLUEbenchmarks.com
pre-trained-model,[CVPR 2021] Involution: Inverting the Inherence of Convolution for Visual Recognition, a brand new neural operator
User: d-li14
Home Page: https://arxiv.org/abs/2103.06255
pre-trained-model,PyTorch implementation of Lambda Network and pretrained Lambda-ResNet
User: d-li14
Home Page: https://arxiv.org/abs/2102.08602
pre-trained-model,Official Repository for the Uni-Mol Series Methods
Organization: dptech-corp
pre-trained-model,Eden AI: simplify the use and deployment of AI technologies by providing a unique API that connects to the best possible AI engines
Organization: edenai
Home Page: https://www.edenai.co/
pre-trained-model,cross-domain recommendation,transfer learning,pre-training,self-supervise learning papers and datasets
User: fajieyuan
pre-trained-model,Exploring Visual Prompts for Adapting Large-Scale Models
User: hjbahng
pre-trained-model,[NeurIPS'2023] "GPT-ST: Generative Pre-Training of Spatio-Temporal Graph Neural Networks"
User: hkuds
Home Page: https://arxiv.org/abs/2311.04245
pre-trained-model,"HiGPT: Heterogenous Graph Language Models"
User: hkuds
Home Page: https://higpt-hku.github.io
pre-trained-model,"UrbanGPT: Spatio-Temporal Large Language Models"
User: hkuds
Home Page: https://urban-gpt.github.io
pre-trained-model,MiniRBT (中文小型预训练模型系列)
Organization: iflytek
pre-trained-model,Meta-Learning for EEG, Sleep Staging, Transfer Learning, Pre-trained EEG, PSG datasets (IEEE Journal of Biomedical and Health Informatics)
User: iobt-vistec
pre-trained-model,Universal Joint Feature Extraction for P300 EEG Classification Using Multi-Task Autoencoder (IEEE Access)
User: iobt-vistec
pre-trained-model,[AAAI 2023] The implementation for the paper "Energy-Motivated Equivariant Pretraining for 3D Molecular Graphs"
User: jiaor17
Home Page: https://arxiv.org/abs/2207.08824
pre-trained-model,Performance testing of 24 Machine Learning models on Raspberry Pi using TensorFlow Lite and Google Coral USB Accelerator
User: jiteshsaini
pre-trained-model,[ICLR'23 Spotlight🔥] The first successful BERT/MAE-style pretraining on any convolutional network; Pytorch impl. of "Designing BERT for Convolutional Networks: Sparse and Hierarchical Masked Modeling"
User: keyu-tian
Home Page: https://arxiv.org/abs/2301.03580
pre-trained-model,Thank you BART! Rewarding Pre-Trained Models Improves Formality Style Transfer (ACL 2021)
User: laihuiyuan
Home Page: https://arxiv.org/abs/2105.06947
pre-trained-model,Simple, fast and easy to read. Yes, we use the pytorch framework!
User: lornatang
pre-trained-model,Pretrained model for Chinese Scientific Text
User: lvyufeng
pre-trained-model,Brain Tomur Classification Using Pre-trained Models
User: masoudnick
pre-trained-model,Large-scale Self-supervised Pre-training Across Tasks, Languages, and Modalities
Organization: microsoft
Home Page: https://aka.ms/GeneralAI
pre-trained-model,Self-supervised contrastive learning for time series via time-frequency consistency
Organization: mims-harvard
Home Page: https://zitniklab.hms.harvard.edu/projects/TF-C/
pre-trained-model,[MICCAI 2019] [MEDIA 2020] Models Genesis
User: mrgiovanni
pre-trained-model,[ICLR 2024] Supervised Pre-Trained 3D Models for Medical Image Analysis
User: mrgiovanni
Home Page: https://www.cs.jhu.edu/~alanlab/Pubs23/li2023suprem.pdf
pre-trained-model,A work in progress to build out solutions in Rust for MLOPs
Organization: nogibjj
Home Page: https://nogibjj.github.io/rust-tutorial/
pre-trained-model,[ICLR 2024] Official PyTorch implementation of FasterViT: Fast Vision Transformers with Hierarchical Attention
Organization: nvlabs
Home Page: https://arxiv.org/abs/2306.06189
pre-trained-model,[ICML 2023] Official PyTorch implementation of Global Context Vision Transformers
Organization: nvlabs
Home Page: https://arxiv.org/abs/2206.09959
pre-trained-model,An official implementation of Advancing Radiograph Representation Learning with Masked Record Modeling (ICLR'23)
Organization: rl4m
pre-trained-model,Powerful handwritten text recognition. A simple-to-use, unofficial implementation of the paper "TrOCR: Transformer-based Optical Character Recognition with Pre-trained Models".
User: rsommerfeld
pre-trained-model,This repository is the official implementation of our paper MVP: Multi-task Supervised Pre-training for Natural Language Generation.
Organization: rucaibox
pre-trained-model,Source code for our EMNLP'21 paper 《Raise a Child in Large Language Model: Towards Effective and Generalizable Fine-tuning》
User: runxinxu
pre-trained-model,Fine Tune Video Generator for pretrained Stable Diffusion And Disney Models
User: saba99
pre-trained-model,PyTorch code for "Prototypical Contrastive Learning of Unsupervised Representations"
Organization: salesforce
pre-trained-model,Official repository of the AAAI'2022 paper "GALAXY: A Generative Pre-trained Model for Task-Oriented Dialog with Semi-Supervised Learning and Explicit Policy Injection"
Organization: siat-nlp
pre-trained-model,An Open-Source Framework for Prompt-Learning.
Organization: thunlp
Home Page: https://thunlp.github.io/OpenPrompt/
pre-trained-model,Code of the CVPR 2021 Oral paper: A Recurrent Vision-and-Language BERT for Navigation
User: yiconghong
pre-trained-model,Pre-trained Chinese ELECTRA(中文ELECTRA预训练模型)
User: ymcui
Home Page: http://electra.hfl-rc.com
pre-trained-model,PERT: Pre-training BERT with Permuted Language Model
User: ymcui
Home Page: https://arxiv.org/abs/2203.06906
pre-trained-model,A curated list of papers on pre-training for graph neural networks (Pre-train4GNN).
User: yuanchenbei
pre-trained-model,ZhiJian: A Unifying and Rapidly Deployable Toolbox for Pre-trained Model Reuse
User: zhangyikaii
Home Page: https://zhijian.readthedocs.io
pre-trained-model,Searching prompt modules for parameter-efficient transfer learning.
User: zhangyuanhan-ai
pre-trained-model,Must-read Papers on Knowledge Editing for Large Language Models.
Organization: zjunlp
pre-trained-model,An Open-sourced Knowledgable Large Language Model Framework.
Organization: zjunlp
Home Page: http://knowlm.zjukg.cn/
pre-trained-model,[ICLR 2024] Domain-Agnostic Molecular Generation with Chemical Feedback
Organization: zjunlp
Home Page: https://huggingface.co/spaces/zjunlp/MolGen
pre-trained-model,Must-read papers on NLP for science.
Organization: zjunlp
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.