Name: Yuchen Bian
Type: User
Company: Baidu
Bio: Yuchen Bian is a researcher in Baidu Research. He obtained his Ph.D. degree from Penn State University in 2019. His research interests cover NLP, graph mining.
Location: 1195 Bordeaux Dr, Sunnyvale, CA 94089
Blog: http://sites.psu.edu/yuchenbian/
Yuchen Bian's Projects
Local community detection for a given set of query nodes attracts much research attention recently. The query nodes play essential roles in the detection effectiveness. Existing methods perform well when a query node is from the target community core region. However, they struggle with the query- bias issue and especially perform unsatisfactorily when the query nodes come from different communities or when certain query nodes are from communities overlapping region or community boundary region. To address above issues, we consider from a new angle, to replace these original “intractable” query nodes with new detection-friendly query nodes. In this paper, we propose an effective ATP (Amplified Topology Potential) algorithm to detect core nodes of the target communities w.r.t. original query nodes. For one query node, ATP first builds a query-oriented topology potential field around the query node by aggregating random walk with restart scores. Then it amplifies the topology potential value to make core nodes of target communities easily distinguished. Graph-size-independent fast approximation strategies are also proposed together with sound theoretical foundations. Extensive experiments on four real networks using ten state-of-the-art local community detection methods verify the improvement in detection effectiveness and efficiency by the replacing strategy for the tough query cases. Please refer to the ICDM 2020 paper (Rethinking Local Community Detection: Query Nodes Replacement) for details.
list of efficient attention modules
Distillation of BERT model with catalyst framework
CoV-Seq: COVID-19 Genomic Sequence Database and Visualization
Books for machine learning, deep learning, math, NLP, CV, RL, etc
Graph Transformer Networks (Authors' PyTorch implementation for the NeurIPS 19 paper)
《机器学习宝典》包含:谷歌机器学习速成课程(招式)+机器学习术语表(口诀)+机器学习规则(心得)+机器学习中的常识性问题 (内功)。该资源适用于机器学习、深度学习研究人员和爱好者参考!
机器学习、深度学习的学习路径及知识总结
Implementations of basic RL algorithms with minimal lines of codes! (pytorch based)
Repository to track the progress in Natural Language Processing (NLP), including the datasets and the current state-of-the-art for the most common NLP tasks.
Must-read Papers on pre-trained language models.
Pretrained language model and its related optimization techniques developed by Huawei Noah's Ark Lab.
PyTorch image models, scripts, pretrained weights -- ResNet, ResNeXT, EfficientNet, EfficientNetV2, NFNet, Vision Transformer, MixNet, MobileNet-V3/V2, RegNet, DPN, CSPNet, and more
Semantic similarity measures from Babylon Health
Recent Transformer-based CV and related works.
🤗Transformers: State-of-the-art Natural Language Processing for Pytorch and TensorFlow 2.0.
Implementation of Vision Transformer, a simple way to achieve SOTA in vision classification with only a single transformer encoder, in Pytorch