-
Shanghai University Of Engineering Science
- Shanghai
-
12:02
(UTC -12:00) - https://www.sues.edu.cn/
Stars
Hands-On Artificial Intelligence for IoT, published by Packt
SimCLRv2 - Big Self-Supervised Models are Strong Semi-Supervised Learners
《动手学大模型Dive into LLMs》系列编程实践教程
该项目用于对沪深300股票的预测,包括股票下载,数据清洗,LSTM 模型的训练,测试,以及实时预测
Integrated physics-based and ligand-based modeling.
Topological transformer for protein-ligand complex interaction prediction.
使用Bert,ERNIE,进行中文文本分类
中文命名实体识别。包含目前最新的中文命名实体识别论文、中文实体识别相关工具、数据集,以及中文预训练模型、词向量、实体识别综述等。
天池中药说明书实体识别挑战冠军方案;中文命名实体识别;NER; BERT-CRF & BERT-SPAN & BERT-MRC;Pytorch
中文命名实体识别(包括多种模型:HMM,CRF,BiLSTM,BiLSTM+CRF的具体实现)
Latex-format paper templates, including Elsevier, arXiv and IEEE Access.
ConCare: Personalized Clinical Feature Embedding via Capturing the Healthcare Context (AAAI-2020)
This is a repository with the code for the ACL 2019 paper "Analyzing Multi-Head Self-Attention: Specialized Heads Do the Heavy Lifting, the Rest Can Be Pruned" and the ACL 2021 paper "Analyzing Sou…
An offical implementation of PatchTST: "A Time Series is Worth 64 Words: Long-term Forecasting with Transformers." (ICLR 2023) https://arxiv.org/abs/2211.14730
Explore a conversational AI powered by Mistral 7B LLM for our college website. Engage with this intelligent assistant to get information on courses, events, and campus resources. Built with cutting…
Chat-甄嬛是利用《甄嬛传》剧本中所有关于甄嬛的台词和语句,基于ChatGLM2进行LoRA微调得到的模仿甄嬛语气的聊天语言模型。
Galformer: A Transformer with Generative Decoding and a Hybrid Loss Function for Multi-Step Stock Market Index Prediction
CNN+BiLSTM+Attention Multivariate Time Series Prediction implemented by Keras
The GitHub repository for the paper "Informer" accepted by AAAI 2021.
Some transformer architectures were cloned from other gits and compared in the time series forecasting task for several common series (W-transformer, FEDformer, Autoformer, Informer, Temporal Fusio…
An Awesome List of the latest time series papers and code from top AI venues.
Official implementation of our ICLR 2023 paper "Crossformer: Transformer Utilizing Cross-Dimension Dependency for Multivariate Time Series Forecasting"
Official implementation of "Periodicity Decoupling Framework for Long-term Series Forecasting" (ICLR 2024)
Using Transformer deep learning architecture to predict stock prices.