Stars
A GPT-4 AI Tutor Prompt for customizable personalized learning experiences.
[NeurIPS'23 Oral] Visual Instruction Tuning (LLaVA) built towards GPT-4V level capabilities and beyond.
用于从头预训练+SFT一个小参数量的中文LLaMa2的仓库;24G单卡即可运行得到一个具备简单中文问答能力的chat-llama2.
The largest collection of PyTorch image encoders / backbones. Including train, eval, inference, export scripts, and pretrained weights -- ResNet, ResNeXT, EfficientNet, NFNet, Vision Transformer (V…
Implementation of Vision Transformer, a simple way to achieve SOTA in vision classification with only a single transformer encoder, in Pytorch
Instant voice cloning by MIT and MyShell. Audio foundation model.
The fast version of DeLong's method for computing the covariance of unadjusted AUC.
PyPI package to calculate comprehensive confidence intervals for classification positive rate, precision, NPV, and recall using a labeled sample via exact & approximate Frequentist & Bayesian setups.
The official code for "Towards Generalist Foundation Model for Radiology by Leveraging Web-scale 2D&3D Medical Data".
Everything about note management. All in Zotero.
Official code for "LLM-CXR: Instruction-Finetuned LLM for CXR Image Understanding and Generation"
A new collection of medical VQA dataset based on MIMIC-CXR. Part of the work 'EHRXQA: A Multi-Modal Question Answering Dataset for Electronic Health Records with Chest X-ray Images, NeurIPS 2023 D&B'.
Effortless data labeling with AI support from Segment Anything and other awesome models.
This repository contains code to train a self-supervised learning model on chest X-ray images that lack explicit annotations and evaluate this model's performance on pathology-classification tasks.
PyTorch implementation of MAE https//arxiv.org/abs/2111.06377
Code and models for ICML 2024 paper, NExT-GPT: Any-to-Any Multimodal Large Language Model
ToolJet is the open-source foundation of ToolJet AI - the AI-native platform for building internal tools, dashboard, business applications, workflows and AI agents 🚀
The open-source notification Inbox infrastructure. E-mail, SMS, Push and Slack Integrations.
Large-scale Self-supervised Pre-training Across Tasks, Languages, and Modalities
Original reference implementation of "3D Gaussian Splatting for Real-Time Radiance Field Rendering"
MedicalGPT: Training Your Own Medical GPT Model with ChatGPT Training Pipeline. 训练医疗大模型,实现了包括增量预训练(PT)、有监督微调(SFT)、RLHF、DPO、ORPO、GRPO。
Unsupervised Word Segmentation for Neural Machine Translation and Text Generation