-
21:28
(UTC +08:00) - https://liu-jc.github.io/
- in/juncheng-liu
Starred repositories
[Survey] A Comprehensive Survey of Self-Evolving AI Agents: A New Paradigm Bridging Foundation Models and Lifelong Agentic Systems
A curated list of paper, code, data, and other resources focus on multimodal time series analysis.
[ICLR 2025 Spotlight] Official implementation of "Time-MoE: Billion-Scale Time Series Foundation Models with Mixture of Experts"
Unified Training of Universal Time Series Forecasting Transformers
A professional list on Large (Language) Models and Foundation Models (LLM, LM, FM) for Time Series, Spatiotemporal, and Event Data.
PyGWalker: Turn your dataframe into an interactive UI for visual analysis
🦜🔗 Build context-aware reasoning applications
为GPT/GLM等LLM大语言模型提供实用化交互接口,特别优化论文阅读/润色/写作体验,模块化设计,支持自定义快捷按钮&函数插件,支持Python和C++等项目剖析&自译解功能,PDF/LaTex论文翻译&总结功能,支持并行问询多种LLM模型,支持chatglm3等本地模型。接入通义千问, deepseekcoder, 讯飞星火, 文心一言, llama2, rwkv, claude2, m…
A playbook for systematically maximizing the performance of deep learning models.
Differentiable ODE solvers with full GPU support and O(1)-memory backpropagation.
How does Heterophily Impact the Robustness of Graph Neural Networks? Theoretical Connections and Practical Implications (KDD'22)
The implementation for the NeurIPS 2022 paper Parameter-free Dynamic Graph Embedding for Link Prediction.
here you can find the material used for our Tutorials
Collection of resources about partial differential equations, graph neural networks, deep learning and dynamical system simulation
"Graph Neural Controlled Differential Equations for Traffic Forecasting", AAAI 2022
The implementation of LSCALE: Latent Space Clustering-Based Active Learning for Node Classification
The official implementation of EIGNN: Efficient Infinite-Depth Graph Neural Networks (NeurIPS 2021)
[ICML 2022] Graph Stochastic Attention (GSAT) for interpretable and generalizable graph learning.
Official code of GIND (Optimization-Induced Graph Implicit Nonlinear Diffusion)
Code and dataset for paper "GRAND+: Scalable Graph Random Neural Networks"
Code for the paper "How Attentive are Graph Attention Networks?" (ICLR'2022)