-
Shenzhen University
- ShenZhen
- https://github.com/kyodocn
Stars
Python tool for converting files and office documents to Markdown.
An AI personal tutor built with Llama 3.1
一款在线的 微信公众号文章批量下载 工具,支持导出阅读量与评论数据,无需搭建任何环境,可通过 在线网站(https://down.mptext.top) 使用,同时也支持 docker 私有化部署和 Cloudflare 部署。 支持下载各种文件格式,其中 HTML 格式可100%还原文章排版与样式。
A tool to parse TMX files and provide a simple searchable UI.
A simple python script for converting translation memories from TMX to Stardict format (Searchable from Goldendict)
Desktop application of new Bing's AI-powered chat (Windows, macOS and Linux)
GPT4All: Run Local LLMs on Any Device. Open-source and available for commercial use.
基于 ChatGPT API 的划词翻译浏览器插件和跨平台桌面端应用 - Browser extension and cross-platform desktop application for translation based on ChatGPT API.
Unlock your displays on your Mac! Flexible HiDPI scaling, XDR/HDR extra brightness, virtual screens, DDC control, extra dimming, PIP/streaming, EDID override and lots more!
This is a content file for "Pandas 100 Knocks". I don`t offer a license. I own the copyright on this source code.
Materials for the Hugging Face Diffusion Models Course
Japanese word embedding with Sudachi and NWJC 🌿
🤱🏻 Turn any webpage into a desktop app with one command. 一键打包网页生成轻量桌面应用
翻墙-科学上网、自由上网、免费科学上网、免费翻墙、fanqiang、油管youtube/视频下载、软件、VPN、一键翻墙浏览器,vps一键搭建翻墙服务器脚本/教程,免费shadowsocks/ss/ssr/v2ray/goflyway账号/节点,翻墙梯子,电脑、手机、iOS、安卓、windows、Mac、Linux、路由器翻墙、科学上网、youtube视频下载、youtube油管镜像/免翻墙…
『ゼロから作る Deep Learning』(O'Reilly Japan, 2016)
KoichiYasuoka / pynlpir
Forked from tsroten/pynlpirA Python wrapper around the NLPIR/ICTCLAS Chinese segmentation software.
A curated list of resources dedicated to Python libraries, LLMs, dictionaries, and corpora of NLP for Japanese
Tokenizer POS-tagger Lemmatizer and Dependency-parser for modern and contemporary Japanese with BERT models
Extend character support to 5,000+, lift editing limits, integrate with DeepL (Pro or API), enable formal/informal tone selection, and provide privacy‑friendly device fingerprint rotation.
A Cython MeCab wrapper for fast, pythonic Japanese tokenization and morphological analysis.
Use custom tokenizers in spacy-transformers