Welcome to my personal GitHub repository! This repository contains tools and tutorials for AI/ML development, particularly focused on large language model deployment and infrastructure.
A comprehensive Jupyter notebook that automates the setup of the DeepSeek R1 14B language model using Ollama, with Nginx reverse proxy and Ngrok tunneling for public access.
File: deepseek-r1-14b-ollama-nginx.ipynb
This notebook provides a complete, automated setup for running the DeepSeek R1 14B model locally with professional-grade infrastructure:
- Model Deployment: Downloads and configures DeepSeek R1 14B via Ollama
- Reverse Proxy: Sets up Nginx for production-ready request handling
- Public Access: Creates secure tunnels using Ngrok for external access
- Environment Configuration: Optimizes settings for performance and compatibility
Before running the notebook, ensure you have:
- Python 3.6 or higher
- Sufficient disk space (DeepSeek R1 14B requires ~8GB)
- Internet connection for model download
- Ngrok account and auth token (Get one here)
- Root/sudo access (for Nginx installation)
-
Clone this repository
git clone https://github.com/JarrydGordon/JarrydGordon.git cd JarrydGordon -
Install required Python packages
pip install jupyter pyngrok
-
Get your Ngrok auth token
- Sign up at ngrok.com
- Copy your auth token from the dashboard
-
Run the notebook
jupyter notebook deepseek-r1-14b-ollama-nginx.ipynb
-
Follow the prompts
- Enter your Ngrok auth token when requested
- Wait for the automated setup to complete
The notebook automatically installs and configures:
- Ollama: Local LLM server
- DeepSeek R1 14B: The language model
- Nginx: Web server and reverse proxy
- Ngrok: Secure tunneling service
Once setup is complete:
- The notebook will display a public URL (e.g.,
https://xxxxx.ngrok-free.app) - Access your DeepSeek R1 model via the Ollama API:
curl -X POST https://your-ngrok-url.ngrok-free.app/api/generate \ -H "Content-Type: application/json" \ -d '{"model": "deepseek-r1:14b", "prompt": "Hello, world!"}'
Internet β Ngrok Tunnel β Nginx (Port 80) β Ollama (Port 11434) β DeepSeek R1 14B
- Security: The Ngrok tunnel makes your model publicly accessible. Monitor usage and consider authentication for production use.
- Resources: DeepSeek R1 14B requires significant computational resources. Performance may vary based on hardware.
- Costs: Ngrok free tier has usage limits. Consider upgrading for heavy usage.
Common Issues:
-
"Ollama command not found"
- Restart the kernel and re-run the installation cells
- Ensure
/usr/local/binis in your PATH
-
Nginx permission errors
- Make sure you're running with sudo privileges
- Check that port 80 is not already in use
-
Ngrok tunnel fails
- Verify your auth token is correct
- Check your Ngrok account limits
-
Model download timeout
- Ensure stable internet connection
- The 14B model is large (~8GB) and may take time to download
Contributions are welcome! If you have improvements or bug fixes:
- Fork the repository
- Create a feature branch
- Make your changes
- Submit a pull request
This project is open source and available under the MIT License.
- GitHub: @JarrydGordon
- Feel free to open an issue for questions or suggestions!
β If you find this repository helpful, please consider giving it a star!