-
-
Notifications
You must be signed in to change notification settings - Fork 177
Open
Labels
enhancementNew feature or requestNew feature or request
Description
📌 Summary
Bring the remote Ollama backend hack to rep+ by providing an official Google Colab notebook that:
- Installs and runs Ollama inside Colab
- Sets up a Cloudflare Tunnel
- Outputs a public endpoint for rep+ to use as an LLM backend
This does NOT run rep+ in Colab — it only powers its AI Explain / Suggest Attack features remotely when local resources suck or you can’t install Ollama.
🎯 What the Notebook Should Do
- Install Ollama and dependencies in Colab
- Download a chosen model (e.g., DeepSeek R1, Llama 3, etc.)
- Start the Ollama API server
- Establish a Cloudflare Tunnel using a user-provided token
- Print the public tunnel URL that rep+ can point to
🔥 Why This Is Needed
Right now rep+ is a Chrome DevTools HTTP Repeater with AI great for replaying and poking API calls, explaining behavior, and suggesting attack vectors — but its built-in local Ollama dependency means:
- Many users can’t run heavy models locally
- Setup is painful with GPU drivers and storage
A Colab + Tunnel backend:
- Solves hardware limits (RAM/CPU) instantly
- Lets rep+ run AI features easily on any machine
- Great for security researchers on lightweight laptops
🙏 Please consider adding this Colab remote Ollama notebook to the rep+ project makes AI useful, accessible, and way less painful for everyone.
Reactions are currently unavailable
Metadata
Metadata
Assignees
Labels
enhancementNew feature or requestNew feature or request
Type
Projects
Status
Backlog