- Talking to Botty 💬
- Configuring Botty 🔧
- Extending Botty's Responses ✍️
- Running Botty Locally 🚀
- Botty in the ☁️
Botty is built on top of socket.io by default, meaning you should use a client implementation of this to talk to it.
In JavaScript we can simply do something like:
import io from 'socket.io-client';
const socket = io(
'https://botty.alexgurr.com',
{ transports: ['websocket', 'polling', 'flashsocket'] }
);
socket.on('bot-message', (message) => {
// do something
});Botty is simple and cleanly written. This makes it easy to swap out socket.io for a general web socket or REST solution for example.
Botty has an easy-to-change constants file, called constants.js in the root of the server. Here's what you can change:
General configuration
-
PORT: where the Botty server should listen. -
RESPONSES_FILE_PATH: the file location of the dataset file Botty should source it's responses from. Expects a csv file with keys matchingRESPONSES_INPUT_KEYandRESPONSES_OUTPUT_KEYbelow. -
RESPONSES_INPUT_KEY: the name of the input (matched phrase) column in the csv file above -
RESPONSES_OUTPUT_KEY: the name of the output (message) column in the csv file above -
USER_MESSAGE_EVENT: the event string Botty listens to for user socket messages. -
BOT_MESSAGE_EVENT: the event string Botty will emit for it's reponse messages. -
BOT_TYPING_EVENT: the event string Botty will emit when typing a response. IfMAX_TYPING_Sis falsy, this event will never be emitted.
Things to make Botty seem more real
-
DEFAULT_RESPONSE: the message Botty replies with if it finds no response matches. -
RESPONSE_MATCH_THRESHOLD: Botty response-matching tolerance. The lower this value, the looser the matches. -
MIN_TYPING_S: the minimum value Botty should 'type' for, in seconds. -
MAX_TYPING_S: the maximum value Botty should 'type' for, in seconds. Set this to 0 to skip typing events. -
MIN_NATURAL_PAUSE_S: the minimum pause Botty will take before emitting it's first event, in seconds. -
MAX_NATURAL_PAUSE_S: the maximum pause Botty will take before emitting it's first event, in seconds.
Botty has a default dataset file called response_dataset.csv. This is easily extendable, or you can provide your own. If you want to bring your own file, simple change the value of the RESPONSES_FILE_PATH constant and make sure it's in the correct format (see constants above).
Feel free to open a pull request to extend the default file.
yarn
# Botty will be available through socket.io on the port defined through the PORT constant
yarn start
Botty is currently hosted and waiting to chat to your app at https://botty.alexgurr.com/. The server isn't free - if you'd like to help out you could
Botty is currently actively used in the Chatter ReactJS Coding Challenge.
