Skip to main content

bgent

A flexible, scalable and customizable agent for production apps. Comes with batteries-including database, deployment and examples using Supabase and Cloudflare.

cj

npm version build passing tests passing lint passing License stars - bgent forks - bgent

Connect With Usโ€‹

Join the Discord server

Featuresโ€‹

  • ๐Ÿ›  Simple and extensible
  • ๐ŸŽจ Customizable to your use case
  • ๐Ÿ“š Easily ingest and interact with your documents
  • ๐Ÿ’พ Retrievable memory and document store
  • โ˜๏ธ Serverless architecture
  • ๐Ÿš€ Deployable in minutes at scale with Cloudflare
  • ๐Ÿ‘ฅ Multi-agent and room support
  • ๐ŸŽฏ Goal-directed behavior
  • ๐Ÿ“ฆ Comes with ready-to-deploy examples

What can I use it for?โ€‹

  • ๐Ÿค– Chatbots
  • ๐Ÿ•ต๏ธ Autonomous Agents
  • ๐Ÿ“ˆ Business process handling
  • ๐ŸŽฎ Video game NPCs

Try the agentโ€‹

npx bgent

Installationโ€‹

Currently bgent is dependent on Supabase for local development. You can install it with the following command:

npm install bgent

# Select your database adapter
npm install sqlite-vss better-sqlite3 # for sqlite (simple, for local development)
npm install @supabase/supabase-js # for supabase (more complicated but can be deployed at scale)

Set up environment variablesโ€‹

You will need a Supbase account, as well as an OpenAI developer account.

Copy and paste the .dev.vars.example to .dev.vars and fill in the environment variables:

SUPABASE_URL="https://your-supabase-url.supabase.co"
SUPABASE_SERVICE_API_KEY="your-supabase-service-api-key"
OPENAI_API_KEY="your-openai-api-key"

SQLite Local Setup (Easiest)โ€‹

You can use SQLite for local development. This is the easiest way to get started with bgent.

import { BgentRuntime, SqliteDatabaseAdapter } from "bgent";
import { Database } from "sqlite3";
const sqliteDatabaseAdapter = new SqliteDatabaseAdapter(new Database(":memory:"));

const runtime = new BgentRuntime({
serverUrl: "https://api.openai.com/v1",
token: process.env.OPENAI_API_KEY, // Can be an API key or JWT token for your AI services
databaseAdapter: sqliteDatabaseAdapter,
// ... other options
});

Supabase Local Setupโ€‹

First, you will need to install the Supabase CLI. You can install it using the instructions here.

Once you have the CLI installed, you can run the following commands to set up a local Supabase instance:

supabase init
supabase start

You can now start the bgent project with npm run dev and it will connect to the local Supabase instance by default.

NOTE: You will need Docker installed for this to work. If that is an issue for you, use the Supabase Cloud Setup instructions instead below).

Supabase Cloud Setupโ€‹

This library uses Supabase as a database. You can set up a free account at supabase.io and create a new project.

  • Step 1: On the Subase All Projects Dashboard, select โ€œNew Projectโ€.
  • Step 2: Select the organization to store the new project in, assign a database name, password and region.
  • Step 3: Select โ€œCreate New Projectโ€.
  • Step 4: Wait for the database to setup. This will take a few minutes as supabase setups various directories.
  • Step 5: Select the โ€œSQL Editorโ€ tab from the left navigation menu.
  • Step 6: Copy in your own SQL dump file or optionally use the provided file in the bgent directory at: "src/supabase/db.sql". Note: You can use the command "supabase db dump" if you have a pre-exisiting supabase database to generate the SQL dump file.
  • Step 7: Paste the SQL code into the SQL Editor and hit run in the bottom right.
  • Step 8: Select the โ€œDatabasesโ€ tab from the left navigation menu to verify all of the tables have been added properly.

Once you've set up your Supabase project, you can find your API key by going to the "Settings" tab and then "API". You will need to set the SUPABASE_URL and SUPABASE_SERVICE_API_KEY environment variables in your .dev.vars file.

Local Model Setupโ€‹

While bgent uses ChatGPT 3.5 by default, you can use a local model by setting the serverUrl to a local endpoint. The LocalAI project is a great way to run a local model with a compatible API endpoint.

const runtime = new BgentRuntime({
serverUrl: process.env.LOCALAI_URL,
token: process.env.LOCALAI_TOKEN, // Can be an API key or JWT token for your AI service
// ... other options
});

Developmentโ€‹

npm run dev # start the server
npm run shell # start the shell in another terminal to talk to the default agent

Usageโ€‹

import { BgentRuntime, SupabaseDatabaseAdapter, SqliteDatabaseAdapter } from "bgent";

const sqliteDatabaseAdapter = new SqliteDatabaseAdapter(new Database(":memory:"));

// You can also use Supabase like this
// const supabaseDatabaseAdapter = new SupabaseDatabaseAdapter(
// process.env.SUPABASE_URL,
// process.env.SUPABASE_SERVICE_API_KEY)
// ;

const runtime = new BgentRuntime({
serverUrl: "https://api.openai.com/v1",
token: process.env.OPENAI_API_KEY, // Can be an API key or JWT token for your AI services
databaseAdapter: sqliteDatabaseAdapter,
actions: [
/* your custom actions */
],
evaluators: [
/* your custom evaluators */
],
model: "gpt-3.5-turbo", // whatever model you want to use
embeddingModel: "text-embedding-3-small", // whatever model you want to use
});

Custom Actionsโ€‹

Bgent is customized through actions and evaluators. Actions are functions that are called when a user input is received, and evaluators are functions that are called when a condition is met at the end of a conversation turn.

An example of an action is wait (the agent should stop and wait for the user to respond) or elaborate (the agent should elaborate and write another message in the conversation).

An example of a evaluator is fact (the agent should summarize the conversation so far).

import { wait, fact } from "bgent";

const runtime = new BgentRuntime({
// ... other options
actions: [wait],
evaluators: [fact],
});

// OR you can register actions and evaluators after the runtime has been created
bgentRuntime.registerAction(wait);
bgentRuntime.registerEvaluator(fact);

Custom Data Sourcesโ€‹

If you want to add custom data into the context that is sent to the LLM, you can create a Provider and add it to the runtime.

import { type BgentRuntime, type Message, type Provider, type State } from "bgent";

const time: Provider = {
// eslint-disable-next-line @typescript-eslint/no-unused-vars
get: async (_runtime: BgentRuntime, _message: Message, _state?: State) => {
const currentTime = new Date().toLocaleTimeString("en-US");
return "The current time is: " + currentTime;
},
};

const runtime = new BgentRuntime({
// ... other options
providers: [time],
});

Handling User Inputโ€‹

The BgentRuntime instance has a handleMessage method that can be used to handle user input. The method returns a promise that resolves to the agent's response.

You will need to make sure that the room_id already exists in the database. You can use the Supabase client to create new users and rooms if necessary.

const message = {
user_id: "user-uuid", // Replace with the sender's UUID
content: { content: content }, // The message content
room_id: "room-uuid", // Replace with the room's UUID
};
const response = await bgentRuntime.handleMessage(message);
console.log("Agent response:", response);

Example Agentsโ€‹

There are two examples which are set up for cloudflare in src/agents

  • The simple example is a simple agent that can be deployed to cloudflare workers
  • The cj example is a more complex agent that has the ability to introduce users to each other. This agent is also deployable to cloudflare workers, and is the default agent in Cojourney.

An external example of an agent is the afbot Aframe Discord Bot, which is a discord bot that uses bgent as a backend. You can find it here.

Deploy to Cloudflareโ€‹

To deploy an agent to Cloudflare, you can run npm run deploy -- this will by default deploy the cj agent. To deploy your own agent, see the afbot example.

API Documentationโ€‹

Complete API documentation is available at https://bgent.org/docs

Contributions Welcomeโ€‹

This project is made by people like you. No contribution is too small. We welcome your input and support. Please file an issue if you notice something that needs to be resolved, or join us on Discord to discuss working with us on fixes and new features.