documentation updates; AI classes renamed
We now have AiApi, OllamaAiApi, and OpenAiApi. Documentation updates to provide a bit more high-level clarity that was originally generated by the agent.
This commit is contained in:
parent
1edc3a85b8
commit
f1b5a560a3
@ -43,7 +43,7 @@ pnpm --filter gadget-drone dev
|
|||||||
## TypeScript Strictness
|
## TypeScript Strictness
|
||||||
|
|
||||||
| Package | Strictness |
|
| Package | Strictness |
|
||||||
|---|---|
|
| ------------ | ---------------------------------------------------------------------------------- |
|
||||||
| `@gadget/ai` | `strict: true` |
|
| `@gadget/ai` | `strict: true` |
|
||||||
| gadget-drone | `strict: true` |
|
| gadget-drone | `strict: true` |
|
||||||
| gadget-code | `strict: true`, `noUnusedLocals`, `noUnusedParameters`, `noUncheckedIndexedAccess` |
|
| gadget-code | `strict: true`, `noUnusedLocals`, `noUnusedParameters`, `noUncheckedIndexedAccess` |
|
||||||
@ -52,7 +52,7 @@ pnpm --filter gadget-drone dev
|
|||||||
|
|
||||||
When adding a new feature or service, determine its scope:
|
When adding a new feature or service, determine its scope:
|
||||||
|
|
||||||
- **Shared concern** (AI, logging, config schema) → goes in `@gadget/ai`
|
- **Shared AI concern** (AI logging, config schema) → goes in `@gadget/ai`
|
||||||
- **Drone-only** (Bull queue, workspace file operations) → goes in gadget-drone
|
- **Drone-only** (Bull queue, workspace file operations) → goes in gadget-drone
|
||||||
- **Web-only** (Express routes, Mongoose models, session management) → goes in gadget-code
|
- **Web-only** (Express routes, Mongoose models, session management) → goes in gadget-code
|
||||||
|
|
||||||
|
|||||||
26
README.md
26
README.md
@ -5,7 +5,7 @@ A self-hosted **Agentic Engineering Platform (AEP)** — an IDE that drives auto
|
|||||||
## Projects
|
## Projects
|
||||||
|
|
||||||
| Package | Role |
|
| Package | Role |
|
||||||
|---|---|
|
| -------------- | ------------------------------------------------------------------------ |
|
||||||
| `gadget-code` | Web service — agentic IDE, browser UI, API server |
|
| `gadget-code` | Web service — agentic IDE, browser UI, API server |
|
||||||
| `gadget-drone` | Worker process — runs the agentic workflow loop in workspace directories |
|
| `gadget-drone` | Worker process — runs the agentic workflow loop in workspace directories |
|
||||||
| `@gadget/ai` | Shared AI API abstraction — Ollama and OpenAI, called by both |
|
| `@gadget/ai` | Shared AI API abstraction — Ollama and OpenAI, called by both |
|
||||||
@ -43,6 +43,26 @@ pnpm --filter gadget-drone dev
|
|||||||
|
|
||||||
## Architecture
|
## Architecture
|
||||||
|
|
||||||
gadget-code runs on server infrastructure (MongoDB, Redis, etc.) and serves the browser-based IDE. gadget-drone runs on end-user machines, connecting via WebSocket to gadget-code, and executes the agentic workflow loop against local project directories via remote control. gadget-drone never connects directly to MongoDB or Redis — it communicates entirely through the Gadget Code API.
|
### @gadget/ai
|
||||||
|
|
||||||
AI calls are handled by `@gadget/ai`, which both projects depend on. This keeps all AI SDK knowledge in one place.
|
AI API calls are handled by `@gadget/ai`, which both projects depend on. This keeps all AI SDK knowledge in one place, and currently implements:
|
||||||
|
|
||||||
|
- [AiApi](./packages/ai/src/api.ts) - abstract base class for all AI APIs/SDKs
|
||||||
|
- [OllamaAiApi](./packages/ai/src/ollama.ts) - Ollama API implementation
|
||||||
|
- [OpenAiApi](./packages/ai/src/openai.ts) - OpenAI API implementation
|
||||||
|
|
||||||
|
### gadget-drone
|
||||||
|
|
||||||
|
gadget-drone is a headless process runs on end-user machines, connecting via Socket.IO to gadget-code, to receive and execute work orders for the agentic workflow loop.
|
||||||
|
|
||||||
|
At startup, gadget-drone examines `process.cwd()` to determine if it's a workspace directory, and if so, starts a worker process that connects to gadget-code and waits for work orders. It processes work orders in project directories in the gadget-drone workspace directory, and communicates events, status, and results to the IDE via gadget-code's web services and Socket.IO.
|
||||||
|
|
||||||
|
gadget-drone never connects directly to MongoDB or Redis — it communicates entirely through the Gadget Code API.
|
||||||
|
|
||||||
|
### gadget-code
|
||||||
|
|
||||||
|
gadget-code runs on server infrastructure (MongoDB, Redis, etc.) and serves the browser-based IDE. The IDE connects to gadget-code via Socket.IO to send and receive commands in chat sessions to gadget-drone.
|
||||||
|
|
||||||
|
gadget-code can be stacked on a single host for local development, and can achieve significant scale on a single host. It can also be deployed in tiers, potentially made of clusters, and the web tier can be horizontally scaled for production use with high availability.
|
||||||
|
|
||||||
|
Libraries such as [redis-adapter](https://github.com/socketio/socket.io-redis-adapter) and [redis-emitter](https://github.com/socketio/socket.io-redis-emitter) are used for message routing and distribution in the Socket.IO library, handling the real-time message routing among gadget-code, gadget-drone, and the IDE running in the browser.
|
||||||
|
|||||||
@ -53,8 +53,8 @@ console.log(result.stats.duration.text); // formatted, e.g. "00:00:02"
|
|||||||
|
|
||||||
Abstract base class. Currently implemented:
|
Abstract base class. Currently implemented:
|
||||||
|
|
||||||
- **`AiOllamaApi`** — Ollama provider
|
- **`OllamaAiApi`** — Ollama provider
|
||||||
- **`AiOpenAiApi`** — OpenAI provider (stubbed)
|
- **`OpenAiApi`** — OpenAI provider (stubbed)
|
||||||
|
|
||||||
#### `ai.generate(model, options, streamCallback?)`
|
#### `ai.generate(model, options, streamCallback?)`
|
||||||
|
|
||||||
@ -83,7 +83,7 @@ Configured via `IAiProvider` with `sdk: "ollama"`. Uses the `ollama` npm package
|
|||||||
|
|
||||||
### OpenAI
|
### OpenAI
|
||||||
|
|
||||||
Configured via `IAiProvider` with `sdk: "openai"`. Stubbed — `chat()` and `generate()` throw `"Not yet implemented"`. Implement by wiring the `openai` npm package following the same pattern as `AiOllamaApi`.
|
Configured via `IAiProvider` with `sdk: "openai"`. Stubbed — `chat()` and `generate()` throw `"Not yet implemented"`. Implement by wiring the `openai` npm package following the same pattern as `OllamaAiApi`.
|
||||||
|
|
||||||
## Duration Formatting
|
## Duration Formatting
|
||||||
|
|
||||||
|
|||||||
@ -19,11 +19,11 @@ export {
|
|||||||
AiApi,
|
AiApi,
|
||||||
} from "./api.js";
|
} from "./api.js";
|
||||||
|
|
||||||
export { AiOllamaApi } from "./ollama.js";
|
export { OllamaAiApi } from "./ollama.js";
|
||||||
export { AiOpenAiApi } from "./openai.js";
|
export { OpenAiApi } from "./openai.js";
|
||||||
|
|
||||||
import { AiOllamaApi } from "./ollama.js";
|
import { OllamaAiApi } from "./ollama.js";
|
||||||
import { AiOpenAiApi } from "./openai.js";
|
import { OpenAiApi } from "./openai.js";
|
||||||
import type { IAiProvider } from "./api.js";
|
import type { IAiProvider } from "./api.js";
|
||||||
import type { IAiLogger } from "./api.js";
|
import type { IAiLogger } from "./api.js";
|
||||||
import type { AiApi } from "./api.js";
|
import type { AiApi } from "./api.js";
|
||||||
@ -31,9 +31,9 @@ import type { AiApi } from "./api.js";
|
|||||||
export function createAiApi(provider: IAiProvider, logger?: IAiLogger): AiApi {
|
export function createAiApi(provider: IAiProvider, logger?: IAiLogger): AiApi {
|
||||||
switch (provider.sdk) {
|
switch (provider.sdk) {
|
||||||
case "ollama":
|
case "ollama":
|
||||||
return new AiOllamaApi(provider, logger);
|
return new OllamaAiApi(provider, logger);
|
||||||
case "openai":
|
case "openai":
|
||||||
return new AiOpenAiApi(provider, logger);
|
return new OpenAiApi(provider, logger);
|
||||||
default:
|
default:
|
||||||
throw new Error(`Unknown AI SDK: ${(provider as IAiProvider).sdk}`);
|
throw new Error(`Unknown AI SDK: ${(provider as IAiProvider).sdk}`);
|
||||||
}
|
}
|
||||||
|
|||||||
@ -18,7 +18,7 @@ import {
|
|||||||
IAiResponseStreamFn,
|
IAiResponseStreamFn,
|
||||||
} from "./api.js";
|
} from "./api.js";
|
||||||
|
|
||||||
export class AiOllamaApi extends AiApi {
|
export class OllamaAiApi extends AiApi {
|
||||||
protected client: Ollama;
|
protected client: Ollama;
|
||||||
|
|
||||||
constructor(provider: IAiProvider, logger?: IAiLogger) {
|
constructor(provider: IAiProvider, logger?: IAiLogger) {
|
||||||
@ -30,11 +30,11 @@ export class AiOllamaApi extends AiApi {
|
|||||||
}
|
}
|
||||||
|
|
||||||
async listModels(): Promise<void> {
|
async listModels(): Promise<void> {
|
||||||
await this.log.debug("AiOllamaApi.listModels called");
|
await this.log.debug("OllamaAiApi.listModels called");
|
||||||
}
|
}
|
||||||
|
|
||||||
async probeModel(modelId: string): Promise<void> {
|
async probeModel(modelId: string): Promise<void> {
|
||||||
await this.log.debug("AiOllamaApi.probeModel called", { modelId });
|
await this.log.debug("OllamaAiApi.probeModel called", { modelId });
|
||||||
}
|
}
|
||||||
|
|
||||||
async generate(
|
async generate(
|
||||||
@ -42,7 +42,7 @@ export class AiOllamaApi extends AiApi {
|
|||||||
options: IAiGenerateOptions,
|
options: IAiGenerateOptions,
|
||||||
streamCallback?: IAiResponseStreamFn,
|
streamCallback?: IAiResponseStreamFn,
|
||||||
): Promise<IAiGenerateResponse> {
|
): Promise<IAiGenerateResponse> {
|
||||||
await this.log.debug("AiOllamaApi.generate called", {
|
await this.log.debug("OllamaAiApi.generate called", {
|
||||||
provider: model.provider.name,
|
provider: model.provider.name,
|
||||||
modelId: model.modelId,
|
modelId: model.modelId,
|
||||||
});
|
});
|
||||||
@ -85,7 +85,7 @@ export class AiOllamaApi extends AiApi {
|
|||||||
options: IAiChatOptions,
|
options: IAiChatOptions,
|
||||||
streamCallback?: IAiResponseStreamFn,
|
streamCallback?: IAiResponseStreamFn,
|
||||||
): Promise<IAiChatResponse> {
|
): Promise<IAiChatResponse> {
|
||||||
await this.log.debug("AiOllamaApi.chat called", {
|
await this.log.debug("OllamaAiApi.chat called", {
|
||||||
provider: model.provider.name,
|
provider: model.provider.name,
|
||||||
modelId: model.modelId,
|
modelId: model.modelId,
|
||||||
});
|
});
|
||||||
|
|||||||
@ -14,17 +14,17 @@ import {
|
|||||||
IAiResponseStreamFn,
|
IAiResponseStreamFn,
|
||||||
} from "./api.js";
|
} from "./api.js";
|
||||||
|
|
||||||
export class AiOpenAiApi extends AiApi {
|
export class OpenAiApi extends AiApi {
|
||||||
constructor(provider: IAiProvider, logger?: IAiLogger) {
|
constructor(provider: IAiProvider, logger?: IAiLogger) {
|
||||||
super(provider, logger);
|
super(provider, logger);
|
||||||
}
|
}
|
||||||
|
|
||||||
async listModels(): Promise<void> {
|
async listModels(): Promise<void> {
|
||||||
await this.log.debug("AiOpenAiApi.listModels called");
|
await this.log.debug("OpenAiApi.listModels called");
|
||||||
}
|
}
|
||||||
|
|
||||||
async probeModel(modelId: string): Promise<void> {
|
async probeModel(modelId: string): Promise<void> {
|
||||||
await this.log.debug("AiOpenAiApi.probeModel called", { modelId });
|
await this.log.debug("OpenAiApi.probeModel called", { modelId });
|
||||||
}
|
}
|
||||||
|
|
||||||
async generate(
|
async generate(
|
||||||
|
|||||||
Loading…
Reference in New Issue
Block a user