make reasoning effort configurable; remove sign up concept

- Implemented reasoning effort setting in SESSION panel of Chat Sessio
View
- Removed all ability to "sign up" for an account
This commit is contained in:
Rob Colbert 2026-05-08 11:40:30 -04:00
parent 63e812b7c3
commit 11bdd5e3b0
31 changed files with 464 additions and 355 deletions

View File

@ -147,11 +147,11 @@ email:
enabled: true enabled: true
maxConnections: 5 maxConnections: 5
maxMessages: 100 maxMessages: 100
contact: contact:
to: "Support <support@example.com>" to: "Support <support@example.com>"
# MinIO/S3 configuration (optional) # MinIO/S3 configuration (optional)
minio: minio:
endpoint: "localhost" endpoint: "localhost"
port: 9000 port: 9000
useSsl: false useSsl: false
@ -161,14 +161,10 @@ minio:
uploads: "gadget-uploads" uploads: "gadget-uploads"
images: "gadget-images" images: "gadget-images"
videos: "gadget-videos" videos: "gadget-videos"
audios: "gadget-audios" audios: "gadget-audios"
```
# User settings ## gadget-drone.yaml Reference
user:
signupEnabled: false
```
## gadget-drone.yaml Reference
```yaml ```yaml
# Basic settings # Basic settings

171
docs/reasoning-effort.md Normal file
View File

@ -0,0 +1,171 @@
# Reasoning Effort
**Status:** ✅ **IMPLEMENTED**
**Last Updated:** May 8, 2026
## Overview
Reasoning effort controls how much an AI model "thinks" before responding. Models with reasoning capabilities (like DeepSeek-R1, QwQ, OpenAI o1/o3) can produce internal chain-of-thought tokens before generating their final answer. The reasoning effort setting lets users balance between speed and thoroughness.
## User Setting
The reasoning effort is configured per chat session via a dropdown in the Session sidebar:
| Value | Effect |
|----------|-----------------------------------------------------|
| **Off** | No thinking output. Model responds immediately. |
| **Low** | Minimal thinking. Faster responses, less depth. |
| **Medium** | Balanced thinking. Default reasoning depth. |
| **High** | Maximum thinking. Slower but more thorough. |
The dropdown is **disabled** when the selected model does not have `hasThinking: true` in its capabilities.
## Data Flow
```
User selects "High" in Reasoning dropdown
→ PUT /api/v1/chat-sessions/:id { reasoningEffort: "high" }
→ Stored in MongoDB ChatSession.reasoningEffort
→ When creating a turn:
ChatTurn.reasoningEffort = ChatSession.reasoningEffort (snapshotted)
→ Drone receives work order with populated turn
→ agent.ts reads turn.reasoningEffort, maps "off" → false
→ Passes to AiService.chat() as params.reasoning
→ Provider SDK receives the appropriate parameter
```
## Provider Mapping
Each AI provider uses a different parameter name for reasoning effort. The `@gadget/ai` abstraction handles the translation:
| Provider | Parameter | Values |
|----------|------------------|---------------------------------|
| Ollama | `think` | `false`, `"low"`, `"medium"`, `"high"` |
| OpenAI | `reasoning_effort` | `"low"`, `"medium"`, `"high"` |
### Mapping Logic (in `gadget-drone/src/services/agent.ts`)
```typescript
const reasoningEffort = turn.reasoningEffort || "off";
const reasoning: boolean | "low" | "medium" | "high" =
reasoningEffort === "off" ? false : reasoningEffort;
```
- `"off"``false` (disables thinking entirely)
- `"low"``"low"` (minimal thinking)
- `"medium"``"medium"` (balanced)
- `"high"``"high"` (maximum thinking)
### Ollama Implementation (`packages/ai/src/ollama.ts`)
```typescript
const response = await this.client.chat({
model: model.modelId,
messages,
stream: true,
think: model.params.reasoning, // boolean | "low" | "medium" | "high"
});
```
When `think` is `false`, the Ollama SDK disables thinking. When set to a string level, the model allocates corresponding effort.
### OpenAI Implementation (`packages/ai/src/openai.ts`)
```typescript
const response = await this.client.chat.completions.create({
model: model.modelId,
messages,
tools,
stream: true,
...(typeof model.params.reasoning === "string"
? { reasoning_effort: model.params.reasoning }
: {}),
});
```
The `reasoning_effort` parameter is only passed when the value is a string (`"low"`, `"medium"`, `"high"`). When `false`, the parameter is omitted — standard non-reasoning models would reject it.
## Streaming Thinking Content
When reasoning effort is enabled and the model produces thinking tokens, they are streamed back in real-time:
1. **Provider SDK** emits thinking tokens in stream chunks
2. **Provider implementation** (`ollama.ts` / `openai.ts`) maps them to `IAiStreamChunk` with `type: 'thinking'`
3. **Drone** forwards via Socket.IO as `thinking(content)` events
4. **Frontend** renders thinking content in distinct muted blocks
### Thinking Chunk Handling
**Ollama:**
```typescript
if (chunk.message.thinking) {
await streamCallback({
type: 'thinking',
data: chunk.message.thinking,
});
}
```
**OpenAI:**
```typescript
if ('reasoning' in delta && delta.reasoning) {
await streamCallback({
type: 'thinking',
data: delta.reasoning as string,
});
}
```
## Type Definitions
### `ReasoningEffort` (in `packages/api/src/interfaces/chat-session.ts`)
```typescript
export type ReasoningEffort = "off" | "low" | "medium" | "high";
```
### `IAiModelConfig.params.reasoning` (in `packages/ai/src/api.ts`)
```typescript
params: {
reasoning: boolean | "high" | "medium" | "low";
// ...
}
```
Note: The `IAiModelConfig` type uses `boolean | "high" | "medium" | "low"` (no `"off"`). The `"off"` value from the user-facing setting is mapped to `false` before reaching the AI provider layer.
### Mongoose Schema
**ChatSession** (`gadget-code/src/models/chat-session.ts`):
```typescript
reasoningEffort: {
type: String,
enum: ["off", "low", "medium", "high"],
default: "off",
}
```
**ChatTurn** (`gadget-code/src/models/chat-turn.ts`):
```typescript
reasoningEffort: {
type: String,
enum: ["off", "low", "medium", "high"],
default: "off",
}
```
## Model Capability Detection
The `hasThinking` capability is detected during model probing:
- **Ollama**: checks if model capabilities array includes `"reasoning"`
- **OpenAI**: checks if model features include `"reasoning_effort"` or fallback detection by model ID (`o1`, `o3`, `reasoning`)
The frontend uses this capability flag to enable/disable the Reasoning dropdown.
## Related Documentation
- [Streaming Responses](./streaming-responses.md) — How thinking tokens are streamed to the IDE
- [Socket Protocol](./socket-protocol.md) — Socket.IO event definitions
- [Architecture](./architecture.md) — Overall system architecture

View File

@ -426,6 +426,15 @@ marked.setOptions({
- Thinking and response mixed in same block (mode detection broken) - Thinking and response mixed in same block (mode detection broken)
- Tool calls not appearing (tool call events not routed) - Tool calls not appearing (tool call events not routed)
## Reasoning Effort
The reasoning effort setting controls how much a model thinks before responding. See [Reasoning Effort](./reasoning-effort.md) for full documentation.
Key integration points with streaming:
- When reasoning effort is **Off** (`false`), no thinking tokens are produced
- When set to **Low/Medium/High**, the model allocates corresponding thinking depth
- Thinking tokens stream through the same path as response tokens but with `type: 'thinking'`
## Future Enhancements ## Future Enhancements
Potential improvements not yet implemented: Potential improvements not yet implemented:
@ -477,3 +486,4 @@ Potential improvements not yet implemented:
- [Socket Protocol](./socket-protocol.md) — Socket.IO event definitions - [Socket Protocol](./socket-protocol.md) — Socket.IO event definitions
- [ChatTurn Interface](../packages/api/src/interfaces/chat-turn.ts) — TypeScript types - [ChatTurn Interface](../packages/api/src/interfaces/chat-turn.ts) — TypeScript types
- [ChatTurn Model](../gadget-code/src/models/chat-turn.ts) — Mongoose schema - [ChatTurn Model](../gadget-code/src/models/chat-turn.ts) — Mongoose schema
- [Reasoning Effort](./reasoning-effort.md) — How thinking/reasoning is controlled

View File

@ -112,18 +112,15 @@ DTP_HTTPS_PORT="3443"
DTP_HTTPS_KEY_FILE="/path/to/ssl/key.pem" DTP_HTTPS_KEY_FILE="/path/to/ssl/key.pem"
DTP_HTTPS_CRT_FILE="/path/to/ssl/cert.pem" DTP_HTTPS_CRT_FILE="/path/to/ssl/cert.pem"
# Session # Session
DTP_SESSION_TRUST_PROXY="enabled" DTP_SESSION_TRUST_PROXY="enabled"
DTP_SESSION_COOKIE_SECURE="enabled" DTP_SESSION_COOKIE_SECURE="enabled"
DTP_SESSION_COOKIE_SAMESITE="strict" DTP_SESSION_COOKIE_SAMESITE="strict"
```
# User Signup
DTP_USER_SIGNUP="enabled"
```
## Features ## Features
- User authentication (sign up, sign in, sign out) - User authentication (sign in, sign out)
- JWT-based session management - JWT-based session management
- RESTful API structure - RESTful API structure
- Socket.io real-time communication - Socket.io real-time communication
@ -153,12 +150,11 @@ DTP_USER_SIGNUP="enabled"
## API Endpoints ## API Endpoints
| Method | Endpoint | Description | | Method | Endpoint | Description |
| ------ | ---------------- | ------------------ | | ------ | ---------------- | ------------------ |
| POST | `/auth/sign-up` | Create new account | | POST | `/auth/sign-in` | Authenticate |
| POST | `/auth/sign-in` | Authenticate | | GET | `/auth/sign-out` | Sign out |
| GET | `/auth/sign-out` | Sign out | | GET | `/api/v1/user` | Get current user |
| GET | `/api/v1/user` | Get current user |
## License ## License

View File

@ -7,7 +7,6 @@ import StatusBar from './components/StatusBar';
import Home from './pages/Home'; import Home from './pages/Home';
import ProjectManager from './pages/ProjectManager'; import ProjectManager from './pages/ProjectManager';
import SignIn from './pages/SignIn'; import SignIn from './pages/SignIn';
import SignUp from './pages/SignUp';
import ChatSessionView from './pages/ChatSessionView'; import ChatSessionView from './pages/ChatSessionView';
import DroneManager from './pages/DroneManager'; import DroneManager from './pages/DroneManager';
@ -124,27 +123,17 @@ export default function App() {
<Route path="/projects/:slug" element={<ProjectManager user={user} />} /> <Route path="/projects/:slug" element={<ProjectManager user={user} />} />
<Route path="/drones" element={<DroneManager user={user} />} /> <Route path="/drones" element={<DroneManager user={user} />} />
<Route path="/projects/:projectId/chat-session/:sessionId" element={<ChatSessionView />} /> <Route path="/projects/:projectId/chat-session/:sessionId" element={<ChatSessionView />} />
<Route <Route
path="/sign-in" path="/sign-in"
element={ element={
user ? ( user ? (
<Navigate to="/" replace /> <Navigate to="/" replace />
) : ( ) : (
<SignIn onSuccess={handleSignInSuccess} /> <SignIn onSuccess={handleSignInSuccess} />
) )
} }
/> />
<Route </Routes>
path="/sign-up"
element={
user ? (
<Navigate to="/" replace />
) : (
<SignUp onSuccess={handleSignInSuccess} />
)
}
/>
</Routes>
</main> </main>
<StatusBar statusMessage={statusMessage} projectSlug={currentProject} /> <StatusBar statusMessage={statusMessage} projectSlug={currentProject} />
</div> </div>

View File

@ -1,52 +0,0 @@
import { Link } from 'react-router-dom';
interface User {
_id: string;
email: string;
displayName: string;
}
interface NavbarProps {
user: User | null;
onSignOut: () => void;
}
export default function Navbar({ user, onSignOut }: NavbarProps) {
return (
<nav className="bg-bg-secondary border-b border-border">
<div className="max-w-4xl mx-auto px-4 py-3 flex items-center justify-between">
<Link to="/" className="text-xl font-semibold text-text">
DTP Web App
</Link>
<div className="flex items-center gap-4">
{user ? (
<>
<span className="text-text-muted">{user.displayName}</span>
<button
onClick={onSignOut}
className="px-4 py-2 bg-primary hover:bg-primary-hover text-white rounded-lg transition-colors"
>
Sign Out
</button>
</>
) : (
<>
<Link
to="/sign-in"
className="px-4 py-2 text-text-muted hover:text-text transition-colors"
>
Sign In
</Link>
<Link
to="/sign-up"
className="px-4 py-2 bg-primary hover:bg-primary-hover text-white rounded-lg transition-colors"
>
Sign Up
</Link>
</>
)}
</div>
</div>
</nav>
);
}

View File

@ -280,6 +280,7 @@ export interface ChatSession {
mode: ChatSessionMode; mode: ChatSessionMode;
provider: string | AiProvider; provider: string | AiProvider;
selectedModel: string; selectedModel: string;
reasoningEffort?: string;
stats: ChatSessionStats; stats: ChatSessionStats;
pins: Array<{ _id?: string; content: string }>; pins: Array<{ _id?: string; content: string }>;
} }

View File

@ -40,12 +40,14 @@ export default function ChatSessionView() {
const [isUpdatingMode, setIsUpdatingMode] = useState(false); const [isUpdatingMode, setIsUpdatingMode] = useState(false);
const [isUpdatingProvider, setIsUpdatingProvider] = useState(false); const [isUpdatingProvider, setIsUpdatingProvider] = useState(false);
const [isUpdatingModel, setIsUpdatingModel] = useState(false); const [isUpdatingModel, setIsUpdatingModel] = useState(false);
const [isUpdatingReasoning, setIsUpdatingReasoning] = useState(false);
const [logExpanded, setLogExpanded] = useState(false); const [logExpanded, setLogExpanded] = useState(false);
const [logs, setLogs] = useState<LogEntry[]>([]); const [logs, setLogs] = useState<LogEntry[]>([]);
const [toast, setToast] = useState<string | null>(null); const [toast, setToast] = useState<string | null>(null);
const [providers, setProviders] = useState<AiProvider[]>([]); const [providers, setProviders] = useState<AiProvider[]>([]);
const [selectedProviderId, setSelectedProviderId] = useState<string>(''); const [selectedProviderId, setSelectedProviderId] = useState<string>('');
const [selectedModelId, setSelectedModelId] = useState<string>(''); const [selectedModelId, setSelectedModelId] = useState<string>('');
const [sessionReasoningEffort, setSessionReasoningEffort] = useState<string>('off');
const messagesEndRef = useRef<HTMLDivElement>(null); const messagesEndRef = useRef<HTMLDivElement>(null);
const inputRef = useRef<HTMLTextAreaElement>(null); const inputRef = useRef<HTMLTextAreaElement>(null);
@ -104,6 +106,7 @@ export default function ChatSessionView() {
: sessionData.provider?._id; : sessionData.provider?._id;
setSelectedProviderId(providerId || ''); setSelectedProviderId(providerId || '');
setSelectedModelId(sessionData.selectedModel || ''); setSelectedModelId(sessionData.selectedModel || '');
setSessionReasoningEffort(sessionData.reasoningEffort || 'off');
} }
} catch (err) { } catch (err) {
setError(err instanceof Error ? err.message : 'Failed to load session'); setError(err instanceof Error ? err.message : 'Failed to load session');
@ -455,6 +458,34 @@ export default function ChatSessionView() {
} }
}; };
const getSelectedModelCapabilities = useCallback(() => {
const provider = providers.find(p => p._id === selectedProviderId);
if (!provider) return null;
const model = provider.models.find(m => m.id === selectedModelId);
return model?.capabilities || null;
}, [providers, selectedProviderId, selectedModelId]);
const handleReasoningChange = async (e: React.ChangeEvent<HTMLSelectElement>) => {
if (!session) return;
const newValue = e.target.value;
if (newValue === sessionReasoningEffort) return;
setIsUpdatingReasoning(true);
try {
const updatedSession = await chatSessionApi.update(session._id, {
reasoningEffort: newValue
});
setSession(updatedSession);
setSessionReasoningEffort(newValue);
showToast(`Reasoning effort set to ${newValue}`);
} catch (err) {
showToast(`Failed to change reasoning effort: ${err instanceof Error ? err.message : 'Unknown error'}`);
} finally {
setIsUpdatingReasoning(false);
}
};
const handleModelChange = async (e: React.ChangeEvent<HTMLSelectElement>) => { const handleModelChange = async (e: React.ChangeEvent<HTMLSelectElement>) => {
if (!session) return; if (!session) return;
@ -731,6 +762,25 @@ export default function ChatSessionView() {
))} ))}
</select> </select>
</div> </div>
<div>
<div className="text-xs text-text-muted">Reasoning</div>
<select
value={sessionReasoningEffort}
onChange={handleReasoningChange}
disabled={isUpdatingReasoning || !getSelectedModelCapabilities()?.hasThinking}
className="w-full mt-1 px-2 py-1.5 bg-bg-tertiary border border-border-default rounded text-text-primary text-sm focus:border-brand focus:outline-none disabled:opacity-50 disabled:cursor-not-allowed"
title={
!getSelectedModelCapabilities()?.hasThinking
? "Selected model does not support reasoning"
: "Controls how much the model thinks before responding"
}
>
<option value="off">Off</option>
<option value="low">Low</option>
<option value="medium">Medium</option>
<option value="high">High</option>
</select>
</div>
</div> </div>
</div> </div>

View File

@ -77,14 +77,8 @@ export default function SignIn({ onSuccess }: SignInProps) {
{loading ? 'Signing in...' : 'Sign In'} {loading ? 'Signing in...' : 'Sign In'}
</button> </button>
</div> </div>
</form> </form>
<p className="mt-4 text-center text-text-muted"> </div>
Don't have an account?{' '} </div>
<Link to="/sign-up" className="text-brand hover:underline"> );
Sign up }
</Link>
</p>
</div>
</div>
);
}

View File

@ -1,122 +0,0 @@
import { useState } from 'react';
import { Link } from 'react-router-dom';
import { api, AuthResponse, User } from '../lib/api';
interface SignUpProps {
onSuccess: (user: User, token: string) => void;
}
export default function SignUp({ onSuccess }: SignUpProps) {
const [email, setEmail] = useState('');
const [displayName, setDisplayName] = useState('');
const [password, setPassword] = useState('');
const [passwordVerify, setPasswordVerify] = useState('');
const [error, setError] = useState('');
const [loading, setLoading] = useState(false);
const handleSubmit = async (e: React.FormEvent) => {
e.preventDefault();
setError('');
if (password !== passwordVerify) {
setError('Passwords do not match');
return;
}
setLoading(true);
try {
const response = await api.post<AuthResponse>('/auth/sign-up', {
email,
password,
displayName,
});
onSuccess(response.user, response.token);
} catch (err) {
setError(err instanceof Error ? err.message : 'Sign up failed');
} finally {
setLoading(false);
}
};
return (
<div className="min-h-screen flex items-center justify-center bg-bg-primary">
<div className="w-full max-w-md p-8">
<h1 className="text-2xl font-bold text-center mb-6">Create Account</h1>
{error && (
<div className="mb-4 p-3 bg-red-900/30 border border-red-700 rounded-lg text-red-200">
{error}
</div>
)}
<form onSubmit={handleSubmit} className="space-y-4">
<div>
<label htmlFor="email" className="block text-sm text-text-muted mb-1">
Email Address
</label>
<input
id="email"
type="email"
value={email}
onChange={(e) => setEmail(e.target.value)}
className="w-full px-4 py-2 bg-bg-tertiary border border-border-default rounded-lg text-text-primary focus:outline-none focus:border-brand"
required
/>
</div>
<div>
<label htmlFor="displayName" className="block text-sm text-text-muted mb-1">
Display Name
</label>
<input
id="displayName"
type="text"
value={displayName}
onChange={(e) => setDisplayName(e.target.value)}
className="w-full px-4 py-2 bg-bg-tertiary border border-border-default rounded-lg text-text-primary focus:outline-none focus:border-brand"
required
/>
</div>
<div>
<label htmlFor="password" className="block text-sm text-text-muted mb-1">
Password
</label>
<input
id="password"
type="password"
value={password}
onChange={(e) => setPassword(e.target.value)}
className="w-full px-4 py-2 bg-bg-tertiary border border-border-default rounded-lg text-text-primary focus:outline-none focus:border-brand"
required
/>
</div>
<div>
<label htmlFor="passwordVerify" className="block text-sm text-text-muted mb-1">
Verify Password
</label>
<input
id="passwordVerify"
type="password"
value={passwordVerify}
onChange={(e) => setPasswordVerify(e.target.value)}
className="w-full px-4 py-2 bg-bg-tertiary border border-border-default rounded-lg text-text-primary focus:outline-none focus:border-brand"
required
/>
</div>
<div className="flex gap-4 pt-2">
<Link
to="/"
className="flex-1 px-4 py-2 text-center border border-border-default rounded-lg hover:bg-bg-tertiary transition-colors"
>
Cancel
</Link>
<button
type="submit"
disabled={loading}
className="flex-1 px-4 py-2 bg-brand hover:bg-red-700 text-white rounded-lg transition-colors disabled:opacity-50"
>
{loading ? 'Creating...' : 'Sign Up'}
</button>
</div>
</form>
</div>
</div>
);
}

View File

@ -47,6 +47,21 @@ export default defineConfig({
build: { build: {
outDir: path.join(rootDir, 'dist', 'client'), outDir: path.join(rootDir, 'dist', 'client'),
emptyOutDir: true, emptyOutDir: true,
rolldownOptions: {
output: {
codeSplitting: {
minSize: 20000,
groups: [
{
name: 'vendor',
test: /[\\/]node_modules[\\/]/,
priority: 10,
maxSize: 250000,
},
],
},
},
},
}, },
resolve: { resolve: {
alias: { alias: {

View File

@ -124,10 +124,9 @@ export default {
audios: yamlConfig.minio?.buckets?.audios || "dtp-audios", audios: yamlConfig.minio?.buckets?.audios || "dtp-audios",
}, },
}, },
user: { user: {
signupEnabled: yamlConfig.user?.signupEnabled === true, passwordSalt: yamlConfig.auth.passwordSalt,
passwordSalt: yamlConfig.auth.passwordSalt, },
},
https: { https: {
enabled: yamlConfig.https?.enabled === true, enabled: yamlConfig.https?.enabled === true,
address: yamlConfig.https?.address || "127.0.0.1", address: yamlConfig.https?.address || "127.0.0.1",

View File

@ -178,6 +178,7 @@ class ChatSessionController extends DtpController {
provider: string; provider: string;
selectedModel: string; selectedModel: string;
mode: ChatSessionMode; mode: ChatSessionMode;
reasoningEffort: "off" | "low" | "medium" | "high";
}> = {}; }> = {};
if (updates.name !== undefined) { if (updates.name !== undefined) {
@ -196,6 +197,13 @@ class ChatSessionController extends DtpController {
throw new Error(`Invalid mode: ${updates.mode}`); throw new Error(`Invalid mode: ${updates.mode}`);
} }
} }
if (updates.reasoningEffort !== undefined) {
const validValues = ["off", "low", "medium", "high"] as const;
if (!validValues.includes(updates.reasoningEffort)) {
throw new Error(`Invalid reasoningEffort: ${updates.reasoningEffort}. Must be one of: ${validValues.join(", ")}`);
}
allowedUpdates.reasoningEffort = updates.reasoningEffort;
}
const session = await ChatSessionService.update( const session = await ChatSessionService.update(
res.locals.chatSession._id, res.locals.chatSession._id,

View File

@ -26,60 +26,14 @@ export class AuthController extends DtpController {
} }
async start(): Promise<void> { async start(): Promise<void> {
this.router.post("/sign-up", this.postSignUp.bind(this));
this.router.post("/sign-in", this.postSignIn.bind(this)); this.router.post("/sign-in", this.postSignIn.bind(this));
this.router.post("/renew-token", this.postRenewToken.bind(this)); this.router.post("/renew-token", this.postRenewToken.bind(this));
this.router.get("/welcome", this.getWelcomeView.bind(this)); this.router.get("/welcome", this.getWelcomeView.bind(this));
this.router.get("/sign-up", this.getSignUpForm.bind(this));
this.router.get("/sign-in", this.getSignInForm.bind(this)); this.router.get("/sign-in", this.getSignInForm.bind(this));
this.router.get("/sign-out", this.getSignOut.bind(this)); this.router.get("/sign-out", this.getSignOut.bind(this));
} }
async postSignUp(
req: Request,
res: Response,
next: NextFunction
): Promise<void> {
try {
const user = await UserService.create(
req.body.email,
req.body.password,
req.body.displayName
);
req.session.user = {
_id: user._id,
email: user.email,
displayName: user.displayName,
flags: user.flags,
};
const token = await SessionService.createJsonWebToken(user);
req.session.token = token;
req.session.type = SessionType.WEB;
req.session.save((err: Error) => {
if (err) {
return next(err);
}
res.status(201).json({
success: true,
user: {
_id: user._id.toString(),
email: user.email,
displayName: user.displayName,
flags: user.flags,
},
token,
});
});
} catch (error) {
this.log.error("failed to process new user sign-up", { error });
return next(error);
}
}
async postSignIn( async postSignIn(
req: Request, req: Request,
res: Response, res: Response,
@ -143,12 +97,6 @@ export class AuthController extends DtpController {
}); });
} }
async getSignUpForm(_req: Request, res: Response): Promise<void> {
res.status(200).json({
success: true,
form: "sign-up",
});
}
async getSignInForm(_req: Request, res: Response): Promise<void> { async getSignInForm(_req: Request, res: Response): Promise<void> {
res.status(200).json({ res.status(200).json({
success: true, success: true,

View File

@ -68,12 +68,11 @@ export abstract class DtpController implements DtpComponent {
* @param res Response The response being generated. * @param res Response The response being generated.
* @param next NextFunction The next function to call when done. * @param next NextFunction The next function to call when done.
*/ */
middleware(req: Request, res: Response, next: NextFunction) { middleware(req: Request, res: Response, next: NextFunction) {
res.locals.request = req; res.locals.request = req;
res.locals.currentView = this.slug; res.locals.currentView = this.slug;
res.locals.signupEnabled = env.user.signupEnabled; next();
next(); }
}
hmacMiddleware() { hmacMiddleware() {
return async (req: Request, res: Response, next: NextFunction) => { return async (req: Request, res: Response, next: NextFunction) => {

View File

@ -25,6 +25,11 @@ export const ChatSessionSchema = new Schema<IChatSession>({
}, },
provider: { type: String, required: true, ref: "AiProvider" }, provider: { type: String, required: true, ref: "AiProvider" },
selectedModel: { type: String, required: true }, selectedModel: { type: String, required: true },
reasoningEffort: {
type: String,
enum: ["off", "low", "medium", "high"],
default: "off",
},
stats: { stats: {
turnCount: { type: Number, default: 0, required: true }, turnCount: { type: Number, default: 0, required: true },
toolCallCount: { type: Number, default: 0, required: true }, toolCallCount: { type: Number, default: 0, required: true },

View File

@ -63,6 +63,11 @@ export const ChatTurnSchema = new Schema<IChatTurn>({
session: { type: String, required: true, ref: "ChatSession" }, session: { type: String, required: true, ref: "ChatSession" },
provider: { type: String, required: true, ref: "AiProvider" }, provider: { type: String, required: true, ref: "AiProvider" },
llm: { type: String, required: true }, // id/name of the model used to process the prompt llm: { type: String, required: true }, // id/name of the model used to process the prompt
reasoningEffort: {
type: String,
enum: ["off", "low", "medium", "high"],
default: "off",
},
mode: { mode: {
type: String, type: String,
enum: ChatSessionMode, enum: ChatSessionMode,

View File

@ -15,6 +15,7 @@ import {
IProject, IProject,
ChatTurnStatus, ChatTurnStatus,
ChatTurnDocument, ChatTurnDocument,
ReasoningEffort,
} from "@gadget/api"; } from "@gadget/api";
import { DtpService } from "../lib/service.js"; import { DtpService } from "../lib/service.js";
@ -178,6 +179,7 @@ class ChatSessionService extends DtpService {
provider: GadgetId; provider: GadgetId;
selectedModel: string; selectedModel: string;
mode: ChatSessionMode; mode: ChatSessionMode;
reasoningEffort: ReasoningEffort;
}>, }>,
): Promise<IChatSession> { ): Promise<IChatSession> {
const session = await ChatSession.findById(chatSessionId); const session = await ChatSession.findById(chatSessionId);
@ -203,6 +205,9 @@ class ChatSessionService extends DtpService {
if (updates.mode !== undefined) { if (updates.mode !== undefined) {
session.mode = updates.mode; session.mode = updates.mode;
} }
if (updates.reasoningEffort !== undefined) {
session.reasoningEffort = updates.reasoningEffort;
}
await session.save(); await session.save();
@ -248,6 +253,7 @@ class ChatSessionService extends DtpService {
session: session._id, session: session._id,
provider: session.provider, provider: session.provider,
llm: session.selectedModel, llm: session.selectedModel,
reasoningEffort: (session as IChatSession).reasoningEffort || "off",
mode: session.mode, mode: session.mode,
status: ChatTurnStatus.Processing, status: ChatTurnStatus.Processing,
prompts: { prompts: {

View File

@ -150,7 +150,7 @@ describe('Environment Configuration', () => {
const content = fs.readFileSync(envPath, 'utf-8'); const content = fs.readFileSync(envPath, 'utf-8');
expect(content).toContain('https'); expect(content).toContain('https');
expect(content).toContain('port: parseInt'); expect(content).toContain('port: yamlConfig.https?.port');
expect(content).toContain('keyFile'); expect(content).toContain('keyFile');
expect(content).toContain('crtFile'); expect(content).toContain('crtFile');
}); });

View File

@ -46,20 +46,22 @@ describe("CodeSession", () => {
status: "available", status: "available",
} as IDroneRegistration; } as IDroneRegistration;
mockProject = {
_id: nanoid(),
slug: "test-project",
name: "Test Project",
} as IProject;
mockChatSession = { mockChatSession = {
_id: nanoid(), _id: nanoid(),
name: "Test Session", name: "Test Session",
mode: "build", mode: "build",
provider: nanoid(), provider: nanoid(),
selectedModel: "llama3.1", selectedModel: "llama3.1",
user: mockUser,
project: mockProject,
} as IChatSession; } as IChatSession;
mockProject = {
_id: nanoid(),
slug: "test-project",
name: "Test Project",
} as IProject;
codeSession = new CodeSession(mockSocket, mockUser); codeSession = new CodeSession(mockSocket, mockUser);
}); });
@ -81,29 +83,32 @@ describe("CodeSession", () => {
}); });
describe("onSubmitPrompt", () => { describe("onSubmitPrompt", () => {
it("should throw error if no drone is selected", async () => { it("should return error if no drone is selected", async () => {
codeSession.setChatSession(mockChatSession, mockProject); codeSession.setChatSession(mockChatSession, mockProject);
const cb = vi.fn();
await expect(codeSession.onSubmitPrompt("test prompt")).rejects.toThrow( await codeSession.onSubmitPrompt("test prompt", cb);
"No drone selected",
); expect(cb).toHaveBeenCalledWith(false, { message: "No drone selected" });
}); });
it("should throw error if no chat session is active", async () => { it("should return error if no chat session is active", async () => {
codeSession.setSelectedDrone(mockDrone); codeSession.setSelectedDrone(mockDrone);
const cb = vi.fn();
await expect(codeSession.onSubmitPrompt("test prompt")).rejects.toThrow( await codeSession.onSubmitPrompt("test prompt", cb);
"No chat session active",
); expect(cb).toHaveBeenCalledWith(false, { message: "No chat session active" });
}); });
it("should throw error if no project is selected", async () => { it("should return error if no project is selected", async () => {
codeSession.setSelectedDrone(mockDrone); codeSession.setSelectedDrone(mockDrone);
codeSession.setChatSession(mockChatSession, undefined as any); codeSession.setChatSession(mockChatSession, undefined as any);
const cb = vi.fn();
await expect(codeSession.onSubmitPrompt("test prompt")).rejects.toThrow( await codeSession.onSubmitPrompt("test prompt", cb);
"No project selected",
); expect(cb).toHaveBeenCalledWith(false, { message: "No project selected" });
}); });
it("should create a ChatTurn and emit processWorkOrder to drone", async () => { it("should create a ChatTurn and emit processWorkOrder to drone", async () => {
@ -117,6 +122,9 @@ describe("CodeSession", () => {
vi.mocked(ChatTurn).mockImplementation(function () { vi.mocked(ChatTurn).mockImplementation(function () {
return mockTurn as any; return mockTurn as any;
}); });
(vi.mocked(ChatTurn) as any).populate = vi
.fn()
.mockResolvedValue(mockTurn);
const mockDroneSession = { const mockDroneSession = {
socket: { socket: {
@ -129,7 +137,8 @@ describe("CodeSession", () => {
mockDroneSession as any, mockDroneSession as any,
); );
await codeSession.onSubmitPrompt("test prompt"); const cb = vi.fn();
await codeSession.onSubmitPrompt("test prompt", cb);
expect(ChatTurn).toHaveBeenCalledWith( expect(ChatTurn).toHaveBeenCalledWith(
expect.objectContaining({ expect.objectContaining({
@ -139,10 +148,9 @@ describe("CodeSession", () => {
provider: mockChatSession.provider, provider: mockChatSession.provider,
llm: mockChatSession.selectedModel, llm: mockChatSession.selectedModel,
status: ChatTurnStatus.Processing, status: ChatTurnStatus.Processing,
prompts: { prompts: expect.objectContaining({
user: "test prompt", user: "test prompt",
system: undefined, }),
},
}), }),
); );
@ -162,12 +170,15 @@ describe("CodeSession", () => {
const mockTurn = { const mockTurn = {
_id: nanoid(), _id: nanoid(),
status: ChatTurnStatus.Processing, status: ChatTurnStatus.Processing,
response: "", errorMessage: "",
save: vi.fn().mockResolvedValue(undefined), save: vi.fn().mockResolvedValue(undefined),
}; };
vi.mocked(ChatTurn).mockImplementation(function () { vi.mocked(ChatTurn).mockImplementation(function () {
return mockTurn as any; return mockTurn as any;
}); });
(vi.mocked(ChatTurn) as any).populate = vi
.fn()
.mockResolvedValue(mockTurn);
const mockDroneSession = { const mockDroneSession = {
socket: { socket: {
@ -183,10 +194,11 @@ describe("CodeSession", () => {
mockDroneSession as any, mockDroneSession as any,
); );
await codeSession.onSubmitPrompt("test prompt"); const cb = vi.fn();
await codeSession.onSubmitPrompt("test prompt", cb);
expect(mockTurn.status).toBe(ChatTurnStatus.Error); expect(mockTurn.status).toBe(ChatTurnStatus.Error);
expect(mockTurn.response).toBe("Drone is busy"); expect(mockTurn.errorMessage).toBe("Drone is busy");
expect(mockTurn.save).toHaveBeenCalled(); expect(mockTurn.save).toHaveBeenCalled();
}); });
}); });

View File

@ -100,30 +100,25 @@ describe("DroneSession", () => {
}); });
describe("onThinking", () => { describe("onThinking", () => {
it("should route thinking event to code session and update ChatTurn", async () => { it("should route thinking event to code session", async () => {
const mockCodeSession = { const mockCodeSession = {
socket: { emit: vi.fn() }, socket: { emit: vi.fn() },
onThinking: vi.fn(),
}; };
vi.mocked(SocketService.getCodeSessionByChatSessionId).mockReturnValue( vi.mocked(SocketService.getCodeSessionByChatSessionId).mockReturnValue(
mockCodeSession as any, mockCodeSession as any,
); );
vi.mocked(ChatTurn.findByIdAndUpdate).mockResolvedValue({} as any);
droneSession.setChatSessionId(mockChatSessionId); droneSession.setChatSessionId(mockChatSessionId);
droneSession.setCurrentTurnId(mockTurnId);
await droneSession.onThinking("thinking content"); await droneSession.onThinking("thinking content");
expect(SocketService.getCodeSessionByChatSessionId).toHaveBeenCalledWith( expect(SocketService.getCodeSessionByChatSessionId).toHaveBeenCalledWith(
mockChatSessionId, mockChatSessionId,
); );
expect(mockCodeSession.socket.emit).toHaveBeenCalledWith( expect(mockCodeSession.onThinking).toHaveBeenCalledWith(
"thinking",
"thinking content", "thinking content",
); );
expect(ChatTurn.findByIdAndUpdate).toHaveBeenCalledWith(mockTurnId, {
thinking: "thinking content",
});
}); });
it("should log warning if no chat session is active", async () => { it("should log warning if no chat session is active", async () => {
@ -134,30 +129,25 @@ describe("DroneSession", () => {
}); });
describe("onResponse", () => { describe("onResponse", () => {
it("should route response event to code session and update ChatTurn", async () => { it("should route response event to code session", async () => {
const mockCodeSession = { const mockCodeSession = {
socket: { emit: vi.fn() }, socket: { emit: vi.fn() },
onResponse: vi.fn(),
}; };
vi.mocked(SocketService.getCodeSessionByChatSessionId).mockReturnValue( vi.mocked(SocketService.getCodeSessionByChatSessionId).mockReturnValue(
mockCodeSession as any, mockCodeSession as any,
); );
vi.mocked(ChatTurn.findByIdAndUpdate).mockResolvedValue({} as any);
droneSession.setChatSessionId(mockChatSessionId); droneSession.setChatSessionId(mockChatSessionId);
droneSession.setCurrentTurnId(mockTurnId);
await droneSession.onResponse("response content"); await droneSession.onResponse("response content");
expect(SocketService.getCodeSessionByChatSessionId).toHaveBeenCalledWith( expect(SocketService.getCodeSessionByChatSessionId).toHaveBeenCalledWith(
mockChatSessionId, mockChatSessionId,
); );
expect(mockCodeSession.socket.emit).toHaveBeenCalledWith( expect(mockCodeSession.onResponse).toHaveBeenCalledWith(
"response",
"response content", "response content",
); );
expect(ChatTurn.findByIdAndUpdate).toHaveBeenCalledWith(mockTurnId, {
response: "response content",
});
}); });
it("should log warning if no chat session is active", async () => { it("should log warning if no chat session is active", async () => {
@ -171,8 +161,10 @@ describe("DroneSession", () => {
it("should route toolCall event to code session and update ChatTurn", async () => { it("should route toolCall event to code session and update ChatTurn", async () => {
const mockCodeSession = { const mockCodeSession = {
socket: { emit: vi.fn() }, socket: { emit: vi.fn() },
onToolCall: vi.fn(),
}; };
const mockTurn = { const mockTurn = {
blocks: [],
toolCalls: [], toolCalls: [],
stats: { toolCallCount: 0 }, stats: { toolCallCount: 0 },
save: vi.fn().mockResolvedValue(undefined), save: vi.fn().mockResolvedValue(undefined),
@ -195,8 +187,7 @@ describe("DroneSession", () => {
expect(SocketService.getCodeSessionByChatSessionId).toHaveBeenCalledWith( expect(SocketService.getCodeSessionByChatSessionId).toHaveBeenCalledWith(
mockChatSessionId, mockChatSessionId,
); );
expect(mockCodeSession.socket.emit).toHaveBeenCalledWith( expect(mockCodeSession.onToolCall).toHaveBeenCalledWith(
"toolCall",
"call-123", "call-123",
"readFile", "readFile",
'{"path":"test.ts"}', '{"path":"test.ts"}',
@ -225,6 +216,7 @@ describe("DroneSession", () => {
it("should update ChatTurn status and emit to code session on success", async () => { it("should update ChatTurn status and emit to code session on success", async () => {
const mockCodeSession = { const mockCodeSession = {
socket: { emit: vi.fn() }, socket: { emit: vi.fn() },
onWorkOrderComplete: vi.fn(),
}; };
const mockTurn = { const mockTurn = {
status: ChatTurnStatus.Processing, status: ChatTurnStatus.Processing,
@ -242,8 +234,7 @@ describe("DroneSession", () => {
expect(ChatTurn.findById).toHaveBeenCalledWith(mockTurnId); expect(ChatTurn.findById).toHaveBeenCalledWith(mockTurnId);
expect(mockTurn.status).toBe(ChatTurnStatus.Finished); expect(mockTurn.status).toBe(ChatTurnStatus.Finished);
expect(mockTurn.save).toHaveBeenCalled(); expect(mockTurn.save).toHaveBeenCalled();
expect(mockCodeSession.socket.emit).toHaveBeenCalledWith( expect(mockCodeSession.onWorkOrderComplete).toHaveBeenCalledWith(
"workOrderComplete",
mockTurnId, mockTurnId,
true, true,
undefined, undefined,
@ -254,10 +245,11 @@ describe("DroneSession", () => {
it("should update ChatTurn to Error status on failure", async () => { it("should update ChatTurn to Error status on failure", async () => {
const mockCodeSession = { const mockCodeSession = {
socket: { emit: vi.fn() }, socket: { emit: vi.fn() },
onWorkOrderComplete: vi.fn(),
}; };
const mockTurn = { const mockTurn = {
status: ChatTurnStatus.Processing, status: ChatTurnStatus.Processing,
response: "", errorMessage: "",
save: vi.fn().mockResolvedValue(undefined), save: vi.fn().mockResolvedValue(undefined),
}; };
vi.mocked(SocketService.getCodeSessionByChatSessionId).mockReturnValue( vi.mocked(SocketService.getCodeSessionByChatSessionId).mockReturnValue(
@ -274,7 +266,7 @@ describe("DroneSession", () => {
); );
expect(mockTurn.status).toBe(ChatTurnStatus.Error); expect(mockTurn.status).toBe(ChatTurnStatus.Error);
expect(mockTurn.response).toBe("Agent crashed"); expect(mockTurn.errorMessage).toBe("Agent crashed");
expect(mockTurn.save).toHaveBeenCalled(); expect(mockTurn.save).toHaveBeenCalled();
}); });

View File

@ -32,7 +32,7 @@ test.describe('Authentication Flow', () => {
const userData = await page.evaluate(() => localStorage.getItem('dtp_user')); const userData = await page.evaluate(() => localStorage.getItem('dtp_user'));
expect(userData).toBeNull(); expect(userData).toBeNull();
const content = await page.content(); const content = await page.content();
expect(content).toContain('Sign Up Today!'); expect(content).toContain('Sign In');
}); });
}); });

View File

@ -5,6 +5,11 @@ import { fileURLToPath } from 'node:url';
const __dirname = path.dirname(fileURLToPath(import.meta.url)); const __dirname = path.dirname(fileURLToPath(import.meta.url));
export default defineConfig({ export default defineConfig({
resolve: {
alias: {
'@': path.resolve(__dirname, 'src'),
},
},
test: { test: {
globals: true, globals: true,
environment: 'node', environment: 'node',

View File

@ -131,12 +131,16 @@ class AgentService extends GadgetService {
} }
try { try {
const reasoningEffort = turn.reasoningEffort || "off";
const reasoning: boolean | "low" | "medium" | "high" =
reasoningEffort === "off" ? false : reasoningEffort;
const response = await AiService.chat( const response = await AiService.chat(
turn.provider, turn.provider,
{ {
modelId: turn.llm, modelId: turn.llm,
params: { params: {
reasoning: false, reasoning,
temperature: 0.8, temperature: 0.8,
topP: 0.9, topP: 0.9,
topK: 40, topK: 40,

View File

@ -36,7 +36,7 @@ export interface IDroneModelConfig {
provider: DbAiProvider | GadgetId; provider: DbAiProvider | GadgetId;
modelId: string; modelId: string;
params: { params: {
reasoning: boolean; reasoning: boolean | "low" | "medium" | "high";
temperature: number; temperature: number;
topP: number; topP: number;
topK: number; topK: number;

View File

@ -372,4 +372,75 @@ describe('OllamaAiApi', () => {
expect(response.response).toBe('Here are the results'); expect(response.response).toBe('Here are the results');
}); });
}); });
describe('probeModel', () => {
it('should detect thinking capability from "thinking" (Ollama convention)', async () => {
mockOllamaClient.show.mockResolvedValue({
capabilities: ['completion', 'vision', 'tools', 'thinking'],
details: { family: 'gemma4' },
model_info: {},
modified_at: '2026-04-04T06:20:40.211Z',
});
const result = await api.probeModel('gemma4:e4b');
expect(result.capabilities.hasThinking).toBe(true);
expect(result.capabilities.canCallTools).toBe(true);
expect(result.capabilities.hasVision).toBe(true);
});
it('should detect thinking capability from "reasoning" (OpenAI convention)', async () => {
mockOllamaClient.show.mockResolvedValue({
capabilities: ['completion', 'reasoning'],
details: { family: 'deepseek' },
model_info: {},
modified_at: '2026-04-04T06:20:40.211Z',
});
const result = await api.probeModel('deepseek-r1');
expect(result.capabilities.hasThinking).toBe(true);
});
it('should set hasThinking false when neither thinking nor reasoning in capabilities', async () => {
mockOllamaClient.show.mockResolvedValue({
capabilities: ['completion'],
details: { family: 'llama' },
model_info: {},
modified_at: '2026-04-04T06:20:40.211Z',
});
const result = await api.probeModel('llama3.2');
expect(result.capabilities.hasThinking).toBe(false);
});
it('should detect vision, tools, and embedding capabilities', async () => {
mockOllamaClient.show.mockResolvedValue({
capabilities: ['completion', 'vision', 'tools', 'embeddings'],
details: { family: 'llama' },
model_info: {},
modified_at: '2026-04-04T06:20:40.211Z',
});
const result = await api.probeModel('some-model');
expect(result.capabilities.hasVision).toBe(true);
expect(result.capabilities.canCallTools).toBe(true);
expect(result.capabilities.hasEmbedding).toBe(true);
});
it('should extract settings from Modelfile parameters', async () => {
mockOllamaClient.show.mockResolvedValue({
capabilities: ['completion'],
details: { family: 'llama' },
model_info: {},
parameters: 'temperature 0.7\ntop_k 40\ntop_p 0.9\nnum_ctx 4096',
modified_at: '2026-04-04T06:20:40.211Z',
});
const result = await api.probeModel('llama3.2');
expect(result.settings).toBeDefined();
expect(result.settings!.temperature).toBe(0.7);
expect(result.settings!.topK).toBe(40);
expect(result.settings!.topP).toBe(0.9);
expect(result.settings!.numCtx).toBe(4096);
});
});
}); });

View File

@ -108,7 +108,7 @@ export class OllamaAiApi extends AiApi {
!!modelInfo?.["vision_model"] || !!modelInfo?.["vision_model"] ||
!!modelInfo?.["clip"], !!modelInfo?.["clip"],
hasEmbedding: capabilities.includes("embeddings"), hasEmbedding: capabilities.includes("embeddings"),
hasThinking: capabilities.includes("reasoning"), hasThinking: capabilities.includes("thinking") || capabilities.includes("reasoning"),
isInstructTuned: isInstructTuned:
modelId.toLowerCase().includes("instruct") || modelId.toLowerCase().includes("instruct") ||
modelId.toLowerCase().includes("chat") || modelId.toLowerCase().includes("chat") ||

View File

@ -203,6 +203,9 @@ export class OpenAiApi extends AiApi {
{ role: "user" as const, content: options.prompt }, { role: "user" as const, content: options.prompt },
], ],
stream: true, stream: true,
...(typeof model.params.reasoning === "string"
? { reasoning_effort: model.params.reasoning as "low" | "medium" | "high" }
: {}),
}); });
let accumulatedResponse = ""; let accumulatedResponse = "";
@ -316,6 +319,9 @@ export class OpenAiApi extends AiApi {
messages, messages,
tools, tools,
stream: true, stream: true,
...(typeof model.params.reasoning === "string"
? { reasoning_effort: model.params.reasoning as "low" | "medium" | "high" }
: {}),
}); });
let accumulatedResponse = ""; let accumulatedResponse = "";

View File

@ -22,6 +22,8 @@ export interface IChatSessionPin {
content: string; content: string;
} }
export type ReasoningEffort = "off" | "low" | "medium" | "high";
export interface IChatSession { export interface IChatSession {
_id: GadgetId; _id: GadgetId;
createdAt: Date; createdAt: Date;
@ -32,6 +34,7 @@ export interface IChatSession {
mode: ChatSessionMode; mode: ChatSessionMode;
provider: IAiProvider | GadgetId; provider: IAiProvider | GadgetId;
selectedModel: string; selectedModel: string;
reasoningEffort?: ReasoningEffort;
stats: { stats: {
turnCount: number; turnCount: number;
toolCallCount: number; toolCallCount: number;

View File

@ -9,7 +9,7 @@ import type { IProject } from "./project.js";
import type { IChatSession } from "./chat-session.js"; import type { IChatSession } from "./chat-session.js";
import type { IAiProvider } from "./ai-provider.js"; import type { IAiProvider } from "./ai-provider.js";
import { ChatSessionMode } from "./chat-session.js"; import { ChatSessionMode, ReasoningEffort } from "./chat-session.js";
import { GadgetId } from "../lib/gadget-id.ts"; import { GadgetId } from "../lib/gadget-id.ts";
export enum ChatTurnStatus { export enum ChatTurnStatus {
@ -80,6 +80,7 @@ export interface IChatTurn {
session: IChatSession | GadgetId; session: IChatSession | GadgetId;
provider: IAiProvider | GadgetId; provider: IAiProvider | GadgetId;
llm: string; // id/name of the model used to process the prompt llm: string; // id/name of the model used to process the prompt
reasoningEffort?: ReasoningEffort;
mode: ChatSessionMode; mode: ChatSessionMode;
status: ChatTurnStatus; status: ChatTurnStatus;
prompts: IChatTurnPrompts; prompts: IChatTurnPrompts;

View File

@ -112,10 +112,7 @@ export interface GadgetCodeConfig {
audios?: string; audios?: string;
}; };
}; };
user?: { }
signupEnabled?: boolean;
};
}
/** /**
* Gadget Drone Worker Configuration * Gadget Drone Worker Configuration