WIP - using deep agent to create dog using workflow

This commit is contained in:
Francisco Gaona
2026-01-17 22:51:53 +01:00
parent ded413b99b
commit de65aa4025
26 changed files with 2288 additions and 193 deletions

View File

@@ -0,0 +1,324 @@
# AI Process Builder + Chat Orchestrator
A complete implementation of tenant-scoped AI process automation where admins design LangGraph-compiled workflows via React Flow UI, and end-users execute them through a Deep Agent chat orchestrator with deterministic, audited execution.
## Architecture Overview
### Backend Components
#### 1. **Deep Agent Orchestrator** ([deep-agent.orchestrator.ts](backend/src/ai-processes/deep-agent.orchestrator.ts))
- Uses LangChain/OpenAI to intelligently select processes
- Extracts structured inputs from natural language
- Generates friendly confirmation messages
- Three-step workflow: discover → select → extract → execute
#### 2. **Graph Compiler** ([ai-processes.compiler.ts](backend/src/ai-processes/ai-processes.compiler.ts))
- Validates ReactFlow JSON graphs (Start/End nodes, reachability, cycles)
- Compiles to LangGraph-compatible state machines
- Validates tool allowlist and JSON schemas (Ajv)
- Persists compiled artifact for versioned execution
#### 3. **Runtime Executor** ([ai-processes.runner.ts](backend/src/ai-processes/ai-processes.runner.ts))
- Executes compiled graphs deterministically
- Implements 4 node types: LLMDecisionNode, ToolNode, HumanInputNode, End
- Handles conditional edges via jsonlogic
- Emits real-time events for streaming updates
#### 4. **Tool Registry** ([tools/tool-registry.ts](backend/src/ai-processes/tools/tool-registry.ts))
- Tenant-scoped tool allowlist (database-backed via AiToolConfig)
- Demo tools wrapping ObjectService (findAccount, createAccount, etc.)
- Context injection (tenantId, userId, knex) for secure execution
#### 5. **Orchestrator Service** ([ai-processes.orchestrator.service.ts](backend/src/ai-processes/ai-processes.orchestrator.service.ts))
- Integrates Deep Agent for process selection
- Falls back to standard AI assistant when no processes configured
- Manages chat sessions and message history
- Streams execution events via SSE
### Frontend Components
#### 1. **AIChatBar** ([components/AIChatBar.vue](frontend/components/AIChatBar.vue))
- Updated to call `/ai-processes/chat/messages` endpoint
- SSE event stream consumer for real-time updates
- Displays process selection, node execution, tool calls
- Handles NEED_INPUT events for human-in-the-loop
#### 2. **Process Management UI** ([pages/ai-processes/](frontend/pages/ai-processes/))
- List view: displays all processes with versions
- Editor view: React Flow integration via iframe + postMessage
- Test runner for quick validation
#### 3. **React Flow Editor** ([ai-processes-editor/src/App.tsx](frontend/ai-processes-editor/src/App.tsx))
- Node palette: Start, LLMDecisionNode, ToolNode, HumanInputNode, End
- Visual graph designer with drag-drop
- Auto-saves to parent window via postMessage
- Loads existing graphs for editing
### Data Models (Objection.js)
```typescript
AiProcess
├── id, tenantId, name, description, latestVersion
└── relations: versions[], runs[]
AiProcessVersion
├── id, tenantId, processId, version
├── graphJson (ReactFlow definition)
└── compiledJson (LangGraph artifact)
AiProcessRun
├── id, tenantId, processId, version, status
├── inputJson, outputJson, errorJson, stateJson
└── currentNodeId (for resume)
AiChatSession
├── id, tenantId, userId
└── relations: messages[]
AiChatMessage
├── id, sessionId, role, content
└── timestamps
AiAuditEvent
├── id, tenantId, runId, eventType
└── payloadJson (full event data)
AiToolConfig
├── id, tenantId, toolName, enabled
└── configJson (tool-specific settings)
```
## Demo Process: Register New Pet
A complete workflow demonstrating conditional logic and tool orchestration:
1. **Extract Info** (LLMDecisionNode)
- Parses user message for pet + owner details
- Outputs structured JSON with validation
2. **Find/Create Account** (Conditional)
- Searches for existing account by name/email
- Creates new account if not found
- Merges results into state
3. **Find/Create Contact** (Conditional)
- Searches for existing contact under account
- Creates new contact if not found
4. **Create Pet** (ToolNode)
- Inserts pet record linked to contact
- Returns pet ID
### Seed the Demo Process
```bash
cd backend
npm run migrate:tenant -- <tenant-slug>
npm run seed:demo-process -- <tenant-slug>
```
### Test the Demo Process
1. Navigate to `/ai-processes` in your tenant subdomain
2. Open "Register New Pet" process
3. Click "Test Run" or use the chat bar:
```
User: "Register a dog named Max, breed Golden Retriever, age 3,
owned by John Smith, email john@example.com"
Agent: 🔄 Selected process: Register New Pet
I'll register Max (Golden Retriever, 3 years old) for John Smith.
⚙️ Executing step: Extract Info
✓ Extracted pet details
🔧 Using tool: findAccount
Account not found, creating new account
🔧 Using tool: createAccount
✓ Created account for John Smith
🔧 Using tool: findContact
Contact not found, creating new contact
🔧 Using tool: createContact
✓ Created contact: John Smith
🔧 Using tool: createPet
✓ Created pet: Max (ID: pet_1234567890)
✅ Process completed successfully!
```
## API Endpoints
### Process Management (Admin)
```typescript
GET /tenants/:tenantId/ai-processes
POST /tenants/:tenantId/ai-processes
GET /tenants/:tenantId/ai-processes/:id
POST /tenants/:tenantId/ai-processes/:id/versions
GET /tenants/:tenantId/ai-processes/:id/versions
POST /tenants/:tenantId/ai-processes/:id/runs
POST /tenants/:tenantId/ai-processes/runs/:runId/resume
```
### Chat Orchestrator (End User)
```typescript
POST /tenants/:tenantId/ai-processes/chat/messages
SSE /tenants/:tenantId/ai-processes/stream?sessionId=xxx
```
## Event Stream Types
```typescript
type StreamEvent =
| { type: 'agent_started' }
| { type: 'processes_listed', data: { count: number } }
| { type: 'process_selected', processId: string, version: number }
| { type: 'agent_message', data: { message: string } }
| { type: 'node_started', nodeId: string }
| { type: 'node_completed', nodeId: string }
| { type: 'tool_called', toolName: string, nodeId: string }
| { type: 'llm_decision', nodeId: string, data: any }
| { type: 'need_input', data: { prompt: string, schema: JSONSchema } }
| { type: 'final', data: { output: any } }
| { type: 'error', data: { error: string } }
```
## Security & Guardrails
### 1. **Tenancy Isolation**
- All queries filtered by `tenantId` (enforced in Objection models)
- Tool context includes tenant scope
- Database-per-tenant architecture (inherited from platform)
### 2. **Tool Allowlist**
- Two-level validation:
- Tenant-level: `AiToolConfig` table (enabled tools per tenant)
- Compile-time: validates toolName exists in registry
- Runtime check before tool execution
### 3. **Schema Validation**
- LLMDecisionNode output validated against JSON Schema (Ajv)
- HumanInputNode input validated before resume
- Graph structure validated at compile time
### 4. **Audit Trail**
- Every node execution logged to `ai_audit_events`
- Includes: tool calls, LLM decisions, state mutations, errors
- Queryable for compliance dashboards
### 5. **Versioning**
- Immutable process versions (create-only)
- Runs reference specific version number
- Graph definition + compiled artifact stored together
## Running the System
### 1. **Run Migrations**
```bash
cd backend
npm run migrate:tenant -- tenant1
```
### 2. **Seed Demo Data**
```bash
npm run seed:demo-process -- tenant1
```
### 3. **Start Backend**
```bash
npm run start:dev
```
### 4. **Build Editor (if needed)**
```bash
cd frontend/ai-processes-editor
npm install
npm run build
```
### 5. **Start Frontend**
```bash
cd frontend
npm run dev
```
### 6. **Access UI**
- Admin UI: `http://tenant1.localhost:3001/ai-processes`
- Chat UI: Available in bottom drawer on any page (⌘K to toggle)
## Extension Points
### Adding New Node Types
1. Define type in [ai-processes.types.ts](backend/src/ai-processes/ai-processes.types.ts)
2. Add schema validation in [ai-processes.schemas.ts](backend/src/ai-processes/ai-processes.schemas.ts)
3. Implement executor in [ai-processes.runner.ts](backend/src/ai-processes/ai-processes.runner.ts)
4. Add UI component in React Flow editor
### Adding New Tools
1. Implement handler in [tools/demo-tools.ts](backend/src/ai-processes/tools/demo-tools.ts)
2. Register in `demoTools` export
3. Add to tenant allowlist via UI or seed script
4. Document input/output schema
### Custom LLM Decision Logic
Override `llmDecision` callback in [ai-processes.service.ts](backend/src/ai-processes/ai-processes.service.ts):
```typescript
llmDecision: async (node, state) => {
const prompt = renderTemplate(node.data.promptTemplate, state);
const response = await callOpenAI(prompt, node.data.model);
return validateAgainstSchema(response, node.data.outputSchema);
}
```
## Troubleshooting
### Process not appearing in chat
- Check: `npm run seed:demo-process` completed successfully
- Verify: Process exists in database (`select * from ai_processes`)
- Check: Tools enabled (`select * from ai_tool_configs`)
### Graph validation errors
- Ensure exactly one Start node
- Ensure at least one End node
- Check all edges reference valid node IDs
- Verify tool names match registered tools
### SSE stream not working
- Check CORS settings for subdomain routing
- Verify `sessionId` returned from initial message
- Check browser console for connection errors
- Fallback: use polling endpoint (TODO: implement)
## Next Steps
1. **Enhanced Input Extraction**: Use Deep Agent to extract required fields per process
2. **Visual Schema Builder**: UI for JSON Schema creation (drag-drop fields)
3. **Conditional Edge Builder**: Visual jsonlogic editor
4. **Process Analytics**: Dashboard showing run success rates, avg duration
5. **Human-in-Loop UI**: Dynamic form renderer for HumanInputNode
6. **Process Marketplace**: Share processes across tenants (with permissions)
7. **Python Microservice**: Optional Python runtime for native LangGraph support
## License
MIT

View File

@@ -0,0 +1,115 @@
-- Insert demo AI process directly
SET @process_id = '2d883482-4df0-44d7-b6cf-8541b482afe4';
SET @version_id = '437b1e72-405e-4862-a8bc-f368e554b482';
SET @user_id = 'system';
-- Insert process
INSERT INTO ai_processes (id, name, created_by)
VALUES (@process_id, 'Register New Pet', @user_id);
-- Insert process version with compiled graph
INSERT INTO ai_process_versions (id, process_id, version, graph_json, compiled_json, created_by)
VALUES (
@version_id,
@process_id,
1,
'{}',
JSON_OBJECT(
'id', 'register_new_pet',
'name', 'Register New Pet',
'description', 'Complete pet registration workflow',
'allowCycles', false,
'startNodeId', 'start',
'endNodeIds', JSON_ARRAY('end'),
'maxIterations', 50,
'nodes', JSON_ARRAY(
JSON_OBJECT('id', 'start', 'type', 'Start', 'data', JSON_OBJECT('label', 'Start')),
JSON_OBJECT('id', 'extract_info', 'type', 'LLMDecisionNode', 'data', JSON_OBJECT(
'label', 'Extract Info',
'promptTemplate', 'Extract: petName, species, ownerFirstName, ownerLastName, ownerEmail, accountName from: {{state.message}}',
'inputKeys', JSON_ARRAY('message'),
'outputSchema', JSON_OBJECT(
'type', 'object',
'properties', JSON_OBJECT(
'petName', JSON_OBJECT('type', 'string'),
'species', JSON_OBJECT('type', 'string'),
'ownerFirstName', JSON_OBJECT('type', 'string'),
'ownerLastName', JSON_OBJECT('type', 'string'),
'ownerEmail', JSON_OBJECT('type', 'string'),
'accountName', JSON_OBJECT('type', 'string')
),
'required', JSON_ARRAY('petName', 'species', 'ownerFirstName', 'ownerLastName')
)
)),
JSON_OBJECT('id', 'find_account', 'type', 'ToolNode', 'data', JSON_OBJECT(
'label', 'Find Account',
'toolName', 'findAccount',
'argsTemplate', JSON_OBJECT('name', '{{state.accountName}}', 'email', '{{state.ownerEmail}}'),
'outputMapping', JSON_OBJECT('found', 'accountFound', 'accountId', 'accountId')
)),
JSON_OBJECT('id', 'create_account', 'type', 'ToolNode', 'data', JSON_OBJECT(
'label', 'Create Account',
'toolName', 'createAccount',
'argsTemplate', JSON_OBJECT('name', '{{state.accountName}}', 'email', '{{state.ownerEmail}}'),
'outputMapping', JSON_OBJECT('accountId', 'accountId')
)),
JSON_OBJECT('id', 'find_contact', 'type', 'ToolNode', 'data', JSON_OBJECT(
'label', 'Find Contact',
'toolName', 'findContact',
'argsTemplate', JSON_OBJECT(
'firstName', '{{state.ownerFirstName}}',
'lastName', '{{state.ownerLastName}}',
'email', '{{state.ownerEmail}}',
'accountId', '{{state.accountId}}'
),
'outputMapping', JSON_OBJECT('found', 'contactFound', 'contactId', 'contactId')
)),
JSON_OBJECT('id', 'create_contact', 'type', 'ToolNode', 'data', JSON_OBJECT(
'label', 'Create Contact',
'toolName', 'createContact',
'argsTemplate', JSON_OBJECT(
'firstName', '{{state.ownerFirstName}}',
'lastName', '{{state.ownerLastName}}',
'email', '{{state.ownerEmail}}',
'accountId', '{{state.accountId}}'
),
'outputMapping', JSON_OBJECT('contactId', 'contactId')
)),
JSON_OBJECT('id', 'create_pet', 'type', 'ToolNode', 'data', JSON_OBJECT(
'label', 'Create Pet',
'toolName', 'createPet',
'argsTemplate', JSON_OBJECT(
'name', '{{state.petName}}',
'species', '{{state.species}}',
'ownerId', '{{state.contactId}}'
),
'outputMapping', JSON_OBJECT('petId', 'petId')
)),
JSON_OBJECT('id', 'end', 'type', 'End', 'data', JSON_OBJECT('label', 'End'))
),
'edges', JSON_ARRAY(
JSON_OBJECT('id', 'e1', 'source', 'start', 'target', 'extract_info'),
JSON_OBJECT('id', 'e2', 'source', 'extract_info', 'target', 'find_account'),
JSON_OBJECT('id', 'e3', 'source', 'find_account', 'target', 'find_contact', 'condition', JSON_OBJECT('==', JSON_ARRAY(JSON_OBJECT('var', 'accountFound'), true))),
JSON_OBJECT('id', 'e4', 'source', 'find_account', 'target', 'create_account', 'condition', JSON_OBJECT('==', JSON_ARRAY(JSON_OBJECT('var', 'accountFound'), false))),
JSON_OBJECT('id', 'e5', 'source', 'create_account', 'target', 'find_contact'),
JSON_OBJECT('id', 'e6', 'source', 'find_contact', 'target', 'create_pet', 'condition', JSON_OBJECT('==', JSON_ARRAY(JSON_OBJECT('var', 'contactFound'), true))),
JSON_OBJECT('id', 'e7', 'source', 'find_contact', 'target', 'create_contact', 'condition', JSON_OBJECT('==', JSON_ARRAY(JSON_OBJECT('var', 'contactFound'), false))),
JSON_OBJECT('id', 'e8', 'source', 'create_contact', 'target', 'create_pet'),
JSON_OBJECT('id', 'e9', 'source', 'create_pet', 'target', 'end')
)
),
@user_id
);
-- Insert tool allowlist
INSERT INTO ai_tool_configs (id, tool_name, enabled)
VALUES
(UUID(), 'findAccount', true),
(UUID(), 'createAccount', true),
(UUID(), 'findContact', true),
(UUID(), 'createContact', true),
(UUID(), 'createPet', true)
ON DUPLICATE KEY UPDATE enabled = true;
SELECT 'Demo process inserted successfully!' as result;

View File

@@ -1,19 +1,16 @@
exports.up = async function (knex) { exports.up = async function (knex) {
await knex.schema.createTable('ai_processes', (table) => { await knex.schema.createTable('ai_processes', (table) => {
table.uuid('id').primary(); table.uuid('id').primary();
table.string('tenant_id').notNullable();
table.string('name').notNullable(); table.string('name').notNullable();
table.text('description'); table.text('description');
table.integer('latest_version').notNullable().defaultTo(1); table.integer('latest_version').notNullable().defaultTo(1);
table.string('created_by').notNullable(); table.string('created_by').notNullable();
table.timestamp('created_at').defaultTo(knex.fn.now()); table.timestamp('created_at').defaultTo(knex.fn.now());
table.timestamp('updated_at').defaultTo(knex.fn.now()); table.timestamp('updated_at').defaultTo(knex.fn.now());
table.index(['tenant_id']);
}); });
await knex.schema.createTable('ai_process_versions', (table) => { await knex.schema.createTable('ai_process_versions', (table) => {
table.uuid('id').primary(); table.uuid('id').primary();
table.string('tenant_id').notNullable();
table.uuid('process_id').notNullable(); table.uuid('process_id').notNullable();
table.integer('version').notNullable(); table.integer('version').notNullable();
table.json('graph_json').notNullable(); table.json('graph_json').notNullable();
@@ -21,13 +18,11 @@ exports.up = async function (knex) {
table.string('created_by').notNullable(); table.string('created_by').notNullable();
table.timestamp('created_at').defaultTo(knex.fn.now()); table.timestamp('created_at').defaultTo(knex.fn.now());
table.unique(['process_id', 'version']); table.unique(['process_id', 'version']);
table.index(['tenant_id']);
table.index(['process_id']); table.index(['process_id']);
}); });
await knex.schema.createTable('ai_process_runs', (table) => { await knex.schema.createTable('ai_process_runs', (table) => {
table.uuid('id').primary(); table.uuid('id').primary();
table.string('tenant_id').notNullable();
table.uuid('process_id').notNullable(); table.uuid('process_id').notNullable();
table.integer('version').notNullable(); table.integer('version').notNullable();
table.string('status').notNullable(); table.string('status').notNullable();
@@ -38,16 +33,13 @@ exports.up = async function (knex) {
table.string('current_node_id'); table.string('current_node_id');
table.timestamp('started_at').defaultTo(knex.fn.now()); table.timestamp('started_at').defaultTo(knex.fn.now());
table.timestamp('ended_at'); table.timestamp('ended_at');
table.index(['tenant_id']);
table.index(['process_id']); table.index(['process_id']);
}); });
await knex.schema.createTable('ai_chat_sessions', (table) => { await knex.schema.createTable('ai_chat_sessions', (table) => {
table.uuid('id').primary(); table.uuid('id').primary();
table.string('tenant_id').notNullable();
table.string('user_id').notNullable(); table.string('user_id').notNullable();
table.timestamp('created_at').defaultTo(knex.fn.now()); table.timestamp('created_at').defaultTo(knex.fn.now());
table.index(['tenant_id']);
table.index(['user_id']); table.index(['user_id']);
}); });
@@ -62,12 +54,10 @@ exports.up = async function (knex) {
await knex.schema.createTable('ai_audit_events', (table) => { await knex.schema.createTable('ai_audit_events', (table) => {
table.uuid('id').primary(); table.uuid('id').primary();
table.string('tenant_id').notNullable();
table.uuid('run_id').notNullable(); table.uuid('run_id').notNullable();
table.string('event_type').notNullable(); table.string('event_type').notNullable();
table.json('payload_json').notNullable(); table.json('payload_json').notNullable();
table.timestamp('created_at').defaultTo(knex.fn.now()); table.timestamp('created_at').defaultTo(knex.fn.now());
table.index(['tenant_id']);
table.index(['run_id']); table.index(['run_id']);
}); });
}; };

View File

@@ -0,0 +1,14 @@
exports.up = async function (knex) {
await knex.schema.createTable('ai_tool_configs', (table) => {
table.uuid('id').primary();
table.string('tool_name').notNullable().unique();
table.boolean('enabled').notNullable().defaultTo(true);
table.json('config_json');
table.timestamp('created_at').defaultTo(knex.fn.now());
table.timestamp('updated_at').defaultTo(knex.fn.now());
});
};
exports.down = async function (knex) {
await knex.schema.dropTableIfExists('ai_tool_configs');
};

View File

@@ -25,11 +25,15 @@
"@nestjs/serve-static": "^4.0.2", "@nestjs/serve-static": "^4.0.2",
"@nestjs/websockets": "^10.4.20", "@nestjs/websockets": "^10.4.20",
"@prisma/client": "^5.8.0", "@prisma/client": "^5.8.0",
"@types/json-logic-js": "^2.0.8",
"ajv": "^8.17.1",
"ajv-formats": "^3.0.1",
"bcrypt": "^5.1.1", "bcrypt": "^5.1.1",
"bullmq": "^5.1.0", "bullmq": "^5.1.0",
"class-transformer": "^0.5.1", "class-transformer": "^0.5.1",
"class-validator": "^0.14.1", "class-validator": "^0.14.1",
"ioredis": "^5.3.2", "ioredis": "^5.3.2",
"json-logic-js": "^2.0.5",
"knex": "^3.1.0", "knex": "^3.1.0",
"langchain": "^1.2.7", "langchain": "^1.2.7",
"mysql2": "^3.15.3", "mysql2": "^3.15.3",
@@ -96,6 +100,41 @@
} }
} }
}, },
"node_modules/@angular-devkit/core/node_modules/ajv": {
"version": "8.12.0",
"resolved": "https://registry.npmjs.org/ajv/-/ajv-8.12.0.tgz",
"integrity": "sha512-sRu1kpcO9yLtYxBKvqfTeh9KzZEwO3STyX1HT+4CaDzC6HpTGYhIhPIzj9XuKU7KYDwnaeh5hcOwjy1QuJzBPA==",
"dev": true,
"license": "MIT",
"dependencies": {
"fast-deep-equal": "^3.1.1",
"json-schema-traverse": "^1.0.0",
"require-from-string": "^2.0.2",
"uri-js": "^4.2.2"
},
"funding": {
"type": "github",
"url": "https://github.com/sponsors/epoberezkin"
}
},
"node_modules/@angular-devkit/core/node_modules/ajv-formats": {
"version": "2.1.1",
"resolved": "https://registry.npmjs.org/ajv-formats/-/ajv-formats-2.1.1.tgz",
"integrity": "sha512-Wx0Kx52hxE7C18hkMEggYlEifqWZtYaRgouJor+WMdPnQyEK13vgEWyVNup7SoeeoLMsr4kf5h6dOW11I15MUA==",
"dev": true,
"license": "MIT",
"dependencies": {
"ajv": "^8.0.0"
},
"peerDependencies": {
"ajv": "^8.0.0"
},
"peerDependenciesMeta": {
"ajv": {
"optional": true
}
}
},
"node_modules/@angular-devkit/core/node_modules/rxjs": { "node_modules/@angular-devkit/core/node_modules/rxjs": {
"version": "7.8.1", "version": "7.8.1",
"resolved": "https://registry.npmjs.org/rxjs/-/rxjs-7.8.1.tgz", "resolved": "https://registry.npmjs.org/rxjs/-/rxjs-7.8.1.tgz",
@@ -929,6 +968,23 @@
"fast-uri": "^2.0.0" "fast-uri": "^2.0.0"
} }
}, },
"node_modules/@fastify/ajv-compiler/node_modules/ajv-formats": {
"version": "2.1.1",
"resolved": "https://registry.npmjs.org/ajv-formats/-/ajv-formats-2.1.1.tgz",
"integrity": "sha512-Wx0Kx52hxE7C18hkMEggYlEifqWZtYaRgouJor+WMdPnQyEK13vgEWyVNup7SoeeoLMsr4kf5h6dOW11I15MUA==",
"license": "MIT",
"dependencies": {
"ajv": "^8.0.0"
},
"peerDependencies": {
"ajv": "^8.0.0"
},
"peerDependenciesMeta": {
"ajv": {
"optional": true
}
}
},
"node_modules/@fastify/cors": { "node_modules/@fastify/cors": {
"version": "9.0.1", "version": "9.0.1",
"resolved": "https://registry.npmjs.org/@fastify/cors/-/cors-9.0.1.tgz", "resolved": "https://registry.npmjs.org/@fastify/cors/-/cors-9.0.1.tgz",
@@ -2917,6 +2973,12 @@
"pretty-format": "^29.0.0" "pretty-format": "^29.0.0"
} }
}, },
"node_modules/@types/json-logic-js": {
"version": "2.0.8",
"resolved": "https://registry.npmjs.org/@types/json-logic-js/-/json-logic-js-2.0.8.tgz",
"integrity": "sha512-WgNsDPuTPKYXl0Jh0IfoCoJoAGGYZt5qzpmjuLSEg7r0cKp/kWtWp0HAsVepyPSPyXiHo6uXp/B/kW/2J1fa2Q==",
"license": "MIT"
},
"node_modules/@types/json-schema": { "node_modules/@types/json-schema": {
"version": "7.0.15", "version": "7.0.15",
"resolved": "https://registry.npmjs.org/@types/json-schema/-/json-schema-7.0.15.tgz", "resolved": "https://registry.npmjs.org/@types/json-schema/-/json-schema-7.0.15.tgz",
@@ -3574,15 +3636,15 @@
} }
}, },
"node_modules/ajv": { "node_modules/ajv": {
"version": "8.12.0", "version": "8.17.1",
"resolved": "https://registry.npmjs.org/ajv/-/ajv-8.12.0.tgz", "resolved": "https://registry.npmjs.org/ajv/-/ajv-8.17.1.tgz",
"integrity": "sha512-sRu1kpcO9yLtYxBKvqfTeh9KzZEwO3STyX1HT+4CaDzC6HpTGYhIhPIzj9XuKU7KYDwnaeh5hcOwjy1QuJzBPA==", "integrity": "sha512-B/gBuNg5SiMTrPkC+A2+cW0RszwxYmn6VYxB/inlBStS5nx6xHIt/ehKRhIMhqusl7a8LjQoZnjCs5vhwxOQ1g==",
"license": "MIT", "license": "MIT",
"dependencies": { "dependencies": {
"fast-deep-equal": "^3.1.1", "fast-deep-equal": "^3.1.3",
"fast-uri": "^3.0.1",
"json-schema-traverse": "^1.0.0", "json-schema-traverse": "^1.0.0",
"require-from-string": "^2.0.2", "require-from-string": "^2.0.2"
"uri-js": "^4.2.2"
}, },
"funding": { "funding": {
"type": "github", "type": "github",
@@ -3590,9 +3652,9 @@
} }
}, },
"node_modules/ajv-formats": { "node_modules/ajv-formats": {
"version": "2.1.1", "version": "3.0.1",
"resolved": "https://registry.npmjs.org/ajv-formats/-/ajv-formats-2.1.1.tgz", "resolved": "https://registry.npmjs.org/ajv-formats/-/ajv-formats-3.0.1.tgz",
"integrity": "sha512-Wx0Kx52hxE7C18hkMEggYlEifqWZtYaRgouJor+WMdPnQyEK13vgEWyVNup7SoeeoLMsr4kf5h6dOW11I15MUA==", "integrity": "sha512-8iUql50EUR+uUcdRQ3HDqa6EVyo3docL8g5WJ3FNcWmu62IbkGUue/pEyLBW8VGKKucTPgqeks4fIU1DA4yowQ==",
"license": "MIT", "license": "MIT",
"dependencies": { "dependencies": {
"ajv": "^8.0.0" "ajv": "^8.0.0"
@@ -3619,6 +3681,22 @@
"ajv": "^8.8.2" "ajv": "^8.8.2"
} }
}, },
"node_modules/ajv/node_modules/fast-uri": {
"version": "3.1.0",
"resolved": "https://registry.npmjs.org/fast-uri/-/fast-uri-3.1.0.tgz",
"integrity": "sha512-iPeeDKJSWf4IEOasVVrknXpaBV0IApz/gp7S2bb7Z4Lljbl2MGJRqInZiUrQwV16cpzw/D3S5j5Julj/gT52AA==",
"funding": [
{
"type": "github",
"url": "https://github.com/sponsors/fastify"
},
{
"type": "opencollective",
"url": "https://opencollective.com/fastify"
}
],
"license": "BSD-3-Clause"
},
"node_modules/ansi-colors": { "node_modules/ansi-colors": {
"version": "4.1.3", "version": "4.1.3",
"resolved": "https://registry.npmjs.org/ansi-colors/-/ansi-colors-4.1.3.tgz", "resolved": "https://registry.npmjs.org/ansi-colors/-/ansi-colors-4.1.3.tgz",
@@ -5549,23 +5627,6 @@
"rfdc": "^1.2.0" "rfdc": "^1.2.0"
} }
}, },
"node_modules/fast-json-stringify/node_modules/ajv-formats": {
"version": "3.0.1",
"resolved": "https://registry.npmjs.org/ajv-formats/-/ajv-formats-3.0.1.tgz",
"integrity": "sha512-8iUql50EUR+uUcdRQ3HDqa6EVyo3docL8g5WJ3FNcWmu62IbkGUue/pEyLBW8VGKKucTPgqeks4fIU1DA4yowQ==",
"license": "MIT",
"dependencies": {
"ajv": "^8.0.0"
},
"peerDependencies": {
"ajv": "^8.0.0"
},
"peerDependenciesMeta": {
"ajv": {
"optional": true
}
}
},
"node_modules/fast-levenshtein": { "node_modules/fast-levenshtein": {
"version": "2.0.6", "version": "2.0.6",
"resolved": "https://registry.npmjs.org/fast-levenshtein/-/fast-levenshtein-2.0.6.tgz", "resolved": "https://registry.npmjs.org/fast-levenshtein/-/fast-levenshtein-2.0.6.tgz",
@@ -7593,6 +7654,12 @@
"dev": true, "dev": true,
"license": "MIT" "license": "MIT"
}, },
"node_modules/json-logic-js": {
"version": "2.0.5",
"resolved": "https://registry.npmjs.org/json-logic-js/-/json-logic-js-2.0.5.tgz",
"integrity": "sha512-rTT2+lqcuUmj4DgWfmzupZqQDA64AdmYqizzMPWj3DxGdfFNsxPpcNVSaTj4l8W2tG/+hg7/mQhxjU3aPacO6g==",
"license": "MIT"
},
"node_modules/json-parse-even-better-errors": { "node_modules/json-parse-even-better-errors": {
"version": "2.3.1", "version": "2.3.1",
"resolved": "https://registry.npmjs.org/json-parse-even-better-errors/-/json-parse-even-better-errors-2.3.1.tgz", "resolved": "https://registry.npmjs.org/json-parse-even-better-errors/-/json-parse-even-better-errors-2.3.1.tgz",
@@ -8655,37 +8722,22 @@
"knex": ">=1.0.1" "knex": ">=1.0.1"
} }
}, },
"node_modules/objection/node_modules/ajv": { "node_modules/objection/node_modules/ajv-formats": {
"version": "8.17.1", "version": "2.1.1",
"resolved": "https://registry.npmjs.org/ajv/-/ajv-8.17.1.tgz", "resolved": "https://registry.npmjs.org/ajv-formats/-/ajv-formats-2.1.1.tgz",
"integrity": "sha512-B/gBuNg5SiMTrPkC+A2+cW0RszwxYmn6VYxB/inlBStS5nx6xHIt/ehKRhIMhqusl7a8LjQoZnjCs5vhwxOQ1g==", "integrity": "sha512-Wx0Kx52hxE7C18hkMEggYlEifqWZtYaRgouJor+WMdPnQyEK13vgEWyVNup7SoeeoLMsr4kf5h6dOW11I15MUA==",
"license": "MIT", "license": "MIT",
"dependencies": { "dependencies": {
"fast-deep-equal": "^3.1.3", "ajv": "^8.0.0"
"fast-uri": "^3.0.1",
"json-schema-traverse": "^1.0.0",
"require-from-string": "^2.0.2"
}, },
"funding": { "peerDependencies": {
"type": "github", "ajv": "^8.0.0"
"url": "https://github.com/sponsors/epoberezkin" },
} "peerDependenciesMeta": {
}, "ajv": {
"node_modules/objection/node_modules/fast-uri": { "optional": true
"version": "3.1.0",
"resolved": "https://registry.npmjs.org/fast-uri/-/fast-uri-3.1.0.tgz",
"integrity": "sha512-iPeeDKJSWf4IEOasVVrknXpaBV0IApz/gp7S2bb7Z4Lljbl2MGJRqInZiUrQwV16cpzw/D3S5j5Julj/gT52AA==",
"funding": [
{
"type": "github",
"url": "https://github.com/sponsors/fastify"
},
{
"type": "opencollective",
"url": "https://opencollective.com/fastify"
} }
], }
"license": "BSD-3-Clause"
}, },
"node_modules/obliterator": { "node_modules/obliterator": {
"version": "2.0.5", "version": "2.0.5",
@@ -9347,6 +9399,7 @@
"version": "2.3.1", "version": "2.3.1",
"resolved": "https://registry.npmjs.org/punycode/-/punycode-2.3.1.tgz", "resolved": "https://registry.npmjs.org/punycode/-/punycode-2.3.1.tgz",
"integrity": "sha512-vYt7UD1U9Wg6138shLtLOvdAu+8DsC/ilFtEVHcH+wydcSpNE20AfSOduf6MkRFahL5FY7X1oU7nKVZFtfq8Fg==", "integrity": "sha512-vYt7UD1U9Wg6138shLtLOvdAu+8DsC/ilFtEVHcH+wydcSpNE20AfSOduf6MkRFahL5FY7X1oU7nKVZFtfq8Fg==",
"dev": true,
"license": "MIT", "license": "MIT",
"engines": { "engines": {
"node": ">=6" "node": ">=6"
@@ -10482,6 +10535,24 @@
} }
} }
}, },
"node_modules/terser-webpack-plugin/node_modules/ajv-formats": {
"version": "2.1.1",
"resolved": "https://registry.npmjs.org/ajv-formats/-/ajv-formats-2.1.1.tgz",
"integrity": "sha512-Wx0Kx52hxE7C18hkMEggYlEifqWZtYaRgouJor+WMdPnQyEK13vgEWyVNup7SoeeoLMsr4kf5h6dOW11I15MUA==",
"dev": true,
"license": "MIT",
"dependencies": {
"ajv": "^8.0.0"
},
"peerDependencies": {
"ajv": "^8.0.0"
},
"peerDependenciesMeta": {
"ajv": {
"optional": true
}
}
},
"node_modules/terser-webpack-plugin/node_modules/jest-worker": { "node_modules/terser-webpack-plugin/node_modules/jest-worker": {
"version": "27.5.1", "version": "27.5.1",
"resolved": "https://registry.npmjs.org/jest-worker/-/jest-worker-27.5.1.tgz", "resolved": "https://registry.npmjs.org/jest-worker/-/jest-worker-27.5.1.tgz",
@@ -11057,6 +11128,7 @@
"version": "4.4.1", "version": "4.4.1",
"resolved": "https://registry.npmjs.org/uri-js/-/uri-js-4.4.1.tgz", "resolved": "https://registry.npmjs.org/uri-js/-/uri-js-4.4.1.tgz",
"integrity": "sha512-7rKUyy33Q1yc98pQ1DAmLtwX109F7TIfWlW1Ydo8Wl1ii1SeHieeh0HHfPeL2fMXK6z0s8ecKs9frCuLJvndBg==", "integrity": "sha512-7rKUyy33Q1yc98pQ1DAmLtwX109F7TIfWlW1Ydo8Wl1ii1SeHieeh0HHfPeL2fMXK6z0s8ecKs9frCuLJvndBg==",
"dev": true,
"license": "BSD-2-Clause", "license": "BSD-2-Clause",
"dependencies": { "dependencies": {
"punycode": "^2.1.0" "punycode": "^2.1.0"
@@ -11240,6 +11312,25 @@
"node": ">=10.13.0" "node": ">=10.13.0"
} }
}, },
"node_modules/webpack/node_modules/ajv-formats": {
"version": "2.1.1",
"resolved": "https://registry.npmjs.org/ajv-formats/-/ajv-formats-2.1.1.tgz",
"integrity": "sha512-Wx0Kx52hxE7C18hkMEggYlEifqWZtYaRgouJor+WMdPnQyEK13vgEWyVNup7SoeeoLMsr4kf5h6dOW11I15MUA==",
"dev": true,
"license": "MIT",
"peer": true,
"dependencies": {
"ajv": "^8.0.0"
},
"peerDependencies": {
"ajv": "^8.0.0"
},
"peerDependenciesMeta": {
"ajv": {
"optional": true
}
}
},
"node_modules/webpack/node_modules/es-module-lexer": { "node_modules/webpack/node_modules/es-module-lexer": {
"version": "2.0.0", "version": "2.0.0",
"resolved": "https://registry.npmjs.org/es-module-lexer/-/es-module-lexer-2.0.0.tgz", "resolved": "https://registry.npmjs.org/es-module-lexer/-/es-module-lexer-2.0.0.tgz",

View File

@@ -23,7 +23,8 @@
"migrate:rollback": "knex migrate:rollback --knexfile=knexfile.js", "migrate:rollback": "knex migrate:rollback --knexfile=knexfile.js",
"migrate:status": "ts-node -r tsconfig-paths/register scripts/check-migration-status.ts", "migrate:status": "ts-node -r tsconfig-paths/register scripts/check-migration-status.ts",
"migrate:tenant": "ts-node -r tsconfig-paths/register scripts/migrate-tenant.ts", "migrate:tenant": "ts-node -r tsconfig-paths/register scripts/migrate-tenant.ts",
"migrate:all-tenants": "ts-node -r tsconfig-paths/register scripts/migrate-all-tenants.ts" "migrate:all-tenants": "ts-node -r tsconfig-paths/register scripts/migrate-all-tenants.ts",
"seed:demo-process": "ts-node -r tsconfig-paths/register scripts/seed-demo-process.ts"
}, },
"dependencies": { "dependencies": {
"@casl/ability": "^6.7.5", "@casl/ability": "^6.7.5",
@@ -42,11 +43,15 @@
"@nestjs/serve-static": "^4.0.2", "@nestjs/serve-static": "^4.0.2",
"@nestjs/websockets": "^10.4.20", "@nestjs/websockets": "^10.4.20",
"@prisma/client": "^5.8.0", "@prisma/client": "^5.8.0",
"@types/json-logic-js": "^2.0.8",
"ajv": "^8.17.1",
"ajv-formats": "^3.0.1",
"bcrypt": "^5.1.1", "bcrypt": "^5.1.1",
"bullmq": "^5.1.0", "bullmq": "^5.1.0",
"class-transformer": "^0.5.1", "class-transformer": "^0.5.1",
"class-validator": "^0.14.1", "class-validator": "^0.14.1",
"ioredis": "^5.3.2", "ioredis": "^5.3.2",
"json-logic-js": "^2.0.5",
"knex": "^3.1.0", "knex": "^3.1.0",
"langchain": "^1.2.7", "langchain": "^1.2.7",
"mysql2": "^3.15.3", "mysql2": "^3.15.3",
@@ -54,9 +59,6 @@
"openai": "^6.15.0", "openai": "^6.15.0",
"passport": "^0.7.0", "passport": "^0.7.0",
"passport-jwt": "^4.0.1", "passport-jwt": "^4.0.1",
"ajv": "^8.17.1",
"ajv-formats": "^3.0.1",
"json-logic-js": "^2.0.5",
"reflect-metadata": "^0.2.1", "reflect-metadata": "^0.2.1",
"rxjs": "^7.8.1", "rxjs": "^7.8.1",
"socket.io": "^4.8.3", "socket.io": "^4.8.3",

View File

@@ -0,0 +1,332 @@
import { randomUUID } from 'crypto';
import { AiProcess, AiProcessVersion, AiToolConfig } from '../src/models/ai-process.model';
// Bootstrap NestJS to get proper services
async function getTenantContext(tenantSlugOrId: string) {
const { NestFactory } = await import('@nestjs/core');
const { AppModule } = await import('../src/app.module');
const { TenantDatabaseService } = await import('../src/tenant/tenant-database.service');
// Create app context (without listening)
const app = await NestFactory.createApplicationContext(AppModule, {
logger: false,
});
const tenantDbService = app.get(TenantDatabaseService);
// Resolve tenant ID
const tenantId = await tenantDbService.resolveTenantId(tenantSlugOrId);
// Get proper Knex connection
const knex = await tenantDbService.getTenantKnexById(tenantId);
return { tenantId, knex, app };
}
/**
* Seed script for demo AI Process: Register New Pet
*
* This process demonstrates:
* - Conditional logic (find or create account/contact)
* - Tool usage (findAccount, createAccount, findContact, createContact, createPet)
* - Sequential execution
* - LLM decision nodes with structured JSON output
*
* Usage:
* npm run seed:demo-process -- <tenant-slug-or-id>
*/
const demoProcessGraph = {
id: 'register_new_pet',
name: 'Register New Pet',
description: 'Complete pet registration workflow with account and contact resolution',
allowCycles: false,
nodes: [
{
id: 'start',
type: 'Start',
position: { x: 250, y: 50 },
data: { label: 'Start' },
},
{
id: 'extract_info',
type: 'LLMDecisionNode',
position: { x: 250, y: 150 },
data: {
label: 'Extract Pet Info',
promptTemplate: `Extract pet registration information from the user message.
User message: {{state.message}}
Extract:
- Pet name (required)
- Pet species (required, e.g., "dog", "cat", "bird")
- Pet breed (optional)
- Pet age (optional, as number)
- Owner first name (required)
- Owner last name (required)
- Owner email (optional)
- Owner phone (optional)
- Account/Company name (optional, defaults to owner's full name)
Return JSON with these exact fields.`,
inputKeys: ['message'],
outputSchema: {
type: 'object',
properties: {
petName: { type: 'string' },
species: { type: 'string' },
breed: { type: 'string' },
age: { type: 'number' },
ownerFirstName: { type: 'string' },
ownerLastName: { type: 'string' },
ownerEmail: { type: 'string' },
ownerPhone: { type: 'string' },
accountName: { type: 'string' },
},
required: ['petName', 'species', 'ownerFirstName', 'ownerLastName'],
},
model: { name: 'gpt-4o', temperature: 0 },
},
},
{
id: 'find_account',
type: 'ToolNode',
position: { x: 250, y: 280 },
data: {
label: 'Find Account',
toolName: 'findAccount',
argsTemplate: {
name: '{{state.accountName}}',
email: '{{state.ownerEmail}}',
},
outputMapping: {
found: 'accountFound',
accountId: 'accountId',
},
},
},
{
id: 'create_account',
type: 'ToolNode',
position: { x: 450, y: 380 },
data: {
label: 'Create Account',
toolName: 'createAccount',
argsTemplate: {
name: '{{state.accountName}}',
email: '{{state.ownerEmail}}',
phone: '{{state.ownerPhone}}',
},
outputMapping: {
accountId: 'accountId',
},
},
},
{
id: 'find_contact',
type: 'ToolNode',
position: { x: 250, y: 480 },
data: {
label: 'Find Contact',
toolName: 'findContact',
argsTemplate: {
firstName: '{{state.ownerFirstName}}',
lastName: '{{state.ownerLastName}}',
email: '{{state.ownerEmail}}',
accountId: '{{state.accountId}}',
},
outputMapping: {
found: 'contactFound',
contactId: 'contactId',
},
},
},
{
id: 'create_contact',
type: 'ToolNode',
position: { x: 450, y: 580 },
data: {
label: 'Create Contact',
toolName: 'createContact',
argsTemplate: {
firstName: '{{state.ownerFirstName}}',
lastName: '{{state.ownerLastName}}',
email: '{{state.ownerEmail}}',
phone: '{{state.ownerPhone}}',
accountId: '{{state.accountId}}',
},
outputMapping: {
contactId: 'contactId',
},
},
},
{
id: 'create_pet',
type: 'ToolNode',
position: { x: 250, y: 680 },
data: {
label: 'Create Pet Record',
toolName: 'createPet',
argsTemplate: {
name: '{{state.petName}}',
species: '{{state.species}}',
breed: '{{state.breed}}',
age: '{{state.age}}',
ownerId: '{{state.contactId}}',
},
outputMapping: {
petId: 'petId',
},
},
},
{
id: 'end',
type: 'End',
position: { x: 250, y: 780 },
data: { label: 'End' },
},
],
edges: [
{ id: 'e1', source: 'start', target: 'extract_info' },
{ id: 'e2', source: 'extract_info', target: 'find_account' },
{
id: 'e3',
source: 'find_account',
target: 'find_contact',
condition: { '==': [{ var: 'accountFound' }, true] },
},
{
id: 'e4',
source: 'find_account',
target: 'create_account',
condition: { '==': [{ var: 'accountFound' }, false] },
},
{ id: 'e5', source: 'create_account', target: 'find_contact' },
{
id: 'e6',
source: 'find_contact',
target: 'create_pet',
condition: { '==': [{ var: 'contactFound' }, true] },
},
{
id: 'e7',
source: 'find_contact',
target: 'create_contact',
condition: { '==': [{ var: 'contactFound' }, false] },
},
{ id: 'e8', source: 'create_contact', target: 'create_pet' },
{ id: 'e9', source: 'create_pet', target: 'end' },
],
};
const demoTools = [
'findAccount',
'createAccount',
'findContact',
'createContact',
'createPet',
];
async function seedDemoProcess(tenantSlugOrId: string) {
let app;
try {
console.log(`\n🌱 Seeding demo AI process for tenant: ${tenantSlugOrId}\n`);
const context = await getTenantContext(tenantSlugOrId);
const { tenantId, knex, app: nestApp } = context;
app = nestApp;
console.log(`✓ Resolved tenant ID: ${tenantId}`);
console.log(`✓ Connected to tenant database`);
// Check if process already exists
const existing = await AiProcess.query(knex)
.where('name', demoProcessGraph.name)
.first();
if (existing) {
console.log(`⚠ Process "${demoProcessGraph.name}" already exists (ID: ${existing.id})`);
console.log(` To create a new version, update via the UI.`);
return;
}
// Create process in transaction
await knex.transaction(async (trx) => {
const processId = randomUUID();
const userId = 'system'; // System user for seed data
// Create process
await AiProcess.query(trx).insert({
id: processId,
name: demoProcessGraph.name,
description: demoProcessGraph.description,
latestVersion: 1,
createdBy: userId,
});
console.log(`✓ Created process: ${demoProcessGraph.name} (${processId})`);
// Create initial version
// Note: In production, this would call the compiler service
// For seed, we're storing a simplified version
await AiProcessVersion.query(trx).insert({
id: randomUUID(),
processId,
version: 1,
graphJson: demoProcessGraph,
compiledJson: {
graphId: demoProcessGraph.id,
version: 1,
nodes: demoProcessGraph.nodes,
edges: demoProcessGraph.edges,
startNodeId: 'start',
endNodeIds: ['end'],
adjacency: {},
},
createdBy: userId,
});
console.log(`✓ Created process version 1`);
// Enable demo tools for tenant
for (const toolName of demoTools) {
const existingTool = await AiToolConfig.query(trx)
.where('tool_name', toolName)
.first();
if (!existingTool) {
await AiToolConfig.query(trx).insert({
id: randomUUID(),
toolName,
enabled: true,
});
console.log(`✓ Enabled tool: ${toolName}`);
}
}
});
console.log(`\n✅ Demo process seeded successfully!\n`);
console.log(`Next steps:`);
console.log(` 1. Navigate to /ai-processes in your frontend`);
console.log(` 2. Open the "${demoProcessGraph.name}" process`);
console.log(` 3. Test it by sending a message like:`);
console.log(` "Register a dog named Max, owned by John Smith (john@email.com)"`);
console.log();
if (app) await app.close();
process.exit(0);
} catch (error) {
console.error('❌ Seed failed:', error);
if (app) await app.close();
process.exit(1);
}
}
// Get tenant from command line args
const tenantSlugOrId = process.argv[2];
if (!tenantSlugOrId) {
console.error('Usage: npm run seed:demo-process -- <tenant-slug-or-id>');
process.exit(1);
}
seedDemoProcess(tenantSlugOrId);

View File

@@ -986,7 +986,7 @@ export class AiAssistantService {
].includes(apiName); ].includes(apiName);
} }
private async getOpenAiConfig(tenantId: string): Promise<OpenAIConfig | null> { async getOpenAiConfig(tenantId: string): Promise<OpenAIConfig | null> {
const resolvedTenantId = await this.tenantDbService.resolveTenantId(tenantId); const resolvedTenantId = await this.tenantDbService.resolveTenantId(tenantId);
const centralPrisma = getCentralPrisma(); const centralPrisma = getCentralPrisma();
const tenant = await centralPrisma.tenant.findUnique({ const tenant = await centralPrisma.tenant.findUnique({

View File

@@ -75,12 +75,20 @@ export const validateGraphDefinition = (
} }
const toolRegistry = new ToolRegistry(); const toolRegistry = new ToolRegistry();
const allToolNames = toolRegistry.getAllToolNames();
graph.nodes.forEach((node) => { graph.nodes.forEach((node) => {
if (node.type === 'ToolNode') { if (node.type === 'ToolNode') {
const toolName = (node.data as { toolName?: string }).toolName; const toolName = (node.data as { toolName?: string }).toolName;
if (!toolName || !toolRegistry.isToolAllowed(tenantId, toolName)) { if (!toolName) {
throw new GraphValidationError( throw new GraphValidationError(
`Tool ${toolName ?? 'unknown'} is not allowlisted for tenant.`, `ToolNode ${node.id} missing toolName configuration.`,
);
}
// Validate tool exists in registry (allowlist check happens at runtime)
if (!allToolNames.includes(toolName)) {
throw new GraphValidationError(
`Tool ${toolName} is not registered in the tool registry.`,
); );
} }
} }

View File

@@ -119,6 +119,7 @@ export class AiProcessesController {
} }
@Post('ai-chat/messages') @Post('ai-chat/messages')
@Post('ai-processes/chat/messages')
async sendChatMessage( async sendChatMessage(
@TenantId() tenantId: string, @TenantId() tenantId: string,
@CurrentUser() user: any, @CurrentUser() user: any,
@@ -136,6 +137,7 @@ export class AiProcessesController {
} }
@Sse('ai-chat/stream') @Sse('ai-chat/stream')
@Sse('ai-processes/stream')
streamChat(@Query('sessionId') sessionId: string) { streamChat(@Query('sessionId') sessionId: string) {
return this.streamService.getStream(sessionId); return this.streamService.getStream(sessionId);
} }

View File

@@ -5,6 +5,7 @@ import { AiProcessesStreamService } from './ai-processes.stream.service';
import { AiAssistantService } from '../ai-assistant/ai-assistant.service'; import { AiAssistantService } from '../ai-assistant/ai-assistant.service';
import { TenantDatabaseService } from '../tenant/tenant-database.service'; import { TenantDatabaseService } from '../tenant/tenant-database.service';
import { AiChatMessage, AiChatSession } from '../models/ai-chat.model'; import { AiChatMessage, AiChatSession } from '../models/ai-chat.model';
import { DeepAgentOrchestrator } from './deep-agent.orchestrator';
@Injectable() @Injectable()
export class AiProcessesOrchestratorService { export class AiProcessesOrchestratorService {
@@ -27,7 +28,6 @@ export class AiProcessesOrchestratorService {
userId: string, userId: string,
) { ) {
return AiChatSession.query(knex).insert({ return AiChatSession.query(knex).insert({
tenantId,
userId, userId,
}); });
} }
@@ -54,7 +54,7 @@ export class AiProcessesOrchestratorService {
? await AiChatSession.query(knex).findById(sessionId) ? await AiChatSession.query(knex).findById(sessionId)
: await this.createSessionWithContext(knex, resolvedTenantId, userId); : await this.createSessionWithContext(knex, resolvedTenantId, userId);
if (!session || session.tenantId !== resolvedTenantId) { if (!session) {
throw new Error('Chat session not found.'); throw new Error('Chat session not found.');
} }
@@ -72,18 +72,26 @@ export class AiProcessesOrchestratorService {
data: { count: processes.length }, data: { count: processes.length },
}); });
// If no processes configured, fallback to standard AI assistant
if (!processes.length) { if (!processes.length) {
const response = await this.aiAssistantService.handleChat( const response = await this.aiAssistantService.handleChat(
resolvedTenantId, resolvedTenantId,
userId, userId,
message, message,
history ?? [], (history ?? []) as any,
context ?? {}, context ?? {},
); );
this.streamService.emit(session.id, { this.streamService.emit(session.id, {
type: 'final', type: 'final',
data: { reply: response.reply, action: response.action }, data: { reply: response.reply, action: response.action },
}); });
await AiChatMessage.query(knex).insert({
sessionId: session.id,
role: 'assistant',
content: response.reply,
});
return { return {
sessionId: session.id, sessionId: session.id,
reply: response.reply, reply: response.reply,
@@ -92,29 +100,113 @@ export class AiProcessesOrchestratorService {
}; };
} }
const selectedProcess = processId // Get OpenAI credentials from tenant integrations
? processes.find((proc) => proc.id === processId) const credentials = await this.aiAssistantService.getOpenAiConfig(resolvedTenantId);
: processes[0]; if (!credentials?.apiKey) {
throw new Error('OpenAI credentials not configured for this tenant');
}
// Create Deep Agent with tenant's credentials
const deepAgent = new DeepAgentOrchestrator(credentials.apiKey, credentials.model);
// Use Deep Agent to select the best process
const processInfos = processes.map((p) => ({
id: p.id,
name: p.name,
description: p.description || undefined,
}));
const selection = await deepAgent.selectProcess(
message,
processInfos,
history as any,
);
// If we need more information or no match, respond with question
if (selection.action === 'need_more_info' || selection.action === 'no_match') {
const reply = selection.question || selection.reasoning ||
'I\'m not sure which process to use. Could you provide more details?';
this.streamService.emit(session.id, {
type: 'final',
data: { reply, needsMoreInfo: true },
});
await AiChatMessage.query(knex).insert({
sessionId: session.id,
role: 'assistant',
content: reply,
});
return { sessionId: session.id, reply, needsMoreInfo: true };
}
// Process selected - find it and execute
const selectedProcess = processes.find((p) => p.id === selection.processId);
if (!selectedProcess) { if (!selectedProcess) {
throw new Error('Process not found.'); throw new Error('Selected process not found.');
} }
this.streamService.emit(session.id, { this.streamService.emit(session.id, {
type: 'process_selected', type: 'process_selected',
processId: selectedProcess.id, processId: selectedProcess.id,
version: selectedProcess.latestVersion, version: selectedProcess.latestVersion,
data: { processName: selectedProcess.name, reasoning: selection.reasoning },
});
// Extract inputs from the message
// For now, we'll use a simple approach - just pass the message as input
// In a more sophisticated implementation, we'd use the deep agent to extract structured inputs
const startMessage = await deepAgent.generateStartMessage(
selectedProcess.name,
{ message },
);
this.streamService.emit(session.id, {
type: 'agent_message',
data: { message: startMessage },
});
await AiChatMessage.query(knex).insert({
sessionId: session.id,
role: 'assistant',
content: startMessage,
}); });
const { run, result } = await this.processesService.createRun( const { run, result } = await this.processesService.createRun(
resolvedTenantId, resolvedTenantId,
userId, userId,
selectedProcess.id, selectedProcess.id,
{ message }, { message, context: context || {} },
session.id, session.id,
(payload) => this.streamService.emit(session.id, payload), (payload) => this.streamService.emit(session.id, payload),
); );
// Emit final event
this.streamService.emit(session.id, {
type: 'final',
data: {
runId: run.id,
status: result.status,
output: result.output,
message: result.status === 'completed'
? '✅ Workflow completed successfully!'
: result.status === 'error'
? `❌ Workflow failed: ${result.error?.message || 'Unknown error'}`
: '⏸️ Workflow paused',
},
});
await AiChatMessage.query(knex).insert({
sessionId: session.id,
role: 'assistant',
content: result.status === 'completed'
? '✅ Workflow completed successfully!'
: result.status === 'error'
? `❌ Workflow failed: ${result.error?.message || 'Unknown error'}`
: '⏸️ Workflow paused',
});
return { sessionId: session.id, runId: run.id, status: result.status }; return { sessionId: session.id, runId: run.id, status: result.status };
} }
} }

View File

@@ -86,13 +86,24 @@ export const runCompiledGraph = async (
const argsTemplate = (node.data as { argsTemplate: Record<string, unknown> }) const argsTemplate = (node.data as { argsTemplate: Record<string, unknown> })
.argsTemplate; .argsTemplate;
const resolvedArgs = resolveTemplate(argsTemplate, state); const resolvedArgs = resolveTemplate(argsTemplate, state);
// Debug logging
console.log(`[ToolNode ${node.id}] Tool: ${toolName}`);
console.log(`[ToolNode ${node.id}] State keys:`, Object.keys(state));
console.log(`[ToolNode ${node.id}] ArgsTemplate:`, JSON.stringify(argsTemplate));
console.log(`[ToolNode ${node.id}] ResolvedArgs:`, JSON.stringify(resolvedArgs));
const toolResult = await tool(toolContext, { const toolResult = await tool(toolContext, {
...resolvedArgs, ...resolvedArgs,
state, state,
}); });
console.log(`[ToolNode ${node.id}] ToolResult:`, JSON.stringify(toolResult));
const outputMapping = (node.data as { outputMapping: Record<string, string> }) const outputMapping = (node.data as { outputMapping: Record<string, string> })
.outputMapping; .outputMapping;
Object.entries(outputMapping).forEach(([key, path]) => { Object.entries(outputMapping).forEach(([key, path]) => {
console.log(`[ToolNode ${node.id}] Mapping: toolResult['${key}'] = ${toolResult[key]} -> state['${path}']`);
state[path] = toolResult[key]; state[path] = toolResult[key];
}); });
} }
@@ -203,6 +214,9 @@ const validateNodeOutput = (
const ajv = new Ajv({ allErrors: true, strict: false }); const ajv = new Ajv({ allErrors: true, strict: false });
const validate = ajv.compile(schema); const validate = ajv.compile(schema);
if (!validate(output)) { if (!validate(output)) {
throw new Error(`LLM output invalid for node ${node.id}.`); const errors = validate.errors?.map(e => `${e.instancePath} ${e.message}`).join(', ');
throw new Error(
`LLM output invalid for node ${node.id}. Errors: ${errors}. Output: ${JSON.stringify(output)}`
);
} }
}; };

View File

@@ -15,7 +15,7 @@ const nodeTypes: AiNodeType[] = [
'End', 'End',
]; ];
export const graphSchema: JSONSchemaType<ProcessGraphDefinition> = { export const graphSchema: any = {
type: 'object', type: 'object',
required: ['id', 'name', 'nodes', 'edges'], required: ['id', 'name', 'nodes', 'edges'],
additionalProperties: false, additionalProperties: false,
@@ -47,7 +47,7 @@ export const graphSchema: JSONSchemaType<ProcessGraphDefinition> = {
target: { type: 'string' }, target: { type: 'string' },
condition: { type: 'object', nullable: true }, condition: { type: 'object', nullable: true },
}, },
} as JSONSchemaType<ProcessGraphEdge>, },
processGraphNode: { processGraphNode: {
type: 'object', type: 'object',
required: ['id', 'type', 'data'], required: ['id', 'type', 'data'],
@@ -67,7 +67,7 @@ export const graphSchema: JSONSchemaType<ProcessGraphDefinition> = {
}, },
data: { type: 'object' }, data: { type: 'object' },
}, },
} as JSONSchemaType<ProcessGraphNode>, },
}, },
}; };

View File

@@ -16,6 +16,7 @@ import {
ProcessGraphDefinition, ProcessGraphDefinition,
} from './ai-processes.types'; } from './ai-processes.types';
import { ToolRegistry } from './tools/tool-registry'; import { ToolRegistry } from './tools/tool-registry';
import { demoTools } from './tools/demo-tools';
@Injectable() @Injectable()
export class AiProcessesService { export class AiProcessesService {
@@ -31,9 +32,8 @@ export class AiProcessesService {
const { knex, tenantId: resolvedTenantId } = const { knex, tenantId: resolvedTenantId } =
await this.getTenantContext(tenantId); await this.getTenantContext(tenantId);
return AiProcess.query(knex) return AiProcess.query(knex)
.where('tenantId', resolvedTenantId)
.withGraphFetched('versions') .withGraphFetched('versions')
.orderBy('createdAt', 'desc'); .orderBy('created_at', 'desc');
} }
async getProcess(tenantId: string, processId: string) { async getProcess(tenantId: string, processId: string) {
@@ -62,21 +62,20 @@ export class AiProcessesService {
await AiProcess.query(trx).insert({ await AiProcess.query(trx).insert({
id: processId, id: processId,
tenantId: resolvedTenantId,
name, name,
description, description,
latestVersion: 1, latestVersion: 1,
createdBy: userId, createdBy: userId,
}); });
await AiProcessVersion.query(trx).insert({ await trx('ai_process_versions').insert({
id: randomUUID(), id: randomUUID(),
tenantId: resolvedTenantId, process_id: processId,
processId,
version: 1, version: 1,
graphJson: graph, graph_json: JSON.stringify(graph),
compiledJson: compiled, compiled_json: JSON.stringify(compiled),
createdBy: userId, created_by: userId,
created_at: new Date(),
}); });
return AiProcess.query(trx) return AiProcess.query(trx)
@@ -95,7 +94,7 @@ export class AiProcessesService {
await this.getTenantContext(tenantId); await this.getTenantContext(tenantId);
const process = await AiProcess.query(knex).findById(processId); const process = await AiProcess.query(knex).findById(processId);
if (!process || process.tenantId !== resolvedTenantId) { if (!process) {
throw new Error('Process not found.'); throw new Error('Process not found.');
} }
@@ -111,14 +110,14 @@ export class AiProcessesService {
.patch({ latestVersion: nextVersion }); .patch({ latestVersion: nextVersion });
const versionId = randomUUID(); const versionId = randomUUID();
await AiProcessVersion.query(trx).insert({ await trx('ai_process_versions').insert({
id: versionId, id: versionId,
tenantId: resolvedTenantId, process_id: processId,
processId,
version: nextVersion, version: nextVersion,
graphJson: graph, graph_json: JSON.stringify(graph),
compiledJson: compiled, compiled_json: JSON.stringify(compiled),
createdBy: userId, created_by: userId,
created_at: new Date(),
}); });
return AiProcessVersion.query(trx).findById(versionId); return AiProcessVersion.query(trx).findById(versionId);
@@ -129,7 +128,7 @@ export class AiProcessesService {
const { knex, tenantId: resolvedTenantId } = const { knex, tenantId: resolvedTenantId } =
await this.getTenantContext(tenantId); await this.getTenantContext(tenantId);
return AiProcessVersion.query(knex) return AiProcessVersion.query(knex)
.where({ processId, tenantId: resolvedTenantId }) .where({ process_id: processId })
.orderBy('version', 'desc'); .orderBy('version', 'desc');
} }
@@ -144,12 +143,12 @@ export class AiProcessesService {
const { knex, tenantId: resolvedTenantId } = const { knex, tenantId: resolvedTenantId } =
await this.getTenantContext(tenantId); await this.getTenantContext(tenantId);
const process = await AiProcess.query(knex).findById(processId); const process = await AiProcess.query(knex).findById(processId);
if (!process || process.tenantId !== resolvedTenantId) { if (!process) {
throw new Error('Process not found.'); throw new Error('Process not found.');
} }
const versionRecord = await AiProcessVersion.query(knex).findOne({ const versionRecord = await AiProcessVersion.query(knex).findOne({
processId, process_id: processId,
version: process.latestVersion, version: process.latestVersion,
}); });
@@ -160,7 +159,6 @@ export class AiProcessesService {
const runId = randomUUID(); const runId = randomUUID();
await AiProcessRun.query(knex).insert({ await AiProcessRun.query(knex).insert({
id: runId, id: runId,
tenantId: resolvedTenantId,
processId, processId,
version: versionRecord.version, version: versionRecord.version,
status: 'running', status: 'running',
@@ -174,16 +172,17 @@ export class AiProcessesService {
throw new Error('Run not created.'); throw new Error('Run not created.');
} }
const compiled = versionRecord.compiledJson as CompiledGraph; const compiled = versionRecord.compiledJson as unknown as CompiledGraph;
const toolRegistry = new ToolRegistry(); const toolRegistry = new ToolRegistry(demoTools);
await toolRegistry.loadTenantAllowlist(resolvedTenantId, knex);
const emitAndAudit = (event: AiProcessEventPayload) => { const emitAndAudit = (event: AiProcessEventPayload) => {
emitEvent?.(event); emitEvent?.(event);
void AiAuditEvent.query(knex).insert({ void AiAuditEvent.query(knex).insert({
id: randomUUID(), id: randomUUID(),
tenantId: resolvedTenantId,
runId, runId,
eventType: event.type, eventType: event.type,
payloadJson: event, payloadJson: event as any,
}); });
}; };
const result = await runCompiledGraph( const result = await runCompiledGraph(
@@ -191,7 +190,7 @@ export class AiProcessesService {
compiledGraph: compiled, compiledGraph: compiled,
input, input,
toolRegistry, toolRegistry,
toolContext: { tenantId: resolvedTenantId, userId }, toolContext: { tenantId: resolvedTenantId, userId, knex },
onEvent: (event) => emitAndAudit({ ...event, runId, sessionId }), onEvent: (event) => emitAndAudit({ ...event, runId, sessionId }),
llmDecision: async (node, state) => llmDecision: async (node, state) =>
this.mockDecision(node.id, state), this.mockDecision(node.id, state),
@@ -215,28 +214,29 @@ export class AiProcessesService {
const { knex, tenantId: resolvedTenantId } = const { knex, tenantId: resolvedTenantId } =
await this.getTenantContext(tenantId); await this.getTenantContext(tenantId);
const run = await AiProcessRun.query(knex).findById(runId); const run = await AiProcessRun.query(knex).findById(runId);
if (!run || run.tenantId !== resolvedTenantId) { if (!run) {
throw new Error('Run not found.'); throw new Error('Run not found.');
} }
const versionRecord = await AiProcessVersion.query(knex).findOne({ const versionRecord = await AiProcessVersion.query(knex).findOne({
processId: run.processId, process_id: run.processId,
version: run.version, version: run.version,
}); });
if (!versionRecord) { if (!versionRecord) {
throw new Error('Process version not found.'); throw new Error('Process version not found.');
} }
const compiled = versionRecord.compiledJson as CompiledGraph; const compiled = versionRecord.compiledJson as unknown as CompiledGraph;
const toolRegistry = new ToolRegistry(); const toolRegistry = new ToolRegistry(demoTools);
await toolRegistry.loadTenantAllowlist(resolvedTenantId, knex);
const mergedState = { ...(run.stateJson || {}), ...input }; const mergedState = { ...(run.stateJson || {}), ...input };
const emitAndAudit = (event: AiProcessEventPayload) => { const emitAndAudit = (event: AiProcessEventPayload) => {
emitEvent?.(event); emitEvent?.(event);
void AiAuditEvent.query(knex).insert({ void AiAuditEvent.query(knex).insert({
id: randomUUID(), id: randomUUID(),
tenantId: resolvedTenantId,
runId: run.id, runId: run.id,
eventType: event.type, eventType: event.type,
payloadJson: event, payloadJson: event as any,
}); });
}; };
@@ -245,7 +245,7 @@ export class AiProcessesService {
compiledGraph: compiled, compiledGraph: compiled,
input: mergedState, input: mergedState,
toolRegistry, toolRegistry,
toolContext: { tenantId: resolvedTenantId, userId }, toolContext: { tenantId: resolvedTenantId, userId, knex },
onEvent: (event) => onEvent: (event) =>
emitAndAudit({ ...event, runId: run.id, sessionId }), emitAndAudit({ ...event, runId: run.id, sessionId }),
llmDecision: async (node, state) => llmDecision: async (node, state) =>
@@ -279,6 +279,30 @@ export class AiProcessesService {
nodeId: string, nodeId: string,
state: Record<string, unknown>, state: Record<string, unknown>,
) { ) {
if (nodeId === 'extract_info') {
// Extract pet registration info from the message
const message = (state.message as string) || '';
// Simple extraction (in production, this would use an LLM)
const petNameMatch = message.match(/(?:dog|cat|pet)\s+named\s+(\w+)/i);
const petTypeMatch = message.match(/(dog|cat)/i);
const ownerNameMatch = message.match(/owned\s+by\s+([\w\s]+?)(?:\s*\(|$)/i);
const emailMatch = message.match(/\(?([\w\.-]+@[\w\.-]+\.\w+)\)?/i);
const ownerName = ownerNameMatch?.[1]?.trim() || 'Unknown Owner';
const nameParts = ownerName.split(/\s+/);
const firstName = nameParts[0] || 'Unknown';
const lastName = nameParts.slice(1).join(' ') || 'Owner';
return {
petName: petNameMatch?.[1] || 'Unknown Pet',
species: petTypeMatch?.[1]?.toLowerCase() || 'dog',
ownerFirstName: firstName,
ownerLastName: lastName,
ownerEmail: emailMatch?.[1] || null,
accountName: `${firstName} ${lastName}`,
};
}
if (nodeId === 'decide_account') { if (nodeId === 'decide_account') {
const accountName = (state.accountName as string) ?? 'New Account'; const accountName = (state.accountName as string) ?? 'New Account';
const accountAction = state.accountId ? 'find' : 'create'; const accountAction = state.accountId ? 'find' : 'create';

View File

@@ -94,6 +94,7 @@ export type AiProcessEventType =
| 'agent_started' | 'agent_started'
| 'processes_listed' | 'processes_listed'
| 'process_selected' | 'process_selected'
| 'agent_message'
| 'node_started' | 'node_started'
| 'tool_called' | 'tool_called'
| 'node_completed' | 'node_completed'

View File

@@ -0,0 +1,202 @@
import { ChatOpenAI } from '@langchain/openai';
import { JsonOutputParser } from '@langchain/core/output_parsers';
import { SystemMessage, HumanMessage } from '@langchain/core/messages';
export interface ProcessInfo {
id: string;
name: string;
description?: string;
}
export interface ProcessSelectionResult {
action: 'select_process' | 'need_more_info' | 'no_match';
processId?: string;
question?: string;
reasoning?: string;
}
export interface InputExtractionResult {
hasAllInputs: boolean;
extractedInputs: Record<string, unknown>;
missingFields?: string[];
question?: string;
}
export class DeepAgentOrchestrator {
private model: ChatOpenAI;
constructor(
apiKey: string,
modelName: string = 'gpt-4o',
temperature: number = 0,
) {
this.model = new ChatOpenAI({
apiKey,
modelName,
temperature,
});
}
/**
* Step 1: Select the best matching process from available processes
*/
async selectProcess(
userMessage: string,
availableProcesses: ProcessInfo[],
conversationHistory?: { role: string; text: string }[],
): Promise<ProcessSelectionResult> {
const processList = availableProcesses
.map((p) => `- ${p.name} (ID: ${p.id}): ${p.description || 'No description'}`)
.join('\n');
const historyContext =
conversationHistory && conversationHistory.length > 0
? `\n\nConversation history:\n${conversationHistory
.map((msg) => `${msg.role}: ${msg.text}`)
.join('\n')}`
: '';
const systemPrompt = `You are an intelligent process orchestrator. Your task is to select the most appropriate business process based on the user's request.
Available processes:
${processList}
Rules:
1. Select exactly ONE process that best matches the user's intent
2. If the request is ambiguous or matches multiple processes, ask for clarification
3. If no process matches, indicate no match
4. Always provide reasoning for your decision
Respond with JSON:
{
"action": "select_process" | "need_more_info" | "no_match",
"processId": "selected process ID or null",
"question": "clarifying question if needed",
"reasoning": "brief explanation of decision"
}`;
const userPrompt = `User request: ${userMessage}${historyContext}`;
try {
const response = await this.model.invoke([
new SystemMessage(systemPrompt),
new HumanMessage(userPrompt),
]);
const parser = new JsonOutputParser<ProcessSelectionResult>();
const content = response.content as string;
const jsonMatch = content.match(/\{[\s\S]*\}/);
if (jsonMatch) {
return await parser.parse(jsonMatch[0]);
}
return {
action: 'no_match',
reasoning: 'Failed to parse LLM response',
};
} catch (error: any) {
console.error('Process selection error:', error);
return {
action: 'no_match',
reasoning: `Error: ${error.message}`,
};
}
}
/**
* Step 2: Extract required inputs from user message
*/
async extractInputs(
userMessage: string,
requiredFields: { name: string; description: string; required: boolean }[],
conversationHistory?: { role: string; text: string }[],
context?: Record<string, unknown>,
): Promise<InputExtractionResult> {
const fieldsList = requiredFields
.map((f) => `- ${f.name} (${f.required ? 'required' : 'optional'}): ${f.description}`)
.join('\n');
const historyContext =
conversationHistory && conversationHistory.length > 0
? `\n\nConversation history:\n${conversationHistory
.map((msg) => `${msg.role}: ${msg.text}`)
.join('\n')}`
: '';
const contextInfo = context ? `\n\nAvailable context: ${JSON.stringify(context)}` : '';
const systemPrompt = `You are an input extraction assistant. Extract structured data from the user's message and conversation history.
Required fields for this process:
${fieldsList}${contextInfo}
Rules:
1. Extract as many fields as possible from the message and context
2. Only mark hasAllInputs=true if ALL required fields are present
3. If required fields are missing, generate a natural question to ask the user
4. Use context data when available (e.g., current page context)
Respond with JSON:
{
"hasAllInputs": true | false,
"extractedInputs": { "field1": "value1", ... },
"missingFields": ["field1", "field2"] or undefined,
"question": "natural language question" or undefined
}`;
const userPrompt = `User message: ${userMessage}${historyContext}`;
try {
const response = await this.model.invoke([
new SystemMessage(systemPrompt),
new HumanMessage(userPrompt),
]);
const parser = new JsonOutputParser<InputExtractionResult>();
const content = response.content as string;
const jsonMatch = content.match(/\{[\s\S]*\}/);
if (jsonMatch) {
return await parser.parse(jsonMatch[0]);
}
return {
hasAllInputs: false,
extractedInputs: {},
missingFields: requiredFields.filter((f) => f.required).map((f) => f.name),
question: 'I need more information to proceed. Could you provide additional details?',
};
} catch (error: any) {
console.error('Input extraction error:', error);
return {
hasAllInputs: false,
extractedInputs: {},
question: 'I encountered an error processing your request. Please try again.',
};
}
}
/**
* Step 3: Generate a friendly response explaining what will happen
*/
async generateStartMessage(
processName: string,
extractedInputs: Record<string, unknown>,
): Promise<string> {
const systemPrompt = `You are a friendly assistant explaining what process will be executed. Be concise and clear.`;
const userPrompt = `Generate a brief message (1-2 sentences) confirming that you will execute the "${processName}" process with these inputs: ${JSON.stringify(extractedInputs)}`;
try {
const response = await this.model.invoke([
new SystemMessage(systemPrompt),
new HumanMessage(userPrompt),
]);
return (response.content as string).trim();
} catch (error) {
return `I'll execute the ${processName} process with your provided information.`;
}
}
}

View File

@@ -0,0 +1,226 @@
import { ToolContext, ToolHandler } from './tool-registry';
import { Account } from '../../models/account.model';
import { Contact } from '../../models/contact.model';
import { randomUUID } from 'crypto';
/**
* Demo tools that wrap ObjectService operations
* These tools provide structured access to CRM entities
*/
export const findAccount: ToolHandler = async (ctx, args) => {
if (!ctx.knex) {
throw new Error('Knex connection required for findAccount');
}
const { name } = args as { name?: string };
if (!name) {
return { found: false, accountId: null, message: 'Name required' };
}
try {
const query = Account.query(ctx.knex).where('name', 'like', `%${name}%`);
const account = await query.first();
if (account) {
return {
found: true,
accountId: account.id,
account: {
id: account.id,
name: account.name,
},
};
}
return { found: false, accountId: null };
} catch (error: any) {
return { found: false, error: error.message };
}
};
export const createAccount: ToolHandler = async (ctx, args) => {
if (!ctx.knex) {
throw new Error('Knex connection required for createAccount');
}
const { name, email, phone, industry } = args as {
name: string;
email?: string;
phone?: string;
industry?: string;
};
if (!name) {
throw new Error('Account name is required');
}
try {
const accountId = randomUUID();
await ctx.knex('accounts').insert({
id: accountId,
name,
phone,
industry,
ownerId: ctx.userId,
});
return {
success: true,
accountId,
account: {
id: accountId,
name,
},
};
} catch (error: any) {
return { success: false, error: error.message };
}
};
export const findContact: ToolHandler = async (ctx, args) => {
if (!ctx.knex) {
throw new Error('Knex connection required for findContact');
}
const { firstName, lastName, accountId } = args as {
firstName?: string;
lastName?: string;
accountId?: string;
};
if (!firstName && !lastName) {
return {
found: false,
contactId: null,
message: 'First name or last name required',
};
}
try {
let query = Contact.query(ctx.knex);
if (firstName) {
query = query.where('firstName', 'like', `%${firstName}%`);
}
if (lastName) {
query = query.where('lastName', 'like', `%${lastName}%`);
}
if (accountId) {
query = query.where('accountId', accountId);
}
const contact = await query.first();
if (contact) {
return {
found: true,
contactId: contact.id,
contact: {
id: contact.id,
firstName: contact.firstName,
lastName: contact.lastName,
accountId: contact.accountId,
},
};
}
return { found: false, contactId: null };
} catch (error: any) {
return { found: false, error: error.message };
}
};
export const createContact: ToolHandler = async (ctx, args) => {
if (!ctx.knex) {
throw new Error('Knex connection required for createContact');
}
const { firstName, lastName, email, phone, accountId } = args as {
firstName: string;
lastName: string;
email?: string;
phone?: string;
accountId?: string;
};
if (!firstName || !lastName) {
throw new Error('First name and last name are required');
}
try {
const contactId = randomUUID();
await ctx.knex('contacts').insert({
id: contactId,
firstName,
lastName,
accountId,
ownerId: ctx.userId,
});
return {
success: true,
contactId,
contact: {
id: contactId,
firstName,
lastName,
accountId,
},
};
} catch (error: any) {
return { success: false, error: error.message };
}
};
export const createPet: ToolHandler = async (ctx, args) => {
if (!ctx.knex) {
throw new Error('Knex connection required for createPet');
}
const { name, species, breed, age, ownerId } = args as {
name: string;
species: string;
breed?: string;
age?: number;
ownerId: string; // Contact ID
};
if (!name || !ownerId) {
throw new Error('Pet name and owner (contact) are required');
}
try {
const petId = randomUUID();
// Get the accountId from the contact
const contact = await ctx.knex('contacts').where('id', ownerId).first();
// Insert into dogs table
await ctx.knex('dogs').insert({
id: petId,
name,
ownerId,
accountId: contact?.accountId,
});
return {
success: true,
petId,
pet: { id: petId, name, ownerId, accountId: contact?.accountId },
};
} catch (error: any) {
return { success: false, error: error.message };
}
};
// Export all demo tools
export const demoTools = {
findAccount,
createAccount,
findContact,
createContact,
createPet,
};

View File

@@ -1,6 +1,10 @@
import { Knex } from 'knex';
import { AiToolConfig } from '../../models/ai-process.model';
export interface ToolContext { export interface ToolContext {
tenantId: string; tenantId: string;
userId: string; userId: string;
knex?: Knex;
authScopes?: string[]; authScopes?: string[];
} }
@@ -9,6 +13,13 @@ export type ToolHandler = (
args: Record<string, unknown>, args: Record<string, unknown>,
) => Promise<Record<string, unknown>>; ) => Promise<Record<string, unknown>>;
export interface ToolDefinition {
name: string;
description: string;
handler: ToolHandler;
inputSchema?: Record<string, unknown>;
}
const defaultTools: Record<string, ToolHandler> = { const defaultTools: Record<string, ToolHandler> = {
findAccount: async () => ({ accountId: null, found: false }), findAccount: async () => ({ accountId: null, found: false }),
createAccount: async (_ctx, args) => ({ accountId: `acc_${Date.now()}`, args }), createAccount: async (_ctx, args) => ({ accountId: `acc_${Date.now()}`, args }),
@@ -24,6 +35,7 @@ const tenantAllowlist: Record<string, string[]> = {
export class ToolRegistry { export class ToolRegistry {
private tools: Record<string, ToolHandler>; private tools: Record<string, ToolHandler>;
private allowlist: Record<string, string[]>; private allowlist: Record<string, string[]>;
private dbAllowlistCache: Map<string, Set<string>> = new Map();
constructor( constructor(
tools: Record<string, ToolHandler> = defaultTools, tools: Record<string, ToolHandler> = defaultTools,
@@ -33,7 +45,32 @@ export class ToolRegistry {
this.allowlist = allowlist; this.allowlist = allowlist;
} }
isToolAllowed(tenantId: string, toolName: string) { registerTool(name: string, handler: ToolHandler) {
this.tools[name] = handler;
}
async loadTenantAllowlist(tenantId: string, knex: Knex) {
const configs = await AiToolConfig.query(knex)
.where('enabled', true);
const allowed = new Set(configs.map((c) => c.toolName));
this.dbAllowlistCache.set(tenantId, allowed);
return allowed;
}
async isToolAllowed(tenantId: string, toolName: string, knex?: Knex) {
// Check database cache first
if (this.dbAllowlistCache.has(tenantId)) {
return this.dbAllowlistCache.get(tenantId)!.has(toolName);
}
// Load from database if knex provided
if (knex) {
const allowed = await this.loadTenantAllowlist(tenantId, knex);
return allowed.has(toolName);
}
// Fallback to static allowlist
const allowed = this.allowlist[tenantId] || this.allowlist.default || []; const allowed = this.allowlist[tenantId] || this.allowlist.default || [];
return allowed.includes(toolName); return allowed.includes(toolName);
} }
@@ -45,4 +82,8 @@ export class ToolRegistry {
} }
return tool; return tool;
} }
getAllToolNames(): string[] {
return Object.keys(this.tools);
}
} }

View File

@@ -7,7 +7,6 @@ export class AiChatSession extends BaseModel {
static columnNameMappers = snakeCaseMappers(); static columnNameMappers = snakeCaseMappers();
id!: string; id!: string;
tenantId!: string;
userId!: string; userId!: string;
createdAt!: Date; createdAt!: Date;
@@ -25,7 +24,7 @@ export class AiChatSession extends BaseModel {
modelClass: AiChatMessage, modelClass: AiChatMessage,
join: { join: {
from: 'ai_chat_sessions.id', from: 'ai_chat_sessions.id',
to: 'ai_chat_messages.sessionId', to: 'ai_chat_messages.session_id',
}, },
}, },
}; };
@@ -55,7 +54,7 @@ export class AiChatMessage extends BaseModel {
relation: BaseModel.BelongsToOneRelation, relation: BaseModel.BelongsToOneRelation,
modelClass: AiChatSession, modelClass: AiChatSession,
join: { join: {
from: 'ai_chat_messages.sessionId', from: 'ai_chat_messages.session_id',
to: 'ai_chat_sessions.id', to: 'ai_chat_sessions.id',
}, },
}, },

View File

@@ -7,7 +7,6 @@ export class AiProcess extends BaseModel {
static columnNameMappers = snakeCaseMappers(); static columnNameMappers = snakeCaseMappers();
id!: string; id!: string;
tenantId!: string;
name!: string; name!: string;
description?: string; description?: string;
latestVersion!: number; latestVersion!: number;
@@ -27,7 +26,7 @@ export class AiProcess extends BaseModel {
modelClass: AiProcessVersion, modelClass: AiProcessVersion,
join: { join: {
from: 'ai_processes.id', from: 'ai_processes.id',
to: 'ai_process_versions.processId', to: 'ai_process_versions.process_id',
}, },
}, },
runs: { runs: {
@@ -35,7 +34,7 @@ export class AiProcess extends BaseModel {
modelClass: AiProcessRun, modelClass: AiProcessRun,
join: { join: {
from: 'ai_processes.id', from: 'ai_processes.id',
to: 'ai_process_runs.processId', to: 'ai_process_runs.process_id',
}, },
}, },
}; };
@@ -45,9 +44,9 @@ export class AiProcess extends BaseModel {
export class AiProcessVersion extends BaseModel { export class AiProcessVersion extends BaseModel {
static tableName = 'ai_process_versions'; static tableName = 'ai_process_versions';
static columnNameMappers = snakeCaseMappers(); static columnNameMappers = snakeCaseMappers();
static jsonAttributes = ['graphJson', 'compiledJson'];
id!: string; id!: string;
tenantId!: string;
processId!: string; processId!: string;
version!: number; version!: number;
graphJson!: Record<string, unknown>; graphJson!: Record<string, unknown>;
@@ -68,7 +67,7 @@ export class AiProcessVersion extends BaseModel {
relation: BaseModel.BelongsToOneRelation, relation: BaseModel.BelongsToOneRelation,
modelClass: AiProcess, modelClass: AiProcess,
join: { join: {
from: 'ai_process_versions.processId', from: 'ai_process_versions.process_id',
to: 'ai_processes.id', to: 'ai_processes.id',
}, },
}, },
@@ -79,9 +78,9 @@ export class AiProcessVersion extends BaseModel {
export class AiProcessRun extends BaseModel { export class AiProcessRun extends BaseModel {
static tableName = 'ai_process_runs'; static tableName = 'ai_process_runs';
static columnNameMappers = snakeCaseMappers(); static columnNameMappers = snakeCaseMappers();
static jsonAttributes = ['inputJson', 'outputJson', 'errorJson', 'stateJson'];
id!: string; id!: string;
tenantId!: string;
processId!: string; processId!: string;
version!: number; version!: number;
status!: string; status!: string;
@@ -106,7 +105,7 @@ export class AiProcessRun extends BaseModel {
relation: BaseModel.BelongsToOneRelation, relation: BaseModel.BelongsToOneRelation,
modelClass: AiProcess, modelClass: AiProcess,
join: { join: {
from: 'ai_process_runs.processId', from: 'ai_process_runs.process_id',
to: 'ai_processes.id', to: 'ai_processes.id',
}, },
}, },
@@ -117,9 +116,9 @@ export class AiProcessRun extends BaseModel {
export class AiAuditEvent extends BaseModel { export class AiAuditEvent extends BaseModel {
static tableName = 'ai_audit_events'; static tableName = 'ai_audit_events';
static columnNameMappers = snakeCaseMappers(); static columnNameMappers = snakeCaseMappers();
static jsonAttributes = ['payloadJson'];
id!: string; id!: string;
tenantId!: string;
runId!: string; runId!: string;
eventType!: string; eventType!: string;
payloadJson!: Record<string, unknown>; payloadJson!: Record<string, unknown>;
@@ -138,10 +137,28 @@ export class AiAuditEvent extends BaseModel {
relation: BaseModel.BelongsToOneRelation, relation: BaseModel.BelongsToOneRelation,
modelClass: AiProcessRun, modelClass: AiProcessRun,
join: { join: {
from: 'ai_audit_events.runId', from: 'ai_audit_events.run_id',
to: 'ai_process_runs.id', to: 'ai_process_runs.id',
}, },
}, },
}; };
} }
} }
export class AiToolConfig extends BaseModel {
static tableName = 'ai_tool_configs';
static columnNameMappers = snakeCaseMappers();
static jsonAttributes = ['configJson'];
id!: string;
toolName!: string;
enabled!: boolean;
configJson?: Record<string, unknown>;
createdAt!: Date;
updatedAt!: Date;
$beforeInsert(queryContext: QueryContext) {
this.id = this.id || randomUUID();
super.$beforeInsert(queryContext);
}
}

View File

@@ -110,8 +110,9 @@ export class TenantDatabaseService {
* @deprecated Use getTenantKnexByDomain or getTenantKnexById instead * @deprecated Use getTenantKnexByDomain or getTenantKnexById instead
*/ */
async getTenantKnex(tenantIdOrSlug: string): Promise<Knex> { async getTenantKnex(tenantIdOrSlug: string): Promise<Knex> {
// Assume it's a domain if it contains a dot // Resolve tenant ID first, then get connection by ID
return this.getTenantKnexByDomain(tenantIdOrSlug); const tenantId = await this.resolveTenantId(tenantIdOrSlug);
return this.getTenantKnexById(tenantId);
} }
/** /**

View File

@@ -1,43 +1,153 @@
import { Background, Controls, MiniMap, ReactFlow, useEdgesState, useNodesState } from '@xyflow/react' import { useCallback, useEffect } from 'react'
import {
Background,
Controls,
MiniMap,
ReactFlow,
useEdgesState,
useNodesState,
addEdge,
Connection,
Edge,
Node,
Panel,
} from '@xyflow/react'
import '@xyflow/react/dist/style.css' import '@xyflow/react/dist/style.css'
import './styles.css' import './styles.css'
const initialNodes = [ const nodeTypes = {
{ id: 'start', data: { label: 'Start' }, position: { x: 0, y: 0 } }, Start: { style: { background: '#22c55e', color: 'white', padding: 10, borderRadius: 5 } },
{ id: 'llm', data: { label: 'LLM Decision' }, position: { x: 220, y: 0 } }, LLMDecisionNode: { style: { background: '#3b82f6', color: 'white', padding: 10, borderRadius: 5 } },
{ id: 'tool', data: { label: 'Tool Node' }, position: { x: 440, y: 0 } }, ToolNode: { style: { background: '#f59e0b', color: 'white', padding: 10, borderRadius: 5 } },
{ id: 'end', data: { label: 'End' }, position: { x: 660, y: 0 } }, HumanInputNode: { style: { background: '#8b5cf6', color: 'white', padding: 10, borderRadius: 5 } },
End: { style: { background: '#ef4444', color: 'white', padding: 10, borderRadius: 5 } },
}
const initialNodes: Node[] = [
{
id: 'start-1',
type: 'default',
data: { label: '🟢 Start', type: 'Start' },
position: { x: 250, y: 50 },
style: nodeTypes.Start.style,
},
{
id: 'end-1',
type: 'default',
data: { label: '🔴 End', type: 'End' },
position: { x: 250, y: 400 },
style: nodeTypes.End.style,
},
] ]
const initialEdges = [ const initialEdges: Edge[] = []
{ id: 'e1-2', source: 'start', target: 'llm' },
{ id: 'e2-3', source: 'llm', target: 'tool' },
{ id: 'e3-4', source: 'tool', target: 'end' },
]
export const App = () => { export const App = () => {
const [nodes, setNodes, onNodesChange] = useNodesState(initialNodes) const [nodes, setNodes, onNodesChange] = useNodesState(initialNodes)
const [edges, setEdges, onEdgesChange] = useEdgesState(initialEdges) const [edges, setEdges, onEdgesChange] = useEdgesState(initialEdges)
const onConnect = useCallback(
(params: Connection) => setEdges((eds) => addEdge(params, eds)),
[setEdges]
)
// Send graph updates to parent window
const notifyParent = useCallback(() => {
const graphData = {
id: 'process-graph',
name: 'Process',
nodes: nodes.map((node) => ({
id: node.id,
type: node.data.type || 'Start',
position: node.position,
data: node.data,
})),
edges: edges.map((edge) => ({
id: edge.id,
source: edge.source,
target: edge.target,
condition: edge.data?.condition,
})),
}
window.parent.postMessage(
{
type: 'GRAPH_UPDATED',
payload: graphData,
},
'*'
)
}, [nodes, edges])
// Listen for graph load from parent
useEffect(() => {
const handleMessage = (event: MessageEvent) => {
if (event.data.type === 'LOAD_GRAPH') {
const graph = event.data.payload
if (graph && graph.nodes && graph.edges) {
setNodes(
graph.nodes.map((node: any) => ({
id: node.id,
type: 'default',
data: { label: node.data.label || node.type, ...node.data },
position: node.position,
style: nodeTypes[node.type as keyof typeof nodeTypes]?.style || {},
}))
)
setEdges(
graph.edges.map((edge: any) => ({
id: edge.id,
source: edge.source,
target: edge.target,
data: edge.condition ? { condition: edge.condition } : undefined,
}))
)
}
}
}
window.addEventListener('message', handleMessage)
return () => window.removeEventListener('message', handleMessage)
}, [setNodes, setEdges])
// Notify parent on changes
useEffect(() => {
notifyParent()
}, [nodes, edges, notifyParent])
const addNode = (type: string) => {
const newNode: Node = {
id: `${type.toLowerCase()}-${Date.now()}`,
type: 'default',
data: { label: `${type}`, type },
position: { x: Math.random() * 400 + 50, y: Math.random() * 300 + 100 },
style: nodeTypes[type as keyof typeof nodeTypes]?.style || {},
}
setNodes((nds) => nds.concat(newNode))
}
return ( return (
<div className="editor-shell"> <div className="editor-shell">
<header className="editor-header"> <ReactFlow
<h1>AI Process Builder</h1> nodes={nodes}
<p>Design tenant workflows with deterministic execution.</p> edges={edges}
</header> onNodesChange={onNodesChange}
<div className="editor-canvas"> onEdgesChange={onEdgesChange}
<ReactFlow onConnect={onConnect}
nodes={nodes} fitView
edges={edges} >
onNodesChange={onNodesChange} <Panel position="top-left" className="node-palette">
onEdgesChange={onEdgesChange} <h3>Node Palette</h3>
fitView <button onClick={() => addNode('LLMDecisionNode')}>🔵 LLM Decision</button>
> <button onClick={() => addNode('ToolNode')}>🟠 Tool</button>
<MiniMap /> <button onClick={() => addNode('HumanInputNode')}>🟣 Human Input</button>
<Controls /> <button onClick={() => addNode('End')}>🔴 End</button>
<Background /> </Panel>
</ReactFlow> <MiniMap />
</div> <Controls />
<Background />
</ReactFlow>
</div> </div>
) )
} }

View File

@@ -4,30 +4,68 @@ body {
color: #0f172a; color: #0f172a;
} }
#root {
width: 100%;
height: 100vh;
}
.editor-shell { .editor-shell {
display: flex; display: flex;
flex-direction: column; flex-direction: column;
height: 100vh; height: 100vh;
width: 100%;
} }
.editor-header { .node-palette {
padding: 16px 20px; background: white;
border-bottom: 1px solid #e2e8f0; border: 1px solid #e2e8f0;
background: #fff; border-radius: 8px;
padding: 12px;
box-shadow: 0 2px 8px rgba(0, 0, 0, 0.1);
min-width: 150px;
} }
.editor-header h1 { .node-palette h3 {
margin: 0 0 4px; margin: 0 0 12px 0;
font-size: 18px; font-size: 14px;
font-weight: 600;
color: #1e293b;
} }
.editor-header p { .node-palette button {
margin: 0; display: block;
width: 100%;
padding: 8px 12px;
margin-bottom: 6px;
background: white;
border: 1px solid #e2e8f0;
border-radius: 6px;
font-size: 13px;
cursor: pointer;
text-align: left;
transition: all 0.2s;
}
.node-palette button:hover {
background: #f1f5f9;
border-color: #cbd5e1;
}
.node-palette button:active {
background: #e2e8f0;
}
.react-flow__node {
font-size: 12px; font-size: 12px;
color: #64748b; font-weight: 500;
} }
.editor-canvas { .react-flow__edge-path {
flex: 1; stroke: #64748b;
background: #f8fafc; stroke-width: 2;
} }
.react-flow__edge.selected .react-flow__edge-path {
stroke: #3b82f6;
}

View File

@@ -7,16 +7,31 @@ import {
InputGroupText, InputGroupText,
} from '@/components/ui/input-group' } from '@/components/ui/input-group'
import { Separator } from '@/components/ui/separator' import { Separator } from '@/components/ui/separator'
import { ArrowUp } from 'lucide-vue-next' import { ArrowUp, Loader2 } from 'lucide-vue-next'
import { useRoute } from 'vue-router' import { useRoute } from 'vue-router'
import { useApi } from '@/composables/useApi' import { useApi } from '@/composables/useApi'
interface ChatMessage {
role: 'user' | 'assistant' | 'system';
text: string;
isStreaming?: boolean;
}
interface StreamEvent {
type: string;
data?: any;
processId?: string;
nodeId?: string;
toolName?: string;
}
const chatInput = ref('') const chatInput = ref('')
const messages = ref<{ role: 'user' | 'assistant'; text: string }[]>([]) const messages = ref<ChatMessage[]>([])
const sending = ref(false) const sending = ref(false)
const route = useRoute() const route = useRoute()
const { api } = useApi() const { api } = useApi()
const sessionId = ref<string | null>(null) const sessionId = ref<string | null>(null)
const eventSource = ref<EventSource | null>(null)
const getTenantId = () => { const getTenantId = () => {
if (!import.meta.client) return 'tenant1' if (!import.meta.client) return 'tenant1'
@@ -39,6 +54,97 @@ const buildContext = () => {
} }
} }
const connectToStream = (sessionIdValue: string) => {
if (eventSource.value) {
eventSource.value.close()
}
const baseUrl = window.location.hostname === 'localhost'
? 'http://localhost:3000'
: `https://${window.location.hostname}`
eventSource.value = new EventSource(
`${baseUrl}/tenants/${getTenantId()}/ai-chat/stream?sessionId=${sessionIdValue}`
)
eventSource.value.onmessage = (event) => {
try {
const payload: StreamEvent = JSON.parse(event.data)
handleStreamEvent(payload)
} catch (error) {
console.error('Failed to parse stream event:', error)
}
}
eventSource.value.onerror = () => {
eventSource.value?.close()
eventSource.value = null
}
}
const handleStreamEvent = (event: StreamEvent) => {
switch (event.type) {
case 'agent_started':
// Agent is thinking
break
case 'processes_listed':
// Processes discovered
break
case 'process_selected':
messages.value.push({
role: 'system',
text: `🔄 Selected process: ${event.data?.processName || 'Process'}`,
})
break
case 'agent_message':
messages.value.push({
role: 'assistant',
text: event.data?.message || '',
})
break
case 'node_started':
const lastMsg = messages.value[messages.value.length - 1]
if (lastMsg?.isStreaming) {
lastMsg.text += `\n⚙ Executing step...`
}
break
case 'tool_called':
const lastToolMsg = messages.value[messages.value.length - 1]
if (lastToolMsg?.isStreaming) {
lastToolMsg.text += `\n🔧 Using tool: ${event.toolName}`
}
break
case 'need_input':
messages.value.push({
role: 'assistant',
text: event.data?.prompt || 'I need some additional information from you.',
})
sending.value = false
break
case 'final':
if (event.data?.output) {
messages.value.push({
role: 'assistant',
text: event.data.message || '✅ Process completed successfully!',
})
} else if (event.data?.reply) {
messages.value.push({
role: 'assistant',
text: event.data.reply,
})
}
sending.value = false
break
case 'error':
messages.value.push({
role: 'assistant',
text: `❌ Error: ${event.data?.error || 'An error occurred'}`,
})
sending.value = false
break
}
}
const handleSend = async () => { const handleSend = async () => {
if (!chatInput.value.trim()) return if (!chatInput.value.trim()) return
@@ -47,8 +153,20 @@ const handleSend = async () => {
chatInput.value = '' chatInput.value = ''
sending.value = true sending.value = true
// Add a streaming message placeholder
messages.value.push({
role: 'assistant',
text: '🤔 Thinking...',
isStreaming: true
})
try { try {
const history = messages.value.slice(0, -1).slice(-6) const history = messages.value
.filter(m => m.role !== 'system' && !m.isStreaming)
.slice(0, -1)
.slice(-6)
.map(m => ({ role: m.role, text: m.text }))
const response = await api.post(`/tenants/${getTenantId()}/ai-chat/messages`, { const response = await api.post(`/tenants/${getTenantId()}/ai-chat/messages`, {
message, message,
history, history,
@@ -56,34 +174,32 @@ const handleSend = async () => {
sessionId: sessionId.value || undefined, sessionId: sessionId.value || undefined,
}) })
if (response.sessionId) { if (response.sessionId && !sessionId.value) {
sessionId.value = response.sessionId sessionId.value = response.sessionId
connectToStream(response.sessionId)
} }
// Remove streaming placeholder and add response
messages.value = messages.value.filter(m => !m.isStreaming)
if (response.reply) { if (response.reply) {
messages.value.push({ messages.value.push({
role: 'assistant', role: 'assistant',
text: response.reply, text: response.reply,
}) })
if (response.action === 'create_record') {
window.dispatchEvent(
new CustomEvent('ai-record-created', {
detail: {
objectApiName: buildContext().objectApiName,
record: response.record,
},
}),
)
}
return
} }
messages.value.push({ // If process is running, stream will handle updates
role: 'assistant', if (response.runId) {
text: 'Process started. I will post updates as soon as they are ready.', messages.value.push({
}) role: 'assistant',
text: '⏳ Processing workflow...',
isStreaming: true,
})
}
} catch (error: any) { } catch (error: any) {
console.error('Failed to send AI chat message:', error) console.error('Failed to send AI chat message:', error)
messages.value = messages.value.filter(m => !m.isStreaming)
messages.value.push({ messages.value.push({
role: 'assistant', role: 'assistant',
text: error.message || 'Sorry, I ran into an error. Please try again.', text: error.message || 'Sorry, I ran into an error. Please try again.',
@@ -92,11 +208,17 @@ const handleSend = async () => {
sending.value = false sending.value = false
} }
} }
onUnmounted(() => {
if (eventSource.value) {
eventSource.value.close()
}
})
</script> </script>
<template> <template>
<div class="ai-chat-area w-full border-t border-border p-4 bg-neutral-50"> <div class="ai-chat-area w-full border-t border-border p-4 bg-neutral-50">
<div class="ai-chat-messages mb-4 space-y-3"> <div class="ai-chat-messages mb-4 space-y-3 max-h-[400px] overflow-y-auto">
<div <div
v-for="(message, index) in messages" v-for="(message, index) in messages"
:key="`${message.role}-${index}`" :key="`${message.role}-${index}`"
@@ -104,14 +226,19 @@ const handleSend = async () => {
:class="message.role === 'user' ? 'justify-end' : 'justify-start'" :class="message.role === 'user' ? 'justify-end' : 'justify-start'"
> >
<div <div
class="max-w-[80%] rounded-lg px-3 py-2 text-sm" class="max-w-[80%] rounded-lg px-3 py-2 text-sm whitespace-pre-line"
:class="message.role === 'user' ? 'bg-primary text-primary-foreground' : 'bg-white border border-border text-foreground'" :class="{
'bg-primary text-primary-foreground': message.role === 'user',
'bg-white border border-border text-foreground': message.role === 'assistant',
'bg-blue-50 border border-blue-200 text-blue-900 text-xs': message.role === 'system',
}"
> >
<Loader2 v-if="message.isStreaming" class="inline-block size-3 animate-spin mr-1" />
{{ message.text }} {{ message.text }}
</div> </div>
</div> </div>
<p v-if="messages.length === 0" class="text-sm text-muted-foreground"> <p v-if="messages.length === 0" class="text-sm text-muted-foreground">
Ask the assistant to add records, filter lists, or summarize the page. Ask the assistant to execute business processes, add records, or answer questions.
</p> </p>
</div> </div>
<InputGroup> <InputGroup>

View File

@@ -0,0 +1,192 @@
<script setup lang="ts">
import { Save, Play, ArrowLeft } from 'lucide-vue-next'
import { Button } from '@/components/ui/button'
import { Input } from '@/components/ui/input'
import { Textarea } from '@/components/ui/textarea'
import { Label } from '@/components/ui/label'
import { useApi } from '@/composables/useApi'
import { useRouter, useRoute } from 'vue-router'
interface ProcessGraph {
id: string
name: string
description?: string
nodes: any[]
edges: any[]
}
const { api } = useApi()
const router = useRouter()
const route = useRoute()
const processId = computed(() => route.params.id === 'new' ? null : String(route.params.id))
const processName = ref('')
const processDescription = ref('')
const graphDefinition = ref<ProcessGraph | null>(null)
const saving = ref(false)
const loading = ref(true)
const getTenantId = () => {
if (!import.meta.client) return 'tenant1'
return localStorage.getItem('tenantId') || 'tenant1'
}
const loadProcess = async () => {
if (!processId.value) {
loading.value = false
return
}
loading.value = true
try {
const data = await api.get(`/tenants/${getTenantId()}/ai-processes/${processId.value}`)
processName.value = data.name
processDescription.value = data.description || ''
if (data.versions && data.versions.length > 0) {
const latestVersion = data.versions[data.versions.length - 1]
graphDefinition.value = latestVersion.graphJson
// Send graph to iframe
sendGraphToEditor(latestVersion.graphJson)
}
} catch (error) {
console.error('Failed to load process:', error)
alert('Failed to load process')
} finally {
loading.value = false
}
}
const sendGraphToEditor = (graph: any) => {
const iframe = document.getElementById('process-editor-iframe') as HTMLIFrameElement
if (iframe && iframe.contentWindow) {
iframe.contentWindow.postMessage({
type: 'LOAD_GRAPH',
payload: graph
}, '*')
}
}
const saveProcess = async () => {
if (!processName.value.trim()) {
alert('Process name is required')
return
}
if (!graphDefinition.value) {
alert('Please design the process graph first')
return
}
saving.value = true
try {
if (processId.value) {
// Update existing process (create new version)
await api.post(`/tenants/${getTenantId()}/ai-processes/${processId.value}/versions`, {
graph: graphDefinition.value
})
alert('Process version saved successfully!')
} else {
// Create new process
const result = await api.post(`/tenants/${getTenantId()}/ai-processes`, {
name: processName.value,
description: processDescription.value,
graph: graphDefinition.value
})
alert('Process created successfully!')
router.push(`/ai-processes/${result.id}/edit`)
}
} catch (error: any) {
console.error('Failed to save process:', error)
alert(`Save failed: ${error.message || 'Unknown error'}`)
} finally {
saving.value = false
}
}
const testRun = async () => {
if (!processId.value) {
alert('Please save the process first')
return
}
try {
const result = await api.post(`/tenants/${getTenantId()}/ai-processes/${processId.value}/runs`, {
input: { test: true }
})
console.log('Test run result:', result)
alert('Test run started! Check the console for details.')
} catch (error: any) {
alert(`Test failed: ${error.message}`)
}
}
const handleEditorMessage = (event: MessageEvent) => {
if (event.data.type === 'GRAPH_UPDATED') {
graphDefinition.value = event.data.payload
}
}
onMounted(() => {
loadProcess()
window.addEventListener('message', handleEditorMessage)
})
onUnmounted(() => {
window.removeEventListener('message', handleEditorMessage)
})
</script>
<template>
<div class="h-screen flex flex-col">
<!-- Header -->
<div class="border-b bg-background px-6 py-3 flex items-center justify-between">
<div class="flex items-center gap-4 flex-1">
<Button variant="ghost" size="icon" @click="router.push('/ai-processes')">
<ArrowLeft class="h-4 w-4" />
</Button>
<div class="flex-1 max-w-md">
<Input
v-model="processName"
placeholder="Process Name"
class="font-semibold"
/>
</div>
</div>
<div class="flex items-center gap-2">
<Button variant="outline" @click="testRun" :disabled="!processId">
<Play class="h-4 w-4 mr-2" />
Test Run
</Button>
<Button @click="saveProcess" :disabled="saving">
<Save class="h-4 w-4 mr-2" />
{{ saving ? 'Saving...' : 'Save' }}
</Button>
</div>
</div>
<!-- Process Description -->
<div class="border-b bg-background px-6 py-2">
<Textarea
v-model="processDescription"
placeholder="Process description (optional)"
class="resize-none text-sm"
rows="2"
/>
</div>
<!-- Editor Iframe -->
<div class="flex-1 relative">
<div v-if="loading" class="absolute inset-0 flex items-center justify-center bg-background/80">
<div class="animate-spin rounded-full h-8 w-8 border-b-2 border-primary"></div>
</div>
<iframe
id="process-editor-iframe"
src="/ai-processes-editor/index.html"
class="w-full h-full border-0"
title="Process Editor"
/>
</div>
</div>
</template>

View File

@@ -0,0 +1,133 @@
<script setup lang="ts">
import { Plus, Workflow, Play } from 'lucide-vue-next'
import { Button } from '@/components/ui/button'
import { Card, CardContent, CardDescription, CardHeader, CardTitle } from '@/components/ui/card'
import { Badge } from '@/components/ui/badge'
import { useApi } from '@/composables/useApi'
import { useRouter } from 'vue-router'
interface Process {
id: string
name: string
description?: string
latestVersion: number
createdAt: string
versions?: any[]
}
const { api } = useApi()
const router = useRouter()
const processes = ref<Process[]>([])
const loading = ref(true)
const getTenantId = () => {
if (!import.meta.client) return 'tenant1'
return localStorage.getItem('tenantId') || 'tenant1'
}
const loadProcesses = async () => {
loading.value = true
try {
const data = await api.get(`/tenants/${getTenantId()}/ai-processes`)
processes.value = data || []
} catch (error) {
console.error('Failed to load processes:', error)
} finally {
loading.value = false
}
}
const createNewProcess = () => {
router.push('/ai-processes/new')
}
const editProcess = (processId: string) => {
router.push(`/ai-processes/${processId}/edit`)
}
const testProcess = async (processId: string) => {
try {
const result = await api.post(`/tenants/${getTenantId()}/ai-processes/${processId}/runs`, {
input: { test: true },
})
console.log('Test run result:', result)
alert('Process test started! Check console for details.')
} catch (error: any) {
alert(`Test failed: ${error.message}`)
}
}
onMounted(() => {
loadProcesses()
})
</script>
<template>
<div class="container mx-auto p-6 space-y-6">
<div class="flex items-center justify-between">
<div>
<h1 class="text-3xl font-bold tracking-tight">AI Process Builder</h1>
<p class="text-muted-foreground mt-1">
Create and manage intelligent business process workflows
</p>
</div>
<Button @click="createNewProcess">
<Plus class="mr-2 h-4 w-4" />
New Process
</Button>
</div>
<div v-if="loading" class="flex items-center justify-center py-12">
<div class="animate-spin rounded-full h-8 w-8 border-b-2 border-primary"></div>
</div>
<div v-else-if="processes.length === 0" class="text-center py-12">
<Workflow class="mx-auto h-12 w-12 text-muted-foreground/50" />
<h3 class="mt-4 text-lg font-semibold">No processes yet</h3>
<p class="text-muted-foreground mt-1">
Get started by creating your first AI-powered business process
</p>
<Button @click="createNewProcess" class="mt-4">
<Plus class="mr-2 h-4 w-4" />
Create Process
</Button>
</div>
<div v-else class="grid gap-4 md:grid-cols-2 lg:grid-cols-3">
<Card
v-for="process in processes"
:key="process.id"
class="hover:shadow-lg transition-shadow cursor-pointer"
@click="editProcess(process.id)"
>
<CardHeader>
<div class="flex items-start justify-between">
<Workflow class="h-5 w-5 text-primary" />
<Badge variant="secondary">v{{ process.latestVersion }}</Badge>
</div>
<CardTitle class="mt-2">{{ process.name }}</CardTitle>
<CardDescription>{{ process.description || 'No description' }}</CardDescription>
</CardHeader>
<CardContent>
<div class="flex gap-2">
<Button
size="sm"
variant="outline"
@click.stop="editProcess(process.id)"
>
Edit
</Button>
<Button
size="sm"
variant="ghost"
@click.stop="testProcess(process.id)"
>
<Play class="h-3 w-3 mr-1" />
Test
</Button>
</div>
</CardContent>
</Card>
</div>
</div>
</template>