Interactive Data Assembly Tool (IDAT)¶
Overview¶
The Interactive Data Assembly Tool (IDAT) is AIDDDMAP's sophisticated visual workspace for creating, modifying, and managing data workflows. It combines an intuitive canvas-based interface with AI agent assistance to streamline data preparation, analysis, and publishing processes.
Core Features¶
1. Visual Canvas Environment¶
interface IDATPanelConfig {
dimensions: {
width: number;
height: number;
};
grid: {
size: number;
snap: boolean;
};
theme: {
background: string;
gridColor: string;
nodeColors: Record<string, string>;
};
}
The canvas provides:
- Drag-and-drop functionality for datasets and agents
- Real-time node connections and workflow visualization
- Interactive grid system with snapping capabilities
- Customizable themes and visual styles
2. Node Types¶
enum NodeType {
DATASET = "dataset",
AGENT = "agent",
TRANSFORM = "transform",
OUTPUT = "output",
ENCRYPTION = "encryption",
}
interface BaseNode {
id: string;
type: NodeType;
position: { x: number; y: number };
data: any;
}
interface DatasetNode extends BaseNode {
type: NodeType.DATASET;
data: {
name: string;
schema: SchemaDefinition;
preview: DataPreview;
metadata: DatasetMetadata;
};
}
3. Agent Integration¶
IDAT seamlessly integrates with AIDDDMAP's AI agents:
interface AgentNode extends BaseNode {
type: NodeType.AGENT;
data: {
agent: BaseAgent;
status: AgentStatus;
configuration: AgentConfig;
};
}
// Example: Deploying an agent to the canvas
const workflow = new DataWorkflow();
workflow.addNode({
type: NodeType.AGENT,
data: {
agent: new DataCuratorAgent(),
configuration: {
mode: "automatic",
triggers: ["new_data", "schema_change"],
},
},
});
4. Real-time Processing¶
interface LiveDataConfig {
source: DataSource;
bufferSize: number;
updateInterval: number;
errorHandling: ErrorHandlingStrategy;
}
class LiveDataHandler {
constructor(config: LiveDataConfig) {
this.buffer = new CircularBuffer(config.bufferSize);
this.interval = config.updateInterval;
}
async processIncomingData(data: any): Promise<void> {
await this.buffer.add(data);
await this.notifySubscribers(data);
}
}
5. Workflow Management¶
interface WorkflowState {
status: "idle" | "processing" | "error";
progress: number;
activeNodes: string[];
errors: WorkflowError[];
metrics: {
processedItems: number;
throughput: number;
latency: number;
};
}
// Example: Creating a data preparation pipeline
const pipeline = new DataPipeline()
.addSource(new DatasetNode("raw_data"))
.addTransformation(new CleansingNode())
.addAgent(new DataCuratorAgent())
.addValidation(new QualityCheckNode())
.addDestination(new OutputNode("processed_data"));
User Interface Components¶
1. Main Canvas¶
- Interactive workspace for visual workflow creation
- Zoom and pan controls
- Grid system with snap-to-grid functionality
- Node selection and multi-select capabilities
2. Right Drawer (Agent Panel)¶
- Access to all available AI agents
- Agent configuration controls
- Real-time agent status monitoring
- Drag-and-drop deployment to canvas
3. Left Drawer (Task Zone)¶
- Active task management
- Live data monitoring
- Quick access to common operations
- Status notifications and alerts
4. Toolbar¶
- Common operations (save, load, export)
- View controls (zoom, fit, center)
- Undo/redo functionality
- Layout options
Integration Features¶
1. Encryption Integration¶
interface EncryptionNode extends BaseNode {
type: NodeType.ENCRYPTION;
mode: "FHE" | "ZK" | "Basic";
config: FHEConfig | ZKConfig | BasicEncryptionConfig;
}
// Adding encryption to a workflow
workflow.addNode({
type: NodeType.ENCRYPTION,
mode: "FHE",
config: {
scheme: "CKKS",
parameters: {
polyModulusDegree: 8192,
coeffModulusBits: [60, 40, 40, 60],
},
},
});
2. UADM Integration¶
// Deploy custom agent from UADM
const customAgent = await uadm.deployAgent({
agent: new CustomAgent(),
config: {
hardwareAbstraction: {
protocols: ["http", "websocket"],
deviceTypes: ["server"],
},
},
});
workflow.addNode({
type: NodeType.AGENT,
data: { agent: customAgent },
});
Best Practices¶
1. Performance Optimization¶
// Implement data streaming for large datasets
const stream = new DataStream({
batchSize: 1000,
parallel: true,
maxConcurrency: 4,
});
stream.on("data", async (batch) => {
await workflow.processBatch(batch);
});
2. Error Handling¶
try {
await workflow.execute();
} catch (error) {
if (error instanceof ValidationError) {
await workflow.handleValidationError(error);
} else if (error instanceof ProcessingError) {
await workflow.retry({
maxAttempts: 3,
backoff: "exponential",
});
}
}
3. Resource Management¶
const resources = new ResourceManager({
maxMemory: "4GB",
maxConcurrency: 8,
timeout: 30000,
});
workflow.setResourceConstraints(resources);
Example Workflows¶
1. Data Preparation Pipeline¶
const preparationWorkflow = new DataWorkflow()
.addSource(new DatasetNode("raw_data"))
.addAgent(
new DataQualityAgent({
validations: ["completeness", "accuracy"],
}),
)
.addAgent(
new DataCuratorAgent({
enrichment: ["metadata", "schema"],
}),
)
.addEncryption(
new EncryptionNode({
mode: "FHE",
}),
)
.addDestination(new OutputNode("marketplace"));
await preparationWorkflow.execute();
2. Real-time Analytics¶
const realtimeWorkflow = new RealTimeWorkflow()
.setSource(
new StreamingSource({
type: "websocket",
url: "ws://data-stream",
}),
)
.addProcessing(
new WindowedAggregation({
window: "5m",
operation: "average",
}),
)
.addVisualization(new LiveDashboard());