Paradigm | Description | Primary Input | Primary Output | Interface Patterns |
---|---|---|---|---|
Direct Manipulation | User controls the system through structured UI elements. | Structured Inputs | Artifacts, Actions, Visualizations | Context Chips, Config Panels, AI-Enhanced Tables, Toolbars, Progress Views |
Conversation | User and AI exchange dialogue in natural language or structured prompts. | Natural Language, Embedded UI Controls | Responses, Artifacts, Actions, Commands | Chat Interfaces, Prompt Bars, Agent Switching, Embedded UI Inputs |
Proactive AI | AI initiates actions, suggestions, or notifications without explicit user prompts. | Event-Driven, Contextual Triggers | Notifications, Actions, Commands | Notification Cards, Autonomous Agents, Smart Alerts, System Feedback |
Ambient Assistance | AI enhances user experience quietly in the background without seizing focus. | Multimodal Signals, Passive Monitoring | Inline Suggestions, Visualizations, Commands | Inline Suggestions, Passive Monitoring, Adaptive Interfaces |
🎛️ Interaction Models
How will users engage with this AI and what are its inputs and outputs?
Updated on