Docs

Developer Documentation

This documentation provides an overview of how the simulated assistant system works, including its mock API behavior and frontend integration.

Endpoints

  • POST /api/chat – Accepts user input and returns a supportive text response
  • GET /api/sentiments – Simulates analysis of emotions like sadness, stress, burnout

Data Simulation

No real data is collected or stored. All conversations are handled in-browser using JavaScript.

SDK Roadmap

  • Q3 2025 – JavaScript SDK for local testing
  • Q4 2025 – Python API wrapper (simulated)

🛠️ API Integration Guide

The MindEase AI platform allows for future expansion using a simple keyword-detection simulation. While no backend is used currently, we envision a modular API interface for community developers.

  • POST /api/chat – Accepts input, returns simulated empathetic response
  • GET /api/emotion – Returns mock emotion data for given text payloads
  • GET /api/suggestions – Returns recommended reflection prompts

Authentication

MindEase does not require auth tokens for local simulation. Future gated features may integrate token-based auth for $MIND holders.

Sample Integration

fetch('/api/chat', {
  method: 'POST',
  body: JSON.stringify({ text: "I feel burned out" }),
})

📊 Mock SDK Functions

A lightweight SDK is planned to allow offline simulations, journaling, and mood tracking. The SDK will support local storage and analysis of text for educational use only.

Coming Features

  • Guided journaling prompts
  • Emoji-based mood tracking
  • Browser-only sentiment visualization