jostlebot Claude Opus 4.5 commited on
Commit
ae87306
·
1 Parent(s): 0eda47f

Add interactive FEEL and NEED tools + update HF description

Browse files

- FEEL tool: Clickable feelings list organized by needs met/unmet
- NEED tool: Clickable needs list organized by NVC categories
- Users can self-identify feelings/needs, then get AI reflection
- Updated README.md with correct "Tend & Send ARI Prototype" title
- Purple/indigo branding colors

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>

Files changed (2) hide show
  1. README.md +30 -21
  2. static/index.html +259 -3
README.md CHANGED
@@ -1,41 +1,50 @@
1
  ---
2
- title: Tolerate Space Lab
3
- emoji: 🌿
4
- colorFrom: green
5
- colorTo: gray
6
  sdk: docker
7
  pinned: false
8
  license: mit
9
  ---
10
 
11
- # Tolerate Space Lab
12
 
13
- A therapeutic intervention prototype that builds distress tolerance through intentional response delays, concurrent somatic journaling, and pattern reflection.
14
 
15
- **This is a relational workout, not synthetic intimacy.**
16
 
17
  ## What This Tool Does
18
 
19
- The Tolerate Space Lab helps users practice sitting with the uncertainty and discomfort that arises when waiting for a response from someone whose message matters to them.
20
 
21
- ### The Intervention
22
 
23
- 1. **Attachment activation**: Engage in a simulated text conversation
24
- 2. **The stretch**: Responses are intentionally delayed, gradually lengthening
25
- 3. **Somatic awareness**: Journal what arises in your body while waiting
26
- 4. **Pattern reflection**: Receive a clinical debrief at session end
 
 
 
 
27
 
28
- ### Clinical Foundation
 
 
 
 
 
 
29
 
30
  Informed by:
31
- - Distress tolerance principles from DBT
32
- - Somatic awareness practices (Tara Brach, Sarah Peyton)
33
- - Attachment theory research
34
- - Trauma-informed design
35
 
36
  ## Created By
37
 
38
- Jocelyn Skillman, LMHC — exploring Clinical UX and Assistive Relational Intelligence.
39
 
40
- *This tool is offered as a prototype for educational and demonstration purposes. It is not a substitute for professional mental health care.*
41
- # Rebuild trigger Tue Jan 20 18:08:23 PST 2026
 
1
  ---
2
+ title: Tend & Send ARI Prototype
3
+ emoji: 💜
4
+ colorFrom: indigo
5
+ colorTo: green
6
  sdk: docker
7
  pinned: false
8
  license: mit
9
  ---
10
 
11
+ # Tend & Send - ARI Prototype
12
 
13
+ Assistive Relational Intelligence tools for human-to-human connection.
14
 
15
+ **These tools exist to support YOUR relationship. ARI never becomes the relationship - it augments human-to-human connection.**
16
 
17
  ## What This Tool Does
18
 
19
+ Practice difficult conversations with a simulated partner while using ARI tools to build relational capacity.
20
 
21
+ ### The ARI Toolkit
22
 
23
+ - **TEND** - Transform messages with NVC clarity and warmth
24
+ - **NVC** - Guided I-statement builder
25
+ - **FEEL** - Identify feelings in the conversation
26
+ - **NEED** - Extract underlying needs
27
+ - **LISTEN** - Receive mode: hear them before reacting
28
+ - **REPAIR** - Craft genuine repair after ruptures
29
+ - **SOMA** - Body check-in before responding
30
+ - **LOVE** - Slow down, breathe, return to presence
31
 
32
+ ### Practice Conversations
33
+
34
+ - Choose attachment style: Anxious, Avoidant, Disorganized, Secure
35
+ - Set difficulty level: Gentle to Crisis
36
+ - Partner responds like a real person (not a therapist)
37
+
38
+ ## Clinical Foundation
39
 
40
  Informed by:
41
+ - Nonviolent Communication (Marshall Rosenberg)
42
+ - Attachment Theory
43
+ - Somatic awareness practices
44
+ - Trauma-informed design principles
45
 
46
  ## Created By
47
 
48
+ Jocelyn Skillman, LMHC
49
 
50
+ *Reflection prompts only - not therapy. If in crisis, contact a professional.*
 
static/index.html CHANGED
@@ -613,6 +613,71 @@
613
 
614
  .right-input button:hover { background: var(--accent-light); }
615
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
616
  /* Love/Breath Modal */
617
  .breath-overlay {
618
  position: fixed;
@@ -878,14 +943,33 @@
878
  const TOOL_MAP = {
879
  tend: { name: 'TEND Transform', icon: '✨', api: 'tend' },
880
  nvc: { name: 'Guided NVC', icon: '📝', api: 'guided_nvc' },
881
- feel: { name: 'Feelings', icon: '💜', api: 'feelings_needs' },
882
- need: { name: 'Needs', icon: '🎯', api: 'feelings_needs' },
883
  listen: { name: 'Receive Mode', icon: '👂', api: 'receive_mode' },
884
  repair: { name: 'Repair Support', icon: '🙏', api: 'repair_support' },
885
  soma: { name: 'Somatic Check-in', icon: '🫁', api: 'somatic_checkin' },
886
  love: { name: 'Slow Down', icon: '💗', api: null }
887
  };
888
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
889
  const PARTNER_PROMPTS = {
890
  anxious: {
891
  gentle: `You ARE a real person with anxious attachment. NOT a therapist. React like a real human - seek reassurance, worry, maybe defensive. 1-3 sentences, casual text.`,
@@ -960,10 +1044,115 @@
960
  // Open right panel
961
  openRightPanel(tool);
962
 
963
- // Call API
 
 
 
 
 
 
 
 
 
 
 
964
  callToolAPI(tool, userInput);
965
  }
966
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
967
  function openRightPanel(tool) {
968
  const panel = document.getElementById('right-panel');
969
  const title = document.getElementById('tool-title');
@@ -1052,6 +1241,73 @@
1052
  async function submitToTool() {
1053
  const input = document.getElementById('tool-input');
1054
  const userInput = input.value.trim();
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1055
  if (!userInput || !activeTool) return;
1056
 
1057
  input.value = '';
 
613
 
614
  .right-input button:hover { background: var(--accent-light); }
615
 
616
+ /* Feelings/Needs Button Grid */
617
+ .selection-grid {
618
+ display: flex;
619
+ flex-wrap: wrap;
620
+ gap: 8px;
621
+ margin: 12px 0;
622
+ }
623
+
624
+ .selection-btn {
625
+ padding: 8px 12px;
626
+ background: var(--bg-input);
627
+ border: 1px solid var(--border);
628
+ color: var(--text-secondary);
629
+ border-radius: 20px;
630
+ font-size: 0.8rem;
631
+ cursor: pointer;
632
+ transition: all 0.2s;
633
+ }
634
+
635
+ .selection-btn:hover {
636
+ border-color: var(--accent);
637
+ color: var(--accent-light);
638
+ }
639
+
640
+ .selection-btn.selected {
641
+ background: var(--accent);
642
+ border-color: var(--accent);
643
+ color: white;
644
+ }
645
+
646
+ .selection-category {
647
+ font-size: 0.7rem;
648
+ text-transform: uppercase;
649
+ letter-spacing: 0.05em;
650
+ color: var(--text-muted);
651
+ margin: 16px 0 8px;
652
+ padding-top: 12px;
653
+ border-top: 1px solid var(--border);
654
+ }
655
+
656
+ .selection-category:first-child {
657
+ border-top: none;
658
+ margin-top: 0;
659
+ padding-top: 0;
660
+ }
661
+
662
+ .selected-items {
663
+ background: rgba(99, 102, 241, 0.1);
664
+ border: 1px solid var(--accent);
665
+ border-radius: 8px;
666
+ padding: 12px;
667
+ margin-top: 16px;
668
+ }
669
+
670
+ .selected-items h4 {
671
+ font-size: 0.8rem;
672
+ color: var(--accent-light);
673
+ margin-bottom: 8px;
674
+ }
675
+
676
+ .selected-items p {
677
+ font-size: 0.9rem;
678
+ line-height: 1.5;
679
+ }
680
+
681
  /* Love/Breath Modal */
682
  .breath-overlay {
683
  position: fixed;
 
943
  const TOOL_MAP = {
944
  tend: { name: 'TEND Transform', icon: '✨', api: 'tend' },
945
  nvc: { name: 'Guided NVC', icon: '📝', api: 'guided_nvc' },
946
+ feel: { name: 'Feelings Check', icon: '💜', api: null, interactive: true },
947
+ need: { name: 'Needs Check', icon: '🎯', api: null, interactive: true },
948
  listen: { name: 'Receive Mode', icon: '👂', api: 'receive_mode' },
949
  repair: { name: 'Repair Support', icon: '🙏', api: 'repair_support' },
950
  soma: { name: 'Somatic Check-in', icon: '🫁', api: 'somatic_checkin' },
951
  love: { name: 'Slow Down', icon: '💗', api: null }
952
  };
953
 
954
+ // NVC Feelings vocabulary
955
+ const FEELINGS_DATA = {
956
+ 'When needs ARE met': ['Calm', 'Content', 'Grateful', 'Hopeful', 'Joyful', 'Peaceful', 'Relieved', 'Safe', 'Tender', 'Touched'],
957
+ 'When needs are NOT met': ['Angry', 'Anxious', 'Confused', 'Disappointed', 'Frustrated', 'Helpless', 'Hurt', 'Lonely', 'Overwhelmed', 'Sad', 'Scared', 'Tense', 'Tired', 'Worried']
958
+ };
959
+
960
+ // NVC Needs vocabulary
961
+ const NEEDS_DATA = {
962
+ 'Connection': ['Acceptance', 'Belonging', 'Closeness', 'To be heard', 'To be seen', 'To be understood', 'Trust', 'Warmth'],
963
+ 'Autonomy': ['Choice', 'Freedom', 'Independence', 'Space', 'Spontaneity'],
964
+ 'Meaning': ['Contribution', 'Growth', 'Purpose', 'To matter', 'Creativity'],
965
+ 'Peace': ['Ease', 'Harmony', 'Order', 'Rest', 'Safety', 'Security'],
966
+ 'Honesty': ['Authenticity', 'Integrity', 'To be known', 'Transparency'],
967
+ 'Physical': ['Air', 'Food', 'Movement', 'Rest', 'Safety', 'Touch']
968
+ };
969
+
970
+ let selectedFeelings = [];
971
+ let selectedNeeds = [];
972
+
973
  const PARTNER_PROMPTS = {
974
  anxious: {
975
  gentle: `You ARE a real person with anxious attachment. NOT a therapist. React like a real human - seek reassurance, worry, maybe defensive. 1-3 sentences, casual text.`,
 
1044
  // Open right panel
1045
  openRightPanel(tool);
1046
 
1047
+ // Handle interactive tools differently
1048
+ if (tool === 'feel') {
1049
+ showFeelingsSelection();
1050
+ return;
1051
+ }
1052
+
1053
+ if (tool === 'need') {
1054
+ showNeedsSelection();
1055
+ return;
1056
+ }
1057
+
1058
+ // Call API for other tools
1059
  callToolAPI(tool, userInput);
1060
  }
1061
 
1062
+ function showFeelingsSelection() {
1063
+ selectedFeelings = [];
1064
+ const content = document.getElementById('tool-content');
1065
+
1066
+ let html = '<p style="margin-bottom: 16px; color: var(--text-secondary);">What feelings are present right now? Tap to select.</p>';
1067
+
1068
+ for (const [category, feelings] of Object.entries(FEELINGS_DATA)) {
1069
+ html += `<div class="selection-category">${category}</div>`;
1070
+ html += '<div class="selection-grid">';
1071
+ for (const feeling of feelings) {
1072
+ html += `<button class="selection-btn" onclick="toggleFeeling('${feeling}')" id="feel-${feeling}">${feeling}</button>`;
1073
+ }
1074
+ html += '</div>';
1075
+ }
1076
+
1077
+ html += '<div class="selected-items" id="feelings-summary" style="display: none;"><h4>Selected feelings:</h4><p id="feelings-list"></p></div>';
1078
+
1079
+ content.innerHTML = html;
1080
+ }
1081
+
1082
+ function toggleFeeling(feeling) {
1083
+ const btn = document.getElementById(`feel-${feeling}`);
1084
+ const idx = selectedFeelings.indexOf(feeling);
1085
+
1086
+ if (idx > -1) {
1087
+ selectedFeelings.splice(idx, 1);
1088
+ btn.classList.remove('selected');
1089
+ } else {
1090
+ selectedFeelings.push(feeling);
1091
+ btn.classList.add('selected');
1092
+ }
1093
+
1094
+ updateFeelingsSummary();
1095
+ }
1096
+
1097
+ function updateFeelingsSummary() {
1098
+ const summary = document.getElementById('feelings-summary');
1099
+ const list = document.getElementById('feelings-list');
1100
+
1101
+ if (selectedFeelings.length > 0) {
1102
+ summary.style.display = 'block';
1103
+ list.textContent = selectedFeelings.join(', ');
1104
+ } else {
1105
+ summary.style.display = 'none';
1106
+ }
1107
+ }
1108
+
1109
+ function showNeedsSelection() {
1110
+ selectedNeeds = [];
1111
+ const content = document.getElementById('tool-content');
1112
+
1113
+ let html = '<p style="margin-bottom: 16px; color: var(--text-secondary);">What needs are alive right now? Tap to select.</p>';
1114
+
1115
+ for (const [category, needs] of Object.entries(NEEDS_DATA)) {
1116
+ html += `<div class="selection-category">${category}</div>`;
1117
+ html += '<div class="selection-grid">';
1118
+ for (const need of needs) {
1119
+ html += `<button class="selection-btn" onclick="toggleNeed('${need}')" id="need-${need}">${need}</button>`;
1120
+ }
1121
+ html += '</div>';
1122
+ }
1123
+
1124
+ html += '<div class="selected-items" id="needs-summary" style="display: none;"><h4>Selected needs:</h4><p id="needs-list"></p></div>';
1125
+
1126
+ content.innerHTML = html;
1127
+ }
1128
+
1129
+ function toggleNeed(need) {
1130
+ const btn = document.getElementById(`need-${need}`);
1131
+ const idx = selectedNeeds.indexOf(need);
1132
+
1133
+ if (idx > -1) {
1134
+ selectedNeeds.splice(idx, 1);
1135
+ btn.classList.remove('selected');
1136
+ } else {
1137
+ selectedNeeds.push(need);
1138
+ btn.classList.add('selected');
1139
+ }
1140
+
1141
+ updateNeedsSummary();
1142
+ }
1143
+
1144
+ function updateNeedsSummary() {
1145
+ const summary = document.getElementById('needs-summary');
1146
+ const list = document.getElementById('needs-list');
1147
+
1148
+ if (selectedNeeds.length > 0) {
1149
+ summary.style.display = 'block';
1150
+ list.textContent = selectedNeeds.join(', ');
1151
+ } else {
1152
+ summary.style.display = 'none';
1153
+ }
1154
+ }
1155
+
1156
  function openRightPanel(tool) {
1157
  const panel = document.getElementById('right-panel');
1158
  const title = document.getElementById('tool-title');
 
1241
  async function submitToTool() {
1242
  const input = document.getElementById('tool-input');
1243
  const userInput = input.value.trim();
1244
+
1245
+ // Handle FEEL tool submission
1246
+ if (activeTool === 'feel' && selectedFeelings.length > 0) {
1247
+ const feelingsText = selectedFeelings.join(', ');
1248
+ addToolMsg('user', `I'm feeling: ${feelingsText}`);
1249
+
1250
+ // Get reflection from API
1251
+ addToolLoading();
1252
+ const isVerbose = document.getElementById('toggle-verbose')?.checked || false;
1253
+ try {
1254
+ const response = await fetch('/api/tool', {
1255
+ method: 'POST',
1256
+ headers: { 'Content-Type': 'application/json' },
1257
+ body: JSON.stringify({
1258
+ tool: 'feelings_needs',
1259
+ partner_message: lastPartnerMessage,
1260
+ user_draft: '',
1261
+ user_input: `The person has identified these feelings: ${feelingsText}. ${userInput ? 'They added: ' + userInput : ''} Reflect back what these feelings might be pointing to. Help them connect feelings to needs.`,
1262
+ verbose: isVerbose
1263
+ })
1264
+ });
1265
+ const data = await response.json();
1266
+ removeToolLoading();
1267
+ if (data.response) {
1268
+ addToolMsg('tool', data.response, 'Feelings Reflection');
1269
+ }
1270
+ } catch (e) {
1271
+ removeToolLoading();
1272
+ addToolMsg('tool', 'Connection error.', 'Error');
1273
+ }
1274
+ input.value = '';
1275
+ return;
1276
+ }
1277
+
1278
+ // Handle NEED tool submission
1279
+ if (activeTool === 'need' && selectedNeeds.length > 0) {
1280
+ const needsText = selectedNeeds.join(', ');
1281
+ addToolMsg('user', `I'm needing: ${needsText}`);
1282
+
1283
+ // Get reflection from API
1284
+ addToolLoading();
1285
+ const isVerbose = document.getElementById('toggle-verbose')?.checked || false;
1286
+ try {
1287
+ const response = await fetch('/api/tool', {
1288
+ method: 'POST',
1289
+ headers: { 'Content-Type': 'application/json' },
1290
+ body: JSON.stringify({
1291
+ tool: 'feelings_needs',
1292
+ partner_message: lastPartnerMessage,
1293
+ user_draft: '',
1294
+ user_input: `The person has identified these needs: ${needsText}. ${userInput ? 'They added: ' + userInput : ''} Reflect back these needs. Help them think about what request might help meet these needs in the relationship.`,
1295
+ verbose: isVerbose
1296
+ })
1297
+ });
1298
+ const data = await response.json();
1299
+ removeToolLoading();
1300
+ if (data.response) {
1301
+ addToolMsg('tool', data.response, 'Needs Reflection');
1302
+ }
1303
+ } catch (e) {
1304
+ removeToolLoading();
1305
+ addToolMsg('tool', 'Connection error.', 'Error');
1306
+ }
1307
+ input.value = '';
1308
+ return;
1309
+ }
1310
+
1311
  if (!userInput || !activeTool) return;
1312
 
1313
  input.value = '';