ndurner commited on
Commit
4353f5a
·
1 Parent(s): 3203a57

trying dedicated GH README

Browse files
.github/README.md ADDED
@@ -0,0 +1,59 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ # Aileen 3 Core
2
+ ## Introduction
3
+ Large Language Models (LLMs) rely on **tools** - sometimes provided by **MCP servers** - to interact with the outside world. Aileen 3 Core is an MCP server that advances the goals of **Information Foraging**: mining for novel insights from high-noise sources to create dense briefings for time efficient consumption by the user.
4
+
5
+ ### Competition submission
6
+ Aileen 3 Core is [a contender](https://huggingface.co/spaces/ndurner/aileen3-core) in the [MCP's 1st Birthday - Hosted by Anthropic and Gradio](https://huggingface.co/MCP-1st-Birthday) hackathon. [Aileen 3 Agent](https://github.com/ndurner/aileen3-agent), an agentic system built on this MCP server, is a [capstone project](https://ndurner.de/links/aileen3-kaggle-writeup) to the [AI Agents Intensive Course with Google](https://www.kaggle.com/learn-guide/5-day-agents).
7
+
8
+ ## Using
9
+ ### Using in Claude Desktop
10
+ #### Installing
11
+ 1. Optionally, create a new Python virtual environment
12
+ * example: `python3 -m venv .venv-claude`
13
+ 2. Install the Aileen MCP: `pip install ./mcp`
14
+ * (or `pip install -e ./mcp` if you want to make live changes to this source tree)
15
+ 3. Obtain Google Gemini API key: [https://aistudio.google.com](../Google AI Studio)
16
+ 3. Add reference to `claude_desktop_config.json`. The Gemini API key will be read from the environment, so can be set here:
17
+ ```
18
+ {
19
+ ...
20
+ "mcpServers": {
21
+ "aileen3-mcp": {
22
+ "command": "/Users/.../aileen3-core/.venv-claude/bin/python",
23
+ "args": [
24
+ "-m",
25
+ "aileen3_mcp.server"
26
+ ],
27
+ "env": {
28
+ "GEMINI_API_KEY": "AI..."
29
+ }
30
+ }
31
+ }
32
+ }
33
+ ```
34
+ 4. Restart Claude
35
+
36
+ #### Using the MCP server
37
+ The model Haiku 4.5 is sufficient for basic tasks. To make your plans fully transparent to the LLM, refer to "aileen3" explicitely in the prompt, e.g.:
38
+ > Use aileen3 to translate slide 3 from YouTube video reference eXP-PvKcI9A to German.
39
+
40
+ ![Screenshot of Claude Desktop: slide translation with Aileen 3 Core](../readme-assets/claude-slide-translation.webp)
41
+
42
+ #### Debugging
43
+ The message exchange and Claude-facing error messages can be read from Claude log files:
44
+ ```
45
+ tail -n 20 -F ~/Library/Logs/Claude/mcp*.log
46
+ ```
47
+
48
+ ## Local development
49
+
50
+ Build and run the Docker Space image locally:
51
+
52
+ ```bash
53
+ docker build -t aileen3-core .
54
+ docker run -it -p 7860:7860 aileen3-core
55
+ ```
56
+
57
+ The Docker build passes `--pre` to `pip install` so it can grab the latest Gradio 6 pre-release builds referenced in `demo/requirements.txt` (which currently specifies `gradio>=6.0.0.dev0`).
58
+
59
+ The blank Gradio 6 demo lives in `demo/app.py`.
.github/generate_welcome_readme.py ADDED
@@ -0,0 +1,86 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ #!/usr/bin/env python3
2
+
3
+ import argparse
4
+ import os
5
+ from pathlib import Path
6
+ import re
7
+
8
+
9
+ FRONTMATTER_BOUNDARY = re.compile(r"^---\s*$")
10
+ MARKDOWN_LINK = re.compile(r"(!?)\[(?P<text>[^\]]*)\]\((?P<target>[^)]+)\)")
11
+ SKIP_PREFIXES = ("http://", "https://", "mailto:", "#")
12
+
13
+
14
+ def strip_frontmatter(raw: str) -> str:
15
+ lines = raw.splitlines()
16
+ if not lines or not FRONTMATTER_BOUNDARY.match(lines[0]):
17
+ return raw
18
+ end_index = None
19
+ for i in range(1, len(lines)):
20
+ if FRONTMATTER_BOUNDARY.match(lines[i]):
21
+ end_index = i
22
+ break
23
+ if end_index is None:
24
+ return raw
25
+ body_lines = lines[end_index + 1 :]
26
+ return "\n".join(body_lines).lstrip("\n")
27
+
28
+
29
+ def rewrite_links(raw: str, readme_path: Path, output_path: Path) -> str:
30
+ readme_dir = readme_path.resolve().parent
31
+ output_dir = output_path.resolve().parent
32
+
33
+ def replace(match: re.Match) -> str:
34
+ bang = match.group(1)
35
+ text = match.group("text")
36
+ target = match.group("target").strip()
37
+
38
+ if target.startswith(SKIP_PREFIXES) or target.startswith("/"):
39
+ return match.group(0)
40
+
41
+ source_target = (readme_dir / target).resolve()
42
+ try:
43
+ new_target = os.path.relpath(source_target, start=output_dir)
44
+ except ValueError:
45
+ new_target = target
46
+
47
+ return f"{bang}[{text}]({new_target})"
48
+
49
+ return MARKDOWN_LINK.sub(replace, raw)
50
+
51
+
52
+ def transform(readme_path: Path, output_path: Path) -> None:
53
+ raw = readme_path.read_text(encoding="utf-8")
54
+ no_frontmatter = strip_frontmatter(raw)
55
+ rewritten = rewrite_links(no_frontmatter, readme_path, output_path)
56
+ output_path.parent.mkdir(parents=True, exist_ok=True)
57
+ output_path.write_text(rewritten, encoding="utf-8")
58
+
59
+
60
+ def main() -> None:
61
+ parser = argparse.ArgumentParser(
62
+ description=(
63
+ "Generate a welcome-page-friendly README by stripping frontmatter "
64
+ "and rewriting relative links."
65
+ )
66
+ )
67
+ parser.add_argument(
68
+ "--input",
69
+ type=Path,
70
+ default=Path("README.md"),
71
+ help="Source README file (default: README.md)",
72
+ )
73
+ parser.add_argument(
74
+ "--output",
75
+ type=Path,
76
+ default=Path(".github/README.md"),
77
+ help="Output Markdown file (default: .github/README.md)",
78
+ )
79
+ args = parser.parse_args()
80
+
81
+ transform(args.input, args.output)
82
+
83
+
84
+ if __name__ == "__main__":
85
+ main()
86
+