ykaitao commited on
Commit
b046e5d
·
verified ·
1 Parent(s): ea7eb83

Upload folder using huggingface_hub

Browse files
Dockerfile ADDED
@@ -0,0 +1,81 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ # Copyright (c) Meta Platforms, Inc. and affiliates.
2
+ # All rights reserved.
3
+ #
4
+ # This source code is licensed under the BSD-style license found in the
5
+ # LICENSE file in the root directory of this source tree.
6
+
7
+ # Multi-stage build using openenv-base
8
+ # This Dockerfile is flexible and works for both:
9
+ # - In-repo environments (with local OpenEnv sources)
10
+ # - Standalone environments (with openenv from PyPI/Git)
11
+ # The build script (openenv build) handles context detection and sets appropriate build args.
12
+
13
+ ARG BASE_IMAGE=ghcr.io/meta-pytorch/openenv-base:latest
14
+ FROM ${BASE_IMAGE} AS builder
15
+
16
+ WORKDIR /app
17
+
18
+ # Ensure git is available (required for installing dependencies from VCS)
19
+ RUN apt-get update && \
20
+ apt-get install -y --no-install-recommends git && \
21
+ rm -rf /var/lib/apt/lists/*
22
+
23
+ # Build argument to control whether we're building standalone or in-repo
24
+ ARG BUILD_MODE=in-repo
25
+ ARG ENV_NAME=lingua_env
26
+
27
+ # Copy environment code (always at root of build context)
28
+ COPY . /app/env
29
+
30
+ # For in-repo builds, openenv is already vendored in the build context
31
+ # For standalone builds, openenv will be installed via pyproject.toml
32
+ WORKDIR /app/env
33
+
34
+ # Ensure uv is available (for local builds where base image lacks it)
35
+ RUN if ! command -v uv >/dev/null 2>&1; then \
36
+ curl -LsSf https://astral.sh/uv/install.sh | sh && \
37
+ mv /root/.local/bin/uv /usr/local/bin/uv && \
38
+ mv /root/.local/bin/uvx /usr/local/bin/uvx; \
39
+ fi
40
+
41
+ # Install dependencies using uv sync
42
+ # If uv.lock exists, use it; otherwise resolve on the fly
43
+ RUN --mount=type=cache,target=/root/.cache/uv \
44
+ if [ -f uv.lock ]; then \
45
+ uv sync --frozen --no-install-project --no-editable; \
46
+ else \
47
+ uv sync --no-install-project --no-editable; \
48
+ fi
49
+
50
+ RUN --mount=type=cache,target=/root/.cache/uv \
51
+ if [ -f uv.lock ]; then \
52
+ uv sync --frozen --no-editable; \
53
+ else \
54
+ uv sync --no-editable; \
55
+ fi
56
+
57
+ # Final runtime stage
58
+ FROM ${BASE_IMAGE}
59
+
60
+ WORKDIR /app
61
+
62
+ # Copy the virtual environment from builder
63
+ COPY --from=builder /app/env/.venv /app/.venv
64
+
65
+ # Copy the environment code
66
+ COPY --from=builder /app/env /app/env
67
+
68
+ # Set PATH to use the virtual environment
69
+ ENV PATH="/app/.venv/bin:$PATH"
70
+
71
+ # Set PYTHONPATH so imports work correctly
72
+ ENV PYTHONPATH="/app/env:$PYTHONPATH"
73
+
74
+ # Health check
75
+ HEALTHCHECK --interval=30s --timeout=3s --start-period=5s --retries=3 \
76
+ CMD curl -f http://localhost:8000/health || exit 1
77
+
78
+ # Run the FastAPI server
79
+ # The module path is constructed to work with the /app/env structure
80
+ ENV ENABLE_WEB_INTERFACE=true
81
+ CMD ["sh", "-c", "cd /app/env && uvicorn server.app:app --host 0.0.0.0 --port 8000"]
README.md CHANGED
@@ -1,10 +1,255 @@
1
  ---
2
- title: Lingua Env
3
- emoji: 🐢
4
- colorFrom: green
5
  colorTo: red
6
  sdk: docker
7
  pinned: false
 
 
 
 
8
  ---
9
 
10
- Check out the configuration reference at https://huggingface.co/docs/hub/spaces-config-reference
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
  ---
2
+ title: Lingua Env Environment Server
3
+ emoji: 📟
4
+ colorFrom: yellow
5
  colorTo: red
6
  sdk: docker
7
  pinned: false
8
+ app_port: 8000
9
+ base_path: /web
10
+ tags:
11
+ - openenv
12
  ---
13
 
14
+ # Lingua Env Environment
15
+
16
+ A simple test environment that echoes back messages. Perfect for testing the env APIs as well as demonstrating environment usage patterns.
17
+
18
+ ## Quick Start
19
+
20
+ The simplest way to use the Lingua Env environment is through the `LinguaEnv` class:
21
+
22
+ ```python
23
+ from lingua_env import LinguaAction, LinguaEnv
24
+
25
+ try:
26
+ # Create environment from Docker image
27
+ lingua_envenv = LinguaEnv.from_docker_image("lingua_env-env:latest")
28
+
29
+ # Reset
30
+ result = lingua_envenv.reset()
31
+ print(f"Reset: {result.observation.echoed_message}")
32
+
33
+ # Send multiple messages
34
+ messages = ["Hello, World!", "Testing echo", "Final message"]
35
+
36
+ for msg in messages:
37
+ result = lingua_envenv.step(LinguaAction(message=msg))
38
+ print(f"Sent: '{msg}'")
39
+ print(f" → Echoed: '{result.observation.echoed_message}'")
40
+ print(f" → Length: {result.observation.message_length}")
41
+ print(f" → Reward: {result.reward}")
42
+
43
+ finally:
44
+ # Always clean up
45
+ lingua_envenv.close()
46
+ ```
47
+
48
+ That's it! The `LinguaEnv.from_docker_image()` method handles:
49
+ - Starting the Docker container
50
+ - Waiting for the server to be ready
51
+ - Connecting to the environment
52
+ - Container cleanup when you call `close()`
53
+
54
+ ## Building the Docker Image
55
+
56
+ Before using the environment, you need to build the Docker image:
57
+
58
+ ```bash
59
+ # From project root
60
+ docker build -t lingua_env-env:latest -f server/Dockerfile .
61
+ ```
62
+
63
+ ## Deploying to Hugging Face Spaces
64
+
65
+ You can easily deploy your OpenEnv environment to Hugging Face Spaces using the `openenv push` command:
66
+
67
+ ```bash
68
+ # From the environment directory (where openenv.yaml is located)
69
+ openenv push
70
+
71
+ # Or specify options
72
+ openenv push --namespace my-org --private
73
+ ```
74
+
75
+ The `openenv push` command will:
76
+ 1. Validate that the directory is an OpenEnv environment (checks for `openenv.yaml`)
77
+ 2. Prepare a custom build for Hugging Face Docker space (enables web interface)
78
+ 3. Upload to Hugging Face (ensuring you're logged in)
79
+
80
+ ### Prerequisites
81
+
82
+ - Authenticate with Hugging Face: The command will prompt for login if not already authenticated
83
+
84
+ ### Options
85
+
86
+ - `--directory`, `-d`: Directory containing the OpenEnv environment (defaults to current directory)
87
+ - `--repo-id`, `-r`: Repository ID in format 'username/repo-name' (defaults to 'username/env-name' from openenv.yaml)
88
+ - `--base-image`, `-b`: Base Docker image to use (overrides Dockerfile FROM)
89
+ - `--private`: Deploy the space as private (default: public)
90
+
91
+ ### Examples
92
+
93
+ ```bash
94
+ # Push to your personal namespace (defaults to username/env-name from openenv.yaml)
95
+ openenv push
96
+
97
+ # Push to a specific repository
98
+ openenv push --repo-id my-org/my-env
99
+
100
+ # Push with a custom base image
101
+ openenv push --base-image ghcr.io/meta-pytorch/openenv-base:latest
102
+
103
+ # Push as a private space
104
+ openenv push --private
105
+
106
+ # Combine options
107
+ openenv push --repo-id my-org/my-env --base-image custom-base:latest --private
108
+ ```
109
+
110
+ After deployment, your space will be available at:
111
+ `https://huggingface.co/spaces/<repo-id>`
112
+
113
+ The deployed space includes:
114
+ - **Web Interface** at `/web` - Interactive UI for exploring the environment
115
+ - **API Documentation** at `/docs` - Full OpenAPI/Swagger interface
116
+ - **Health Check** at `/health` - Container health monitoring
117
+ - **WebSocket** at `/ws` - Persistent session endpoint for low-latency interactions
118
+
119
+ ## Environment Details
120
+
121
+ ### Action
122
+ **LinguaAction**: Contains a single field
123
+ - `message` (str) - The message to echo back
124
+
125
+ ### Observation
126
+ **LinguaObservation**: Contains the echo response and metadata
127
+ - `echoed_message` (str) - The message echoed back
128
+ - `message_length` (int) - Length of the message
129
+ - `reward` (float) - Reward based on message length (length × 0.1)
130
+ - `done` (bool) - Always False for echo environment
131
+ - `metadata` (dict) - Additional info like step count
132
+
133
+ ### Reward
134
+ The reward is calculated as: `message_length × 0.1`
135
+ - "Hi" → reward: 0.2
136
+ - "Hello, World!" → reward: 1.3
137
+ - Empty message → reward: 0.0
138
+
139
+ ## Advanced Usage
140
+
141
+ ### Connecting to an Existing Server
142
+
143
+ If you already have a Lingua Env environment server running, you can connect directly:
144
+
145
+ ```python
146
+ from lingua_env import LinguaEnv
147
+
148
+ # Connect to existing server
149
+ lingua_envenv = LinguaEnv(base_url="<ENV_HTTP_URL_HERE>")
150
+
151
+ # Use as normal
152
+ result = lingua_envenv.reset()
153
+ result = lingua_envenv.step(LinguaAction(message="Hello!"))
154
+ ```
155
+
156
+ Note: When connecting to an existing server, `lingua_envenv.close()` will NOT stop the server.
157
+
158
+ ### Using the Context Manager
159
+
160
+ The client supports context manager usage for automatic connection management:
161
+
162
+ ```python
163
+ from lingua_env import LinguaAction, LinguaEnv
164
+
165
+ # Connect with context manager (auto-connects and closes)
166
+ with LinguaEnv(base_url="http://localhost:8000") as env:
167
+ result = env.reset()
168
+ print(f"Reset: {result.observation.echoed_message}")
169
+ # Multiple steps with low latency
170
+ for msg in ["Hello", "World", "!"]:
171
+ result = env.step(LinguaAction(message=msg))
172
+ print(f"Echoed: {result.observation.echoed_message}")
173
+ ```
174
+
175
+ The client uses WebSocket connections for:
176
+ - **Lower latency**: No HTTP connection overhead per request
177
+ - **Persistent session**: Server maintains your environment state
178
+ - **Efficient for episodes**: Better for many sequential steps
179
+
180
+ ### Concurrent WebSocket Sessions
181
+
182
+ The server supports multiple concurrent WebSocket connections. To enable this,
183
+ modify `server/app.py` to use factory mode:
184
+
185
+ ```python
186
+ # In server/app.py - use factory mode for concurrent sessions
187
+ app = create_app(
188
+ LinguaEnvironment, # Pass class, not instance
189
+ LinguaAction,
190
+ LinguaObservation,
191
+ max_concurrent_envs=4, # Allow 4 concurrent sessions
192
+ )
193
+ ```
194
+
195
+ Then multiple clients can connect simultaneously:
196
+
197
+ ```python
198
+ from lingua_env import LinguaAction, LinguaEnv
199
+ from concurrent.futures import ThreadPoolExecutor
200
+
201
+ def run_episode(client_id: int):
202
+ with LinguaEnv(base_url="http://localhost:8000") as env:
203
+ result = env.reset()
204
+ for i in range(10):
205
+ result = env.step(LinguaAction(message=f"Client {client_id}, step {i}"))
206
+ return client_id, result.observation.message_length
207
+
208
+ # Run 4 episodes concurrently
209
+ with ThreadPoolExecutor(max_workers=4) as executor:
210
+ results = list(executor.map(run_episode, range(4)))
211
+ ```
212
+
213
+ ## Development & Testing
214
+
215
+ ### Direct Environment Testing
216
+
217
+ Test the environment logic directly without starting the HTTP server:
218
+
219
+ ```bash
220
+ # From the server directory
221
+ python3 server/lingua_env_environment.py
222
+ ```
223
+
224
+ This verifies that:
225
+ - Environment resets correctly
226
+ - Step executes actions properly
227
+ - State tracking works
228
+ - Rewards are calculated correctly
229
+
230
+ ### Running Locally
231
+
232
+ Run the server locally for development:
233
+
234
+ ```bash
235
+ uvicorn server.app:app --reload
236
+ ```
237
+
238
+ ## Project Structure
239
+
240
+ ```
241
+ lingua_env/
242
+ ├── .dockerignore # Docker build exclusions
243
+ ├── __init__.py # Module exports
244
+ ├── README.md # This file
245
+ ├── openenv.yaml # OpenEnv manifest
246
+ ├── pyproject.toml # Project metadata and dependencies
247
+ ├── uv.lock # Locked dependencies (generated)
248
+ ├── client.py # LinguaEnv client
249
+ ├── models.py # Action and Observation models
250
+ └── server/
251
+ ├── __init__.py # Server module exports
252
+ ├── lingua_env_environment.py # Core environment logic
253
+ ├── app.py # FastAPI application (HTTP + WebSocket endpoints)
254
+ └── Dockerfile # Container image definition
255
+ ```
__init__.py ADDED
@@ -0,0 +1,16 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ # Copyright (c) Meta Platforms, Inc. and affiliates.
2
+ # All rights reserved.
3
+ #
4
+ # This source code is licensed under the BSD-style license found in the
5
+ # LICENSE file in the root directory of this source tree.
6
+
7
+ """Lingua Env Environment."""
8
+
9
+ from .client import LinguaEnv
10
+ from .models import LinguaAction, LinguaObservation
11
+
12
+ __all__ = [
13
+ "LinguaAction",
14
+ "LinguaObservation",
15
+ "LinguaEnv",
16
+ ]
client.py ADDED
@@ -0,0 +1,99 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ # Copyright (c) Meta Platforms, Inc. and affiliates.
2
+ # All rights reserved.
3
+ #
4
+ # This source code is licensed under the BSD-style license found in the
5
+ # LICENSE file in the root directory of this source tree.
6
+
7
+ """Lingua Env Environment Client."""
8
+
9
+ from typing import Dict
10
+
11
+ from openenv.core.client_types import StepResult
12
+ from openenv.core.env_server.types import State
13
+ from openenv.core import EnvClient
14
+
15
+ from .models import LinguaAction, LinguaObservation
16
+
17
+
18
+ class LinguaEnv(
19
+ EnvClient[LinguaAction, LinguaObservation, dict]
20
+ ):
21
+ """
22
+ Client for the Lingua Env Environment.
23
+
24
+ This client maintains a persistent WebSocket connection to the environment server,
25
+ enabling efficient multi-step interactions with lower latency.
26
+ Each client instance has its own dedicated environment session on the server.
27
+
28
+ Example:
29
+ >>> # Connect to a running server
30
+ >>> with LinguaEnv(base_url="http://localhost:8000") as client:
31
+ ... result = client.reset()
32
+ ... print(result.observation.echoed_message)
33
+ ...
34
+ ... result = client.step(LinguaAction(message="Hello!"))
35
+ ... print(result.observation.echoed_message)
36
+
37
+ Example with Docker:
38
+ >>> # Automatically start container and connect
39
+ >>> client = LinguaEnv.from_docker_image("lingua_env-env:latest")
40
+ >>> try:
41
+ ... result = client.reset()
42
+ ... result = client.step(LinguaAction(message="Test"))
43
+ ... finally:
44
+ ... client.close()
45
+ """
46
+
47
+ def _step_payload(self, action: LinguaAction) -> Dict:
48
+ """
49
+ Convert LinguaAction to JSON payload for step message.
50
+
51
+ Args:
52
+ action: LinguaAction instance
53
+
54
+ Returns:
55
+ Dictionary representation suitable for JSON encoding
56
+ """
57
+ return {
58
+ "message": action.message,
59
+ }
60
+
61
+ def _parse_result(self, payload: Dict) -> StepResult[LinguaObservation]:
62
+ """
63
+ Parse server response into StepResult[LinguaObservation].
64
+
65
+ Args:
66
+ payload: JSON response data from server
67
+
68
+ Returns:
69
+ StepResult with LinguaObservation
70
+ """
71
+ obs_data = payload.get("observation", {})
72
+ observation = LinguaObservation(
73
+ echoed_message=obs_data.get("echoed_message", ""),
74
+ message_length=obs_data.get("message_length", 0),
75
+ done=payload.get("done", False),
76
+ reward=payload.get("reward"),
77
+ metadata=obs_data.get("metadata", {}),
78
+ )
79
+
80
+ return StepResult(
81
+ observation=observation,
82
+ reward=payload.get("reward"),
83
+ done=payload.get("done", False),
84
+ )
85
+
86
+ def _parse_state(self, payload: Dict) -> State:
87
+ """
88
+ Parse server response into State object.
89
+
90
+ Args:
91
+ payload: JSON response from state request
92
+
93
+ Returns:
94
+ State object with episode_id and step_count
95
+ """
96
+ return State(
97
+ episode_id=payload.get("episode_id"),
98
+ step_count=payload.get("step_count", 0),
99
+ )
data/ambiguous_examples.jsonl ADDED
@@ -0,0 +1,10 @@
 
 
 
 
 
 
 
 
 
 
 
1
+ {"sentence": ["Начальник", "областного", "управления", "связи", "Семен", "Еремеевич", "был", "человек", "простой", ",", "приходил", "на", "работу", "всегда", "вовремя", ",", "здоровался", "с", "секретаршей", "за", "руку", "и", "иногда", "даже", "писал", "в", "стенгазету", "заметки", "под", "псевдонимом", "\"", "Муха", "\"", "."], "target_index": 1, "word": "областного", "candidate_parses": [{"lemma": "областной", "tag": "ADJF neut,sing,gent", "score": 0.727272}, {"lemma": "областной", "tag": "ADJF masc,sing,gent", "score": 0.181818}, {"lemma": "областной", "tag": "ADJF anim,masc,sing,accs", "score": 0.090909}], "gold": {"lemma": "областной", "upos": "ADJ", "feats": {"Case": "Gen", "Degree": "Pos", "Gender": "Neut", "Number": "Sing"}}}
2
+ {"sentence": ["Начальник", "областного", "управления", "связи", "Семен", "Еремеевич", "был", "человек", "простой", ",", "приходил", "на", "работу", "всегда", "вовремя", ",", "здоровался", "с", "секретаршей", "за", "руку", "и", "иногда", "даже", "писал", "в", "стенгазету", "заметки", "под", "псевдонимом", "\"", "Муха", "\"", "."], "target_index": 2, "word": "управления", "candidate_parses": [{"lemma": "управление", "tag": "NOUN,inan,neut sing,gent", "score": 0.989071}, {"lemma": "управление", "tag": "NOUN,inan,neut plur,nomn", "score": 0.005464}, {"lemma": "управление", "tag": "NOUN,inan,neut plur,accs", "score": 0.005464}], "gold": {"lemma": "управление", "upos": "NOUN", "feats": {"Animacy": "Inan", "Case": "Gen", "Gender": "Neut", "Number": "Sing"}}}
3
+ {"sentence": ["Начальник", "областного", "управления", "связи", "Семен", "Еремеевич", "был", "человек", "простой", ",", "приходил", "на", "работу", "всегда", "вовремя", ",", "здоровался", "с", "секретаршей", "за", "руку", "и", "иногда", "даже", "писал", "в", "стенгазету", "заметки", "под", "псевдонимом", "\"", "Муха", "\"", "."], "target_index": 3, "word": "связи", "candidate_parses": [{"lemma": "связь", "tag": "NOUN,inan,femn sing,gent", "score": 0.759562}, {"lemma": "связь", "tag": "NOUN,inan,femn plur,accs", "score": 0.103825}, {"lemma": "связь", "tag": "NOUN,inan,femn plur,nomn", "score": 0.087431}, {"lemma": "связь", "tag": "NOUN,inan,femn sing,datv", "score": 0.032786}, {"lemma": "связь", "tag": "NOUN,inan,femn sing,loc2", "score": 0.010928}], "gold": {"lemma": "связь", "upos": "NOUN", "feats": {"Animacy": "Inan", "Case": "Gen", "Gender": "Fem", "Number": "Sing"}}}
4
+ {"sentence": ["Начальник", "областного", "управления", "связи", "Семен", "Еремеевич", "был", "человек", "простой", ",", "приходил", "на", "работу", "всегда", "вовремя", ",", "здоровался", "с", "секретаршей", "за", "руку", "и", "иногда", "даже", "писал", "в", "стенгазету", "заметки", "под", "псевдонимом", "\"", "Муха", "\"", "."], "target_index": 7, "word": "человек", "candidate_parses": [{"lemma": "человек", "tag": "NOUN,anim,masc sing,nomn", "score": 0.569458}, {"lemma": "человек", "tag": "NOUN,anim,masc plur,gent", "score": 0.430541}], "gold": {"lemma": "человек", "upos": "NOUN", "feats": {"Animacy": "Anim", "Case": "Nom", "Gender": "Masc", "Number": "Sing"}}}
5
+ {"sentence": ["Начальник", "областного", "управления", "связи", "Семен", "Еремеевич", "был", "человек", "простой", ",", "приходил", "на", "работу", "всегда", "вовремя", ",", "здоровался", "с", "секретаршей", "за", "руку", "и", "иногда", "даже", "писал", "в", "стенгазету", "заметки", "под", "псевдонимом", "\"", "Муха", "\"", "."], "target_index": 8, "word": "простой", "candidate_parses": [{"lemma": "простой", "tag": "ADJF,Qual femn,sing,gent", "score": 0.225806}, {"lemma": "простой", "tag": "ADJF,Qual femn,sing,datv", "score": 0.225806}, {"lemma": "простой", "tag": "ADJF,Qual femn,sing,ablt", "score": 0.193548}, {"lemma": "простой", "tag": "ADJF,Qual femn,sing,loct", "score": 0.129032}, {"lemma": "простой", "tag": "NOUN,inan,masc sing,nomn", "score": 0.064516}], "gold": {"lemma": "простой", "upos": "ADJ", "feats": {"Case": "Nom", "Degree": "Pos", "Gender": "Masc", "Number": "Sing"}}}
6
+ {"sentence": ["Начальник", "областного", "управления", "связи", "Семен", "Еремеевич", "был", "человек", "простой", ",", "приходил", "на", "работу", "всегда", "вовремя", ",", "здоровался", "с", "секретаршей", "за", "руку", "и", "иногда", "даже", "писал", "в", "стенгазету", "заметки", "под", "псевдонимом", "\"", "Муха", "\"", "."], "target_index": 11, "word": "на", "candidate_parses": [{"lemma": "на", "tag": "PREP", "score": 0.998961}, {"lemma": "на", "tag": "PRCL", "score": 0.000849}, {"lemma": "на", "tag": "INTJ", "score": 0.000188}], "gold": {"lemma": "на", "upos": "ADP", "feats": {}}}
7
+ {"sentence": ["Начальник", "областного", "управления", "связи", "Семен", "Еремеевич", "был", "человек", "простой", ",", "приходил", "на", "работу", "всегда", "вовремя", ",", "здоровался", "с", "секретаршей", "за", "руку", "и", "иногда", "даже", "писал", "в", "стенгазету", "заметки", "под", "псевдонимом", "\"", "Муха", "\"", "."], "target_index": 17, "word": "с", "candidate_parses": [{"lemma": "с", "tag": "PREP", "score": 0.997628}, {"lemma": "с", "tag": "NOUN,inan,femn,Fixd,Abbr plur,nomn", "score": 0.000406}, {"lemma": "с", "tag": "PRCL", "score": 0.000271}, {"lemma": "с", "tag": "NOUN,inan,femn,Fixd,Abbr sing,nomn", "score": 6.7e-05}, {"lemma": "с", "tag": "NOUN,inan,femn,Fixd,Abbr sing,gent", "score": 6.7e-05}], "gold": {"lemma": "с", "upos": "ADP", "feats": {}}}
8
+ {"sentence": ["Начальник", "областного", "управления", "связи", "Семен", "Еремеевич", "был", "человек", "простой", ",", "приходил", "на", "работу", "всегда", "вовремя", ",", "здоровался", "с", "секретаршей", "за", "руку", "и", "иногда", "даже", "писал", "в", "стенгазету", "заметки", "под", "псевдонимом", "\"", "Муха", "\"", "."], "target_index": 21, "word": "и", "candidate_parses": [{"lemma": "и", "tag": "CONJ", "score": 0.998263}, {"lemma": "и", "tag": "PRCL", "score": 0.000306}, {"lemma": "и", "tag": "INTJ", "score": 0.000204}, {"lemma": "и", "tag": "NOUN,anim,masc,Fixd,Abbr sing,nomn", "score": 0.000102}, {"lemma": "и", "tag": "NOUN,anim,masc,Fixd,Abbr sing,gent", "score": 0.000102}], "gold": {"lemma": "и", "upos": "CCONJ", "feats": {}}}
9
+ {"sentence": ["Начальник", "областного", "управления", "связи", "Семен", "Еремеевич", "был", "человек", "простой", ",", "приходил", "на", "работу", "всегда", "вовремя", ",", "здоровался", "с", "секретаршей", "за", "руку", "и", "иногда", "даже", "писал", "в", "стенгазету", "заметки", "под", "псевдонимом", "\"", "Муха", "\"", "."], "target_index": 24, "word": "писал", "candidate_parses": [{"lemma": "писать", "tag": "VERB,impf,tran masc,sing,past,indc", "score": 0.99115}, {"lemma": "писать", "tag": "VERB,impf,intr masc,sing,past,indc", "score": 0.008849}], "gold": {"lemma": "писать", "upos": "VERB", "feats": {"Aspect": "Imp", "Gender": "Masc", "Mood": "Ind", "Number": "Sing", "Tense": "Past", "VerbForm": "Fin", "Voice": "Act"}}}
10
+ {"sentence": ["Начальник", "областного", "управления", "связи", "Семен", "Еремеевич", "был", "человек", "простой", ",", "приходил", "на", "работу", "всегда", "вовремя", ",", "здоровался", "с", "секретаршей", "за", "руку", "и", "иногда", "даже", "писал", "в", "стенгазету", "заметки", "под", "псевдонимом", "\"", "Муха", "\"", "."], "target_index": 25, "word": "в", "candidate_parses": [{"lemma": "в", "tag": "PREP", "score": 0.999327}, {"lemma": "в", "tag": "NOUN,inan,masc,Fixd,Abbr sing,gent", "score": 0.000249}, {"lemma": "в", "tag": "NOUN,inan,masc,Fixd,Abbr sing,loct", "score": 5.7e-05}, {"lemma": "в", "tag": "NOUN,inan,masc,Fixd,Abbr sing,nomn", "score": 1.9e-05}, {"lemma": "в", "tag": "NOUN,inan,masc,Fixd,Abbr sing,datv", "score": 1.9e-05}], "gold": {"lemma": "в", "upos": "ADP", "feats": {}}}
models.py ADDED
@@ -0,0 +1,32 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ # Copyright (c) Meta Platforms, Inc. and affiliates.
2
+ # All rights reserved.
3
+ #
4
+ # This source code is licensed under the BSD-style license found in the
5
+ # LICENSE file in the root directory of this source tree.
6
+
7
+ """
8
+ Data models for the Lingua Env Environment.
9
+
10
+ The lingua_env environment is a simple test environment that echoes back messages.
11
+ """
12
+
13
+ from typing import Any
14
+
15
+ from pydantic import Field
16
+ from openenv.core.env_server import Action, Observation, State
17
+
18
+
19
+ class LinguaAction(Action):
20
+ choice: int
21
+
22
+
23
+ class LinguaObservation(Observation):
24
+ sentence: list[str]
25
+ target_index: int
26
+ word: str
27
+ candidate_parses: list[dict[str, Any]]
28
+
29
+
30
+ class LinguaState(State):
31
+ gold: dict[str, Any] = Field(default_factory=dict)
32
+
openenv.yaml ADDED
@@ -0,0 +1,7 @@
 
 
 
 
 
 
 
 
1
+ spec_version: 1
2
+ name: lingua_env
3
+ type: space
4
+ runtime: fastapi
5
+ app: server.app:app
6
+ port: 8000
7
+
openenv_lingua_env.egg-info/PKG-INFO ADDED
@@ -0,0 +1,9 @@
 
 
 
 
 
 
 
 
 
 
1
+ Metadata-Version: 2.4
2
+ Name: openenv-lingua_env
3
+ Version: 0.1.0
4
+ Summary: Lingua Env environment for OpenEnv
5
+ Requires-Python: >=3.10
6
+ Requires-Dist: openenv-core[core]>=0.2.0
7
+ Provides-Extra: dev
8
+ Requires-Dist: pytest>=8.0.0; extra == "dev"
9
+ Requires-Dist: pytest-cov>=4.0.0; extra == "dev"
openenv_lingua_env.egg-info/SOURCES.txt ADDED
@@ -0,0 +1,14 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ README.md
2
+ pyproject.toml
3
+ ./__init__.py
4
+ ./client.py
5
+ ./models.py
6
+ openenv_lingua_env.egg-info/PKG-INFO
7
+ openenv_lingua_env.egg-info/SOURCES.txt
8
+ openenv_lingua_env.egg-info/dependency_links.txt
9
+ openenv_lingua_env.egg-info/entry_points.txt
10
+ openenv_lingua_env.egg-info/requires.txt
11
+ openenv_lingua_env.egg-info/top_level.txt
12
+ server/__init__.py
13
+ server/app.py
14
+ server/lingua_env_environment.py
openenv_lingua_env.egg-info/dependency_links.txt ADDED
@@ -0,0 +1 @@
 
 
1
+
openenv_lingua_env.egg-info/entry_points.txt ADDED
@@ -0,0 +1,2 @@
 
 
 
1
+ [console_scripts]
2
+ server = lingua_env.server.app:main
openenv_lingua_env.egg-info/requires.txt ADDED
@@ -0,0 +1,5 @@
 
 
 
 
 
 
1
+ openenv-core[core]>=0.2.0
2
+
3
+ [dev]
4
+ pytest>=8.0.0
5
+ pytest-cov>=4.0.0
openenv_lingua_env.egg-info/top_level.txt ADDED
@@ -0,0 +1 @@
 
 
1
+ lingua_env
pyproject.toml ADDED
@@ -0,0 +1,45 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ # Copyright (c) Meta Platforms, Inc. and affiliates.
2
+ # All rights reserved.
3
+ #
4
+ # This source code is licensed under the BSD-style license found in the
5
+ # LICENSE file in the root directory of this source tree.
6
+
7
+ [build-system]
8
+ requires = ["setuptools>=45", "wheel"]
9
+ build-backend = "setuptools.build_meta"
10
+
11
+ [project]
12
+ name = "openenv-lingua_env"
13
+ version = "0.1.0"
14
+ description = "Lingua Env environment for OpenEnv"
15
+ requires-python = ">=3.10"
16
+ dependencies = [
17
+ # Core OpenEnv runtime (provides FastAPI server + HTTP client types)
18
+ # install from github
19
+ # "openenv-core[core] @ git+https://github.com/meta-pytorch/OpenEnv.git",
20
+ "openenv-core[core]>=0.2.0",
21
+ # Environment-specific dependencies
22
+ # Add all dependencies needed for your environment here
23
+ # Examples:
24
+ # "numpy>=1.19.0",
25
+ # "torch>=2.0.0",
26
+ # "gymnasium>=0.29.0",
27
+ # "openspiel>=1.0.0",
28
+ # "smolagents>=1.22.0,<2",
29
+ ]
30
+
31
+ [project.optional-dependencies]
32
+ dev = [
33
+ "pytest>=8.0.0",
34
+ "pytest-cov>=4.0.0",
35
+ ]
36
+
37
+ [project.scripts]
38
+ # Server entry point - enables running via: uv run --project . server
39
+ # or: python -m lingua_env.server.app
40
+ server = "lingua_env.server.app:main"
41
+
42
+ [tool.setuptools]
43
+ include-package-data = true
44
+ packages = ["lingua_env", "lingua_env.server"]
45
+ package-dir = { "lingua_env" = ".", "lingua_env.server" = "server" }
server/.ipynb_checkpoints/app-checkpoint.py ADDED
@@ -0,0 +1,60 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ # Copyright (c) Meta Platforms, Inc. and affiliates.
2
+ # All rights reserved.
3
+ #
4
+ # This source code is licensed under the BSD-style license found in the
5
+ # LICENSE file in the root directory of this source tree.
6
+
7
+ """
8
+ FastAPI application for the Lingua Env Environment.
9
+
10
+ This module creates an HTTP server that exposes the LinguaEnvironment
11
+ over HTTP and WebSocket endpoints, compatible with EnvClient.
12
+
13
+ Endpoints:
14
+ - POST /reset: Reset the environment
15
+ - POST /step: Execute an action
16
+ - GET /state: Get current environment state
17
+ - GET /schema: Get action/observation schemas
18
+ - WS /ws: WebSocket endpoint for persistent sessions
19
+
20
+ Usage:
21
+ # Development (with auto-reload):
22
+ uvicorn server.app:app --reload --host 0.0.0.0 --port 8000
23
+
24
+ # Production:
25
+ uvicorn server.app:app --host 0.0.0.0 --port 8000 --workers 4
26
+
27
+ # Or run directly:
28
+ python -m server.app
29
+ """
30
+
31
+ try:
32
+ from openenv.core.env_server.http_server import create_app
33
+ except Exception as e: # pragma: no cover
34
+ raise ImportError(
35
+ "openenv is required for the web interface. Install dependencies with:\n uv sync\n"
36
+ ) from e
37
+
38
+ from ..models import LinguaAction, LinguaObservation
39
+ from .lingua_env_environment import LinguaEnvironment
40
+
41
+ app = create_app(
42
+ LinguaEnvironment,
43
+ LinguaAction,
44
+ LinguaObservation,
45
+ env_name="lingua",
46
+ max_concurrent_envs=1,
47
+ )
48
+
49
+
50
+ def main() -> None:
51
+ import os
52
+ import uvicorn
53
+
54
+ host = os.getenv("HOST", "0.0.0.0")
55
+ port = int(os.getenv("PORT", "8000"))
56
+ uvicorn.run(app, host=host, port=port)
57
+
58
+
59
+ if __name__ == "__main__":
60
+ main()
server/.ipynb_checkpoints/lingua_env_environment-checkpoint.py ADDED
@@ -0,0 +1,177 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ # Copyright (c) Meta Platforms, Inc. and affiliates.
2
+ # All rights reserved.
3
+ #
4
+ # This source code is licensed under the BSD-style license found in the
5
+ # LICENSE file in the root directory of this source tree.
6
+
7
+ """
8
+ Lingua Env Environment Implementation.
9
+
10
+ A simple test environment that echoes back messages sent to it.
11
+ Perfect for testing HTTP server infrastructure.
12
+ """
13
+
14
+ import json
15
+ import random
16
+ import uuid
17
+ from pathlib import Path
18
+
19
+ from openenv.core.env_server import Environment
20
+
21
+ from ..models import LinguaAction, LinguaObservation, LinguaState
22
+
23
+
24
+ PYMORPHY_FEATURE_MAP = {
25
+ "nomn": ("Case", "Nom"),
26
+ "gent": ("Case", "Gen"),
27
+ "datv": ("Case", "Dat"),
28
+ "accs": ("Case", "Acc"),
29
+ "ablt": ("Case", "Ins"),
30
+ "loct": ("Case", "Loc"),
31
+ "loc2": ("Case", "Loc"),
32
+ "sing": ("Number", "Sing"),
33
+ "plur": ("Number", "Plur"),
34
+ "masc": ("Gender", "Masc"),
35
+ "femn": ("Gender", "Fem"),
36
+ "neut": ("Gender", "Neut"),
37
+ "anim": ("Animacy", "Anim"),
38
+ "inan": ("Animacy", "Inan"),
39
+ "past": ("Tense", "Past"),
40
+ "pres": ("Tense", "Pres"),
41
+ "futr": ("Tense", "Fut"),
42
+ "1per": ("Person", "1"),
43
+ "2per": ("Person", "2"),
44
+ "3per": ("Person", "3"),
45
+ "COMP": ("Degree", "Cmp"),
46
+ "Supr": ("Degree", "Sup"),
47
+ }
48
+
49
+ POS_MAP = {
50
+ "NOUN": ["NOUN"],
51
+ "ADJ": ["ADJF", "ADJS"],
52
+ "VERB": ["VERB", "INFN", "GRND"],
53
+ "AUX": ["VERB", "INFN"],
54
+ "ADV": ["ADVB", "COMP", "PRED"],
55
+ "PRON": ["NPRO"],
56
+ "DET": ["ADJF", "NPRO"],
57
+ "NUM": ["NUMR"],
58
+ "ADP": ["PREP"],
59
+ "CCONJ": ["CONJ"],
60
+ "SCONJ": ["CONJ"],
61
+ "PART": ["PRCL"],
62
+ "INTJ": ["INTJ"],
63
+ }
64
+
65
+
66
+ def pymorphy_tag_to_ud_feats(tag_str: str) -> dict[str, str]:
67
+ feats = {}
68
+ for tok in tag_str.replace(",", " ").split():
69
+ if tok in PYMORPHY_FEATURE_MAP:
70
+ k, v = PYMORPHY_FEATURE_MAP[tok]
71
+ feats[k] = v
72
+ return feats
73
+
74
+
75
+ def parse_pymorphy_pos(tag_str: str) -> str:
76
+ return tag_str.split()[0].split(",")[0]
77
+
78
+
79
+ def score_candidate(candidate: dict, gold: dict) -> dict:
80
+ cand_lemma = candidate["lemma"]
81
+ cand_tag = candidate["tag"]
82
+
83
+ lemma_match = cand_lemma == gold["lemma"]
84
+
85
+ pymorphy_pos = parse_pymorphy_pos(cand_tag)
86
+ allowed_pos = POS_MAP.get(gold["upos"], [gold["upos"]])
87
+ pos_match = pymorphy_pos in allowed_pos
88
+
89
+ cand_feats = pymorphy_tag_to_ud_feats(cand_tag)
90
+ gold_feats = gold.get("feats", {})
91
+
92
+ reward = 0
93
+ if lemma_match:
94
+ reward += 1
95
+ if pos_match:
96
+ reward += 1
97
+
98
+ feature_matches = {}
99
+ for feat_name, gold_value in gold_feats.items():
100
+ cand_value = cand_feats.get(feat_name)
101
+ match = cand_value == gold_value
102
+ feature_matches[feat_name] = {
103
+ "candidate": cand_value,
104
+ "gold": gold_value,
105
+ "match": match,
106
+ }
107
+ if match:
108
+ reward += 1
109
+
110
+ return {
111
+ "reward": reward,
112
+ "lemma_match": lemma_match,
113
+ "pos_match": pos_match,
114
+ "candidate_feats": cand_feats,
115
+ "gold_feats": gold_feats,
116
+ "feature_matches": feature_matches,
117
+ }
118
+
119
+
120
+ class LinguaEnvironment(Environment):
121
+ def __init__(self):
122
+ super().__init__()
123
+ data_path = Path(__file__).resolve().parents[1] / "data" / "ambiguous_examples.jsonl"
124
+ with open(data_path, "r", encoding="utf-8") as f:
125
+ self.examples = [json.loads(line) for line in f]
126
+
127
+ self.current = None
128
+ self._state = LinguaState(gold={})
129
+
130
+ def reset(self) -> LinguaObservation:
131
+ self.current = random.choice(self.examples)
132
+ self._state = LinguaState(
133
+ episode_id=str(uuid.uuid4()),
134
+ step_count=0,
135
+ gold=self.current["gold"],
136
+ )
137
+
138
+ return LinguaObservation(
139
+ sentence=list(self.current["sentence"]),
140
+ target_index=int(self.current["target_index"]),
141
+ word=str(self.current["word"]),
142
+ candidate_parses=list(self.current["candidate_parses"]),
143
+ )
144
+
145
+ def step(self, action: LinguaAction) -> LinguaObservation:
146
+ if self.current is None:
147
+ raise RuntimeError("Environment must be reset before step().")
148
+
149
+ choice = int(action.choice)
150
+ if choice < 0 or choice >= len(self.current["candidate_parses"]):
151
+ raise ValueError(f"Invalid choice index: {choice}")
152
+
153
+ self._state.step_count += 1
154
+
155
+ chosen = self.current["candidate_parses"][choice]
156
+ details = score_candidate(chosen, self.current["gold"])
157
+
158
+ self._state.gold = self.current["gold"]
159
+
160
+ obs = LinguaObservation(
161
+ sentence=list(self.current["sentence"]),
162
+ target_index=int(self.current["target_index"]),
163
+ word=str(self.current["word"]),
164
+ candidate_parses=list(self.current["candidate_parses"]),
165
+ )
166
+ obs.reward = float(details["reward"])
167
+ obs.done = True
168
+ obs.metadata = {
169
+ "chosen": chosen,
170
+ "gold": self.current["gold"],
171
+ "reward_breakdown": details,
172
+ }
173
+ return obs
174
+
175
+ @property
176
+ def state(self) -> LinguaState:
177
+ return self._state
server/__init__.py ADDED
@@ -0,0 +1,11 @@
 
 
 
 
 
 
 
 
 
 
 
 
1
+ # Copyright (c) Meta Platforms, Inc. and affiliates.
2
+ # All rights reserved.
3
+ #
4
+ # This source code is licensed under the BSD-style license found in the
5
+ # LICENSE file in the root directory of this source tree.
6
+
7
+ """Lingua Env environment server components."""
8
+
9
+ from .lingua_env_environment import LinguaEnvironment
10
+
11
+ __all__ = ["LinguaEnvironment"]
server/app.py ADDED
@@ -0,0 +1,60 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ # Copyright (c) Meta Platforms, Inc. and affiliates.
2
+ # All rights reserved.
3
+ #
4
+ # This source code is licensed under the BSD-style license found in the
5
+ # LICENSE file in the root directory of this source tree.
6
+
7
+ """
8
+ FastAPI application for the Lingua Env Environment.
9
+
10
+ This module creates an HTTP server that exposes the LinguaEnvironment
11
+ over HTTP and WebSocket endpoints, compatible with EnvClient.
12
+
13
+ Endpoints:
14
+ - POST /reset: Reset the environment
15
+ - POST /step: Execute an action
16
+ - GET /state: Get current environment state
17
+ - GET /schema: Get action/observation schemas
18
+ - WS /ws: WebSocket endpoint for persistent sessions
19
+
20
+ Usage:
21
+ # Development (with auto-reload):
22
+ uvicorn server.app:app --reload --host 0.0.0.0 --port 8000
23
+
24
+ # Production:
25
+ uvicorn server.app:app --host 0.0.0.0 --port 8000 --workers 4
26
+
27
+ # Or run directly:
28
+ python -m server.app
29
+ """
30
+
31
+ try:
32
+ from openenv.core.env_server.http_server import create_app
33
+ except Exception as e: # pragma: no cover
34
+ raise ImportError(
35
+ "openenv is required for the web interface. Install dependencies with:\n uv sync\n"
36
+ ) from e
37
+
38
+ from ..models import LinguaAction, LinguaObservation
39
+ from .lingua_env_environment import LinguaEnvironment
40
+
41
+ app = create_app(
42
+ LinguaEnvironment,
43
+ LinguaAction,
44
+ LinguaObservation,
45
+ env_name="lingua",
46
+ max_concurrent_envs=1,
47
+ )
48
+
49
+
50
+ def main() -> None:
51
+ import os
52
+ import uvicorn
53
+
54
+ host = os.getenv("HOST", "0.0.0.0")
55
+ port = int(os.getenv("PORT", "8000"))
56
+ uvicorn.run(app, host=host, port=port)
57
+
58
+
59
+ if __name__ == "__main__":
60
+ main()
server/lingua_env_environment.py ADDED
@@ -0,0 +1,177 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ # Copyright (c) Meta Platforms, Inc. and affiliates.
2
+ # All rights reserved.
3
+ #
4
+ # This source code is licensed under the BSD-style license found in the
5
+ # LICENSE file in the root directory of this source tree.
6
+
7
+ """
8
+ Lingua Env Environment Implementation.
9
+
10
+ A simple test environment that echoes back messages sent to it.
11
+ Perfect for testing HTTP server infrastructure.
12
+ """
13
+
14
+ import json
15
+ import random
16
+ import uuid
17
+ from pathlib import Path
18
+
19
+ from openenv.core.env_server import Environment
20
+
21
+ from ..models import LinguaAction, LinguaObservation, LinguaState
22
+
23
+
24
+ PYMORPHY_FEATURE_MAP = {
25
+ "nomn": ("Case", "Nom"),
26
+ "gent": ("Case", "Gen"),
27
+ "datv": ("Case", "Dat"),
28
+ "accs": ("Case", "Acc"),
29
+ "ablt": ("Case", "Ins"),
30
+ "loct": ("Case", "Loc"),
31
+ "loc2": ("Case", "Loc"),
32
+ "sing": ("Number", "Sing"),
33
+ "plur": ("Number", "Plur"),
34
+ "masc": ("Gender", "Masc"),
35
+ "femn": ("Gender", "Fem"),
36
+ "neut": ("Gender", "Neut"),
37
+ "anim": ("Animacy", "Anim"),
38
+ "inan": ("Animacy", "Inan"),
39
+ "past": ("Tense", "Past"),
40
+ "pres": ("Tense", "Pres"),
41
+ "futr": ("Tense", "Fut"),
42
+ "1per": ("Person", "1"),
43
+ "2per": ("Person", "2"),
44
+ "3per": ("Person", "3"),
45
+ "COMP": ("Degree", "Cmp"),
46
+ "Supr": ("Degree", "Sup"),
47
+ }
48
+
49
+ POS_MAP = {
50
+ "NOUN": ["NOUN"],
51
+ "ADJ": ["ADJF", "ADJS"],
52
+ "VERB": ["VERB", "INFN", "GRND"],
53
+ "AUX": ["VERB", "INFN"],
54
+ "ADV": ["ADVB", "COMP", "PRED"],
55
+ "PRON": ["NPRO"],
56
+ "DET": ["ADJF", "NPRO"],
57
+ "NUM": ["NUMR"],
58
+ "ADP": ["PREP"],
59
+ "CCONJ": ["CONJ"],
60
+ "SCONJ": ["CONJ"],
61
+ "PART": ["PRCL"],
62
+ "INTJ": ["INTJ"],
63
+ }
64
+
65
+
66
+ def pymorphy_tag_to_ud_feats(tag_str: str) -> dict[str, str]:
67
+ feats = {}
68
+ for tok in tag_str.replace(",", " ").split():
69
+ if tok in PYMORPHY_FEATURE_MAP:
70
+ k, v = PYMORPHY_FEATURE_MAP[tok]
71
+ feats[k] = v
72
+ return feats
73
+
74
+
75
+ def parse_pymorphy_pos(tag_str: str) -> str:
76
+ return tag_str.split()[0].split(",")[0]
77
+
78
+
79
+ def score_candidate(candidate: dict, gold: dict) -> dict:
80
+ cand_lemma = candidate["lemma"]
81
+ cand_tag = candidate["tag"]
82
+
83
+ lemma_match = cand_lemma == gold["lemma"]
84
+
85
+ pymorphy_pos = parse_pymorphy_pos(cand_tag)
86
+ allowed_pos = POS_MAP.get(gold["upos"], [gold["upos"]])
87
+ pos_match = pymorphy_pos in allowed_pos
88
+
89
+ cand_feats = pymorphy_tag_to_ud_feats(cand_tag)
90
+ gold_feats = gold.get("feats", {})
91
+
92
+ reward = 0
93
+ if lemma_match:
94
+ reward += 1
95
+ if pos_match:
96
+ reward += 1
97
+
98
+ feature_matches = {}
99
+ for feat_name, gold_value in gold_feats.items():
100
+ cand_value = cand_feats.get(feat_name)
101
+ match = cand_value == gold_value
102
+ feature_matches[feat_name] = {
103
+ "candidate": cand_value,
104
+ "gold": gold_value,
105
+ "match": match,
106
+ }
107
+ if match:
108
+ reward += 1
109
+
110
+ return {
111
+ "reward": reward,
112
+ "lemma_match": lemma_match,
113
+ "pos_match": pos_match,
114
+ "candidate_feats": cand_feats,
115
+ "gold_feats": gold_feats,
116
+ "feature_matches": feature_matches,
117
+ }
118
+
119
+
120
+ class LinguaEnvironment(Environment):
121
+ def __init__(self):
122
+ super().__init__()
123
+ data_path = Path(__file__).resolve().parents[1] / "data" / "ambiguous_examples.jsonl"
124
+ with open(data_path, "r", encoding="utf-8") as f:
125
+ self.examples = [json.loads(line) for line in f]
126
+
127
+ self.current = None
128
+ self._state = LinguaState(gold={})
129
+
130
+ def reset(self) -> LinguaObservation:
131
+ self.current = random.choice(self.examples)
132
+ self._state = LinguaState(
133
+ episode_id=str(uuid.uuid4()),
134
+ step_count=0,
135
+ gold=self.current["gold"],
136
+ )
137
+
138
+ return LinguaObservation(
139
+ sentence=list(self.current["sentence"]),
140
+ target_index=int(self.current["target_index"]),
141
+ word=str(self.current["word"]),
142
+ candidate_parses=list(self.current["candidate_parses"]),
143
+ )
144
+
145
+ def step(self, action: LinguaAction) -> LinguaObservation:
146
+ if self.current is None:
147
+ raise RuntimeError("Environment must be reset before step().")
148
+
149
+ choice = int(action.choice)
150
+ if choice < 0 or choice >= len(self.current["candidate_parses"]):
151
+ raise ValueError(f"Invalid choice index: {choice}")
152
+
153
+ self._state.step_count += 1
154
+
155
+ chosen = self.current["candidate_parses"][choice]
156
+ details = score_candidate(chosen, self.current["gold"])
157
+
158
+ self._state.gold = self.current["gold"]
159
+
160
+ obs = LinguaObservation(
161
+ sentence=list(self.current["sentence"]),
162
+ target_index=int(self.current["target_index"]),
163
+ word=str(self.current["word"]),
164
+ candidate_parses=list(self.current["candidate_parses"]),
165
+ )
166
+ obs.reward = float(details["reward"])
167
+ obs.done = True
168
+ obs.metadata = {
169
+ "chosen": chosen,
170
+ "gold": self.current["gold"],
171
+ "reward_breakdown": details,
172
+ }
173
+ return obs
174
+
175
+ @property
176
+ def state(self) -> LinguaState:
177
+ return self._state
server/requirements.txt ADDED
@@ -0,0 +1,6 @@
 
 
 
 
 
 
 
1
+ openenv[core]>=0.2.0
2
+ fastapi>=0.115.0
3
+ uvicorn>=0.24.0
4
+
5
+
6
+
uv.lock ADDED
The diff for this file is too large to render. See raw diff