koichi12 commited on
Commit
48635e5
·
verified ·
1 Parent(s): 3eb4a70

Add files using upload-large-folder tool

Browse files
This view is limited to 50 files because it contains too many changes.   See raw diff
Files changed (50) hide show
  1. .venv/lib/python3.11/site-packages/numpy/f2py/tests/__pycache__/test_array_from_pyobj.cpython-311.pyc +0 -0
  2. .venv/lib/python3.11/site-packages/numpy/f2py/tests/__pycache__/test_callback.cpython-311.pyc +0 -0
  3. .venv/lib/python3.11/site-packages/numpy/f2py/tests/__pycache__/test_crackfortran.cpython-311.pyc +0 -0
  4. .venv/lib/python3.11/site-packages/numpy/f2py/tests/__pycache__/test_f2py2e.cpython-311.pyc +0 -0
  5. .venv/lib/python3.11/site-packages/numpy/f2py/tests/__pycache__/test_module_doc.cpython-311.pyc +0 -0
  6. .venv/lib/python3.11/site-packages/numpy/f2py/tests/__pycache__/test_pyf_src.cpython-311.pyc +0 -0
  7. .venv/lib/python3.11/site-packages/numpy/f2py/tests/__pycache__/test_quoted_character.cpython-311.pyc +0 -0
  8. .venv/lib/python3.11/site-packages/numpy/f2py/tests/__pycache__/test_return_character.cpython-311.pyc +0 -0
  9. .venv/lib/python3.11/site-packages/numpy/f2py/tests/__pycache__/test_return_complex.cpython-311.pyc +0 -0
  10. .venv/lib/python3.11/site-packages/numpy/f2py/tests/__pycache__/test_return_logical.cpython-311.pyc +0 -0
  11. .venv/lib/python3.11/site-packages/ray/dashboard/modules/__pycache__/__init__.cpython-311.pyc +0 -0
  12. .venv/lib/python3.11/site-packages/ray/dashboard/modules/__pycache__/dashboard_sdk.cpython-311.pyc +0 -0
  13. .venv/lib/python3.11/site-packages/ray/dashboard/modules/__pycache__/version.cpython-311.pyc +0 -0
  14. .venv/lib/python3.11/site-packages/ray/dashboard/modules/event/event_utils.py +208 -0
  15. .venv/lib/python3.11/site-packages/ray/dashboard/modules/healthz/__init__.py +0 -0
  16. .venv/lib/python3.11/site-packages/ray/dashboard/modules/healthz/__pycache__/__init__.cpython-311.pyc +0 -0
  17. .venv/lib/python3.11/site-packages/ray/dashboard/modules/healthz/__pycache__/healthz_agent.cpython-311.pyc +0 -0
  18. .venv/lib/python3.11/site-packages/ray/dashboard/modules/healthz/__pycache__/healthz_head.cpython-311.pyc +0 -0
  19. .venv/lib/python3.11/site-packages/ray/dashboard/modules/healthz/__pycache__/utils.cpython-311.pyc +0 -0
  20. .venv/lib/python3.11/site-packages/ray/dashboard/modules/healthz/healthz_agent.py +54 -0
  21. .venv/lib/python3.11/site-packages/ray/dashboard/modules/healthz/healthz_head.py +41 -0
  22. .venv/lib/python3.11/site-packages/ray/dashboard/modules/healthz/utils.py +24 -0
  23. .venv/lib/python3.11/site-packages/ray/dashboard/modules/job/__pycache__/__init__.cpython-311.pyc +0 -0
  24. .venv/lib/python3.11/site-packages/ray/dashboard/modules/job/__pycache__/common.cpython-311.pyc +0 -0
  25. .venv/lib/python3.11/site-packages/ray/dashboard/modules/job/__pycache__/job_log_storage_client.cpython-311.pyc +0 -0
  26. .venv/lib/python3.11/site-packages/ray/dashboard/modules/job/__pycache__/sdk.cpython-311.pyc +0 -0
  27. .venv/lib/python3.11/site-packages/ray/dashboard/modules/job/__pycache__/utils.cpython-311.pyc +0 -0
  28. .venv/lib/python3.11/site-packages/ray/dashboard/modules/snapshot/__init__.py +0 -0
  29. .venv/lib/python3.11/site-packages/ray/dashboard/modules/snapshot/__pycache__/__init__.cpython-311.pyc +0 -0
  30. .venv/lib/python3.11/site-packages/ray/dashboard/modules/snapshot/__pycache__/snapshot_head.cpython-311.pyc +0 -0
  31. .venv/lib/python3.11/site-packages/ray/dashboard/modules/snapshot/snapshot_head.py +235 -0
  32. .venv/lib/python3.11/site-packages/ray/data/_internal/arrow_ops/__init__.py +0 -0
  33. .venv/lib/python3.11/site-packages/ray/data/_internal/arrow_ops/__pycache__/transform_polars.cpython-311.pyc +0 -0
  34. .venv/lib/python3.11/site-packages/ray/data/_internal/arrow_ops/__pycache__/transform_pyarrow.cpython-311.pyc +0 -0
  35. .venv/lib/python3.11/site-packages/ray/data/_internal/arrow_ops/transform_polars.py +40 -0
  36. .venv/lib/python3.11/site-packages/ray/data/_internal/datasource/__init__.py +0 -0
  37. .venv/lib/python3.11/site-packages/ray/data/_internal/datasource/__pycache__/__init__.cpython-311.pyc +0 -0
  38. .venv/lib/python3.11/site-packages/ray/data/_internal/datasource/__pycache__/audio_datasource.cpython-311.pyc +0 -0
  39. .venv/lib/python3.11/site-packages/ray/data/_internal/datasource/__pycache__/avro_datasource.cpython-311.pyc +0 -0
  40. .venv/lib/python3.11/site-packages/ray/data/_internal/datasource/__pycache__/bigquery_datasink.cpython-311.pyc +0 -0
  41. .venv/lib/python3.11/site-packages/ray/data/_internal/datasource/__pycache__/bigquery_datasource.cpython-311.pyc +0 -0
  42. .venv/lib/python3.11/site-packages/ray/data/_internal/datasource/__pycache__/binary_datasource.cpython-311.pyc +0 -0
  43. .venv/lib/python3.11/site-packages/ray/data/_internal/datasource/__pycache__/clickhouse_datasource.cpython-311.pyc +0 -0
  44. .venv/lib/python3.11/site-packages/ray/data/_internal/datasource/__pycache__/csv_datasink.cpython-311.pyc +0 -0
  45. .venv/lib/python3.11/site-packages/ray/data/_internal/datasource/__pycache__/csv_datasource.cpython-311.pyc +0 -0
  46. .venv/lib/python3.11/site-packages/ray/data/_internal/datasource/__pycache__/databricks_uc_datasource.cpython-311.pyc +0 -0
  47. .venv/lib/python3.11/site-packages/ray/data/_internal/datasource/__pycache__/delta_sharing_datasource.cpython-311.pyc +0 -0
  48. .venv/lib/python3.11/site-packages/ray/data/_internal/datasource/__pycache__/hudi_datasource.cpython-311.pyc +0 -0
  49. .venv/lib/python3.11/site-packages/ray/data/_internal/datasource/__pycache__/huggingface_datasource.cpython-311.pyc +0 -0
  50. .venv/lib/python3.11/site-packages/ray/data/_internal/datasource/__pycache__/iceberg_datasource.cpython-311.pyc +0 -0
.venv/lib/python3.11/site-packages/numpy/f2py/tests/__pycache__/test_array_from_pyobj.cpython-311.pyc ADDED
Binary file (43.5 kB). View file
 
.venv/lib/python3.11/site-packages/numpy/f2py/tests/__pycache__/test_callback.cpython-311.pyc ADDED
Binary file (14.6 kB). View file
 
.venv/lib/python3.11/site-packages/numpy/f2py/tests/__pycache__/test_crackfortran.cpython-311.pyc ADDED
Binary file (24 kB). View file
 
.venv/lib/python3.11/site-packages/numpy/f2py/tests/__pycache__/test_f2py2e.cpython-311.pyc ADDED
Binary file (47.9 kB). View file
 
.venv/lib/python3.11/site-packages/numpy/f2py/tests/__pycache__/test_module_doc.cpython-311.pyc ADDED
Binary file (1.81 kB). View file
 
.venv/lib/python3.11/site-packages/numpy/f2py/tests/__pycache__/test_pyf_src.cpython-311.pyc ADDED
Binary file (1.68 kB). View file
 
.venv/lib/python3.11/site-packages/numpy/f2py/tests/__pycache__/test_quoted_character.cpython-311.pyc ADDED
Binary file (1.33 kB). View file
 
.venv/lib/python3.11/site-packages/numpy/f2py/tests/__pycache__/test_return_character.cpython-311.pyc ADDED
Binary file (3.43 kB). View file
 
.venv/lib/python3.11/site-packages/numpy/f2py/tests/__pycache__/test_return_complex.cpython-311.pyc ADDED
Binary file (5.76 kB). View file
 
.venv/lib/python3.11/site-packages/numpy/f2py/tests/__pycache__/test_return_logical.cpython-311.pyc ADDED
Binary file (5.38 kB). View file
 
.venv/lib/python3.11/site-packages/ray/dashboard/modules/__pycache__/__init__.cpython-311.pyc ADDED
Binary file (194 Bytes). View file
 
.venv/lib/python3.11/site-packages/ray/dashboard/modules/__pycache__/dashboard_sdk.cpython-311.pyc ADDED
Binary file (18.5 kB). View file
 
.venv/lib/python3.11/site-packages/ray/dashboard/modules/__pycache__/version.cpython-311.pyc ADDED
Binary file (700 Bytes). View file
 
.venv/lib/python3.11/site-packages/ray/dashboard/modules/event/event_utils.py ADDED
@@ -0,0 +1,208 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ import asyncio
2
+ import collections
3
+ import fnmatch
4
+ import itertools
5
+ import json
6
+ import logging.handlers
7
+ import mmap
8
+ import os
9
+ import time
10
+ from concurrent.futures import ThreadPoolExecutor
11
+
12
+ from ray._private.utils import get_or_create_event_loop, run_background_task
13
+ from ray.dashboard.modules.event import event_consts
14
+ from ray.dashboard.utils import async_loop_forever
15
+
16
+ logger = logging.getLogger(__name__)
17
+
18
+
19
+ def _get_source_files(event_dir, source_types=None, event_file_filter=None):
20
+ event_log_names = os.listdir(event_dir)
21
+ source_files = {}
22
+ all_source_types = set(event_consts.EVENT_SOURCE_ALL)
23
+ for source_type in source_types or event_consts.EVENT_SOURCE_ALL:
24
+ assert source_type in all_source_types, f"Invalid source type: {source_type}"
25
+ files = []
26
+ for n in event_log_names:
27
+ if fnmatch.fnmatch(n, f"*{source_type}*"):
28
+ f = os.path.join(event_dir, n)
29
+ if event_file_filter is not None and not event_file_filter(f):
30
+ continue
31
+ files.append(f)
32
+ if files:
33
+ source_files[source_type] = files
34
+ return source_files
35
+
36
+
37
+ def _restore_newline(event_dict):
38
+ try:
39
+ event_dict["message"] = (
40
+ event_dict["message"].replace("\\n", "\n").replace("\\r", "\n")
41
+ )
42
+ except Exception:
43
+ logger.exception("Restore newline for event failed: %s", event_dict)
44
+ return event_dict
45
+
46
+
47
+ def _parse_line(event_str):
48
+ return _restore_newline(json.loads(event_str))
49
+
50
+
51
+ def parse_event_strings(event_string_list):
52
+ events = []
53
+ for data in event_string_list:
54
+ if not data:
55
+ continue
56
+ try:
57
+ event = _parse_line(data)
58
+ events.append(event)
59
+ except Exception:
60
+ logger.exception("Parse event line failed: %s", repr(data))
61
+ return events
62
+
63
+
64
+ ReadFileResult = collections.namedtuple(
65
+ "ReadFileResult", ["fid", "size", "mtime", "position", "lines"]
66
+ )
67
+
68
+
69
+ def _read_file(
70
+ file, pos, n_lines=event_consts.EVENT_READ_LINE_COUNT_LIMIT, closefd=True
71
+ ):
72
+ with open(file, "rb", closefd=closefd) as f:
73
+ # The ino may be 0 on Windows.
74
+ stat = os.stat(f.fileno())
75
+ fid = stat.st_ino or file
76
+ lines = []
77
+ with mmap.mmap(f.fileno(), 0, access=mmap.ACCESS_READ) as mm:
78
+ start = pos
79
+ for _ in range(n_lines):
80
+ sep = mm.find(b"\n", start)
81
+ if sep == -1:
82
+ break
83
+ if sep - start <= event_consts.EVENT_READ_LINE_LENGTH_LIMIT:
84
+ lines.append(mm[start:sep].decode("utf-8"))
85
+ else:
86
+ truncated_size = min(100, event_consts.EVENT_READ_LINE_LENGTH_LIMIT)
87
+ logger.warning(
88
+ "Ignored long string: %s...(%s chars)",
89
+ mm[start : start + truncated_size].decode("utf-8"),
90
+ sep - start,
91
+ )
92
+ start = sep + 1
93
+ return ReadFileResult(fid, stat.st_size, stat.st_mtime, start, lines)
94
+
95
+
96
+ def monitor_events(
97
+ event_dir,
98
+ callback,
99
+ monitor_thread_pool_executor: ThreadPoolExecutor,
100
+ scan_interval_seconds=event_consts.SCAN_EVENT_DIR_INTERVAL_SECONDS,
101
+ start_mtime=time.time() + event_consts.SCAN_EVENT_START_OFFSET_SECONDS,
102
+ monitor_files=None,
103
+ source_types=None,
104
+ ):
105
+ """Monitor events in directory. New events will be read and passed to the
106
+ callback.
107
+
108
+ Args:
109
+ event_dir: The event log directory.
110
+ callback (def callback(List[str]): pass): A callback accepts a list of
111
+ event strings.
112
+ monitor_thread_pool_executor: A thread pool exector to monitor/update
113
+ events. None means it will use the default execturo which uses
114
+ num_cpus of the machine * 5 threads (before python 3.8) or
115
+ min(32, num_cpus + 5) (from Python 3.8).
116
+ scan_interval_seconds: An interval seconds between two scans.
117
+ start_mtime: Only the event log files whose last modification
118
+ time is greater than start_mtime are monitored.
119
+ monitor_files (Dict[int, MonitorFile]): The map from event log file id
120
+ to MonitorFile object. Monitor all files start from the beginning
121
+ if the value is None.
122
+ source_types (List[str]): A list of source type name from
123
+ event_pb2.Event.SourceType.keys(). Monitor all source types if the
124
+ value is None.
125
+ """
126
+ loop = get_or_create_event_loop()
127
+ if monitor_files is None:
128
+ monitor_files = {}
129
+
130
+ logger.info(
131
+ "Monitor events logs modified after %s on %s, " "the source types are %s.",
132
+ start_mtime,
133
+ event_dir,
134
+ "all" if source_types is None else source_types,
135
+ )
136
+
137
+ MonitorFile = collections.namedtuple("MonitorFile", ["size", "mtime", "position"])
138
+
139
+ def _source_file_filter(source_file):
140
+ stat = os.stat(source_file)
141
+ return stat.st_mtime > start_mtime
142
+
143
+ def _read_monitor_file(file, pos):
144
+ assert isinstance(
145
+ file, str
146
+ ), f"File should be a str, but a {type(file)}({file}) found"
147
+ fd = os.open(file, os.O_RDONLY)
148
+ try:
149
+ stat = os.stat(fd)
150
+ # Check the file size to avoid raising the exception
151
+ # ValueError: cannot mmap an empty file
152
+ if stat.st_size <= 0:
153
+ return []
154
+ fid = stat.st_ino or file
155
+ monitor_file = monitor_files.get(fid)
156
+ if monitor_file:
157
+ if (
158
+ monitor_file.position == monitor_file.size
159
+ and monitor_file.size == stat.st_size
160
+ and monitor_file.mtime == stat.st_mtime
161
+ ):
162
+ logger.debug(
163
+ "Skip reading the file because " "there is no change: %s", file
164
+ )
165
+ return []
166
+ position = monitor_file.position
167
+ else:
168
+ logger.info("Found new event log file: %s", file)
169
+ position = pos
170
+ # Close the fd in finally.
171
+ r = _read_file(fd, position, closefd=False)
172
+ # It should be fine to update the dict in executor thread.
173
+ monitor_files[r.fid] = MonitorFile(r.size, r.mtime, r.position)
174
+ loop.call_soon_threadsafe(callback, r.lines)
175
+ except Exception as e:
176
+ raise Exception(f"Read event file failed: {file}") from e
177
+ finally:
178
+ os.close(fd)
179
+
180
+ @async_loop_forever(scan_interval_seconds, cancellable=True)
181
+ async def _scan_event_log_files():
182
+ # Scan event files.
183
+ source_files = await loop.run_in_executor(
184
+ monitor_thread_pool_executor,
185
+ _get_source_files,
186
+ event_dir,
187
+ source_types,
188
+ _source_file_filter,
189
+ )
190
+
191
+ # Limit concurrent read to avoid fd exhaustion.
192
+ semaphore = asyncio.Semaphore(event_consts.CONCURRENT_READ_LIMIT)
193
+
194
+ async def _concurrent_coro(filename):
195
+ async with semaphore:
196
+ return await loop.run_in_executor(
197
+ monitor_thread_pool_executor, _read_monitor_file, filename, 0
198
+ )
199
+
200
+ # Read files.
201
+ await asyncio.gather(
202
+ *[
203
+ _concurrent_coro(filename)
204
+ for filename in list(itertools.chain(*source_files.values()))
205
+ ]
206
+ )
207
+
208
+ return run_background_task(_scan_event_log_files())
.venv/lib/python3.11/site-packages/ray/dashboard/modules/healthz/__init__.py ADDED
File without changes
.venv/lib/python3.11/site-packages/ray/dashboard/modules/healthz/__pycache__/__init__.cpython-311.pyc ADDED
Binary file (202 Bytes). View file
 
.venv/lib/python3.11/site-packages/ray/dashboard/modules/healthz/__pycache__/healthz_agent.cpython-311.pyc ADDED
Binary file (3.25 kB). View file
 
.venv/lib/python3.11/site-packages/ray/dashboard/modules/healthz/__pycache__/healthz_head.cpython-311.pyc ADDED
Binary file (2.83 kB). View file
 
.venv/lib/python3.11/site-packages/ray/dashboard/modules/healthz/__pycache__/utils.cpython-311.pyc ADDED
Binary file (1.73 kB). View file
 
.venv/lib/python3.11/site-packages/ray/dashboard/modules/healthz/healthz_agent.py ADDED
@@ -0,0 +1,54 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ from aiohttp.web import Request, Response
2
+
3
+ import ray.dashboard.optional_utils as optional_utils
4
+ import ray.dashboard.utils as dashboard_utils
5
+ import ray.exceptions
6
+ from ray.dashboard.modules.healthz.utils import HealthChecker
7
+
8
+ routes = optional_utils.DashboardAgentRouteTable
9
+
10
+
11
+ class HealthzAgent(dashboard_utils.DashboardAgentModule):
12
+ """Health check in the agent.
13
+
14
+ This module adds health check related endpoint to the agent to check
15
+ local components' health.
16
+ """
17
+
18
+ def __init__(self, dashboard_agent):
19
+ super().__init__(dashboard_agent)
20
+ self._health_checker = HealthChecker(
21
+ dashboard_agent.gcs_aio_client,
22
+ f"{dashboard_agent.ip}:{dashboard_agent.node_manager_port}",
23
+ )
24
+
25
+ @routes.get("/api/local_raylet_healthz")
26
+ async def health_check(self, req: Request) -> Response:
27
+ try:
28
+ alive = await self._health_checker.check_local_raylet_liveness()
29
+ if alive is False:
30
+ return Response(status=503, text="Local Raylet failed")
31
+ except ray.exceptions.RpcError as e:
32
+ # We only consider the error other than GCS unreachable as raylet failure
33
+ # to avoid false positive.
34
+ # In case of GCS failed, Raylet will crash eventually if GCS is not back
35
+ # within a given time and the check will fail since agent can't live
36
+ # without a local raylet.
37
+ if e.rpc_code not in (
38
+ ray._raylet.GRPC_STATUS_CODE_UNAVAILABLE,
39
+ ray._raylet.GRPC_STATUS_CODE_UNKNOWN,
40
+ ray._raylet.GRPC_STATUS_CODE_DEADLINE_EXCEEDED,
41
+ ):
42
+ return Response(status=503, text=f"Health check failed due to: {e}")
43
+
44
+ return Response(
45
+ text="success",
46
+ content_type="application/text",
47
+ )
48
+
49
+ async def run(self, server):
50
+ pass
51
+
52
+ @staticmethod
53
+ def is_minimal_module():
54
+ return False
.venv/lib/python3.11/site-packages/ray/dashboard/modules/healthz/healthz_head.py ADDED
@@ -0,0 +1,41 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ from aiohttp.web import HTTPServiceUnavailable, Request, Response
2
+
3
+ import ray.dashboard.optional_utils as optional_utils
4
+ import ray.dashboard.utils as dashboard_utils
5
+ from ray.dashboard.modules.healthz.utils import HealthChecker
6
+
7
+ routes = optional_utils.DashboardHeadRouteTable
8
+
9
+
10
+ class HealthzHead(dashboard_utils.DashboardHeadModule):
11
+ """Health check in the head.
12
+
13
+ This module adds health check related endpoint to the head to check
14
+ GCS's heath.
15
+ """
16
+
17
+ def __init__(self, config: dashboard_utils.DashboardHeadModuleConfig):
18
+ super().__init__(config)
19
+ self._health_checker = HealthChecker(self.gcs_aio_client)
20
+
21
+ @routes.get("/api/gcs_healthz")
22
+ async def health_check(self, req: Request) -> Response:
23
+ alive = False
24
+ try:
25
+ alive = await self._health_checker.check_gcs_liveness()
26
+ if alive is True:
27
+ return Response(
28
+ text="success",
29
+ content_type="application/text",
30
+ )
31
+ except Exception as e:
32
+ return HTTPServiceUnavailable(reason=f"Health check failed: {e}")
33
+
34
+ return HTTPServiceUnavailable(reason="Health check failed")
35
+
36
+ async def run(self, server):
37
+ pass
38
+
39
+ @staticmethod
40
+ def is_minimal_module():
41
+ return True
.venv/lib/python3.11/site-packages/ray/dashboard/modules/healthz/utils.py ADDED
@@ -0,0 +1,24 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ from typing import Optional
2
+
3
+ from ray._private.gcs_utils import GcsAioClient
4
+
5
+
6
+ class HealthChecker:
7
+ def __init__(
8
+ self, gcs_aio_client: GcsAioClient, local_node_address: Optional[str] = None
9
+ ):
10
+ self._gcs_aio_client = gcs_aio_client
11
+ self._local_node_address = local_node_address
12
+
13
+ async def check_local_raylet_liveness(self) -> bool:
14
+ if self._local_node_address is None:
15
+ return False
16
+
17
+ liveness = await self._gcs_aio_client.check_alive(
18
+ [self._local_node_address.encode()], 0.1
19
+ )
20
+ return liveness[0]
21
+
22
+ async def check_gcs_liveness(self) -> bool:
23
+ await self._gcs_aio_client.check_alive([], 0.1)
24
+ return True
.venv/lib/python3.11/site-packages/ray/dashboard/modules/job/__pycache__/__init__.cpython-311.pyc ADDED
Binary file (198 Bytes). View file
 
.venv/lib/python3.11/site-packages/ray/dashboard/modules/job/__pycache__/common.cpython-311.pyc ADDED
Binary file (25.1 kB). View file
 
.venv/lib/python3.11/site-packages/ray/dashboard/modules/job/__pycache__/job_log_storage_client.cpython-311.pyc ADDED
Binary file (3.63 kB). View file
 
.venv/lib/python3.11/site-packages/ray/dashboard/modules/job/__pycache__/sdk.cpython-311.pyc ADDED
Binary file (24 kB). View file
 
.venv/lib/python3.11/site-packages/ray/dashboard/modules/job/__pycache__/utils.cpython-311.pyc ADDED
Binary file (13.4 kB). View file
 
.venv/lib/python3.11/site-packages/ray/dashboard/modules/snapshot/__init__.py ADDED
File without changes
.venv/lib/python3.11/site-packages/ray/dashboard/modules/snapshot/__pycache__/__init__.cpython-311.pyc ADDED
Binary file (203 Bytes). View file
 
.venv/lib/python3.11/site-packages/ray/dashboard/modules/snapshot/__pycache__/snapshot_head.cpython-311.pyc ADDED
Binary file (11.4 kB). View file
 
.venv/lib/python3.11/site-packages/ray/dashboard/modules/snapshot/snapshot_head.py ADDED
@@ -0,0 +1,235 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ import concurrent.futures
2
+ import enum
3
+ import json
4
+ import logging
5
+ import os
6
+ from datetime import datetime
7
+ from typing import Optional
8
+
9
+ import aiohttp.web
10
+
11
+ import ray.dashboard.optional_utils as dashboard_optional_utils
12
+ import ray.dashboard.utils as dashboard_utils
13
+ from ray import ActorID
14
+ from ray._private.pydantic_compat import BaseModel, Extra, Field, validator
15
+ from ray._private.storage import _load_class
16
+ from ray.dashboard.consts import RAY_CLUSTER_ACTIVITY_HOOK
17
+
18
+ logger = logging.getLogger(__name__)
19
+ logger.setLevel(logging.INFO)
20
+
21
+ routes = dashboard_optional_utils.DashboardHeadRouteTable
22
+
23
+ SNAPSHOT_API_TIMEOUT_SECONDS = 30
24
+
25
+
26
+ class RayActivityStatus(str, enum.Enum):
27
+ ACTIVE = "ACTIVE"
28
+ INACTIVE = "INACTIVE"
29
+ ERROR = "ERROR"
30
+
31
+
32
+ class RayActivityResponse(BaseModel, extra=Extra.allow):
33
+ """
34
+ Pydantic model used to inform if a particular Ray component can be considered
35
+ active, and metadata about observation.
36
+ """
37
+
38
+ is_active: RayActivityStatus = Field(
39
+ ...,
40
+ description=(
41
+ "Whether the corresponding Ray component is considered active or inactive, "
42
+ "or if there was an error while collecting this observation."
43
+ ),
44
+ )
45
+ reason: Optional[str] = Field(
46
+ None, description="Reason if Ray component is considered active or errored."
47
+ )
48
+ timestamp: float = Field(
49
+ ...,
50
+ description=(
51
+ "Timestamp of when this observation about the Ray component was made. "
52
+ "This is in the format of seconds since unix epoch."
53
+ ),
54
+ )
55
+ last_activity_at: Optional[float] = Field(
56
+ None,
57
+ description=(
58
+ "Timestamp when last actvity of this Ray component finished in format of "
59
+ "seconds since unix epoch. This field does not need to be populated "
60
+ "for Ray components where it is not meaningful."
61
+ ),
62
+ )
63
+
64
+ @validator("reason", always=True)
65
+ def reason_required(cls, v, values, **kwargs):
66
+ if "is_active" in values and values["is_active"] != RayActivityStatus.INACTIVE:
67
+ if v is None:
68
+ raise ValueError(
69
+ 'Reason is required if is_active is "active" or "error"'
70
+ )
71
+ return v
72
+
73
+
74
+ class APIHead(dashboard_utils.DashboardHeadModule):
75
+ def __init__(self, config: dashboard_utils.DashboardHeadModuleConfig):
76
+ super().__init__(config)
77
+ # For offloading CPU intensive work.
78
+ self._thread_pool = concurrent.futures.ThreadPoolExecutor(
79
+ max_workers=2, thread_name_prefix="api_head"
80
+ )
81
+
82
+ @routes.get("/api/actors/kill")
83
+ async def kill_actor_gcs(self, req) -> aiohttp.web.Response:
84
+ actor_id = req.query.get("actor_id")
85
+ force_kill = req.query.get("force_kill", False) in ("true", "True")
86
+ no_restart = req.query.get("no_restart", False) in ("true", "True")
87
+ if not actor_id:
88
+ return dashboard_optional_utils.rest_response(
89
+ success=False, message="actor_id is required."
90
+ )
91
+
92
+ await self.gcs_aio_client.kill_actor(
93
+ ActorID.from_hex(actor_id),
94
+ force_kill,
95
+ no_restart,
96
+ timeout=SNAPSHOT_API_TIMEOUT_SECONDS,
97
+ )
98
+
99
+ message = (
100
+ f"Force killed actor with id {actor_id}"
101
+ if force_kill
102
+ else f"Requested actor with id {actor_id} to terminate. "
103
+ + "It will exit once running tasks complete"
104
+ )
105
+
106
+ return dashboard_optional_utils.rest_response(success=True, message=message)
107
+
108
+ @routes.get("/api/component_activities")
109
+ async def get_component_activities(self, req) -> aiohttp.web.Response:
110
+ timeout = req.query.get("timeout", None)
111
+ if timeout and timeout.isdigit():
112
+ timeout = int(timeout)
113
+ else:
114
+ timeout = SNAPSHOT_API_TIMEOUT_SECONDS
115
+
116
+ # Get activity information for driver
117
+ driver_activity_info = await self._get_job_activity_info(timeout=timeout)
118
+ resp = {"driver": dict(driver_activity_info)}
119
+
120
+ if RAY_CLUSTER_ACTIVITY_HOOK in os.environ:
121
+ try:
122
+ cluster_activity_callable = _load_class(
123
+ os.environ[RAY_CLUSTER_ACTIVITY_HOOK]
124
+ )
125
+ external_activity_output = cluster_activity_callable()
126
+ assert isinstance(external_activity_output, dict), (
127
+ f"Output of hook {os.environ[RAY_CLUSTER_ACTIVITY_HOOK]} "
128
+ "should be Dict[str, RayActivityResponse]. Got "
129
+ f"output: {external_activity_output}"
130
+ )
131
+ for component_type in external_activity_output:
132
+ try:
133
+ component_activity_output = external_activity_output[
134
+ component_type
135
+ ]
136
+ # Parse and validate output to type RayActivityResponse
137
+ component_activity_output = RayActivityResponse(
138
+ **dict(component_activity_output)
139
+ )
140
+ resp[component_type] = dict(component_activity_output)
141
+ except Exception as e:
142
+ logger.exception(
143
+ f"Failed to get activity status of {component_type} "
144
+ f"from user hook {os.environ[RAY_CLUSTER_ACTIVITY_HOOK]}."
145
+ )
146
+ resp[component_type] = {
147
+ "is_active": RayActivityStatus.ERROR,
148
+ "reason": repr(e),
149
+ "timestamp": datetime.now().timestamp(),
150
+ }
151
+ except Exception as e:
152
+ logger.exception(
153
+ "Failed to get activity status from user "
154
+ f"hook {os.environ[RAY_CLUSTER_ACTIVITY_HOOK]}."
155
+ )
156
+ resp["external_component"] = {
157
+ "is_active": RayActivityStatus.ERROR,
158
+ "reason": repr(e),
159
+ "timestamp": datetime.now().timestamp(),
160
+ }
161
+
162
+ return aiohttp.web.Response(
163
+ text=json.dumps(resp),
164
+ content_type="application/json",
165
+ status=aiohttp.web.HTTPOk.status_code,
166
+ )
167
+
168
+ async def _get_job_activity_info(self, timeout: int) -> RayActivityResponse:
169
+ # Returns if there is Ray activity from drivers (job).
170
+ # Drivers in namespaces that start with _ray_internal_ are not
171
+ # considered activity.
172
+ # This includes the _ray_internal_dashboard job that gets automatically
173
+ # created with every cluster
174
+ try:
175
+ reply = await self.gcs_aio_client.get_all_job_info(
176
+ skip_submission_job_info_field=True,
177
+ skip_is_running_tasks_field=True,
178
+ timeout=timeout,
179
+ )
180
+
181
+ num_active_drivers = 0
182
+ latest_job_end_time = 0
183
+ for job_table_entry in reply.values():
184
+ is_dead = bool(job_table_entry.is_dead)
185
+ in_internal_namespace = job_table_entry.config.ray_namespace.startswith(
186
+ "_ray_internal_"
187
+ )
188
+ latest_job_end_time = (
189
+ max(latest_job_end_time, job_table_entry.end_time)
190
+ if job_table_entry.end_time
191
+ else latest_job_end_time
192
+ )
193
+ if not is_dead and not in_internal_namespace:
194
+ num_active_drivers += 1
195
+
196
+ current_timestamp = datetime.now().timestamp()
197
+ # Latest job end time must be before or equal to the current timestamp.
198
+ # Job end times may be provided in epoch milliseconds. Check if this
199
+ # is true, and convert to seconds
200
+ if latest_job_end_time > current_timestamp:
201
+ latest_job_end_time = latest_job_end_time / 1000
202
+ assert current_timestamp >= latest_job_end_time, (
203
+ f"Most recent job end time {latest_job_end_time} must be "
204
+ f"before or equal to the current timestamp {current_timestamp}"
205
+ )
206
+
207
+ is_active = (
208
+ RayActivityStatus.ACTIVE
209
+ if num_active_drivers > 0
210
+ else RayActivityStatus.INACTIVE
211
+ )
212
+ return RayActivityResponse(
213
+ is_active=is_active,
214
+ reason=f"Number of active drivers: {num_active_drivers}"
215
+ if num_active_drivers
216
+ else None,
217
+ timestamp=current_timestamp,
218
+ # If latest_job_end_time == 0, no jobs have finished yet so don't
219
+ # populate last_activity_at
220
+ last_activity_at=latest_job_end_time if latest_job_end_time else None,
221
+ )
222
+ except Exception as e:
223
+ logger.exception("Failed to get activity status of Ray drivers.")
224
+ return RayActivityResponse(
225
+ is_active=RayActivityStatus.ERROR,
226
+ reason=repr(e),
227
+ timestamp=datetime.now().timestamp(),
228
+ )
229
+
230
+ async def run(self, server):
231
+ pass
232
+
233
+ @staticmethod
234
+ def is_minimal_module():
235
+ return False
.venv/lib/python3.11/site-packages/ray/data/_internal/arrow_ops/__init__.py ADDED
File without changes
.venv/lib/python3.11/site-packages/ray/data/_internal/arrow_ops/__pycache__/transform_polars.cpython-311.pyc ADDED
Binary file (2.41 kB). View file
 
.venv/lib/python3.11/site-packages/ray/data/_internal/arrow_ops/__pycache__/transform_pyarrow.cpython-311.pyc ADDED
Binary file (21.8 kB). View file
 
.venv/lib/python3.11/site-packages/ray/data/_internal/arrow_ops/transform_polars.py ADDED
@@ -0,0 +1,40 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ from typing import TYPE_CHECKING, List
2
+
3
+ try:
4
+ import pyarrow
5
+ except ImportError:
6
+ pyarrow = None
7
+
8
+
9
+ if TYPE_CHECKING:
10
+ from ray.data._internal.planner.exchange.sort_task_spec import SortKey
11
+
12
+ pl = None
13
+
14
+
15
+ def check_polars_installed():
16
+ try:
17
+ global pl
18
+ import polars as pl
19
+ except ImportError:
20
+ raise ImportError(
21
+ "polars not installed. Install with `pip install polars` or set "
22
+ "`DataContext.use_polars = False` to fall back to pyarrow"
23
+ )
24
+
25
+
26
+ def sort(table: "pyarrow.Table", sort_key: "SortKey") -> "pyarrow.Table":
27
+ check_polars_installed()
28
+ df = pl.from_arrow(table)
29
+ return df.sort(sort_key.get_columns(), reverse=sort_key.get_descending()).to_arrow()
30
+
31
+
32
+ def concat_and_sort(
33
+ blocks: List["pyarrow.Table"], sort_key: "SortKey"
34
+ ) -> "pyarrow.Table":
35
+ check_polars_installed()
36
+ blocks = [pl.from_arrow(block) for block in blocks]
37
+ df = pl.concat(blocks).sort(
38
+ sort_key.get_columns(), reverse=sort_key.get_descending()
39
+ )
40
+ return df.to_arrow()
.venv/lib/python3.11/site-packages/ray/data/_internal/datasource/__init__.py ADDED
File without changes
.venv/lib/python3.11/site-packages/ray/data/_internal/datasource/__pycache__/__init__.cpython-311.pyc ADDED
Binary file (202 Bytes). View file
 
.venv/lib/python3.11/site-packages/ray/data/_internal/datasource/__pycache__/audio_datasource.cpython-311.pyc ADDED
Binary file (2.52 kB). View file
 
.venv/lib/python3.11/site-packages/ray/data/_internal/datasource/__pycache__/avro_datasource.cpython-311.pyc ADDED
Binary file (2.76 kB). View file
 
.venv/lib/python3.11/site-packages/ray/data/_internal/datasource/__pycache__/bigquery_datasink.cpython-311.pyc ADDED
Binary file (7.82 kB). View file
 
.venv/lib/python3.11/site-packages/ray/data/_internal/datasource/__pycache__/bigquery_datasource.cpython-311.pyc ADDED
Binary file (7.81 kB). View file
 
.venv/lib/python3.11/site-packages/ray/data/_internal/datasource/__pycache__/binary_datasource.cpython-311.pyc ADDED
Binary file (1.54 kB). View file
 
.venv/lib/python3.11/site-packages/ray/data/_internal/datasource/__pycache__/clickhouse_datasource.cpython-311.pyc ADDED
Binary file (16.4 kB). View file
 
.venv/lib/python3.11/site-packages/ray/data/_internal/datasource/__pycache__/csv_datasink.cpython-311.pyc ADDED
Binary file (2.35 kB). View file
 
.venv/lib/python3.11/site-packages/ray/data/_internal/datasource/__pycache__/csv_datasource.cpython-311.pyc ADDED
Binary file (3.49 kB). View file
 
.venv/lib/python3.11/site-packages/ray/data/_internal/datasource/__pycache__/databricks_uc_datasource.cpython-311.pyc ADDED
Binary file (10.2 kB). View file
 
.venv/lib/python3.11/site-packages/ray/data/_internal/datasource/__pycache__/delta_sharing_datasource.cpython-311.pyc ADDED
Binary file (6.95 kB). View file
 
.venv/lib/python3.11/site-packages/ray/data/_internal/datasource/__pycache__/hudi_datasource.cpython-311.pyc ADDED
Binary file (4.52 kB). View file
 
.venv/lib/python3.11/site-packages/ray/data/_internal/datasource/__pycache__/huggingface_datasource.cpython-311.pyc ADDED
Binary file (7.31 kB). View file
 
.venv/lib/python3.11/site-packages/ray/data/_internal/datasource/__pycache__/iceberg_datasource.cpython-311.pyc ADDED
Binary file (12.7 kB). View file