topguy commited on
Commit
d03dff9
·
1 Parent(s): e60bcbb

feat: Implement merged, time-sorted log view

Browse files

Refactors the application to display a merged and time-sorted view of all uploaded log files in a table.

Key changes:
- Replaced the single-file text view with a multi-file `gr.DataFrame`.
- Implemented timestamp parsing on file load, storing parsed timestamps in the application state.
- Created a new `generate_merged_view` function to handle filtering, merging, and sorting of logs.
- Added `pandas` as a dependency for data manipulation.
- Updated the design document to reflect the new architecture and user flow.
- Added a help icon for regular expression syntax.

Files changed (5) hide show
  1. README.md +165 -165
  2. design.md +44 -54
  3. logviewer/app.py +177 -98
  4. logviewer/timestamp_utils.py +8 -10
  5. requirements.txt +2 -1
README.md CHANGED
@@ -1,166 +1,166 @@
1
- ---
2
- title: LogViewer
3
- license: mit
4
- sdk: gradio
5
- app_file: logviewer/app.py
6
- colorFrom: blue
7
- colorTo: green
8
- sdk_version: 5.35.0
9
- ---
10
-
11
- # LogViewer
12
-
13
- LogViewer is a versatile tool designed for efficient viewing and filtering of log files. It offers both a user-friendly web interface built with Gradio and a powerful command-line interface (CLI) for automated processing.
14
-
15
- ## Features
16
-
17
- * **Web Interface (Gradio):**
18
- * Upload and display log files.
19
- * Apply multiple filters (include/exclude text, include/exclude regex).
20
- * Case-sensitive filtering option.
21
- * Reorder applied filters to control processing order.
22
- * Save and load filter configurations (JSON format).
23
- * Download filtered log content.
24
- * **Command-Line Interface (CLI):**
25
- * Process log files using pre-defined filter configurations.
26
- * Automate log analysis workflows.
27
-
28
- ## Installation
29
-
30
- To set up LogViewer on your local machine, follow these steps:
31
-
32
- ### Prerequisites
33
-
34
- * Python 3.8 or higher
35
- * pip (Python package installer)
36
- * `venv` (Python's built-in virtual environment module)
37
-
38
- ### Steps
39
-
40
- 1. **Clone the repository:**
41
- ```bash
42
- git clone https://github.com/your-username/LogViewer.git
43
- cd LogViewer
44
- ```
45
- *(Note: Replace `your-username` with the actual GitHub username once the repository is created.)*
46
-
47
- 2. **Create and activate a virtual environment:**
48
- It's highly recommended to use a virtual environment to manage project dependencies.
49
- ```bash
50
- python -m venv venv
51
- # On Linux/macOS:
52
- source venv/bin/activate
53
- # On Windows:
54
- .\venv\Scripts\activate
55
- ```
56
-
57
- 3. **Install dependencies:**
58
- ```bash
59
- pip install -r requirements.txt
60
- ```
61
-
62
- ## Usage
63
-
64
- ### Running with Docker Compose
65
-
66
- Docker Compose provides an easy way to set up and run the LogViewer application in a containerized environment.
67
-
68
- #### Prerequisites
69
-
70
- * Docker and Docker Compose installed on your system.
71
-
72
- #### Steps
73
-
74
- 1. **Build and run the services:**
75
- Navigate to the root directory of the cloned repository and run:
76
- ```bash
77
- docker compose up --build -d
78
- ```
79
- This command builds the Docker image (if not already built) and starts the LogViewer service in detached mode.
80
-
81
- 2. **Access the Web UI:**
82
- Open your web browser and navigate to `http://localhost:7860`.
83
-
84
- 3. **Using the CLI with Docker Compose:**
85
- If you are running the web interface via Docker Compose, the CLI tool (`cli_app.py`) needs to connect to the Dockerized service. You will need to modify the `Client` URL in `cli_app.py` to point to the Docker container's exposed port (which is `http://localhost:7860` by default).
86
-
87
- Locate the line `client = Client("http://localhost:7860/")` in `cli_app.py` and ensure it points to the correct address where your Docker container is accessible.
88
-
89
- Then, you can run the CLI as usual:
90
- ```bash
91
- python cli_app.py <log_file_path> <filter_file_path> [-o <output_file_path>]
92
- ```
93
-
94
- 4. **Stopping the services:**
95
- To stop the running Docker containers, use:
96
- ```bash
97
- docker compose down
98
- ```
99
-
100
- ### Web Interface (Gradio App)
101
-
102
- To launch the interactive web interface:
103
-
104
- 1. **Start the application:**
105
- Ensure your virtual environment is activated.
106
- ```bash
107
- python app.py
108
- ```
109
- 2. **Access the UI:**
110
- Open your web browser and navigate to the URL displayed in your terminal (usually `http://127.0.0.1:7860`).
111
-
112
- **How to use:**
113
- * **Upload Log File:** Click the "Upload Log File" button to select your log file.
114
- * **Add Filters:** Choose a "Filter Type" (e.g., "Include Text", "Exclude Regex"), enter a "Filter Value", and check "Case Sensitive" if needed. Click "Add Filter".
115
- * **Manage Filters:** Applied filters will appear in the "Applied Filters" list. You can select a filter and use "Remove Selected Filter", "Move Up", or "Move Down" to adjust them.
116
- * **Save/Load Filters:** Use "Save Filters" to download your current filter configuration as a JSON file, or "Load Filters (.json)" to upload a previously saved configuration.
117
- * **Save Filtered Log:** After applying filters, click "Save Filtered Log" to download the processed log content.
118
-
119
- ### Command-Line Interface (CLI)
120
-
121
- To process log files using the CLI, the Gradio web application (`app.py`) **must be running** in the background, as the CLI interacts with its API.
122
-
123
- 1. **Ensure the Gradio app is running:**
124
- Open a separate terminal, activate your virtual environment, and run:
125
- ```bash
126
- python app.py
127
- ```
128
- Keep this terminal open.
129
-
130
- 2. **Run the CLI tool:**
131
- In a new terminal, activate your virtual environment and use `cli_app.py`:
132
- ```bash
133
- python cli_app.py <log_file_path> <filter_file_path> [-o <output_file_path>]
134
- ```
135
- * `<log_file_path>`: The path to the input log file you want to process.
136
- * `<filter_file_path>`: The path to a JSON file containing your filter configurations (e.g., `filters.json` saved from the web UI).
137
- * `-o <output_file_path>` (optional): The path where the filtered log content will be saved. If not provided, a default name will be generated (e.g., `your_log_filters_filtered.txt`).
138
-
139
- **Example:**
140
-
141
- ```bash
142
- python cli_app.py my_application.log my_filters.json -o processed_log.txt
143
- ```
144
-
145
- ## Filter File Format
146
-
147
- Filter configurations are saved and loaded as JSON files. Each filter is an object within a list, with the following structure:
148
-
149
- ```json
150
- [
151
- {
152
- "type": "Include Text",
153
- "value": "ERROR",
154
- "case_sensitive": true
155
- },
156
- {
157
- "type": "Exclude Regex",
158
- "value": "DEBUG|INFO",
159
- "case_sensitive": false
160
- }
161
- ]
162
- ```
163
-
164
- * `type`: Can be "Include Text", "Exclude Text", "Include Regex", or "Exclude Regex".
165
- * `value`: The string or regex pattern for the filter.
166
  * `case_sensitive`: A boolean (`true` or `false`) indicating whether the filter should be case-sensitive.
 
1
+ ---
2
+ title: LogViewer
3
+ license: mit
4
+ sdk: gradio
5
+ app_file: logviewer/app.py
6
+ colorFrom: blue
7
+ colorTo: green
8
+ sdk_version: 5.35.0
9
+ ---
10
+
11
+ # LogViewer
12
+
13
+ LogViewer is a versatile tool designed for efficient viewing and filtering of log files. It offers both a user-friendly web interface built with Gradio and a powerful command-line interface (CLI) for automated processing.
14
+
15
+ ## Features
16
+
17
+ * **Web Interface (Gradio):**
18
+ * Upload and display log files.
19
+ * Apply multiple filters (include/exclude text, include/exclude regex).
20
+ * Case-sensitive filtering option.
21
+ * Reorder applied filters to control processing order.
22
+ * Save and load filter configurations (JSON format).
23
+ * Download filtered log content.
24
+ * **Command-Line Interface (CLI):**
25
+ * Process log files using pre-defined filter configurations.
26
+ * Automate log analysis workflows.
27
+
28
+ ## Installation
29
+
30
+ To set up LogViewer on your local machine, follow these steps:
31
+
32
+ ### Prerequisites
33
+
34
+ * Python 3.8 or higher
35
+ * pip (Python package installer)
36
+ * `venv` (Python's built-in virtual environment module)
37
+
38
+ ### Steps
39
+
40
+ 1. **Clone the repository:**
41
+ ```bash
42
+ git clone https://github.com/your-username/LogViewer.git
43
+ cd LogViewer
44
+ ```
45
+ *(Note: Replace `your-username` with the actual GitHub username once the repository is created.)*
46
+
47
+ 2. **Create and activate a virtual environment:**
48
+ It's highly recommended to use a virtual environment to manage project dependencies.
49
+ ```bash
50
+ python -m venv venv
51
+ # On Linux/macOS:
52
+ source venv/bin/activate
53
+ # On Windows:
54
+ .\venv\Scripts\activate
55
+ ```
56
+
57
+ 3. **Install dependencies:**
58
+ ```bash
59
+ pip install -r requirements.txt
60
+ ```
61
+
62
+ ## Usage
63
+
64
+ ### Running with Docker Compose
65
+
66
+ Docker Compose provides an easy way to set up and run the LogViewer application in a containerized environment.
67
+
68
+ #### Prerequisites
69
+
70
+ * Docker and Docker Compose installed on your system.
71
+
72
+ #### Steps
73
+
74
+ 1. **Build and run the services:**
75
+ Navigate to the root directory of the cloned repository and run:
76
+ ```bash
77
+ docker compose up --build -d
78
+ ```
79
+ This command builds the Docker image (if not already built) and starts the LogViewer service in detached mode.
80
+
81
+ 2. **Access the Web UI:**
82
+ Open your web browser and navigate to `http://localhost:7860`.
83
+
84
+ 3. **Using the CLI with Docker Compose:**
85
+ If you are running the web interface via Docker Compose, the CLI tool (`cli_app.py`) needs to connect to the Dockerized service. You will need to modify the `Client` URL in `cli_app.py` to point to the Docker container's exposed port (which is `http://localhost:7860` by default).
86
+
87
+ Locate the line `client = Client("http://localhost:7860/")` in `cli_app.py` and ensure it points to the correct address where your Docker container is accessible.
88
+
89
+ Then, you can run the CLI as usual:
90
+ ```bash
91
+ python cli_app.py <log_file_path> <filter_file_path> [-o <output_file_path>]
92
+ ```
93
+
94
+ 4. **Stopping the services:**
95
+ To stop the running Docker containers, use:
96
+ ```bash
97
+ docker compose down
98
+ ```
99
+
100
+ ### Web Interface (Gradio App)
101
+
102
+ To launch the interactive web interface:
103
+
104
+ 1. **Start the application:**
105
+ Ensure your virtual environment is activated.
106
+ ```bash
107
+ python app.py
108
+ ```
109
+ 2. **Access the UI:**
110
+ Open your web browser and navigate to the URL displayed in your terminal (usually `http://127.0.0.1:7860`).
111
+
112
+ **How to use:**
113
+ * **Upload Log File:** Click the "Upload Log File" button to select your log file.
114
+ * **Add Filters:** Choose a "Filter Type" (e.g., "Include Text", "Exclude Regex"), enter a "Filter Value", and check "Case Sensitive" if needed. Click "Add Filter".
115
+ * **Manage Filters:** Applied filters will appear in the "Applied Filters" list. You can select a filter and use "Remove Selected Filter", "Move Up", or "Move Down" to adjust them.
116
+ * **Save/Load Filters:** Use "Save Filters" to download your current filter configuration as a JSON file, or "Load Filters (.json)" to upload a previously saved configuration.
117
+ * **Save Filtered Log:** After applying filters, click "Save Filtered Log" to download the processed log content.
118
+
119
+ ### Command-Line Interface (CLI)
120
+
121
+ To process log files using the CLI, the Gradio web application (`app.py`) **must be running** in the background, as the CLI interacts with its API.
122
+
123
+ 1. **Ensure the Gradio app is running:**
124
+ Open a separate terminal, activate your virtual environment, and run:
125
+ ```bash
126
+ python app.py
127
+ ```
128
+ Keep this terminal open.
129
+
130
+ 2. **Run the CLI tool:**
131
+ In a new terminal, activate your virtual environment and use `cli_app.py`:
132
+ ```bash
133
+ python cli_app.py <log_file_path> <filter_file_path> [-o <output_file_path>]
134
+ ```
135
+ * `<log_file_path>`: The path to the input log file you want to process.
136
+ * `<filter_file_path>`: The path to a JSON file containing your filter configurations (e.g., `filters.json` saved from the web UI).
137
+ * `-o <output_file_path>` (optional): The path where the filtered log content will be saved. If not provided, a default name will be generated (e.g., `your_log_filters_filtered.txt`).
138
+
139
+ **Example:**
140
+
141
+ ```bash
142
+ python cli_app.py my_application.log my_filters.json -o processed_log.txt
143
+ ```
144
+
145
+ ## Filter File Format
146
+
147
+ Filter configurations are saved and loaded as JSON files. Each filter is an object within a list, with the following structure:
148
+
149
+ ```json
150
+ [
151
+ {
152
+ "type": "Include Text",
153
+ "value": "ERROR",
154
+ "case_sensitive": true
155
+ },
156
+ {
157
+ "type": "Exclude Regex",
158
+ "value": "DEBUG|INFO",
159
+ "case_sensitive": false
160
+ }
161
+ ]
162
+ ```
163
+
164
+ * `type`: Can be "Include Text", "Exclude Text", "Include Regex", or "Exclude Regex".
165
+ * `value`: The string or regex pattern for the filter.
166
  * `case_sensitive`: A boolean (`true` or `false`) indicating whether the filter should be case-sensitive.
design.md CHANGED
@@ -2,73 +2,64 @@
2
 
3
  ## 1. Overview
4
 
5
- LogViewer is a web-based application built with Python and Gradio that allows users to upload and inspect log files. It provides dynamic filtering capabilities to help users analyze log data by including or excluding lines based on text or regular expressions.
6
 
7
  ## 2. Components
8
 
9
- The application consists of two main Python files:
10
 
11
  ### 2.1. `app.py`
12
 
13
  This is the main application file that creates the user interface and handles all user interactions.
14
 
15
  * **UI Structure:** The UI is built using `gradio.Blocks` with the `Soft` theme. It includes:
16
- * A file upload component (`gr.File`) for loading log files.
17
- * A large, non-interactive textbox (`gr.Textbox`) to display the processed log content.
 
18
  * A section for adding new filters, containing:
19
  * A dropdown (`gr.Dropdown`) to select the filter type (Include/Exclude Text, Include/Exclude Regex).
20
- * A textbox (`gr.Textbox`) to input the filter pattern.
21
  * A checkbox (`gr.Checkbox`) to control case sensitivity for the filter.
22
  * An "Add Filter" button (`gr.Button`).
23
- * "Save Filters" button (`gr.Button`) to download the current filter set as a JSON file.
24
- * "Load Filters" button (`gr.UploadButton`) to upload a JSON file and apply saved filters.
25
- * A section to display and manage active filters:
26
- * A radio button group (`gr.Radio`) that lists the currently applied filters, showing their type, value, and case sensitivity.
27
  * A "Remove Selected Filter" button (`gr.Button`).
28
- * "Move Up" button (`gr.Button`) to move the selected filter up in the list.
29
- * "Move Down" button (`gr.Button`) to move the selected filter down in the list.
30
- * A "Save Filtered Log" button (`gr.Button`) to download the currently displayed filtered log content.
31
 
32
  * **State Management:**
33
- * A `gr.State` object (`filters_state`) is used to maintain the list of active filters as a list of dictionaries. Each dictionary represents a single filter.
 
 
34
 
35
  * **Core Logic:**
36
- * **`add_filter()`:** Adds a new filter dictionary to the `filters_state` list.
37
- * **`remove_filter()`:** Removes a filter from the `filters_state` list based on the user's selection.
38
- * **`move_filter_up()`:** Moves the selected filter up in the `filters_state` list.
39
- * **`move_filter_down()`:** Moves the selected filter down in the `filters_state` list.
40
- * **`apply_filters()`:** This is the main processing function. It reads the uploaded file and applies the sequence of filters stored in `filters_state` by calling the `filter_lines` utility function for each filter.
41
- * **`update_filter_list()`:** Generates a list of strings from the `filters_state` list to display it in the UI.
42
- * **`save_filters()`:** Saves the current `filters_state` to a JSON file.
43
- * **`load_filters()`:** Loads filters from a JSON file into `filters_state`.
44
- * **`save_filtered_log()`:** Saves the current content of the log display to a text file.
45
 
46
  * **Event Handling:**
47
- * Uploading a file triggers `apply_filters`.
48
- * Clicking "Add Filter" triggers `add_filter`, then `update_filter_list`, and finally `apply_filters`.
49
- * Clicking "Remove Selected Filter" triggers `remove_filter`, then `update_filter_list`, and finally `apply_filters`.
50
- * Clicking "Move Up" triggers `move_filter_up`, then `update_filter_list`, and finally `apply_filters`.
51
- * Clicking "Move Down" triggers `move_filter_down`, then `update_filter_list`, and finally `apply_filters`.
52
- * Clicking "Save Filters" triggers `save_filters`.
53
- * Uploading a file to "Load Filters" triggers `load_filters`, then `update_filter_list`, and finally `apply_filters`.
54
- * Clicking "Save Filtered Log" triggers `save_filtered_log`.
55
-
56
- * **API Endpoints:**
57
- * The Gradio application is mounted on a FastAPI app, which exposes all public functions as API endpoints. The endpoints are accessible at `http://<host>:<port>/api/<function_name>/`.
58
 
59
  ### 2.2. `filter_utils.py`
60
 
61
  This module provides the core filtering logic.
62
 
63
- * **`filter_lines()`:** A pure function that takes a list of text lines and applies a single filtering criterion (e.g., include text, exclude regex). It handles both plain text and regular expression matching, with an option for case sensitivity. It returns a new list of filtered lines.
64
 
65
  ### 2.3. `timestamp_utils.py`
66
 
67
- This module provides utilities for parsing and filtering log lines based on timestamps.
68
 
69
- * **`parse_timestamp()`:** Attempts to parse a timestamp from a log line. Designed to be extensible for various timestamp formats.
70
- * **`get_timestamp_range()`:** Extracts the earliest and latest timestamps from a list of log lines.
71
- * **`filter_by_time_range()`:** Filters log lines to include only those within a specified time range, based on parsed timestamps.
72
 
73
  ### 2.4. `cli_app.py`
74
 
@@ -76,23 +67,22 @@ This is a command-line interface (CLI) application that interacts with the runni
76
 
77
  * **Functionality:**
78
  * Takes a log file path and a filter JSON file path as arguments.
79
- * Optionally takes an output file path; otherwise, it generates one.
80
- * Uses the `requests` library to send POST requests to the `/api/apply_filters/` endpoint of the Gradio app.
81
- * Saves the filtered log content to the specified output file.
82
 
83
  ## 3. User Interaction Flow
84
 
85
  1. The user opens the application in their browser.
86
- 2. The user uploads a log file using the file upload component. The log content is immediately displayed.
87
- 3. To filter the log, the user selects a filter type, enters a value, and clicks "Add Filter".
88
- 4. The filter is added to the "Applied Filters" list, and the log view is automatically updated to reflect the new filter.
89
- 5. The user can add multiple filters sequentially. Each new filter is applied to the result of the previous ones.
90
- 6. To remove a filter, the user selects it from the "Applied Filters" radio group.
91
- 7. The user then clicks the "Remove Selected Filter" button.
92
- 8. The filter is removed from the list, and the log view is updated to reflect the change.
93
- 9. The user can reorder filters by selecting a filter and clicking "Move Up" or "Move Down" buttons.
94
- 10. The user can save the current set of filters by clicking the "Save Filters" button, which will prompt a file download.
95
- 11. The user can load a previously saved set of filters by clicking the "Load Filters" button and selecting a JSON file.
96
- 12. The user can save the currently displayed filtered log content by clicking the "Save Filtered Log" button, which will prompt a file download.
97
-
98
- This design allows for a flexible and interactive way to analyze log files by building a chain of filters.
 
2
 
3
  ## 1. Overview
4
 
5
+ LogViewer is a web-based application built with Python, Gradio, and Pandas that allows users to upload and inspect multiple log files. It provides a merged, time-sorted view of all logs and allows for dynamic filtering with separate filter sets for each file. This helps users analyze log data by including or excluding lines based on text or regular expressions.
6
 
7
  ## 2. Components
8
 
9
+ The application consists of three main Python files:
10
 
11
  ### 2.1. `app.py`
12
 
13
  This is the main application file that creates the user interface and handles all user interactions.
14
 
15
  * **UI Structure:** The UI is built using `gradio.Blocks` with the `Soft` theme. It includes:
16
+ * A file upload component (`gr.File`) for loading multiple log files.
17
+ * A dropdown (`gr.Dropdown`) to select the log file whose filters you want to manage.
18
+ * A `gr.DataFrame` to display the merged, filtered, and time-sorted log content in a table with columns for "File", "Timestamp", and "Log Entry".
19
  * A section for adding new filters, containing:
20
  * A dropdown (`gr.Dropdown`) to select the filter type (Include/Exclude Text, Include/Exclude Regex).
21
+ * A textbox (`gr.Textbox`) to input the filter pattern, accompanied by a help button (`gr.Button`) that provides a popup with a regex guide.
22
  * A checkbox (`gr.Checkbox`) to control case sensitivity for the filter.
23
  * An "Add Filter" button (`gr.Button`).
24
+ * "Save Filters" button (`gr.Button`) to download the current filter set for the selected file as a JSON file.
25
+ * "Load Filters" button (`gr.UploadButton`) to upload a JSON file and apply saved filters to the selected file.
26
+ * A section to display and manage active filters for the selected file:
27
+ * A radio button group (`gr.Radio`) that lists the currently applied filters.
28
  * A "Remove Selected Filter" button (`gr.Button`).
29
+ * "Move Up" and "Move Down" buttons to reorder filters.
30
+ * A "Save Filtered Log" button (`gr.Button`) to download the currently displayed merged log content.
 
31
 
32
  * **State Management:**
33
+ * A `gr.State` object (`files_state`) maintains the application's state. It's a dictionary where keys are filenames and values are dictionaries containing:
34
+ * `lines`: A list of objects, where each object represents a line and stores its parsed timestamp, the raw content, and the source filename.
35
+ * `filters`: A list of active filter dictionaries for that file.
36
 
37
  * **Core Logic:**
38
+ * **`add_file()`:** Handles file uploads. For each line in a new file, it calls `timestamp_utils.parse_timestamp` and stores the result along with the line content in `files_state`.
39
+ * **`select_file()`:** Triggered when a user selects a file from the dropdown. It updates the UI to show the filters for the selected file.
40
+ * **`add_filter()`, `remove_filter()`, `move_filter_up()`, `move_filter_down()`:** These functions manage the filter list for the currently selected file.
41
+ * **`generate_merged_view()`:** This is the main processing function. It iterates through all files in `files_state`, applies the respective filters to each file's lines, combines the filtered lines into a single list, sorts them by timestamp, and returns a Pandas DataFrame for display.
42
+ * **`update_filter_list()`:** Generates a list of strings from the filter list of the selected file to display it in the UI.
43
+ * **`save_filters()` & `load_filters()`:** Handle saving and loading of filter sets.
44
+ * **`save_filtered_log()`:** Saves the content of the DataFrame to a text file.
45
+ * **`show_regex_help()`:** Displays an informational popup with a guide to using regular expressions.
 
46
 
47
  * **Event Handling:**
48
+ * Uploading a file triggers `add_file` and then `generate_merged_view`.
49
+ * Changing the file selection in the dropdown triggers `select_file` to update the displayed filter list.
50
+ * Any action that modifies filters (add, remove, move, load) triggers `generate_merged_view` to refresh the log table.
 
 
 
 
 
 
 
 
51
 
52
  ### 2.2. `filter_utils.py`
53
 
54
  This module provides the core filtering logic.
55
 
56
+ * **`filter_lines()`:** A pure function that takes a list of text lines and applies a single filtering criterion.
57
 
58
  ### 2.3. `timestamp_utils.py`
59
 
60
+ This module provides utilities for parsing timestamps from log lines.
61
 
62
+ * **`parse_timestamp()`:** Attempts to parse a timestamp from a log line, supporting multiple formats.
 
 
63
 
64
  ### 2.4. `cli_app.py`
65
 
 
67
 
68
  * **Functionality:**
69
  * Takes a log file path and a filter JSON file path as arguments.
70
+ * Uses the `requests` library to send POST requests to the Gradio app's API endpoints.
71
+ * Saves the filtered log content to an output file.
 
72
 
73
  ## 3. User Interaction Flow
74
 
75
  1. The user opens the application in their browser.
76
+ 2. The user uploads one or more log files. The application immediately processes them, parses timestamps, and displays a merged, time-sorted view of all log entries in a table.
77
+ 3. The filenames appear in a dropdown. The user can select a file from this dropdown to manage its specific filters.
78
+ 4. When a file is selected, its associated filters are displayed in the "Applied Filters" list.
79
+ 5. To filter a log, the user selects the corresponding file, defines a filter (e.g., "Include Text", "Exclude Regex"), and clicks "Add Filter".
80
+ 6. The filter is added to the selected file's filter list, and the main log table automatically updates to reflect the change.
81
+ 7. The user can add multiple filters to each file. Filters are applied sequentially.
82
+ 8. The user can reorder or remove filters for the selected file, and the view will update accordingly.
83
+ 9. The user can save the filter set for a selected file to a JSON file for later use.
84
+ 10. The user can load a saved filter set and apply it to the selected file.
85
+ 11. The user can click the "?" button for help with writing regular expressions.
86
+ 12. The user can save the currently displayed merged and filtered log content to a text file at any time.
87
+
88
+ This design provides a powerful and interactive way to analyze multiple log files simultaneously in a unified, time-ordered view.
logviewer/app.py CHANGED
@@ -1,122 +1,199 @@
1
  import gradio as gr
2
  import json
3
  import os
 
4
  from filter_utils import filter_lines
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
5
 
6
- def add_filter(filters, filter_type, filter_value, case_sensitive):
7
- if filter_value:
8
- filters.append({"type": filter_type, "value": filter_value, "case_sensitive": case_sensitive})
9
- return filters, ""
10
 
11
- def remove_filter(filters, selected_filter):
12
- if selected_filter:
13
  parts = selected_filter.split(" - ")
14
  if len(parts) == 3:
15
  filter_type, filter_value, case_str = parts
16
  case_sensitive = case_str == "Case Sensitive: True"
 
17
  for i, f in enumerate(filters):
18
  if f["type"] == filter_type and f["value"] == filter_value and f["case_sensitive"] == case_sensitive:
19
  filters.pop(i)
20
  break
21
- return filters, None
22
 
23
- def move_filter_up(filters, selected_filter):
24
- if selected_filter:
25
  parts = selected_filter.split(" - ")
26
  if len(parts) == 3:
27
  filter_type, filter_value, case_str = parts
28
  case_sensitive = case_str == "Case Sensitive: True"
 
29
  for i, f in enumerate(filters):
30
  if f["type"] == filter_type and f["value"] == filter_value and f["case_sensitive"] == case_sensitive:
31
  if i > 0:
32
  filters[i], filters[i-1] = filters[i-1], filters[i]
33
  break
34
- return filters, selected_filter
35
 
36
- def move_filter_down(filters, selected_filter):
37
- if selected_filter:
38
  parts = selected_filter.split(" - ")
39
  if len(parts) == 3:
40
  filter_type, filter_value, case_str = parts
41
  case_sensitive = case_str == "Case Sensitive: True"
 
42
  for i, f in enumerate(filters):
43
  if f["type"] == filter_type and f["value"] == filter_value and f["case_sensitive"] == case_sensitive:
44
  if i < len(filters) - 1:
45
  filters[i], filters[i+1] = filters[i+1], filters[i]
46
  break
47
- return filters, selected_filter
48
 
49
- def apply_filters(file, filters):
50
- if file is not None:
51
- with open(file.name) as f:
52
- lines = f.readlines()
53
-
54
- if not filters:
55
- return "".join(lines)
56
 
57
  processed_lines = lines
58
  for f in filters:
59
  filter_type = f["type"]
60
  value = f["value"]
61
  case = f["case_sensitive"]
 
 
62
 
63
  if filter_type == "Include Text":
64
- processed_lines = filter_lines(processed_lines, include_text=value, case_sensitive=case)
65
  elif filter_type == "Exclude Text":
66
- processed_lines = filter_lines(processed_lines, exclude_text=value, case_sensitive=case)
67
  elif filter_type == "Include Regex":
68
- processed_lines = filter_lines(processed_lines, include_regex=value, case_sensitive=case)
69
  elif filter_type == "Exclude Regex":
70
- processed_lines = filter_lines(processed_lines, exclude_regex=value, case_sensitive=case)
 
 
71
 
72
- return "".join(processed_lines)
73
- return ""
74
 
75
- def update_filter_list(filters):
76
- filter_strings = [f'{f["type"]} - {f["value"]} - Case Sensitive: {f["case_sensitive"]}' for f in filters]
77
- return gr.update(choices=filter_strings)
78
 
79
- def save_filters(filters):
80
- with open("filters.json", "w") as f:
81
- json.dump(filters, f)
82
- return "filters.json"
83
 
84
- def load_filters(filter_file):
85
- if filter_file is not None:
86
- with open(filter_file.name) as f:
87
- return json.load(f)
88
- return []
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
89
 
90
- def save_filtered_log(log_content):
91
- if log_content:
 
 
 
 
 
 
 
 
 
 
92
  with open("filtered_log.txt", "w") as f:
93
  f.write(log_content)
94
  return "filtered_log.txt"
95
  return None
96
 
97
- with gr.Blocks(theme=gr.themes.Soft(), css="#log_content textarea { font-family: monospace; }") as demo:
98
- filters_state = gr.State([])
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
99
 
100
  gr.Markdown("## Log File Viewer")
101
 
102
  with gr.Row():
103
- file_input = gr.File(label="Upload Log File")
 
104
 
105
  gr.Markdown("### Filters")
106
 
107
- with gr.Row():
108
  filter_type = gr.Dropdown([
109
  "Include Text", "Exclude Text", "Include Regex", "Exclude Regex"
110
  ], label="Filter Type")
111
- filter_value = gr.Textbox(label="Filter Value")
 
 
112
  case_sensitive_checkbox = gr.Checkbox(label="Case Sensitive", value=True)
113
  with gr.Column(scale=0):
114
  add_filter_button = gr.Button("Add Filter")
115
  save_filters_button = gr.Button("Save Filters")
116
  load_filters_file = gr.UploadButton("Load Filters (.json)", file_types=[".json"])
117
 
118
-
119
-
120
  with gr.Row():
121
  applied_filters_list = gr.Radio(label="Applied Filters", interactive=True)
122
  remove_filter_button = gr.Button("Remove Selected Filter")
@@ -125,107 +202,109 @@ with gr.Blocks(theme=gr.themes.Soft(), css="#log_content textarea { font-family:
125
 
126
  with gr.Row():
127
  save_filtered_log_button = gr.Button("Save Filtered Log")
128
- text_output = gr.Textbox(label="Log Content", lines=20, max_lines=1000, interactive=False, elem_id="log_content")
 
129
 
130
  # Event Handlers
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
131
  add_filter_button.click(
132
  add_filter,
133
- inputs=[filters_state, filter_type, filter_value, case_sensitive_checkbox],
134
- outputs=[filters_state, filter_value]
135
  ).then(
136
  update_filter_list,
137
- inputs=filters_state,
138
  outputs=applied_filters_list
139
  ).then(
140
- apply_filters,
141
- inputs=[file_input, filters_state],
142
- outputs=text_output
143
  )
144
 
145
  remove_filter_button.click(
146
  remove_filter,
147
- inputs=[filters_state, applied_filters_list],
148
- outputs=[filters_state, applied_filters_list]
149
  ).then(
150
  update_filter_list,
151
- inputs=filters_state,
152
  outputs=applied_filters_list
153
  ).then(
154
- apply_filters,
155
- inputs=[file_input, filters_state],
156
- outputs=text_output
157
  )
158
 
159
  move_up_button.click(
160
  move_filter_up,
161
- inputs=[filters_state, applied_filters_list],
162
- outputs=[filters_state, applied_filters_list]
163
  ).then(
164
  update_filter_list,
165
- inputs=filters_state,
166
  outputs=applied_filters_list
167
  ).then(
168
- apply_filters,
169
- inputs=[file_input, filters_state],
170
- outputs=text_output
171
  )
172
 
173
  move_down_button.click(
174
  move_filter_down,
175
- inputs=[filters_state, applied_filters_list],
176
- outputs=[filters_state, applied_filters_list]
177
  ).then(
178
  update_filter_list,
179
- inputs=filters_state,
180
  outputs=applied_filters_list
181
  ).then(
182
- apply_filters,
183
- inputs=[file_input, filters_state],
184
- outputs=text_output
185
  )
186
 
187
  save_filters_button.click(
188
  save_filters,
189
- inputs=filters_state,
190
  outputs=gr.File(label="Download Filter File")
191
  )
192
 
193
  load_filters_file.upload(
194
  load_filters,
195
- inputs=load_filters_file,
196
- outputs=filters_state
197
  ).then(
198
  update_filter_list,
199
- inputs=filters_state,
200
  outputs=applied_filters_list
201
  ).then(
202
- apply_filters,
203
- inputs=[file_input, filters_state],
204
- outputs=text_output
205
- )
206
-
207
- file_input.upload(
208
- apply_filters,
209
- inputs=[file_input, filters_state],
210
- outputs=text_output
211
  )
212
 
213
  save_filtered_log_button.click(
214
  save_filtered_log,
215
- inputs=text_output,
216
  outputs=gr.File(label="Download Filtered Log")
217
  )
218
 
219
- from fastapi import FastAPI
220
-
221
- app = FastAPI()
222
- app = gr.mount_gradio_app(app, demo, path="/", root_path=os.environ.get("GRADIO_ROOT_PATH", ""))
223
-
224
  if __name__ == "__main__":
225
- import uvicorn
226
- uvicorn.run(
227
- app,
228
- host="0.0.0.0",
229
- port=7860,
230
- root_path=os.environ.get("GRADIO_ROOT_PATH", "")
231
- )
 
1
  import gradio as gr
2
  import json
3
  import os
4
+ import pandas as pd
5
  from filter_utils import filter_lines
6
+ from timestamp_utils import parse_timestamp
7
+
8
+ def add_file(files, state):
9
+ if files:
10
+ for file in files:
11
+ filename = os.path.basename(file.name)
12
+ if filename not in state:
13
+ with open(file.name) as f:
14
+ lines = f.readlines()
15
+
16
+ parsed_lines = []
17
+ for line in lines:
18
+ parsed_lines.append({
19
+ "timestamp": parse_timestamp(line),
20
+ "content": line,
21
+ "source": filename
22
+ })
23
+ state[filename] = {"lines": parsed_lines, "filters": []}
24
+
25
+ file_selector_update = gr.update(choices=list(state.keys()), value=list(state.keys())[0] if state else None)
26
+ selected_file = list(state.keys())[0] if state else None
27
+
28
+ table_output_update = generate_merged_view(state)
29
+ filter_list_update = update_filter_list(selected_file, state)
30
+
31
+ return state, file_selector_update, table_output_update, filter_list_update
32
+
33
+ def select_file(selected_file, state):
34
+ if selected_file and selected_file in state:
35
+ filter_list_update = update_filter_list(selected_file, state)
36
+ return filter_list_update
37
+ return gr.update(choices=[])
38
 
39
+ def add_filter(state, selected_file, filter_type, filter_value, case_sensitive):
40
+ if selected_file and filter_value:
41
+ state[selected_file]["filters"].append({"type": filter_type, "value": filter_value, "case_sensitive": case_sensitive})
42
+ return state, ""
43
 
44
+ def remove_filter(state, selected_file, selected_filter):
45
+ if selected_file and selected_filter:
46
  parts = selected_filter.split(" - ")
47
  if len(parts) == 3:
48
  filter_type, filter_value, case_str = parts
49
  case_sensitive = case_str == "Case Sensitive: True"
50
+ filters = state[selected_file]["filters"]
51
  for i, f in enumerate(filters):
52
  if f["type"] == filter_type and f["value"] == filter_value and f["case_sensitive"] == case_sensitive:
53
  filters.pop(i)
54
  break
55
+ return state, None
56
 
57
+ def move_filter_up(state, selected_file, selected_filter):
58
+ if selected_file and selected_filter:
59
  parts = selected_filter.split(" - ")
60
  if len(parts) == 3:
61
  filter_type, filter_value, case_str = parts
62
  case_sensitive = case_str == "Case Sensitive: True"
63
+ filters = state[selected_file]["filters"]
64
  for i, f in enumerate(filters):
65
  if f["type"] == filter_type and f["value"] == filter_value and f["case_sensitive"] == case_sensitive:
66
  if i > 0:
67
  filters[i], filters[i-1] = filters[i-1], filters[i]
68
  break
69
+ return state, selected_filter
70
 
71
+ def move_filter_down(state, selected_file, selected_filter):
72
+ if selected_file and selected_filter:
73
  parts = selected_filter.split(" - ")
74
  if len(parts) == 3:
75
  filter_type, filter_value, case_str = parts
76
  case_sensitive = case_str == "Case Sensitive: True"
77
+ filters = state[selected_file]["filters"]
78
  for i, f in enumerate(filters):
79
  if f["type"] == filter_type and f["value"] == filter_value and f["case_sensitive"] == case_sensitive:
80
  if i < len(filters) - 1:
81
  filters[i], filters[i+1] = filters[i+1], filters[i]
82
  break
83
+ return state, selected_filter
84
 
85
+ def generate_merged_view(state):
86
+ all_lines = []
87
+ for filename, data in state.items():
88
+ lines = data["lines"]
89
+ filters = data["filters"]
 
 
90
 
91
  processed_lines = lines
92
  for f in filters:
93
  filter_type = f["type"]
94
  value = f["value"]
95
  case = f["case_sensitive"]
96
+
97
+ content_lines = [line["content"] for line in processed_lines]
98
 
99
  if filter_type == "Include Text":
100
+ filtered_content = filter_lines(content_lines, include_text=value, case_sensitive=case)
101
  elif filter_type == "Exclude Text":
102
+ filtered_content = filter_lines(content_lines, exclude_text=value, case_sensitive=case)
103
  elif filter_type == "Include Regex":
104
+ filtered_content = filter_lines(content_lines, include_regex=value, case_sensitive=case)
105
  elif filter_type == "Exclude Regex":
106
+ filtered_content = filter_lines(content_lines, exclude_regex=value, case_sensitive=case)
107
+
108
+ processed_lines = [line for line in processed_lines if line["content"] in filtered_content]
109
 
110
+ all_lines.extend(processed_lines)
 
111
 
112
+ if not all_lines:
113
+ return pd.DataFrame(columns=["File", "Timestamp", "Log Entry"])
 
114
 
115
+ # Sort lines by timestamp
116
+ all_lines.sort(key=lambda x: x["timestamp"] if x["timestamp"] is not None else pd.Timestamp.min)
 
 
117
 
118
+ df = pd.DataFrame(all_lines)
119
+ df["Timestamp"] = df["timestamp"].apply(lambda x: x.strftime('%Y-%m-%d %H:%M:%S') if pd.notnull(x) else "")
120
+ df = df.rename(columns={"source": "File", "content": "Log Entry"})
121
+
122
+ return df[["File", "Timestamp", "Log Entry"]]
123
+
124
+ def update_filter_list(selected_file, state):
125
+ if selected_file and selected_file in state:
126
+ filters = state[selected_file]["filters"]
127
+ filter_strings = [f'{f["type"]} - {f["value"]} - Case Sensitive: {f["case_sensitive"]}' for f in filters]
128
+ return gr.update(choices=filter_strings)
129
+ return gr.update(choices=[])
130
+
131
+ def save_filters(selected_file, state):
132
+ if selected_file and selected_file in state:
133
+ filters = state[selected_file]["filters"]
134
+ with open(f"{selected_file}_filters.json", "w") as f:
135
+ json.dump(filters, f)
136
+ return f"{selected_file}_filters.json"
137
+ return None
138
 
139
+ def load_filters(selected_file, state, filter_file):
140
+ if selected_file and filter_file is not None:
141
+ with open(filter_file.name) as f:
142
+ state[selected_file]["filters"] = json.load(f)
143
+ return state
144
+
145
+ def save_filtered_log(log_data):
146
+ if not log_data.empty:
147
+ log_content = ""
148
+ for index, row in log_data.iterrows():
149
+ log_content += f"[{row['File']}] [{row['Timestamp']}] {row['Log Entry']}"
150
+
151
  with open("filtered_log.txt", "w") as f:
152
  f.write(log_content)
153
  return "filtered_log.txt"
154
  return None
155
 
156
+ def show_regex_help():
157
+ gr.Info("""
158
+ **Regular Expression Quick Guide**
159
+
160
+ - `.` : Matches any single character.
161
+ - `*` : Matches the preceding character zero or more times.
162
+ - `+` : Matches the preceding character one or more times.
163
+ - `^` : Matches the start of a line.
164
+
165
+ : Matches the end of a line.
166
+ - `|` : Acts as an OR operator (e.g., `error|warn`).
167
+ - `[...]`: Matches any single character within the brackets (e.g., `[aeiou]`).
168
+ - `(...)`: Groups expressions.
169
+
170
+ **Example:** `^ERROR.*database` will find lines starting with "ERROR" that also contain "database".
171
+ """)
172
+
173
+ with gr.Blocks(theme=gr.themes.Soft(), css="#log_content .gr-dataframe { font-family: monospace; } .gradio-toast { max-width: 500px !important; }") as demo:
174
+ files_state = gr.State({})
175
 
176
  gr.Markdown("## Log File Viewer")
177
 
178
  with gr.Row():
179
+ file_input = gr.File(label="Upload Log File(s)", file_count="multiple")
180
+ file_selector = gr.Dropdown(label="Select Log File")
181
 
182
  gr.Markdown("### Filters")
183
 
184
+ with gr.Row(elem_id="filter_row"):
185
  filter_type = gr.Dropdown([
186
  "Include Text", "Exclude Text", "Include Regex", "Exclude Regex"
187
  ], label="Filter Type")
188
+ filter_value = gr.Textbox(label="Filter Value", scale=4)
189
+ with gr.Column(scale=0, min_width=50):
190
+ help_button = gr.Button("?", scale=0)
191
  case_sensitive_checkbox = gr.Checkbox(label="Case Sensitive", value=True)
192
  with gr.Column(scale=0):
193
  add_filter_button = gr.Button("Add Filter")
194
  save_filters_button = gr.Button("Save Filters")
195
  load_filters_file = gr.UploadButton("Load Filters (.json)", file_types=[".json"])
196
 
 
 
197
  with gr.Row():
198
  applied_filters_list = gr.Radio(label="Applied Filters", interactive=True)
199
  remove_filter_button = gr.Button("Remove Selected Filter")
 
202
 
203
  with gr.Row():
204
  save_filtered_log_button = gr.Button("Save Filtered Log")
205
+
206
+ log_table = gr.DataFrame(headers=["File", "Timestamp", "Log Entry"], interactive=False, elem_id="log_content")
207
 
208
  # Event Handlers
209
+ help_button.click(show_regex_help, inputs=None, outputs=None)
210
+
211
+ file_input.upload(
212
+ add_file,
213
+ inputs=[file_input, files_state],
214
+ outputs=[files_state, file_selector, log_table, applied_filters_list]
215
+ )
216
+
217
+ file_selector.change(
218
+ select_file,
219
+ inputs=[file_selector, files_state],
220
+ outputs=[applied_filters_list]
221
+ ).then(
222
+ generate_merged_view,
223
+ inputs=files_state,
224
+ outputs=log_table
225
+ )
226
+
227
  add_filter_button.click(
228
  add_filter,
229
+ inputs=[files_state, file_selector, filter_type, filter_value, case_sensitive_checkbox],
230
+ outputs=[files_state, filter_value]
231
  ).then(
232
  update_filter_list,
233
+ inputs=[file_selector, files_state],
234
  outputs=applied_filters_list
235
  ).then(
236
+ generate_merged_view,
237
+ inputs=files_state,
238
+ outputs=log_table
239
  )
240
 
241
  remove_filter_button.click(
242
  remove_filter,
243
+ inputs=[files_state, file_selector, applied_filters_list],
244
+ outputs=[files_state, applied_filters_list]
245
  ).then(
246
  update_filter_list,
247
+ inputs=[file_selector, files_state],
248
  outputs=applied_filters_list
249
  ).then(
250
+ generate_merged_view,
251
+ inputs=files_state,
252
+ outputs=log_table
253
  )
254
 
255
  move_up_button.click(
256
  move_filter_up,
257
+ inputs=[files_state, file_selector, applied_filters_list],
258
+ outputs=[files_state, applied_filters_list]
259
  ).then(
260
  update_filter_list,
261
+ inputs=[file_selector, files_state],
262
  outputs=applied_filters_list
263
  ).then(
264
+ generate_merged_view,
265
+ inputs=files_state,
266
+ outputs=log_table
267
  )
268
 
269
  move_down_button.click(
270
  move_filter_down,
271
+ inputs=[files_state, file_selector, applied_filters_list],
272
+ outputs=[files_state, applied_filters_list]
273
  ).then(
274
  update_filter_list,
275
+ inputs=[file_selector, files_state],
276
  outputs=applied_filters_list
277
  ).then(
278
+ generate_merged_view,
279
+ inputs=files_state,
280
+ outputs=log_table
281
  )
282
 
283
  save_filters_button.click(
284
  save_filters,
285
+ inputs=[file_selector, files_state],
286
  outputs=gr.File(label="Download Filter File")
287
  )
288
 
289
  load_filters_file.upload(
290
  load_filters,
291
+ inputs=[file_selector, files_state, load_filters_file],
292
+ outputs=files_state
293
  ).then(
294
  update_filter_list,
295
+ inputs=[file_selector, files_state],
296
  outputs=applied_filters_list
297
  ).then(
298
+ generate_merged_view,
299
+ inputs=files_state,
300
+ outputs=log_table
 
 
 
 
 
 
301
  )
302
 
303
  save_filtered_log_button.click(
304
  save_filtered_log,
305
+ inputs=log_table,
306
  outputs=gr.File(label="Download Filtered Log")
307
  )
308
 
 
 
 
 
 
309
  if __name__ == "__main__":
310
+ demo.launch(server_name="0.0.0.0", server_port=7860)
 
 
 
 
 
 
logviewer/timestamp_utils.py CHANGED
@@ -4,26 +4,24 @@ import re
4
  def parse_timestamp(log_line: str) -> datetime.datetime | None:
5
  """
6
  Attempts to parse a timestamp from the beginning of a log line.
7
- This is a placeholder and should be extended to handle various timestamp formats.
8
  """
9
- # Example: YYYY-MM-DD HH:MM:SS.milliseconds or YYYY-MM-DDTHH:MM:SS,SSS
10
- # This regex is a starting point and might need to be adjusted for specific log formats.
11
  match = re.match(r'^(\d{4}-\d{2}-\d{2}[T ]\d{2}:\d{2}:\d{2}[.,]\d{3})', log_line)
12
  if match:
13
  try:
14
- # Try parsing with milliseconds
15
  return datetime.datetime.strptime(match.group(1).replace('T', ' ').replace(',', '.'), '%Y-%m-%d %H:%M:%S.%f')
16
  except ValueError:
17
  pass
18
-
19
- # Add more parsing attempts for other common formats here
20
- # Example: "MMM DD HH:MM:SS" (e.g., "Jun 28 10:30:00")
21
  match = re.match(r'^(Jan|Feb|Mar|Apr|May|Jun|Jul|Aug|Sep|Oct|Nov|Dec)\s+\d{1,2}\s+\d{2}:\d{2}:\d{2}', log_line)
22
  if match:
23
  try:
24
- # For this format, we might need to assume the current year or pass it in
25
- # For simplicity, let's just return None for now if it's not a full datetime
26
- pass
 
27
  except ValueError:
28
  pass
29
 
 
4
  def parse_timestamp(log_line: str) -> datetime.datetime | None:
5
  """
6
  Attempts to parse a timestamp from the beginning of a log line.
7
+ This function supports multiple timestamp formats.
8
  """
9
+ # Format 1: YYYY-MM-DD HH:MM:SS.milliseconds or YYYY-MM-DDTHH:MM:SS,SSS
 
10
  match = re.match(r'^(\d{4}-\d{2}-\d{2}[T ]\d{2}:\d{2}:\d{2}[.,]\d{3})', log_line)
11
  if match:
12
  try:
 
13
  return datetime.datetime.strptime(match.group(1).replace('T', ' ').replace(',', '.'), '%Y-%m-%d %H:%M:%S.%f')
14
  except ValueError:
15
  pass
16
+
17
+ # Format 2: "MMM DD HH:MM:SS" (e.g., "Jun 29 14:22:27")
 
18
  match = re.match(r'^(Jan|Feb|Mar|Apr|May|Jun|Jul|Aug|Sep|Oct|Nov|Dec)\s+\d{1,2}\s+\d{2}:\d{2}:\d{2}', log_line)
19
  if match:
20
  try:
21
+ # This format doesn't have a year, so we assume the current year.
22
+ current_year = datetime.datetime.now().year
23
+ timestamp_str = f"{current_year} {match.group(0)}"
24
+ return datetime.datetime.strptime(timestamp_str, '%Y %b %d %H:%M:%S')
25
  except ValueError:
26
  pass
27
 
requirements.txt CHANGED
@@ -1,3 +1,4 @@
1
  gradio
2
  uvicorn
3
- requests
 
 
1
  gradio
2
  uvicorn
3
+ requests
4
+ pandas