File size: 3,405 Bytes
1112451
 
 
 
 
58e829b
bb59487
7dfc8d9
bb59487
c39a084
bb59487
c39a084
 
 
 
54c8522
c39a084
54c8522
c39a084
 
d3a967e
 
 
 
 
 
 
 
 
 
f6c65ef
7dfc8d9
f6c65ef
 
c39a084
f6c65ef
c39a084
 
 
 
 
 
f6c65ef
c39a084
f6c65ef
c39a084
fe88229
 
d7d0f99
 
c39a084
d7d0f99
fe88229
 
c39a084
58e829b
7bf82f0
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
7dfc8d9
7bf82f0
 
7dfc8d9
7bf82f0
 
7dfc8d9
7bf82f0
 
 
7dfc8d9
 
 
 
 
 
 
d3a967e
58e829b
d3a967e
 
58e829b
c39a084
58e829b
7dfc8d9
 
 
 
 
 
 
 
 
 
1112451
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
---
title: 'Court Case Scheduling Dashboard: Code4Change'
sdk: docker
pinned: true
---
# Code4Change: Intelligent Court Scheduling System

Purpose-built for hackathon evaluation. This repository runs out of the box using the Streamlit dashboard and the uv tool. It can be run locally, in Docker, or on Hugging Face Spaces (Docker runtime).

## Requirements

- Python 3.11+
- uv (required)
  - macOS/Linux: `curl -LsSf https://astral.sh/uv/install.sh | sh`
  - Windows (PowerShell): `irm https://astral.sh/uv/install.ps1 | iex`

## Quick Start (Dashboard)

1. Install uv (see above) and ensure Python 3.11+ is available.
2. Clone this repository.
3. Navigate to the repo root and activate uv:
```bash
cd path/to/repo
uv activate
```
4. Install dependencies:
```bash
uv install
```
5. Launch the dashboard:
```bash
uv run streamlit run app.py
```

Then open http://localhost:8501 in your browser.

The dashboard provides:
- Run EDA pipeline (process raw data and extract parameters)
- Explore data and parameters
- Generate cases and run simulations
- Review cause lists and judge overrides
- Compare performance and export reports

## Command Line (optional)

All operations are available via CLI as well:

```bash
uv run court-scheduler --help

# End-to-end workflow
uv run court-scheduler workflow --cases 10000 --days 384
```

For a detailed walkthrough tailored for judges, see `docs/HACKATHON_SUBMISSION.md`.

## Run with Docker (recommended for judges)

If you prefer not to install Python or uv locally, use the provided Docker image.

1) Build the image (run in repo root):

```bash
docker build -t code4change-analysis .
```

2) Show CLI help (Windows PowerShell example with volume mounts):

```powershell
docker run --rm `
  -v ${PWD}\Data:/app/Data `
  -v ${PWD}\outputs:/app/outputs `
  code4change-analysis court-scheduler --help
```

3) Example CLI workflow:

```powershell
docker run --rm `
  -v ${PWD}\Data:/app/Data `
  -v ${PWD}\outputs:/app/outputs `
  code4change-analysis court-scheduler workflow --cases 10000 --days 384
```

4) Run the Streamlit dashboard:

```powershell
docker run --rm -p 7860:7860 `
  -v ${PWD}\Data:/app/Data `
  -v ${PWD}\outputs:/app/outputs `
  code4change-analysis
```

Then open http://localhost:7860.

Notes for Windows CMD: use ^ for line continuation and replace ${PWD} with the full path.

## Deploy on Hugging Face Spaces (Docker)

This repository is ready for Hugging Face Spaces using the Docker runtime.

View the live demo at: https://royaalekh-hackathon-code4change.hf.space/


## Data (Parquet format)

This repository uses a parquet data format for efficient loading and processing.
Provided excel and csv files have been pre-converted to parquet and stored in the `Data/` folder.

No manual pre-processing is required; launch the dashboard and click “Run EDA Pipeline.”

## Project Structure

Key paths updated to reflect recent refactor:

- `app.py` — Streamlit entrypoint at the repository root (replaces previous nested path)
- `src/` — all scheduler, simulation, dashboard, and core modules (migrated from `scheduler/`)
- `pages/` and `src/dashboard/pages/` — Streamlit multipage content
- `Data/` — input data in Parquet/CSV
- `outputs/` — generated artifacts (cause lists, reports)
- `docs/` — documentation and hackathon submission details
- `Dockerfile` — Docker image definition for local and Hugging Face deployment