Dataset Viewer
Auto-converted to Parquet Duplicate
repo_id
stringclasses
1 value
file_path
stringlengths
16
121
content
stringlengths
4
160k
token_count
int64
3
66.1k
streamlit
streamlit/.claude/hooks/pre_bash_redirect.py
#!/usr/bin/env python3 # Copyright (c) Streamlit Inc. (2018-2022) Snowflake Inc. (2022-2025) # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, software # distributed under the License is distributed on an "AS IS" BASIS, # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. # See the License for the specific language governing permissions and # limitations under the License. """ PreToolUse(Bash) hook: block pytest commands for e2e_playwright and redirect to make targets. Exit code semantics (as of Claude Code hooks): - exit 0: allow tool call - exit 2: BLOCK; stderr is fed back to Claude so it corrects its plan automatically """ import json import re import sys # Pattern to match pytest commands, including: # - pytest # - python -m pytest # - python3 -m pytest # - with optional whitespace variations PYTEST_PATTERN = re.compile( r""" ^ # start of string (?: # non-capturing group for optional python invocation python # 'python' (?:3)? # optional '3' \s+ # whitespace -m # '-m' \s+ # whitespace )? pytest # 'pytest' \b # word boundary """, re.IGNORECASE | re.VERBOSE, ) def main(): try: payload = json.load(sys.stdin) except Exception as e: # Fail secure: block (exit 2) if we can't parse input to verify safety. print( # noqa: T201 f"Policy: Failed to parse hook input ({type(e).__name__}: {e}). " f"Blocking tool call for safety.", file=sys.stderr, ) sys.exit(2) if payload.get("hook_event_name") != "PreToolUse": sys.exit(0) if payload.get("tool_name") != "Bash": sys.exit(0) cmd = (payload.get("tool_input") or {}).get("command", "") or "" norm = re.sub(r"\s+", " ", cmd).strip() # Check if this is a pytest command targeting e2e_playwright if PYTEST_PATTERN.search(norm) and "e2e_playwright" in norm: print( # noqa: T201 f"Policy: Bash('{norm}') is blocked.\n" f"E2E tests should use make commands instead:\n" f" - Run 'make help' to see available targets\n" f" - Use 'make run-e2e-test <filename>' for e2e tests\n", file=sys.stderr, ) sys.exit(2) sys.exit(0) if __name__ == "__main__": main()
1,195
streamlit
streamlit/.claude/settings.json
{ "hooks": { "PreToolUse": [ { "matcher": "Bash", "hooks": [ { "type": "command", "command": "$CLAUDE_PROJECT_DIR/.claude/hooks/pre_bash_redirect.py", "timeout": 10 } ] } ] } }
178
streamlit
streamlit/.cursor/commands/py-docstring-fix.md
# Python Docstrings - Fix ## Overview Add or correct Numpydoc-style docstrings for public modules, classes, functions, and methods without changing code behavior. Refer to the Docstrings guidance in `.cursor/rules/python.mdc`. ## Success Criteria - All public APIs have clear, accurate Numpydoc docstrings. - Module-level docstrings exist for user-facing modules. - Parameter names, types, defaults, and behavior align with the implementation and type hints. - Relevant exceptions are documented in a `Raises` section. - Nontrivial functions/classes include a minimal, runnable `Examples` section. - `make python-lint` and `make python-tests` pass. ## What to do 1. Discover targets - Find public callables and classes missing docstrings or with incomplete/incorrect content. - Add module-level docstrings where users are expected to import from the module. 2. Write/repair docstrings (Numpydoc) - Start with a one-sentence summary line (imperative mood), followed by a short paragraph if useful. - Common sections to use: - `Parameters`: names, types, and concise descriptions; include defaults (e.g., `float, default=1e-9`). - `Returns` or `Yields`: type and meaning; describe shape/units if applicable. - `Raises`: list exceptions and the conditions that trigger them. - `Examples`: simple doctest-style examples for nontrivial behavior. - Optionally: `See Also`, `Notes`, `Warnings` for important context. - For classes, document constructor arguments in the class docstring rather than `__init__`. - Keep terminology consistent with code; align with type hints and actual behavior. 3. Validate - Ensure no code behavior or signatures change. - Run `make python-lint` and `make python-tests` and fix any issues. ## Do not - Modify production logic, public APIs, or test behavior. - Reformat unrelated code. - Add speculative sections (e.g., `Raises`) that do not reflect actual behavior. ## Example (format only) ```python def normalize(values: Sequence[float], *, epsilon: float = 1e-9) -> list[float]: """Normalize values so they sum to 1. Parameters ---------- values : Sequence[float] Input numbers; must be finite and non-empty. epsilon : float, default=1e-9 Small constant to avoid division by zero. Returns ------- list[float] Normalized weights summing to 1. Raises ------ ValueError If `values` is empty or contains non-finite numbers. Examples -------- >>> normalize([1, 1, 2]) [0.25, 0.25, 0.5] """ ``` ## Checklist [ ] Public modules/classes/functions/methods documented with Numpydoc. [ ] Module-level docstrings added where applicable. [ ] Parameter names, types, and defaults aligned with code and hints. [ ] Relevant exceptions listed in `Raises`. [ ] Examples added for nontrivial APIs. [ ] `make python-lint` and `make python-tests` pass.
883
streamlit
streamlit/.cursor/commands/py-parameterize-test.md
# Parameterize Tests ## Overview Convert repetitive `pytest` tests into parameterized tests using `pytest.mark.parametrize` to reduce duplication and improve maintainability, without changing behavior or reducing test coverage. ## Success Criteria - Coverage is maintained or increased. - All tests pass via `pytest`. - Each new or updated test includes a brief numpydoc-style docstring. - Provide meaningful `ids` for parameter sets to aid readability. - Preserve existing markers and fixtures; use `pytest.param(..., marks=...)` for case-specific marks (e.g., `xfail`, `skip`). ## What to do 1. Discover candidates - Identify tests that only vary input values, expected outputs, or expected exceptions. - Include existing edge and boundary cases; do not remove unique scenarios. 2. Design parameters - Choose clear argument names and create tuples for each case. - Use `pytest.param` when individual cases require marks. - Add `ids=[...]` for stable, human-readable case names. 3. Implement - Replace duplicated tests with a single parameterized test. - Keep the assertion logic equivalent; do not change runtime behavior. - Deduplicate or promote fixtures only if they are reused (2+ call sites); otherwise keep them local. 4. Validate - Run `pytest` on applicable files and ensure all tests pass. - If grouping introduces ambiguity or flakiness, split or adjust the parameterization. ## Do not - Modify production (non-test) code. - Remove or weaken unique test scenarios. - Over-parameterize when it reduces readability or obscures intent. ## Checklist [ ] Parameterizable tests consolidated with `pytest.mark.parametrize`. [ ] All tests pass (`pytest`). [ ] Coverage maintained or increased. [ ] Brief numpydoc docstring on new/updated tests. [ ] Meaningful `ids` added; per-case marks preserved with `pytest.param`.
493
streamlit
streamlit/.cursor/rules/e2e_playwright.mdc
--- description: globs: e2e_playwright/**/*.py alwaysApply: false --- # Streamlit E2E Tests We use playwright with pytest to e2e test Streamlit library. E2E tests verify the complete Streamlit system (frontend, backend, communication, state, visual appearance) from a user's perspective (black-box). They complement Python/JS unit tests, which are faster and focus on internal logic, input/output validation, and specific message sequences. Use E2E tests when testing behavior that requires the full stack or visual verification, especially for new elements or significant changes to existing ones. ## Test Structure - Located in `e2e_playwright/` - Each test consists of two files: - `*.py`: Streamlit app script that's being tested - `*_test.py`: Playwright pytest file that runs the app and tests it - If the test is specific to a Streamlit element, prefix the filename with `st_<element_name>` - Tests can use screenshot comparisons for visual verification - All screenshots are stored in `e2e_playwright/__snapshots__/<os>/` - Other e2e test results are stored in `e2e_playwright/test_results/` which includes: - `e2e_playwright/test_results/<test_name>/`: Video and traces related to the failed test. - `e2e_playwright/test_results/snapshot-tests-failures/<os>/<test_script>/<test_name>/`: Expected, actual, and diff screenshots of the failed snapshot test. - `e2e_playwright/test_results/snapshot-updates/<os>/<test_script>/<test_name>/`: All updated screenshots of the failed test. ## Key Fixtures and Utilities Import from `conftest.py`: - `app: Page` - Light mode app fixture - `themed_app: Page` - Light & dark mode app fixture - `assert_snapshot` - Screenshot testing fixture. Ensure element is stable before calling. - `wait_for_app_run(app)` - Wait for app run to finish - `wait_for_app_loaded(app)` - Wait for initial app load - `rerun_app(app)` - Trigger app rerun and wait - `wait_until(app, fn)` - Run test function until True or timeout ## Best Practices - As a guiding principle, tests should resemble how users interact with the UI. - Use `expect` for auto-wait assertions, not `assert` (reduces flakiness) - If `expect` is insufficient, use the `wait_until` utility. Never use `wait_for_timeout`. - Prefer label- or key-based locators over index-based access (e.g. `get_by_test_id().nth(0)`). The recommended order of priority is: 1. get elements by label (see `app_utils` methods, e.g. `get_text_input`). 2. elements that don't support `label` but support `key`: get elements by a unique key (`get_element_by_key`). 3. If the element doesn't support key or label, you can wrap it with an `st.container(key="my_key")` to better target it via `get_element_by_key`. E.g. `get_element_by_key("my_key").get_by_test_id("stComponent")`. - Prefer stable locators like `get_by_test_id`, `get_by_text` or `get_by_role` over CSS / XPath selectors via `.locator`. - Group related tests into single, logical test files (e.g., by widget or feature) for CI efficiency. - Minimize screenshot surface area; screenshot specific elements, not the whole page unless necessary. - Use descriptive test names. - Ensure elements screenshotted are under 640px height to avoid clipping by the header. - Naming convention for command-related snapshots: `st_command-test_description` - Take a look at other tests in `e2e_playwright/` as inspiration. - e2e tests are expensive, please test every aspect only one time. - Make use of shared `app_utils` methods (import from `e2e_playwright.shared.app_utils`) if applicable. - Make sure that the tests mix different ways of interactions (e.g. fill and type for input fields) for increased coverage. ## Writing Tests & Common Scenarios When adding or modifying tests for an element, ensure the following are covered: - **Visuals:** Snapshot tests for both normal and `disabled` states. - **Interactivity:** Test user interactions and verify the resulting app state or output (e.g., checking text written via `st.write`, potentially using helpers like `expect_markdown` from `shared/app_utils.py`). - **Common Contexts:** Verify behavior within: - A `@st.fragment`. - An `st.form`. - **Core Behavior:** - State persistence (widget value is retained) if the element is temporarily unmounted and remounted. - The element cannot be interacted with when `disabled=True`. - If the element uses the `help` parameter, verify the tooltip appears correctly on hover. - If the element uses the `key` parameter, verify a corresponding CSS class or attribute is set. - If the element is a widget, make sure to test that the identity is kept stable when `key` is provided. - **Custom Config:** Use module-scoped fixtures with `@pytest.mark.early` for tests requiring specific Streamlit configuration options. ## Running tests - Single test: `make run-e2e-test e2e_playwright/name_of_the_test.py` - Debug test (needs manual interactions): `make debug-e2e-test e2e_playwright/name_of_the_test.py` - If frontend logic was changed, it will require running `make frontend-fast` to update the frontend. - You can ignore missing or mismatched snapshot errors. These need to be updated manually.
1,471
streamlit
streamlit/.cursor/rules/make_commands.mdc
--- description: List of all available make commands globs: alwaysApply: false --- # Available `make` commands List of all `make` commands that are available for execution from the repository root folder: help Show all available make commands. all Install all dependencies, build frontend, and install editable Streamlit. all-dev Install all dependencies and editable Streamlit, but do not build the frontend. init Install all dependencies and build protobufs. clean Remove all generated files. protobuf Recompile Protobufs for Python and the frontend. python-init Install Python dependencies and Streamlit in editable mode. python-lint Lint and check formatting of Python files. python-format Format Python files. python-tests Run Python unit tests. python-performance-tests Run Python performance tests. python-integration-tests Run Python integration tests. Requires `integration-requirements.txt` to be installed. python-types Run the Python type checker. frontend-init Install all frontend dependencies. frontend Build the frontend. frontend-with-profiler Build the frontend with the profiler enabled. frontend-fast Build the frontend (as fast as possible). frontend-dev Start the frontend development server. frontend-lint Lint and check formatting of frontend files. frontend-types Run the frontend type checker. frontend-format Format frontend files. frontend-tests Run frontend unit tests and generate coverage report. frontend-typesync Check for unsynced frontend types. update-frontend-typesync Installs missing typescript typings for dependencies. update-snapshots Update e2e playwright snapshots based on the latest completed CI run. update-snapshots-changed Update e2e playwright snapshots of changed e2e files based on the latest completed CI run. update-material-icons Update material icons based on latest Google material symbol version. update-emojis Update emojis based on latest emoji version. update-notices Update the notices file (licenses of frontend assets and dependencies). update-headers Update all license headers. update-min-deps Update minimum dependency constraints file. debug-e2e-test Run a playwright e2e test in debug mode. Use it via `make debug-e2e-test st_command_test.py`. run-e2e-test Run a playwright e2e test. Use it via `make run-e2e-test st_command_test.py`. trace-e2e-test Run e2e test with tracing and view it. Use via `make trace-e2e-test <test_file.py>::<test_func>`. lighthouse-tests Run Lighthouse performance tests. bare-execution-tests Run all e2e tests in bare mode. cli-smoke-tests Run CLI smoke tests. autofix Autofix linting and formatting errors. package Create Python wheel files in `dist/`. conda-package Create conda distribution files.
1,161
streamlit
streamlit/.cursor/rules/new_feature.mdc
--- description: Implementation guide for new features globs: alwaysApply: false --- # New Feature - Implementation Guide - Most features need to be implemented in the backend in `lib/streamlit/`, the frontend `frontend/` and will need changes to our protobuf definitions in `proto/`. - New features should be covered by Python Unit Tests in `lib/tests`, Vitest Unit Tests, and e2e playwright tests in `e2e_playwright/`. - Implementing new element commands requires additional steps to correctly register the element (see notes below). ## Order of implementation 1. implement protobuf changes in `proto/` & run: `make protobuf` (-> @protobuf.mdc) - Note: new elements need to be added to `proto/streamlit/proto/Element.proto`. 2. implement backend implementation in `lib/streamlit/` (-> @python_lib.mdc) - Note: new elements need to be added to `lib/streamlit/__init__.py` 3. implement Python unit tests in `lib/tests` & run via: `PYTHONPATH=lib pytest lib/tests/streamlit/the_test_name.py` (-> @python_tests.mdc) - Note: new elements need to be added to `lib/tests/streamlit/element_mocks.py` 4. implement frontend changes in `frontend/` (-> @typescript.mdc) - Note: new elements need to be added to `frontend/lib/src/components/core/Block/ElementNodeRenderer.tsx` 5. implement vitest unit tests in `*.test.tsx` & run via: `cd frontend && yarn vitest lib/src/components/elements/NewElement/NewElement.test.tsx` (-> @typescript_tests.mdc) 6. implement e2e playwright test in `e2e_playwright/` & run via: `make run-e2e-test e2e_playwright/name_of_the_test.py` (-> @e2e_playwright.mdc) 7. run `make autofix` to auto-fix linting and formatting issues.
534
streamlit
streamlit/.cursor/rules/overview.mdc
--- description: globs: alwaysApply: true --- # Streamlit Repo Overview Streamlit is an open-source (Apache 2.0) Python library for creating interactive web applications and dashboards with focus on data apps and internal tools. ## Tech Stack - **Backend (Server):** Python, Tornado server, pytest - **Frontend (Web UI):** TypeScript, React, Emotion (CSS-in-JS), Vite, Vitest - **Communication:** Protocol Buffers (protobuf) over WebSocket. ## Folder Structure - `lib/`: All backend code and assets. - `streamlit/`: The main Streamlit library package. - `streamlit/elements/`: Backend code of elements and widgets. - `streamlit/runtime/`: App runtime and execution logic. - `streamlit/web/`: Web server and CLI implementation - `tests`: Python unit tests (pytest). - `frontend/`: All frontend code and assets. - `app/`: Streamlit application UI. - `lib/`: Shared TypeScript library that contains elements, widgets, and layouts. - `connection/`: WebSocket connection handling logic. - `utils/`: Shared utilities. - `proto/streamlit/proto/`: Protobuf definitions for client-server communication. - `e2e_playwright/`: E2E tests using playwright (via pytest). - `scripts/`: Utility scripts for development and CI/CD. - `component-lib/`: Library for building Streamlit custom components. - `.github/workflows/`: GitHub Actions workflows used for CI/CD. ### Shell & Build Policy (AI Agents) - Prefer `make` targets for all dev tasks (tests, lint, format, builds). - For Python unit tests: `pytest` commands are allowed and encouraged for running specific tests during development. - For E2E tests: `pytest` commands targeting `e2e_playwright/` files are blocked by policy. Use `make run-e2e-test <filename>` instead. ## `make` commands Selection of `make` commands for development (run in the repo root): - `help`: Show all available make commands. - `protobuf`: Recompile Protobufs for Python and the frontend. - `autofix`: Autofix linting and formatting errors. **Backend Development (Python):** - `python-lint`: Lint and check formatting of Python files (ruff). - `python-tests`: Run all Python unit tests (pytest). - `python-types`: Run the Python type checker (mypy & ty). - `python-format`: Format Python files (ruff). **Frontend Development (TypeScript):** - `frontend-fast`: Build the frontend (vite). - `frontend-dev`: Start the frontend development server (hot-reload). - `frontend-lint`: Lint and check formatting of frontend files (eslint). - `frontend-types`: Run the TypeScript type checker (tsc). - `frontend-format`: Format frontend files (eslint). - `frontend-tests`: Run all frontend unit tests (vitest). **E2E Testing (Playwright):** - `debug-e2e-test`: Run e2e test in debug mode, via: `make debug-e2e-test st_command_test.py`. - `run-e2e-test`: Run e2e test, via: `make run-e2e-test st_command_test.py`. ### Development Tips - **Follow existing patterns**: Check neighboring files for conventions. - You can use the `work-tmp` directory to store temporary files, specs, and scripts. - If you fail to run a `make` command, remember to run it from the root / top-level directory. - The hot-reload dev server for the frontend will be available at <http://localhost:3000>. ## Testing Strategy - **Python Unit Tests**: Test internal behavior without frontend. - **Frontend Unit Tests**: Test React components, hooks, and related functionality with Vitest and React Testing Library. - **E2E Tests**: Test the entire app logic end-to-end with Playwright. - **(Python) Type Tests**: Verify public API typing with mypy `assert_type`. - Prefer running specific tests / test scripts for newly added tests instead the entire test suite.
1,097
streamlit
streamlit/.cursor/rules/protobuf.mdc
--- description: globs: **/*.proto alwaysApply: false --- # Protobuf Protobuf messages are used for communication between the Streamlit backend and frontend via WebSocket connections. **Note**: Messages are "released" once shipped in any Streamlit version. ## Protobuf Compatibility Always keep protobuf messages backwards compatible. New versions must work with older Streamlit versions. **Incompatible changes to avoid:** - Removing a field → Add `// DEPRECATED` comment and mark `[deprecated=true]` - Renaming a field → Deprecate old field, add new field with next available number - Changing field numbers → Keep all existing numbers unchanged - Adding/removing `optional` → Deprecate and create new field - Changing field types incompatibly → Use new field with compatible type **Compatible changes (safe to make):** - Adding new optional fields - Adding comments - Marking fields as deprecated - Modifying/removing unreleased fields ## Compile Protobuf Changes requiring compilation: - Adding a field - **Unreleased messages only:** Modifying/removing fields Changes not requiring compilation: - Comments and `[deprecated=true]` notation Run this command to recompile the protobufs: ```bash make protobuf ``` ## Important Files - `ForwardMsg.proto`: Root message used to send information from the server to the frontend/browser. - `BackMsg.proto`: Root message sent from the browser to the server, e.g. script rerun requests. - `NewSession.proto`: First message that is sent to the browser on every rerun. - `Block.proto`: Contains all block types. A block is a layout container for elements (e.g. columns, tabs, popovers, etc.). - `Element.proto`: Contains all element types. An element is a UI component.
461
streamlit
streamlit/.cursor/rules/python.mdc
--- description: globs: **/*.py alwaysApply: false --- # Python Development Guide - Supported Python versions: 3.10 - 3.14 - Docstrings: Numpy style - Linter: Ruff 0.x (`.ruff.toml`) - Formatter: Ruff 0.x (`.ruff.toml`) - Type Checker: mypy 1.x (`mypy.ini`) - Testing: pytest 8.x (`lib/pytest.ini`) ## Key Principles - PEP 8 Compliance: Adhere to PEP 8 guidelines for code style, with Ruff as the primary linter and formatter. - Elegance and Readability: Strive for elegant and Pythonic code that is easy to understand and maintain. - Zen of Python: Keep the Zen of Python in mind when making design decisions. - Avoid inheritance (prefer composition). - Avoid methods (prefer non-class functions, or static). - Name functions and variables in such a way that you don't need comments to explain the code. - Python folder and filenames should all be snake_cased regardless of what they contain. - Prefer importing entire modules instead of single functions: `from streamlit import mymodule` over `from streamlit.mymodule import internal_function` - Prefer keyword arguments, use positional values only for required values that frame the API. Enhancing arguments should be keyword-only. - Capitalize comments, use proper grammar and punctuation, and no cursing. - Inside a module, anything that is declared at the root level MUST be prefixed with a _ if it's only used inside that module (anything private). - Prioritize new features in Python 3.10+. ## Docstrings - Use Numpydoc style. - Docstrings are meant for users of a function, not developers who may edit the internals of that function in the future. If you want to talk to future developers, use comments. - All modules that we expect users to interact with must have top-level docstrings. If a user is not meant to interact with a module, docstrings are optional. ## Package Structure - `streamlit/`: The main Streamlit library package. - `streamlit/elements`: Backend code of elements and widgets. - `streamlit/runtime`: App runtime and execution logic. - `streamlit/web`: Web server and CLI implementation - `streamlit/commands`: `st` commands that don't add UI elements. - `streamlit/components`: Backend-implementation of custom components. - `streamlit/hello`: `streamlit hello` app implementation. - `streamlit/navigation`: Multi-page app implementation. - `streamlit/proto`: Generated protobuf definitions for client-server communication. - `streamlit/testing`: AppTest v1 implementation. - `streamlit/vendor`: Vendored dependencies. - `streamlit/watcher`: File-watcher implementations. - `streamlit/__init__.py`: Defines all commands in the `st` namespace. - `setup.py`: Setup configuration of the Streamlit library. - `tests`: Python unit tests (pytest). ## Typing - Add typing annotations to every new function, method or class member. - Use `typing_extensions` for back-porting newer typing features. - Use future annotations via `from __future__ import annotations`. ## Relevant `make` commands Run from the repo root: - `make python-lint`: Lint and check formatting of Python files (ruff). - `make python-tests`: Run all Python unit tests (pytest). - `make python-types`: Run the Python type checker (mypy & ty). - `make python-format`: Format Python files (ruff).
885
streamlit
streamlit/.cursor/rules/python_lib.mdc
--- description: globs: lib/streamlit/**/*.py alwaysApply: false --- # Streamlit Lib Python Guide Tips and guidelines specific to the development of the Streamlit Python library, not applicable to scripts and e2e tests. ## Logging If something needs to be logged, please use our logger - that returns a default Python logger - with an appropriate logging level: ```python from streamlit.logger import get_logger _LOGGER: Final = get_logger(__name__) ``` ## Unit Tests We use the unit tests to cover internal behavior that can work without the web / backend counterpart and the e2e tests to test the entire system. We aim for high unit test coverage (90% or higher) of our Python code in `lib/streamlit`. - Under `lib/tests/streamlit`, add a new test file - Preferably in the mirrored directory structure as the non-test files. - Naming: `my_example_test.py` ### Typing Tests We also have typing tests in `lib/tests/streamlit/typing` for our public API to catch typing errors in parameters or return types by using mypy and `assert_type`. Check other typing tests in the `lib/tests/streamlit/typing` directory for inspiration.
314
streamlit
streamlit/.cursor/rules/python_tests.mdc
--- description: globs: lib/tests/**/*.py alwaysApply: false --- # Python Unit Test Guide We use the unit tests to cover internal behavior that can work without the web / backend counterpart. We aim for high unit test coverage (90% or higher) of our Python code in `lib/streamlit`. ## Key Principles - Prefer pytest or pytest plugins over unittest. - For every new test function, please add a brief docstring comment (numpydoc style). - New tests should be fully annotated with types. - Skip tests (via `pytest.mark.skipif`) requiring CI secrets if the environment variables are not set. - Parameterized Tests: Use `@parameterized.expand` whenever it is possible to combine overlapping tests with varying inputs. ## Running tests - Run all with (execute from repo root): ```bash make python-tests ``` - Run a specific test file with: ```bash PYTHONPATH=lib pytest lib/tests/streamlit/my_example_test.py ``` - Run a specific test inside a test file with: ```bash PYTHONPATH=lib pytest lib/tests/streamlit/my_example_test.py -k test_that_something_works ```
313
streamlit
streamlit/.cursor/rules/typescript.mdc
--- description: globs: **/*.ts, **/*.tsx alwaysApply: false --- # TypeScript Development Guide - TypeScript: v5 - Linter: eslint v9 - Formatter: prettier v3 - Framework: React v18 - Styling: @emotion/styled v11 - Build tool: vite v7 - Testing: vitest v3 & react testing library v16 - Package manager: yarn v4 with workspaces ## Key TypeScript Principles - Prefer functional, declarative programming. - Prefer iteration and modularization over duplication. - Use descriptive variable names with auxiliary verbs (e.g., isLoading). - Use the Receive an Object, Return an Object (RORO) pattern. - Ensure functions have explicit return types. - **Omit trivially inferred types**: Do not add type annotations when TypeScript can trivially infer them (e.g., `const count = 0` not `const count: number = 0`). Add explicit types only when they improve clarity or are required. - **Prefer optional chaining**: Use optional chaining (`?.`) instead of `&&` chains for property access. This is enforced by the `@typescript-eslint/prefer-optional-chain` rule. ## Key Frontend Principles - Leverage all of best practices of React 18. - Follow the [Rules of React](https://react.dev/reference/rules): pure components and hooks, immutable props and state, and call hooks at the top level of React functions. - Write performant frontend code. - Ensure referential stability by leveraging React Hooks. - **Updater functions must be pure**: `setState(prev => newState)` updaters must not mutate `prev` or have side effects—return a new object. See [useState](https://react.dev/reference/react/useState#setstate-parameters). - Prefix event handlers with "handle" (e.g., handleClick, handleSubmit). - Favor leveraging @emotion/styled instead of inline styles. - Leverage object style notation in Emotion. - All styled components begin with the word `Styled` to indicate it's a styled component. - Utilize props in styled components to display elements that may have some interactivity. - Avoid the need to target other components. - When using BaseWeb, be sure to import our theme via `useEmotionTheme` and use those values in overrides. - Use the following pattern for naming custom CSS classes and test IDs: `stComponentSubcomponent`, for example: `stTextInputIcon`. - Avoid using pixel sizes for styling, always use rem, em, percentage, or other relative units. ## Static Data Structures - Extract static lookup maps and constants to module-level scope outside functions/components - Static data recreated on every call/render wastes memory and CPU - Exception: Keep inside function only if data depends on parameters, props, or state <good-example> ```tsx // ✅ Module-level - created once const ALIGNMENT_MAP: Record<Alignment, CSSProperties["textAlign"]> = { [Alignment.LEFT]: "left", [Alignment.CENTER]: "center", // ... } as const function getAlignment(config: AlignmentConfig) { return ALIGNMENT_MAP[config.alignment] } ``` </good-example> <bad-example> ```tsx // ❌ Recreated every call function getAlignment(config: AlignmentConfig) { const alignmentMap = { /* same data */ } return alignmentMap[config.alignment] } ``` </bad-example> ## Yarn Workspaces - Project Structure: Monorepo managed with Yarn Workspaces. - Packages: - `app` - Main application UI. - `connection` - WebSocket handling - `lib` - Shared UI components. - `utils` - Shared TypeScript utilities. - `protobuf` - Generated Protocol definitions. - `typescript-config` - Configuration for TypeScript. - `eslint-plugin-streamlit-custom` - ESLint plugin with custom rules. - Package-specific scripts are executed within their respective directories. ## Relevant `make` commands Run from the repo root: - `make frontend-fast`: Build the frontend (vite). - `make frontend-dev`: Start the frontend development server (hot-reload). - `make frontend-lint`: Lint and check formatting of frontend files (eslint). - `make frontend-types`: Run the TypeScript type checker (tsc). - `make frontend-format`: Format frontend files (eslint). - `make frontend-tests`: Run all frontend unit tests (vitest). ## TypeScript Test Guide - Test Framework: Vitest - UI Testing Library: React Testing Library (RTL) ### Key Principles - Coverage: Implement both unit and integration tests (using RTL where applicable). - Robustness: Test edge cases and error handling scenarios. - Accessibility: Validate component accessibility compliance. - Parameterized Tests: Use `it.each` for repeated tests with varying inputs. - Framework Exclusivity: Only use Vitest syntax; do not use Jest. ### Running Tests - Yarn test commands must be run from the `<GIT_ROOT>/frontend` directory. - Run All Tests: `yarn test` - Run Specific File: `yarn test lib/src/components/path/component.test.tsx` - Run Specific Test: `yarn test -t "the test name" lib/src/components/path/component.test.tsx` ### React Testing Library best practices Cheat sheet for queries from RTL: | | No Match | 1 Match | 1+ Match | Await? | | ---------- | -------- | ------- | -------- | ------ | | getBy | throw | return | throw | No | | findBy | throw | return | throw | Yes | | queryBy | null | return | throw | No | | getAllBy | throw | array | array | No | | findAllBy | throw | array | array | Yes | | queryAllBy | [] | array | array | No | - Utilizing any query that throws if not found AND asserting using `toBeInTheDocument` is redundant and must be avoided. Prefer `toBeVisible` instead. - User interactions should utilize the `userEvent` library. - Tests should be written in a way that asserts user behavior, not implementation details. #### Query Priority Order Based on the Guiding Principles, your test should resemble how users interact with your code (component, page, etc.) as much as possible. With this in mind, we recommend this order of priority: 1. Queries Accessible to Everyone Queries that reflect the experience of visual/mouse users as well as those that use assistive technology. - getByRole, getByLabelText, getByPlaceholderText, getByText, getByDisplayValue 2. Semantic Queries HTML5 and ARIA compliant selectors. Note that the user experience of interacting with these attributes varies greatly across browsers and assistive technology. - getByAltText, getByTitle 3. Test IDs - getByTestId - The user cannot see (or hear) these, so this is only recommended for cases where you can't match by role or text or it doesn't make sense (e.g. the text is dynamic). ## You Might Not Need an Effect Authoritative rules for deciding when to use or remove `useEffect` in React components. Based on React docs: [You Might Not Need an Effect](https://react.dev/learn/you-might-not-need-an-effect). ### Purpose - Write React that is simpler, faster, and less error‑prone by avoiding unnecessary Effects. - Prefer deriving values during render and handling actions in event handlers. - Only use Effects to synchronize with external systems. ### Core Principles (enforce by default) - Derive during render: If a value can be computed from props/state, compute it inline. Do NOT mirror it in state or set it in an Effect. - Events, not Effects: Handle user interactions in event handlers. Do NOT move event-driven logic into Effects. - Memoize expensive pure work: Use `useMemo` for heavy pure computations, not `useEffect` + state. - One responsibility per Effect: Each Effect should sync exactly one external concern. - Clean up or don’t ship: Any Effect that subscribes, starts timers, or allocates resources must return a cleanup. - Stable refs over re-subscribes: Keep subscriptions stable; avoid recreating on every render. ### When an Effect IS appropriate Use an Effect only to synchronize the component with an external system: - Subscriptions: event listeners, WebSocket, ResizeObserver, etc. Ensure cleanup. - Imperative APIs: integrating non-React widgets or DOM APIs. - Network sync: Keep local UI in sync with remote data for given parameters (with race handling and cleanup). - Scheduling/out-of-React side effects: analytics pings, imperative focus when tied to visibility. If no external system is involved, you likely don’t need an Effect. ### Anti-patterns to reject (rewrite required) - Setting derived state in an Effect (e.g., `setFullName(firstName + ' ' + lastName)`). Compute during render instead. - Filtering/sorting data via Effect + state. Compute during render; use `useMemo` only if the computation is expensive. - Using Effects to respond to user actions that can be handled in the event handler. - Chains of Effects where each triggers the next via `setState`. Compute the final value directly during render or perform the action in the initiating handler. - Missing cleanup for listeners, intervals, timeouts, blob URLs, or object URLs. - Data fetching without race protection (stale response can overwrite newer data). ### Rewrite patterns (preferred code) - Derived value during render: ```tsx // ❌ Avoid // const [fullName, setFullName] = useState("") // useEffect(() => setFullName(`${firstName} ${lastName}`), [firstName, lastName]) // ✅ Do const fullName = `${firstName} ${lastName}` ``` - Expensive pure computation: ```tsx // ❌ Effect + state // const [visibleTodos, setVisibleTodos] = useState<Todo[]>([]) // useEffect(() => setVisibleTodos(filterTodos(todos, filter)), [todos, filter]) // ✅ Render-time compute (useMemo only if slow) const visibleTodos = useMemo(() => filterTodos(todos, filter), [todos, filter]) ``` - Handle user action in the handler: ```tsx // ❌ useEffect that watches flag and then performs action // useEffect(() => { if (shouldBuy) buy() }, [shouldBuy]) // ✅ Directly do the work in response to the event function handleBuyClick() { void buy() } ``` - Data fetching with race protection and cleanup: ```tsx useEffect(() => { const abort = new AbortController() let ignore = false fetch(makeUrl(query, page), { signal: abort.signal }) .then(r => r.json()) .then(data => { if (!ignore) setResults(data) }) .catch(err => { if (!ignore && (err as any)?.name !== "AbortError") setError(err) }) return () => { ignore = true abort.abort() } }, [query, page]) ``` - Reset state without Effects: ```tsx // ✅ Keyed reset when parent prop changes ;<Child key={userId} userId={userId} /> // ✅ Set initial controlled values from props during render const initialTab = props.defaultTab ?? "overview" ``` ### Effect checklist (must pass all) 1. External sync: Does this Effect synchronize with an external system? If no, remove it. 2. Single responsibility: Exactly one external concern in this Effect. 3. Complete deps: Dependency array is complete and correct; avoid stale closures. 4. Cleanup: All listeners, timers, object URLs, and subscriptions are cleaned in the return function. 5. Race safety: For network requests, ignore/cancel stale responses (AbortController or an `ignore` flag). 6. Referential stability: Use `useCallback`/`useMemo` to prevent unnecessary effect re-runs caused by unstable references. 7. Error handling: Async paths have error handling; no silent failures. ### Decision guide - Can it be computed from existing state/props? Compute during render. - Is the logic triggered by a user event? Put it in the event handler. - Is it an expensive pure calculation? Use `useMemo` (not Effect) to cache. - Are you integrating with something outside React? Use an Effect with cleanup. - Are you fetching data tied to inputs/visibility? Use an Effect with race handling and cleanup. ### Review heuristics (quick scans) - Search for `useEffect` followed by immediate `setState` of values derivable from render inputs. - Effects that only read props/state and don’t touch external systems are candidates for removal. - Multiple Effects updating each other’s state in a chain indicate a missing render-time computation or misplaced event logic. - Effects creating event listeners/timers without `return` cleanup are bugs. ### Performance notes - Prefer render-time computation; add `useMemo` only for provably expensive pure work. - Avoid creating new objects/arrays inline in JSX props each render; memoize when it affects memoized children. - Keep dependency arrays minimal but complete. Split Effects if different concerns require different deps. ### Testing guidance (see [TypeScript Test Guide](#typescript-test-guide)) - Unit test render-time derivations directly (no Effects involved). - For Effects that fetch or subscribe, test cleanup and race handling (use fake timers/abort signals). - RTL: assert behavior and results, not internal hook usage. ### References - React docs: [You Might Not Need an Effect](https://react.dev/learn/you-might-not-need-an-effect)
3,579
streamlit
streamlit/.cursor/worktrees.json
{ "setup-worktree": [ "cd \"$(git rev-parse --show-toplevel)\"; [ -d .venv ] || uv venv; VIRTUAL_ENV=.venv PATH=\".venv/bin:.venv/Scripts:$PATH\" make all-dev frontend" ] }
87
streamlit
streamlit/.devcontainer/Dockerfile
# Copyright (c) Streamlit Inc. (2018-2022) Snowflake Inc. (2022-2025) # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, software # distributed under the License is distributed on an "AS IS" BASIS, # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. # See the License for the specific language governing permissions and # limitations under the License. FROM mcr.microsoft.com/devcontainers/python:3.12 ENV PYTHONUNBUFFERED=1 ENV COREPACK_ENABLE_DOWNLOAD_PROMPT=0 # Install system dependencies RUN apt-get update && apt-get install -y \ build-essential \ curl \ git \ rsync \ unzip \ protobuf-compiler \ && rm -rf /var/lib/apt/lists/* # Install Node.js RUN curl -sL https://deb.nodesource.com/setup_22.x | bash - \ && apt-get install -y nodejs # Enable corepack RUN corepack enable # Update pip RUN pip install --upgrade pip
374
streamlit
streamlit/.devcontainer/devcontainer.json
{ "name": "streamlit-dev", "build": { "dockerfile": "Dockerfile" }, "forwardPorts": [3000], "portsAttributes": { "3000": { "label": "Dev Server", "onAutoForward": "notify" } }, "customizations": { "vscode": { "extensions": [ "dbaeumer.vscode-eslint", "esbenp.prettier-vscode", "ms-python.mypy-type-checker", "ms-python.python", "ms-python.debugpy", "ms-python.vscode-pylance", "charliermarsh.ruff", "EditorConfig.EditorConfig", "vitest.explorer", "tamasfe.even-better-toml" ], "settings": { "[javascript]": { "editor.defaultFormatter": "esbenp.prettier-vscode", "editor.formatOnSave": true }, "[json]": { "editor.defaultFormatter": "esbenp.prettier-vscode", "editor.formatOnSave": true }, "[jsonc]": { "editor.defaultFormatter": "esbenp.prettier-vscode", "editor.formatOnSave": true }, "[python]": { "editor.defaultFormatter": "charliermarsh.ruff", "editor.codeActionsOnSave": { "source.fixAll.ruff": "explicit", "source.organizeImports.ruff": "explicit" }, "editor.formatOnSave": true, "editor.formatOnPaste": true, "editor.formatOnType": true }, "[html]": { "editor.defaultFormatter": "esbenp.prettier-vscode" }, "[typescriptreact]": { "editor.defaultFormatter": "esbenp.prettier-vscode", "editor.formatOnSave": true }, "[typescript]": { "editor.defaultFormatter": "esbenp.prettier-vscode", "editor.formatOnSave": true, "editor.codeActionsOnSave": { "source.organizeImports": "explicit" } }, "[yaml]": { "editor.defaultFormatter": "esbenp.prettier-vscode", "editor.formatOnSave": true }, "[yml]": { "editor.defaultFormatter": "esbenp.prettier-vscode", "editor.formatOnSave": true }, "[toml]": { "editor.defaultFormatter": "tamasfe.even-better-toml", "editor.formatOnSave": true }, "files.trimTrailingWhitespace": true, "files.trimFinalNewlines": true, "files.insertFinalNewline": true, "search.exclude": { "lib/build/**": true }, "makefile.configureOnOpen": false, "eslint.workingDirectories": [ { "mode": "auto" } ], "eslint.lintTask.enable": true, "editor.codeActionsOnSave": { "source.fixAll": "always" }, "vitest.maximumConfigs": 5, "mypy-type-checker.importStrategy": "fromEnvironment", "mypy-type-checker.preferDaemon": false, "mypy-type-checker.args": ["--config-file=mypy.ini"], "ruff.nativeServer": true, "ruff.organizeImports": true, "ruff.fixAll": true, "python.analysis.exclude": ["lib/build"], "python.analysis.include": [ "lib/**/*", "scripts/**/*", "e2e_playwright/**/*" ], "python.analysis.extraPaths": ["./lib"], "python.analysis.typeCheckingMode": "off", "python.terminal.activateEnvInCurrentTerminal": true, "python.testing.pytestEnabled": true, "python.testing.autoTestDiscoverOnSaveEnabled": true, "cursorpyright.analysis.exclude": ["lib/build"], "cursorpyright.analysis.include": [ "lib/**/*", "scripts/**/*", "e2e_playwright/**/*" ], "cursorpyright.analysis.extraPaths": ["./lib"], "cursorpyright.analysis.typeCheckingMode": "off", "cursorpyright.analysis.useTypingExtensions": true, "cursorpyright.shouldImportPylanceSettings": "always", "cursorpyright.analysis.logLevel": "Information", "terminal.integrated.profiles.linux": { "frontend-dev": { "path": "bash", "args": ["-c", "make frontend-dev"] } }, "typescript.tsdk": "frontend/node_modules/typescript/lib", "typescript.enablePromptUseWorkspaceTsdk": true, "git.branchProtection": ["develop"] } } }, "updateContentCommand": "make all-dev", "postStartCommand": "make frontend-dev" }
2,229
streamlit
streamlit/.editorconfig
# https://editorconfig.org/ root = true [*] indent_style = space indent_size = 2 insert_final_newline = true trim_trailing_whitespace = true end_of_line = lf charset = utf-8 max_line_length = 79 [*.py] indent_size = 4 max_line_length = 88 # The JSON files contain newlines inconsistently [*.json] insert_final_newline = ignore [**/node_modules/**] indent_style = ignore indent_size = ignore [**.min.js] indent_style = ignore insert_final_newline = ignore [Makefile] indent_style = tab
188
streamlit
streamlit/.nvmrc
v24
3
streamlit
streamlit/.pre-commit-config.yaml
# Copyright (c) Streamlit Inc. (2018-2022) Snowflake Inc. (2022-2025) # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, software # distributed under the License is distributed on an "AS IS" BASIS, # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. # See the License for the specific language governing permissions and # limitations under the License. # Pre-commit configuration file, # when Streamlit's pre-commit detects that one of the linters has failed, # it automatically lints the files and does not allow the commit to pass. # Please review the changes after lint has failed and commit them again, # the second commit should pass, # because the files were linted after trying to do the first commit. repos: - repo: https://github.com/astral-sh/ruff-pre-commit # We fix ruff to a version to be in sync with the dev-requirements: rev: v0.14.7 hooks: # Run the linter. - id: ruff args: [--fix] files: \.py$|\.pyi$ # Run the formatter. - id: ruff-format files: \.py$|\.pyi$ - repo: local hooks: # Script ./scripts/run_in_subdirectory.py was used to work around a # known problem with hooks in subdirectories when pass_filenames option # is set to true # See: https://github.com/pre-commit/pre-commit/issues/1417 - id: prettier-app name: Prettier App # NOTE: This hook currently does not work on Windows due to "yarn" not being an executable and win32api.CreateProcess # turning `subprocess.run(["yarn", "prettier", "--write"])` into a call to `yarn.exe prettier --write` which does not exist entry: ./scripts/run_in_subdirectory.py frontend/app yarn run format files: ^frontend/app/.*\.(js|jsx|ts|tsx)$ exclude: /vendor/ language: node pass_filenames: true - id: prettier-lib name: Prettier Lib # NOTE: This hook currently does not work on Windows due to "yarn" not being an executable and win32api.CreateProcess # turning `subprocess.run(["yarn", "prettier", "--write"])` into a call to `yarn.exe prettier --write` which does not exist entry: ./scripts/run_in_subdirectory.py frontend/lib yarn run format files: ^frontend/lib/.*\.(js|jsx|ts|tsx)$ exclude: /vendor/ language: node pass_filenames: true - id: prettier-connection name: Prettier Connection # NOTE: This hook currently does not work on Windows due to "yarn" not being an executable and win32api.CreateProcess # turning `subprocess.run(["yarn", "prettier", "--write"])` into a call to `yarn.exe prettier --write` which does not exist entry: ./scripts/run_in_subdirectory.py frontend/connection yarn run format files: ^frontend/connection/.*\.(js|jsx|ts|tsx)$ exclude: /vendor/ language: node pass_filenames: true - id: prettier-utils name: Prettier Utils # NOTE: This hook currently does not work on Windows due to "yarn" not being an executable and win32api.CreateProcess # turning `subprocess.run(["yarn", "prettier", "--write"])` into a call to `yarn.exe prettier --write` which does not exist entry: ./scripts/run_in_subdirectory.py frontend/utils yarn run format files: ^frontend/utils/.*\.(js|jsx|ts|tsx)$ exclude: /vendor/ language: node pass_filenames: true - id: prettier-yaml name: Prettier-yaml # NOTE: This hook currently does not work on Windows due to "yarn" not being an executable and win32api.CreateProcess # turning `subprocess.run(["yarn", "prettier", "--write"])` into a call to `yarn.exe prettier --write` which does not exist # We perform this in the app directory because prettier is installed there. TODO: Break this out to a new package entry: ./scripts/run_in_subdirectory.py frontend/app yarn prettier "../../.github/**/*.{yml,yaml}" --write files: ^.github/.*\.(yml|yaml)$ language: node pass_filenames: false - id: prettier-vscode-devcontainer-json name: Prettier VSCode/devcontainer JSON # NOTE: This hook currently does not work on Windows due to "yarn" not being an executable and win32api.CreateProcess # turning `subprocess.run(["yarn", "prettier", "--write"])` into a call to `yarn.exe prettier --write` which does not exist # We perform this in the app directory because prettier is installed there. TODO: Break this out to a new package entry: ./scripts/run_in_subdirectory.py frontend/app yarn prettier "../../.vscode/*.json" "../../.devcontainer/*.json" --write --config ../.prettierrc files: ^(.vscode/.*\.json|\.devcontainer/.*\.json)$ language: node pass_filenames: false - id: license-headers name: Checks license headers entry: ./scripts/check_license_headers.py language: system always_run: true pass_filenames: false - id: generate-agent-rules name: Check generated agent rules entry: ./scripts/generate_agent_rules.py language: system pass_filenames: false files: | (?x) (^|.*/)AGENTS\.md$ |^scripts/generate_agent_rules\.py$ |^Makefile$ - id: vscode-devcontainer-sync name: Check VSCode/devcontainer sync entry: ./scripts/sync_vscode_devcontainer.py --check language: system files: ^(.vscode/(settings|extensions)\.json|\.devcontainer/devcontainer\.json)$ pass_filenames: false - id: yarn-dedupe name: Check frontend yarn.lock dedupe entry: ./scripts/check_yarn_dedupe.sh frontend language: system pass_filenames: false files: ^frontend/yarn\.lock$ - repo: https://github.com/Lucas-C/pre-commit-hooks rev: v1.5.5 hooks: - id: insert-license name: Add license for all (S)CSS/JS(X)/TS(X) files files: \.(s?css|jsx?|tsx?)$ args: - --comment-style - "/**| *| */" - --license-filepath - scripts/assets/license-template.txt - --fuzzy-match-generates-todo exclude: | (?x) /vendor/ |^vendor/ |^component-lib/declarations/apache-arrow |^frontend/app/src/assets/css/variables\.scss |^lib/tests/streamlit/elements/test_html\.js |^lib/tests/streamlit/elements/test_html\.css |^e2e_playwright/test_assets/ - id: insert-license name: Add license for all Proto files files: \.proto$ args: - --comment-style - "/**!| *| */" - --license-filepath - scripts/assets/license-template.txt - --fuzzy-match-generates-todo exclude: | (?x) /vendor/ |^vendor/ |^component-lib/declarations/apache-arrow |^proto/streamlit/proto/openmetrics_data_model\.proto - id: insert-license name: Add license for all shell files files: \.sh$ args: - --comment-style - "|#|" - --license-filepath - scripts/assets/license-template.txt - --fuzzy-match-generates-todo exclude: | (?x) /vendor/ |^vendor/ |^component-lib/declarations/apache-arrow - id: insert-license name: Add license for all Python files files: \.py$|\.pyi$ args: - --comment-style - "|#|" - --license-filepath - scripts/assets/license-template.txt - --fuzzy-match-generates-todo exclude: | (?x) /vendor/ |^vendor/ |^component-lib/declarations/apache-arrow - id: insert-license name: Add license for all HTML files files: \.html$ args: - --comment-style - "<!--||-->" - --license-filepath - scripts/assets/license-template.txt - --fuzzy-match-generates-todo exclude: | (?x) /vendor/ |^vendor/ |^component-lib/declarations/apache-arrow - repo: https://gitlab.com/bmares/check-json5 rev: v1.0.0 hooks: # Check JSON files that are allowed to have comments - id: check-json5 # Note that this should be the same as the check-json hook's exclude # property files: | (?x) ^\.vscode/.*\.json$ |tsconfig.*\.json$ |.*tsconfig\.json$ - repo: https://github.com/pre-commit/pre-commit-hooks rev: v5.0.0 hooks: - id: trailing-whitespace exclude: | (?x) ^frontend/app/src/assets/ |^NOTICES$ |^proto/streamlit/proto/openmetrics_data_model.proto$ |\.snap$ - id: check-added-large-files - id: check-json # Note that this should be the same as the check-json5 hook's files # property exclude: | (?x) ^\.vscode/.*\.json$ |tsconfig.*\.json$ |.*tsconfig\.json$ - id: check-toml - id: check-yaml exclude: lib/conda-recipe/meta.yaml - id: check-symlinks - id: check-case-conflict - id: check-merge-conflict - id: fix-byte-order-marker - id: end-of-file-fixer exclude: | (?x) /vendor/ |^NOTICES$ |^e2e_playwright/test_assets/ |^LICENSE$ - id: mixed-line-ending args: [--fix=lf] exclude: | (?x) ^NOTICES$
4,405
streamlit
streamlit/.ruff.toml
# Copyright (c) Streamlit Inc. (2018-2022) Snowflake Inc. (2022-2025) # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, software # distributed under the License is distributed on an "AS IS" BASIS, # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. # See the License for the specific language governing permissions and # limitations under the License. # In addition to the standard set of exclusions, omit all tests: extend-exclude = [ # Autogenerated files: "lib/streamlit/proto", "lib/streamlit/emojis.py", "lib/streamlit/material_icon_names.py", # File with an expected compilation error: "e2e_playwright/compilation_error_dialog.py", # Ignore frontend directory: "frontend/**", ] target-version = 'py310' line-length = 88 [format] docstring-code-format = true docstring-code-line-length = "dynamic" line-ending = "lf" [lint] # We activate all rules and only ignore the rules that we don't want to enforce # or that are not relevant for our codebase. select = ["ALL"] ignore = [ # Rules planned to be supported in the future: "EM101", # Checks for the use of string literals in exception constructors. "EM102", # Checks for the use of f-strings in exception constructors. "TRY003", # Checks for exception messages that are not defined in the exception class itself. # Ignored rule sets: "DTZ", # Checks for usage of unsafe naive datetime class. "PTH", # Enforces usage of pathlib. "C90", # Checks for McCabe complexity. "FBT", # Forbids boolean positional arguments. "SLF", # Checks unexpected for private member access. "BLE", # Checks for blind except statements. "CPY", # Checks for copyright statement. "DOC", # Checks for correct docstring format. # Ignored rules: "B904", # Checks for raise statements in exception handlers that lack a from clause. "PYI041", # Checks for parameter annotations that contain redundant unions between builtin numeric types (e.g., int | float). "PYI051", # Checks for redundant unions between a Literal and a builtin supertype of that Literal. "PD009", # Checks for usage of pd.DataFrame.iat. "PIE790", # Checks for unnecessary pass statements. "TD003", # Checks for missing issue link in TODO comment. "TD002", # Checks for missing author in TODO comment. "D100", # Checks for missing docstring in public module. "D101", # Checks for missing docstring in public class. "D102", # Checks for missing docstring in public method. "D103", # Checks for missing docstring in public function. "D104", # Checks for missing docstring in public package. "D105", # Checks for missing docstring in magic method. "D106", # Checks for missing docstring in public nested class. "D107", # Checks for missing docstring in __init__. "D205", # Checks for missing blank line after docstring summary. "D401", # Checks for docstring to start with imperative mood. "D202", # Checks for no-blank line after docstring. "FIX002", # Checks todo comments (which we want to allow). "PGH003", # Checks for blanket type ignore. "PLR2004", # Checks for numerical values that could be put into a constant. "PLR0904", # Checks for classes with too many public methods. "PLR0911", # Checks for functions with too many return statements. "PLR0912", # Checks for functions with too many branches. "PLR0913", # Checks for functions with too many arguments. "PLR0914", # Checks for functions with too many local variables. "PLR0915", # Checks for functions with too many statements. "PLR0916", # Checks for too many boolean expressions in if statements. "A002", # Checks if function argument shadows a Python builtin. "TRY300", # Checks for return statements in try blocks. "ANN401", # Checks that `any` is not used as an annotation. "SIM105", # Enforces use of contextlib.suppress. "SIM108", # Enforces ternary operators instead of if-else. "SIM115", # Enforces context manager for opening files. "SIM117", # Enforces single with statement with multiple contexts. "SIM905", # Enforces list instead of st.split. "COM812", # Checks for absence of trailing commas. Not recommended with formatter. "COM819", # Checks for presence of prohibited trailing commas. Not recommended with formatter. "RSE102", # Checks for unnecessary parentheses on raised exceptions. "RET504", # Checks for assignments that immediately precede a return of the assigned variable (too opinionated). "PT012", # Checks for pytest.raises context managers with multiple statements. "PT019", # Checks for tests that should use @pytest.mark.usefixtures (incompatible with unittest.patch) "PLC0415", # Enforces imports to be at the top-level of the file. ] exclude = [ # pympler is a vendored dependency that doesn't conform to our linting rules: "lib/streamlit/vendor/**", ] extend-safe-fixes = ["TC002", "TC003"] [lint.per-file-ignores] # Only add ignores for entire folders here. To ignore rules in individual # files, use the noqa ignore comment on top of the given file. "e2e_playwright/**" = [ "T20", "B018", "PD", "PERF", "N999", "S", "TRY", "NPY002", ] "lib/tests/**" = [ "PD", "D", "INP", "PERF", "N", "ARG", "S", "TRY", "ANN", "NPY002", ] "scripts/**" = ["T20", "INP", "PERF", "S", "TRY002"] [lint.flake8-tidy-imports] # Disallow all relative imports. ban-relative-imports = "all" [lint.isort] known-first-party = ["streamlit", "shared", "tests", "e2e_playwright"] [lint.pydocstyle] convention = "numpy" [lint.pycodestyle] # Allow lines (e.g. comments) up to 120 characters instead of 88, # which works well with Github. max-line-length = 120 [lint.flake8-comprehensions] # Allow dict calls that make use of keyword arguments (e.g., dict(a=1, b=2)). allow-dict-calls-with-keyword-arguments = true [lint.flake8-unused-arguments] # Ignore unused variadic argument, like *args and **kwargs. ignore-variadic-names = true [lint.flake8-annotations] # Allow missing return annotation on functions that return None: suppress-none-returning = true
1,969
streamlit
streamlit/.vscode/extensions.json
{ "recommendations": [ "dbaeumer.vscode-eslint", "esbenp.prettier-vscode", "ms-python.mypy-type-checker", "ms-python.python", "ms-python.debugpy", "ms-python.vscode-pylance", "charliermarsh.ruff", "EditorConfig.EditorConfig", "vitest.explorer", "tamasfe.even-better-toml", "anysphere.cursorpyright" ] }
172
streamlit
streamlit/.vscode/launch.json
{ // Use IntelliSense to learn about possible attributes. // Hover to view descriptions of existing attributes. // For more information, visit: https://go.microsoft.com/fwlink/?linkid=830387 "version": "0.2.0", "configurations": [ { "name": "Debug Streamlit App", "type": "debugpy", "request": "launch", "module": "streamlit", "args": ["run", "${file}"] // allows to set breakpoints in installed packages, such as the Streamlit lib itself or custom component code etc. // "justMyCode": false, // allows to use a different installed version of streamlit in another venv folder, e.g. // "program": "~/Projects/streamlit/lib/venv/bin/pytest" } ] }
254
streamlit
streamlit/CLAUDE.md
@./AGENTS.md
7
streamlit
streamlit/CODE_OF_CONDUCT.md
# Contributor Covenant Code of Conduct ## Our Pledge In the interest of fostering an open and welcoming environment, we as contributors and maintainers pledge to making participation in our project and our community a harassment-free experience for everyone, regardless of age, body size, disability, ethnicity, sex characteristics, gender identity and expression, level of experience, education, socio-economic status, nationality, personal appearance, race, religion, or sexual identity and orientation. ## Our Standards Examples of behavior that contributes to creating a positive environment include: - Using welcoming and inclusive language - Being respectful of differing viewpoints and experiences - Gracefully accepting constructive criticism - Focusing on what is best for the community - Showing empathy towards other community members Examples of unacceptable behavior by participants include: - The use of sexualized language or imagery and unwelcome sexual attention or advances - Trolling, insulting/derogatory comments, and personal or political attacks - Public or private harassment - Publishing others' private information, such as a physical or electronic address, without explicit permission - Other conduct which could reasonably be considered inappropriate in a professional setting ## Our Responsibilities Project maintainers are responsible for clarifying the standards of acceptable behavior and are expected to take appropriate and fair corrective action in response to any instances of unacceptable behavior. Project maintainers have the right and responsibility to remove, edit, or reject comments, commits, code, wiki edits, issues, and other contributions that are not aligned to this Code of Conduct, or to ban temporarily or permanently any contributor for other behaviors that they deem inappropriate, threatening, offensive, or harmful. ## Scope This Code of Conduct applies both within project spaces and in public spaces when an individual is representing the project or its community. Examples of representing a project or community include using an official project e-mail address, posting via an official social media account, or acting as an appointed representative at an online or offline event. Representation of a project may be further defined and clarified by project maintainers. ## Enforcement Instances of abusive, harassing, or otherwise unacceptable behavior may be reported by contacting the project team at hello@streamlit.io. All complaints will be reviewed and investigated and will result in a response that is deemed necessary and appropriate to the circumstances. The project team is obligated to maintain confidentiality with regard to the reporter of an incident. Further details of specific enforcement policies may be posted separately. Project maintainers who do not follow or enforce the Code of Conduct in good faith may face temporary or permanent repercussions as determined by other members of the project's leadership. ## Attribution This Code of Conduct is adapted from the [Contributor Covenant][homepage], version 1.4, available at https://www.contributor-covenant.org/version/1/4/code-of-conduct.html [homepage]: https://www.contributor-covenant.org For answers to common questions about this code of conduct, see https://www.contributor-covenant.org/faq
694
streamlit
streamlit/CONTRIBUTING.md
Thanks for your interest in helping improve Streamlit! 🎉 **If you are looking for Streamlit's documentation, go here instead: <https://docs.streamlit.io>** This wiki is for people who want to contribute code to Streamlit. There are also other ways to contribute, such as [reporting bugs](https://github.com/streamlit/streamlit/issues/new?template=bug_report.yml), creating [feature requests](https://github.com/streamlit/streamlit/issues/new?template=feature_request.yml), helping other users [in our forums](https://discuss.streamlit.io), Stack Overflow, etc., or just being an awesome member of the community! ## Before contributing **If your contribution is more than a few lines of code, then prior to starting to code on it please post in the issue saying you want to volunteer, and then wait for a positive response.** And if there is no issue for it yet, create it first. This helps make sure: 1. Two people aren't working on the same thing 2. This is something Streamlit's maintainers believe should be implemented/fixed 3. Any API, UI, or deeper architectural changes that need to be implemented have been fully thought through by Streamlit's maintainers 4. Your time is well spent! > [!TIP] > To be clear: if you open a PR that adds a new feature (and isn't just a bug fix or similar) *without* prior support from the Streamlit team, the chances of getting it merged are *extremely low*. Adding a new feature comes with a lot of baggage, such as thinking through the exact API, making sure it fulfills our standards, and maintaining it in the future – even if it's just a small parameter. ## Style Guide Check out [Streamlit's style guide](./wiki/code-style-guide.md). We use [Prettier](https://prettier.io), [Ruff](https://github.com/astral-sh/ruff) and [ESLint](https://eslint.org/) to format and lint code, but some things go beyond what auto-formatters and linters can do. So please take a look! ## How to build Streamlit ### 1. Set up your base environment #### MacOS ```bash # Some Apple dev tools (developer.apple.com/downloads) $ xcode-select --install # Install Homebrew $ /usr/bin/ruby -e "$(curl -fsSL https://raw.githubusercontent.com/Homebrew/install/HEAD/install.sh)" # Install the Protobuf compiler $ brew install protobuf ``` **Installing Node JS and yarn** We recommend that you [manage your nodejs installation with nvm](https://github.com/nvm-sh/nvm#install--update-script). After following the instructions linked above to install `nvm`, use the following command to install the latest supported node version ```bash # Install node nvm install node ``` **Note:** Node has added Corepack which is a manager of package managers 🥳. It supports yarn! You can enable it by running the following: ```bash corepack enable ``` You may need to `brew install corepack` depending on how you installed node. #### Ubuntu ```bash # Install some essentials $ sudo apt-get update $ sudo apt-get install -y sudo make build-essential curl git rsync unzip protobuf-compiler # Set frontend dependencies: $ curl -o- https://raw.githubusercontent.com/nvm-sh/nvm/v0.40.3/install.sh | bash $ source ~/.bashrc $ nvm install node $ corepack enable # Install uv for Python $ curl -LsSf https://astral.sh/uv/install.sh | sh # Install virtual environment in lib: $ cd lib/ $ uv venv --python 3.12 $ source .venv/bin/activate ``` #### Windows Streamlit's development setup is pretty Mac- and Linux-centric. If you're doing Streamlit development on Windows, we suggest using our [devcontainer](./.devcontainer) via Github Codespaces or locally via VS Code. Alternatively, you can also spin up a Linux VM (e.g. via [VirtualBox](https://www.virtualbox.org/), which is free); or your own Linux Docker image; or using Microsoft's WSL ("Windows Subsystem for Linux"). ### 2. Grab the code *(You probably already know how to do this, but just in case...)* First fork [the repo](https://github.com/streamlit/streamlit) via the UI on Github and then do the following: ```bash git clone https://github.com/${YOUR_NAME}/streamlit.git cd streamlit git remote add remote https://github.com/streamlit/streamlit.git git checkout develop git submodule update --init git checkout -b ${BRANCH_NAME} ``` ### 3. Create a new Python environment Create a virtual environment for Streamlit using your favorite tool (`virtualenv`, `pipenv`, etc) and activate it. Here's how we do it with [`venv`](https://docs.python.org/3/library/venv.html): ```bash cd lib python -m venv venv ``` Note that, with `venv` this process should be done from any directory, but it is recommended to do it from the `lib/` directory to keep all python files in one directory. ```bash source ./venv/bin/activate ``` ## How to develop Streamlit The basic developer workflow is that you run a React development server on port 3000 in one terminal and run Streamlit CLI commands in another terminal. ### 1. One-time setup ```bash make all-dev ``` ### 2. Build the frontend ```bash make frontend ``` ### 3. Start the dev server (hot-reloading) The easiest way to start the dev server from the terminal, is to run: ```bash make frontend-dev ``` > [!Note] > This server listens on port `3000` rather than `8501` (i.e. Streamlit's production port). Normally you don't have to worry about this, but it may matter when you're developing certain features. The server is automatically updating to the changes you apply in the frontend code (hot-reloading). ### 4. Run Streamlit Open another terminal, start your Python environment and run Streamlit. If you're using `venv`, that's: ```bash $ cd lib $ source ./venv/bin/activate $ cd .. # Now run any Streamlit command you want, such as: $ streamlit hello ``` ### 5. What to do when you modify some code #### When you modify JS, or CSS code Since we use that awesome dev server above, when you change any JS/CSS code everything should automatically *just work* without the need to restart any of the servers. #### When you modify Python code When you modify Python code, you should kill the old Streamlit server, if any (<key>ctrl-c</key> on the terminal) and then restart it. #### When you update protobufs If you ever modify our protobufs, you'll need to run the command below to compile the protos into libraries that can be used in Python and JS: ```bash make protobuf ``` #### When Javascript or Python dependencies change ```bash make init ``` > [!IMPORTANT] > If your change updates `frontend/yarn.lock` (for example, after adding or upgrading dependencies), run `cd frontend && yarn dedupe` before committing. Our `scripts/check_yarn_dedupe.sh` hook enforces this locally (via pre-commit) and in CI, so handling it upfront keeps your PR green. ### 6. Running tests You should always write unit tests and end-to-end tests! This is true for new features, but also for bugs; this way when you fix a bug you can be sure it will not show up again. So bug-fixing is actually a great way to increase our test coverage where it actually matters. #### Python unit tests - Run all with: ```bash make python-tests ``` - Run a specific test file with: ```bash PYTHONPATH=lib pytest lib/tests/streamlit/the_test_name.py ``` - Run a specific test inside a test file with: ```bash PYTHONPATH=lib pytest lib/tests/streamlit/the_test_name.py -k test_that_something_works ``` - Some tests require you to set up credentials to connect to Snowflake and install [the `snowflake-snowpark-python` package](https://pypi.org/project/snowflake-snowpark-python/). Information on how the Snowflake environment is set up is in our [test utils](./lib/tests/testutil.py) including environment variables to be set. They are skipped by default when running tests. To enable them and disable all others, pass the `--require-integration` flag to `pytest`. ```bash PYTHONPATH=lib pytest --require-integration ``` #### JS unit tests - Run all with: ```bash make frontend-tests ``` - Run specific tests: ```bash cd frontend yarn workspace @streamlit/lib test src/path/to/test_file.test.ts ``` NOTE: Making changes to a react component may cause unit snapshot tests (which are designed to catch unintended changes to jsx/tsx components) to fail. Once you've double-checked that all of the changes in the failing snapshot test are expected, you can follow the prompts that appear after running `make frontend-tests` to update the snapshots, check them into source control, and include them in your PR. #### End-to-end tests You can find information about our e2e testing setup [here](./wiki/running-e2e-tests.md). ### 7. Formatting, linting, and type-checking We've set up various formatting, linting, and type-checking rules that our Continuous Integration checks to maintain code quality and consistency. Before merging a Pull Request, all formatting and linting rules must be satisfied and passed successfully. ### Python For Python, we use [ruff](https://github.com/astral-sh/ruff) for formatting & linting and [mypy](https://github.com/python/mypy) for type-checking. #### Formatting To format all Python code & sort the imports, run the following command: ```bash make python-format ``` Alternatively, you can use the `ruff format` command directly. #### Linting To run the linter, use the command below: ```bash make python-lint ``` Alternatively, you can use the `ruff check` command directly. #### Type-checking For type-checking, run: ```bash make python-types ``` ### Javascript / Typescript For Javascript/Typescript, we utilize Prettier and ESLint. #### Formatting To format your code, run this command: ```bash make frontend-format ``` #### Linting To initiate the linting process, use this command: ```bash make frontend-lint ``` #### Type-checking For type-checking, run: ```bash make frontend-types ``` ### VS-Code / Cursor Setup For development in VS Code, we recommend installing the extensions listed in [`.vscode/extensions.json`](./.vscode/extensions.json) and for an optimized configuration you can use the VS-Code settings from [`.devcontainer/devcontainer.json`](./.devcontainer/devcontainer.json). ### Pre-commit hooks When Streamlit's pre-commit detects that one of the linters has failed, it automatically lints the files and does not allow the commit to pass. Please review the changes after lint has failed and commit them again, the second commit should pass, because the files were linted after trying to do the first commit. But you can run pre-commit hooks manually as needed. - Run all checks on your staged files by using: ```shell pre-commit run ``` - Run all checks on all files by using: ```shell pre-commit run --all-files ``` ## Troubleshooting #### Test `test_streamlit_version` fails ```python def test_streamlit_version(self): """Test streamlit.__version__.""" self.assertEqual(__version__, get_version()) AssertionError: '1.11.0' != '1.11.1' - 1.11.0 ? ^ + 1.11.1 ? ^ ``` To fix this make sure you have setup your Python's venv environments correctly, upgrade your dependencies or recreate your environment and repeat setup. You might have double environments which mismatch for example by accident you could have created venv Python environment inside `streamlit` repository and second one inside `streamlit/lib`. Remove them. #### `protoc` command fails because of version mismatch If the `protoc` command fails and there is a version mismatch reported, try to install the correct version. - Go to [Protobuf releases](https://github.com/protocolbuffers/protobuf/releases) - Choose the [Protobuf tag](https://github.com/protocolbuffers/protobuf/tags) which matches Python's environment Protobuf version, for example [3.20.0](https://github.com/protocolbuffers/protobuf/releases/tag/v3.20.0). Call `pip show protobuf` or equivalent to find this out. - Download zip containing protoc for your system, example: [protoc-3.20.0-osx-x86_64.zip](https://github.com/protocolbuffers/protobuf/releases/download/v3.20.0/protoc-3.20.0-osx-x86_64.zip) <details> <summary>Example for macOS</summary> ```bash curl -OL https://github.com/protocolbuffers/protobuf/releases/download/v3.20.0/protoc-3.20.0-osx-x86_64.zip sudo unzip -o protoc-3.20.0-osx-x86_64.zip -d /usr/local bin/protoc sudo unzip -o protoc-3.20.0-osx-x86_64.zip -d /usr/local 'include/*' # Print out your System's Protoc version protoc --version ``` </details> <details> <summary>Example for Linux (ARM)</summary> ```bash curl -OL https://github.com/protocolbuffers/protobuf/releases/download/v3.20.0/protoc-3.20.0-linux-aarch_64.zip sudo unzip -o protoc-3.20.0-linux-aarch_64.zip -d /usr/local bin/protoc sudo unzip -o protoc-3.20.0-linux-aarch_64.zip -d /usr/local 'include/*' # (optional) remove old version rm /usr/bin/protoc ln -s /usr/local/bin/protoc /usr/bin/protoc # Print out your System's Protoc version protoc --version ``` </details> #### Installing conda and conda-build If you want to build Streamlit as a conda package on your local machine (needing to do this should be rare), you'll need to install a few extra dependencies so that the `make conda-package` target works. 1. First, install `conda` using your favorite package manager or by following [these instructions](https://docs.conda.io/projects/conda/en/latest/user-guide/install/index.html). Both `anaconda` and `miniconda` will work. 2. Then, run `conda install conda-build`. ## Introducing dependencies We aim to only introduce dependencies in this project that have reasonable restrictions and comply with various laws. ![Views](https://api.views-badge.org/badge/st-wiki-contributing)
4,042
streamlit
streamlit/LICENSE
Apache License Version 2.0, January 2004 http://www.apache.org/licenses/ TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION 1. Definitions. "License" shall mean the terms and conditions for use, reproduction, and distribution as defined by Sections 1 through 9 of this document. "Licensor" shall mean the copyright owner or entity authorized by the copyright owner that is granting the License. "Legal Entity" shall mean the union of the acting entity and all other entities that control, are controlled by, or are under common control with that entity. For the purposes of this definition, "control" means (i) the power, direct or indirect, to cause the direction or management of such entity, whether by contract or otherwise, or (ii) ownership of fifty percent (50%) or more of the outstanding shares, or (iii) beneficial ownership of such entity. "You" (or "Your") shall mean an individual or Legal Entity exercising permissions granted by this License. "Source" form shall mean the preferred form for making modifications, including but not limited to software source code, documentation source, and configuration files. "Object" form shall mean any form resulting from mechanical transformation or translation of a Source form, including but not limited to compiled object code, generated documentation, and conversions to other media types. "Work" shall mean the work of authorship, whether in Source or Object form, made available under the License, as indicated by a copyright notice that is included in or attached to the work (an example is provided in the Appendix below). "Derivative Works" shall mean any work, whether in Source or Object form, that is based on (or derived from) the Work and for which the editorial revisions, annotations, elaborations, or other modifications represent, as a whole, an original work of authorship. For the purposes of this License, Derivative Works shall not include works that remain separable from, or merely link (or bind by name) to the interfaces of, the Work and Derivative Works thereof. "Contribution" shall mean any work of authorship, including the original version of the Work and any modifications or additions to that Work or Derivative Works thereof, that is intentionally submitted to Licensor for inclusion in the Work by the copyright owner or by an individual or Legal Entity authorized to submit on behalf of the copyright owner. For the purposes of this definition, "submitted" means any form of electronic, verbal, or written communication sent to the Licensor or its representatives, including but not limited to communication on electronic mailing lists, source code control systems, and issue tracking systems that are managed by, or on behalf of, the Licensor for the purpose of discussing and improving the Work, but excluding communication that is conspicuously marked or otherwise designated in writing by the copyright owner as "Not a Contribution." "Contributor" shall mean Licensor and any individual or Legal Entity on behalf of whom a Contribution has been received by Licensor and subsequently incorporated within the Work. 2. Grant of Copyright License. Subject to the terms and conditions of this License, each Contributor hereby grants to You a perpetual, worldwide, non-exclusive, no-charge, royalty-free, irrevocable copyright license to reproduce, prepare Derivative Works of, publicly display, publicly perform, sublicense, and distribute the Work and such Derivative Works in Source or Object form. 3. Grant of Patent License. Subject to the terms and conditions of this License, each Contributor hereby grants to You a perpetual, worldwide, non-exclusive, no-charge, royalty-free, irrevocable (except as stated in this section) patent license to make, have made, use, offer to sell, sell, import, and otherwise transfer the Work, where such license applies only to those patent claims licensable by such Contributor that are necessarily infringed by their Contribution(s) alone or by combination of their Contribution(s) with the Work to which such Contribution(s) was submitted. If You institute patent litigation against any entity (including a cross-claim or counterclaim in a lawsuit) alleging that the Work or a Contribution incorporated within the Work constitutes direct or contributory patent infringement, then any patent licenses granted to You under this License for that Work shall terminate as of the date such litigation is filed. 4. Redistribution. You may reproduce and distribute copies of the Work or Derivative Works thereof in any medium, with or without modifications, and in Source or Object form, provided that You meet the following conditions: (a) You must give any other recipients of the Work or Derivative Works a copy of this License; and (b) You must cause any modified files to carry prominent notices stating that You changed the files; and (c) You must retain, in the Source form of any Derivative Works that You distribute, all copyright, patent, trademark, and attribution notices from the Source form of the Work, excluding those notices that do not pertain to any part of the Derivative Works; and (d) If the Work includes a "NOTICE" text file as part of its distribution, then any Derivative Works that You distribute must include a readable copy of the attribution notices contained within such NOTICE file, excluding those notices that do not pertain to any part of the Derivative Works, in at least one of the following places: within a NOTICE text file distributed as part of the Derivative Works; within the Source form or documentation, if provided along with the Derivative Works; or, within a display generated by the Derivative Works, if and wherever such third-party notices normally appear. The contents of the NOTICE file are for informational purposes only and do not modify the License. You may add Your own attribution notices within Derivative Works that You distribute, alongside or as an addendum to the NOTICE text from the Work, provided that such additional attribution notices cannot be construed as modifying the License. You may add Your own copyright statement to Your modifications and may provide additional or different license terms and conditions for use, reproduction, or distribution of Your modifications, or for any such Derivative Works as a whole, provided Your use, reproduction, and distribution of the Work otherwise complies with the conditions stated in this License. 5. Submission of Contributions. Unless You explicitly state otherwise, any Contribution intentionally submitted for inclusion in the Work by You to the Licensor shall be under the terms and conditions of this License, without any additional terms or conditions. Notwithstanding the above, nothing herein shall supersede or modify the terms of any separate license agreement you may have executed with Licensor regarding such Contributions. 6. Trademarks. This License does not grant permission to use the trade names, trademarks, service marks, or product names of the Licensor, except as required for reasonable and customary use in describing the origin of the Work and reproducing the content of the NOTICE file. 7. Disclaimer of Warranty. Unless required by applicable law or agreed to in writing, Licensor provides the Work (and each Contributor provides its Contributions) on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied, including, without limitation, any warranties or conditions of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A PARTICULAR PURPOSE. You are solely responsible for determining the appropriateness of using or redistributing the Work and assume any risks associated with Your exercise of permissions under this License. 8. Limitation of Liability. In no event and under no legal theory, whether in tort (including negligence), contract, or otherwise, unless required by applicable law (such as deliberate and grossly negligent acts) or agreed to in writing, shall any Contributor be liable to You for damages, including any direct, indirect, special, incidental, or consequential damages of any character arising as a result of this License or out of the use or inability to use the Work (including but not limited to damages for loss of goodwill, work stoppage, computer failure or malfunction, or any and all other commercial damages or losses), even if such Contributor has been advised of the possibility of such damages. 9. Accepting Warranty or Additional Liability. While redistributing the Work or Derivative Works thereof, You may choose to offer, and charge a fee for, acceptance of support, warranty, indemnity, or other liability obligations and/or rights consistent with this License. However, in accepting such obligations, You may act only on Your own behalf and on Your sole responsibility, not on behalf of any other Contributor, and only if You agree to indemnify, defend, and hold each Contributor harmless for any liability incurred by, or claims asserted against, such Contributor by reason of your accepting any such warranty or additional liability. END OF TERMS AND CONDITIONS APPENDIX: How to apply the Apache License to your work. To apply the Apache License to your work, attach the following boilerplate notice, with the fields enclosed by brackets "[]" replaced with your own identifying information. (Don't include the brackets!) The text should be enclosed in the appropriate comment syntax for the file format. We also recommend that a file or class name and description of purpose be included on the same "printed page" as the copyright notice for easier identification within third-party archives. Copyright [yyyy] [name of copyright owner] Licensed under the Apache License, Version 2.0 (the "License"); you may not use this file except in compliance with the License. You may obtain a copy of the License at http://www.apache.org/licenses/LICENSE-2.0 Unless required by applicable law or agreed to in writing, software distributed under the License is distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the License for the specific language governing permissions and limitations under the License.
3,168
streamlit
streamlit/Makefile
# Copyright (c) Streamlit Inc. (2018-2022) Snowflake Inc. (2022-2025) # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, software # distributed under the License is distributed on an "AS IS" BASIS, # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. # See the License for the specific language governing permissions and # limitations under the License. # Make uses /bin/sh by default, but we are using some bash features. On Ubuntu # /bin/sh is POSIX compliant, ie it's not bash. So let's be explicit: SHELL=/bin/bash INSTALL_DEV_REQS ?= true INSTALL_TEST_REQS ?= true INSTALL_PLAYWRIGHT ?= true # Flags: # - INSTALL_DEV_REQS: install dev requirements (default: true) # - INSTALL_TEST_REQS: install test requirements (default: true) # - INSTALL_PLAYWRIGHT: install Playwright browsers during python-init (default: true) # CI uses a dedicated action to install browsers and typically sets this to false. # Local dev can opt out when not needed: `INSTALL_PLAYWRIGHT=false make init` PYTHON_VERSION := $(shell python --version | cut -d " " -f 2 | cut -d "." -f 1-2) MIN_PROTOC_VERSION = 3.20 # Check if Python is installed and can be executed, otherwise show an error message in red (but continue) ifeq ($(PYTHON_VERSION),) error_message="Error: Python version is not detected. Please ensure Python is installed and accessible in your PATH." error_message_red_colored=$(shell echo -e "\033[0;31m ${error_message} \033[0m") $(warning ${error_message_red_colored}) endif .PHONY: help # Show all available make commands. help: @# Magic line used to create self-documenting makefiles. @# Note that this means the documenting comment just before the command (but after the .PHONY) must be all one line, and should begin with a capital letter and end with a period. @# See https://stackoverflow.com/a/35730928 @awk '/^#/{c=substr($$0,3);next}c&&/^[[:alpha:]][[:alnum:]_-]+:/{print substr($$1,1,index($$1,":")-1) ";" c}1{c=0}' Makefile | column -s';' -t .PHONY: all # Install all dependencies, build frontend, and install editable Streamlit. all: init frontend .PHONY: all-dev # Install all dependencies and editable Streamlit, but do not build the frontend. all-dev: init pre-commit install @echo "" @echo " The frontend has *not* been rebuilt." @echo " If you need to make a wheel file, run:" @echo "" @echo " make frontend" @echo "" .PHONY: init # Install all dependencies and build protobufs. init: python-init frontend-init protobuf .PHONY: clean # Remove all generated files. clean: cd lib; rm -rf build dist .eggs *.egg-info rm -rf lib/conda-recipe/dist find . -name '*.pyc' -type f -delete || true find . -name __pycache__ -type d -delete || true find . -name .pytest_cache -exec rm -rfv {} \; || true find . -name '.benchmarks' -type d -exec rm -rfv {} \; || true rm -rf .mypy_cache rm -rf .ruff_cache rm -f lib/streamlit/proto/*_pb2.py* rm -rf lib/streamlit/static rm -f lib/Pipfile.lock rm -rf frontend/app/build rm -rf frontend/node_modules rm -rf frontend/app/performance/lighthouse/reports rm -rf frontend/app/node_modules rm -rf frontend/lib/node_modules rm -rf frontend/connection/node_modules rm -rf frontend/test_results rm -f frontend/protobuf/proto.js rm -f frontend/protobuf/proto.d.ts rm -rf frontend/public/reports rm -rf frontend/lib/dist rm -rf frontend/connection/dist rm -rf frontend/component-v2-lib/dist rm -rf ~/.cache/pre-commit rm -rf e2e_playwright/test-results rm -rf e2e_playwright/performance-results find . -name .streamlit -not \( -path './e2e_playwright/.streamlit' -o -path './e2e_playwright/config/.streamlit' \) -type d -exec rm -rfv {} \; || true cd lib; rm -rf .coverage .coverage\.* .PHONY: protobuf # Recompile Protobufs for Python and the frontend. protobuf: # Ensure protoc is installed and is >= MIN_PROTOC_VERSION. @if ! command -v protoc &> /dev/null ; then \ echo "protoc not installed."; \ exit 1; \ fi; \ \ PROTOC_VERSION=$$(protoc --version | cut -d ' ' -f 2); \ \ if [[ $$(echo -e "$$PROTOC_VERSION\n$(MIN_PROTOC_VERSION)" | sort -V | head -n1) != $(MIN_PROTOC_VERSION) ]]; then \ echo "Error: protoc version $${PROTOC_VERSION} is < $(MIN_PROTOC_VERSION)"; \ exit 1; \ else \ echo "protoc version $${PROTOC_VERSION} is >= than $(MIN_PROTOC_VERSION)"; \ fi; \ protoc \ --proto_path=proto \ --python_out=lib \ --mypy_out=lib \ proto/streamlit/proto/*.proto @# JS/TS protobuf generation cd frontend/ ; yarn workspace @streamlit/protobuf run generate-protobuf .PHONY: python-init # Install Python dependencies and Streamlit in editable mode. python-init: pip_args=("--editable" "./lib");\ if [ "${INSTALL_DEV_REQS}" = "true" ] ; then\ pip_args+=("--requirement" "lib/dev-requirements.txt"); \ fi;\ if [ "${INSTALL_TEST_REQS}" = "true" ] ; then\ pip_args+=("--requirement" "lib/test-requirements.txt"); \ fi;\ if command -v "uv" > /dev/null; then \ echo "Running command: uv pip install $${pip_args[@]}"; \ uv pip install $${pip_args[@]}; \ else \ echo "Running command: pip install $${pip_args[@]}"; \ pip install $${pip_args[@]}; \ fi;\ if [ "${INSTALL_TEST_REQS}" = "true" ] && [ "${INSTALL_PLAYWRIGHT}" = "true" ] ; then\ python -m playwright install --with-deps; \ fi; .PHONY: python-lint # Lint and check formatting of Python files. python-lint: # Checks if the formatting is correct: ruff format --check # Run linter: ruff check .PHONY: python-format # Format Python files. python-format: # Sort imports ( see https://docs.astral.sh/ruff/formatter/#sorting-imports ) ruff check --select I --fix # Run code formatter ruff format .PHONY: python-tests # Run Python unit tests. python-tests: cd lib; \ PYTHONPATH=. \ pytest -v -l \ -m "not performance" \ tests/ .PHONY: python-performance-tests # Run Python performance tests. python-performance-tests: cd lib; \ PYTHONPATH=. \ pytest -v -l \ -m "performance" \ --benchmark-autosave \ --benchmark-storage file://../.benchmarks/pytest \ tests/ .PHONY: python-integration-tests # Run Python integration tests. Requires `integration-requirements.txt` to be installed. python-integration-tests: cd lib; \ PYTHONPATH=. \ pytest -v -l \ --require-integration \ tests/ .PHONY: python-types # Run the Python type checker. python-types: # Run ty type checker: ty check # Run mypy type checker: mypy --config-file=mypy.ini .PHONY: frontend-init # Install all frontend dependencies. frontend-init: @cd frontend/ && { \ corepack enable yarn; \ if [ $$? -ne 0 ]; then \ echo "Error: 'corepack' command not found or failed to enable."; \ echo "Please ensure you are running the expected version of Node.js as defined in '.nvmrc'."; \ exit 1; \ fi; \ corepack install && yarn install --immutable; \ } .PHONY: frontend # Build the frontend. frontend: cd frontend/ ; yarn workspaces foreach --all --topological run build rsync -av --delete --delete-excluded --exclude=reports \ frontend/app/build/ lib/streamlit/static/ # Move manifest.json to a location that can actually be served by the Tornado # server's static asset handler. mv lib/streamlit/static/.vite/manifest.json lib/streamlit/static .PHONY: frontend-with-profiler # Build the frontend with the profiler enabled. frontend-with-profiler: # Build frontend dependent libraries (excluding app and lib): cd frontend/ ; yarn workspaces foreach --all --exclude @streamlit/app --exclude @streamlit/lib --topological run build # Build the app with the profiler enabled: cd frontend/ ; yarn workspace @streamlit/app buildWithProfiler rsync -av --delete --delete-excluded --exclude=reports \ frontend/app/build/ lib/streamlit/static/ .PHONY: frontend-fast # Build the frontend (as fast as possible). frontend-fast: cd frontend/ ; yarn workspaces foreach --recursive --topological --from @streamlit/app --exclude @streamlit/lib run build rsync -av --delete --delete-excluded --exclude=reports \ frontend/app/build/ lib/streamlit/static/ .PHONY: frontend-dev # Start the frontend development server. frontend-dev: cd frontend/ ; yarn start .PHONY: frontend-lint # Lint and check formatting of frontend files. frontend-lint: cd frontend/ ; yarn workspaces foreach --all run formatCheck cd frontend/ ; yarn workspaces foreach --all run lint .PHONY: frontend-types # Run the frontend type checker. frontend-types: cd frontend/ ; yarn workspaces foreach --all run typecheck .PHONY: frontend-format # Format frontend files. frontend-format: cd frontend/ ; yarn workspaces foreach --all run format .PHONY: frontend-tests # Run frontend unit tests and generate coverage report. frontend-tests: cd frontend; TESTPATH=$(TESTPATH) yarn testCoverage .PHONY: frontend-typesync # Check for unsynced frontend types. frontend-typesync: cd frontend/ ; yarn workspaces foreach --all --exclude @streamlit/typescript-config run typesync:ci --dry=fail || (\ echo -e "\033[0;31mTypesync check failed. Run 'make update-frontend-typesync' to fix.\033[0m"; \ exit 1 \ ) .PHONY: update-frontend-typesync # Installs missing typescript typings for dependencies. update-frontend-typesync: cd frontend/ ; yarn workspaces foreach --all --exclude @streamlit/typescript-config run typesync cd frontend/ ; yarn cd component-lib/ ; yarn typesync cd component-lib/ ; yarn .PHONY: update-snapshots # Update e2e playwright snapshots based on the latest completed CI run. update-snapshots: python ./scripts/update_e2e_snapshots.py .PHONY: update-snapshots-changed # Update e2e playwright snapshots of changed e2e files based on the latest completed CI run. update-snapshots-changed: python ./scripts/update_e2e_snapshots.py --changed .PHONY: update-material-icons # Update material icons based on latest Google material symbol version. update-material-icons: python ./scripts/update_material_icon_font_and_names.py .PHONY: update-emojis # Update emojis based on latest emoji version. update-emojis: python ./scripts/update_emojis.py .PHONY: update-notices # Update the notices file (licenses of frontend assets and dependencies). update-notices: cd frontend; \ yarn licenses generate-disclaimer --production --recursive > ../NOTICES ./scripts/append_license.sh frontend/app/src/assets/fonts/Source_Code/Source-Code.LICENSE ./scripts/append_license.sh frontend/app/src/assets/fonts/Source_Sans/Source-Sans.LICENSE ./scripts/append_license.sh frontend/app/src/assets/fonts/Source_Serif/Source-Serif.LICENSE ./scripts/append_license.sh frontend/app/src/assets/img/Material-Icons.LICENSE ./scripts/append_license.sh frontend/app/src/assets/img/Open-Iconic.LICENSE ./scripts/append_license.sh frontend/lib/src/vendor/react-bootstrap-LICENSE.txt ./scripts/append_license.sh frontend/lib/src/vendor/fzy.js/fzyjs-LICENSE.txt .PHONY: update-headers # Update all license headers. update-headers: pre-commit run insert-license --all-files --hook-stage manual pre-commit run license-headers --all-files --hook-stage manual .PHONY: update-min-deps # Update minimum dependency constraints file. update-min-deps: INSTALL_DEV_REQS=false INSTALL_TEST_REQS=false make python-init >/dev/null python scripts/get_min_versions.py >scripts/assets/min-constraints-gen.txt .PHONY: debug-e2e-test # Run a playwright e2e test in debug mode. Use it via `make debug-e2e-test st_command_test.py`. debug-e2e-test: @if [[ ! "$(filter-out $@,$(MAKECMDGOALS))" == *"_test"* ]]; then \ echo "Error: Test script name must contain '_test' in the filename"; \ exit 1; \ fi @echo "Running test: $(filter-out $@,$(MAKECMDGOALS)) in debug mode." @TEST_SCRIPT=$$(echo $(filter-out $@,$(MAKECMDGOALS)) | sed 's|^e2e_playwright/||'); \ cd e2e_playwright && PWDEBUG=1 pytest $$TEST_SCRIPT --tracing on || ( \ echo "If you implemented changes in the frontend, make sure to call \`make frontend-fast\` to use the up-to-date frontend build in the test."; \ echo "You can find test-results in ./e2e_playwright/test-results"; \ exit 1 \ ) .PHONY: run-e2e-test # Run a playwright e2e test. Use it via `make run-e2e-test st_command_test.py`. run-e2e-test: @if [[ ! "$(filter-out $@,$(MAKECMDGOALS))" == *"_test"* ]]; then \ echo "Error: Test script name must contain '_test' in the filename"; \ exit 1; \ fi @echo "Running test: $(filter-out $@,$(MAKECMDGOALS))" @TEST_SCRIPT=$$(echo $(filter-out $@,$(MAKECMDGOALS)) | sed 's|^e2e_playwright/||'); \ cd e2e_playwright && pytest $$TEST_SCRIPT --tracing retain-on-failure --reruns 0 || ( \ echo "If you implemented changes in the frontend, make sure to call \`make frontend-fast\` to use the up-to-date frontend build in the test."; \ echo "You can find test-results in ./e2e_playwright/test-results"; \ exit 1 \ ) .PHONY: trace-e2e-test # Run e2e test with tracing and view it. Use via `make trace-e2e-test <test_file.py>::<test_func>`. trace-e2e-test: @if [[ -z "$(filter-out $@,$(MAKECMDGOALS))" ]]; then \ echo "Error: Please specify a single test to run"; \ echo "Usage: make trace-e2e-test <test_file.py>::<test_function>"; \ echo "Example: make trace-e2e-test st_audio_input_test.py::test_audio_input_renders"; \ exit 1; \ fi @TEST_ARG=$$(echo $(filter-out $@,$(MAKECMDGOALS)) | sed 's|^e2e_playwright/||'); \ if [[ ! "$$TEST_ARG" == *"::"* ]]; then \ echo "Error: You must specify a single test function, not an entire test file"; \ echo "Usage: make trace-e2e-test <test_file.py>::<test_function>"; \ echo "Example: make trace-e2e-test st_audio_input_test.py::test_audio_input_renders"; \ exit 1; \ fi; \ echo "Clearing previous traces..."; \ rm -rf e2e_playwright/test-results/traces; \ mkdir -p e2e_playwright/test-results/traces; \ echo "Running test with tracing: $$TEST_ARG"; \ (cd e2e_playwright && pytest $$TEST_ARG --tracing=on --output=test-results/traces || true); \ echo ""; \ echo "Launching trace viewer..."; \ TRACE_FILE=$$(find e2e_playwright/test-results/traces -name "trace.zip" -type f 2>/dev/null | head -n 1); \ if [[ -n "$$TRACE_FILE" ]]; then \ python -m playwright show-trace "$$TRACE_FILE"; \ else \ echo "No trace file found. Check e2e_playwright/test-results/traces/ directory."; \ fi .PHONY: lighthouse-tests # Run Lighthouse performance tests. lighthouse-tests: cd frontend/app; \ yarn run lighthouse:run .PHONY: bare-execution-tests # Run all e2e tests in bare mode. bare-execution-tests: PYTHONPATH=. \ python3 scripts/run_bare_execution_tests.py .PHONY: cli-smoke-tests # Run CLI smoke tests. cli-smoke-tests: python3 scripts/cli_smoke_tests.py .PHONY: autofix # Autofix linting and formatting errors. autofix: # Python fixes: make python-format ruff check --fix # JS fixes: make frontend-init make frontend-format cd frontend/ ; yarn workspaces foreach --all run lint --fix # Dedupe yarn.lock cd frontend ; yarn dedupe # Other fixes: make update-notices # Run all pre-commit fixes but not fail if any of them don't work. pre-commit run --all-files --hook-stage manual || true .PHONY: package # Create Python wheel files in `dist/`. package: init frontend # Get rid of the old build and dist folders to make sure that we clean old js and css. rm -rfv lib/build lib/dist cd lib ; python3 setup.py bdist_wheel sdist .PHONY: conda-package # Create conda distribution files. conda-package: init if [ "${SNOWPARK_CONDA_BUILD}" = "1" ] ; then\ echo "Creating Snowpark conda build, so skipping building frontend assets."; \ else \ make frontend; \ fi rm -rf lib/conda-recipe/dist mkdir lib/conda-recipe/dist # This can take upwards of 20 minutes to complete in a fresh conda installation! (Dependency solving is slow.) # NOTE: Running the following command requires both conda and conda-build to # be installed. GIT_HASH=$$(git rev-parse --short HEAD) conda build lib/conda-recipe --output-folder lib/conda-recipe/dist
5,754
streamlit
streamlit/README.md
<br> <img src="https://user-images.githubusercontent.com/7164864/217935870-c0bc60a3-6fc0-4047-b011-7b4c59488c91.png" alt="Streamlit logo" style="margin-top:50px"></img> # Welcome to Streamlit 👋 **A faster way to build and share data apps.** ## What is Streamlit? Streamlit lets you transform Python scripts into interactive web apps in minutes, instead of weeks. Build dashboards, generate reports, or create chat apps. Once you’ve created an app, you can use our [Community Cloud platform](https://streamlit.io/cloud) to deploy, manage, and share your app. ### Why choose Streamlit? - **Simple and Pythonic:** Write beautiful, easy-to-read code. - **Fast, interactive prototyping:** Let others interact with your data and provide feedback quickly. - **Live editing:** See your app update instantly as you edit your script. - **Open-source and free:** Join a vibrant community and contribute to Streamlit's future. ## Installation Open a terminal and run: ```bash $ pip install streamlit $ streamlit hello ``` If this opens our sweet _Streamlit Hello_ app in your browser, you're all set! If not, head over to [our docs](https://docs.streamlit.io/get-started) for specific installs. The app features a bunch of examples of what you can do with Streamlit. Jump to the [quickstart](#quickstart) section to understand how that all works. <img src="https://user-images.githubusercontent.com/7164864/217936487-1017784e-68ec-4e0d-a7f6-6b97525ddf88.gif" alt="Streamlit Hello" width=500 href="none"></img> ## Quickstart ### A little example Create a new file named `streamlit_app.py` in your project directory with the following code: ```python import streamlit as st x = st.slider("Select a value") st.write(x, "squared is", x * x) ``` Now run it to open the app! ``` $ streamlit run streamlit_app.py ``` <img src="https://user-images.githubusercontent.com/7164864/215172915-cf087c56-e7ae-449a-83a4-b5fa0328d954.gif" width=300 alt="Little example"></img> ### Give me more! Streamlit comes in with [a ton of additional powerful elements](https://docs.streamlit.io/develop/api-reference) to spice up your data apps and delight your viewers. Some examples: <table border="0"> <tr> <td> <a target="_blank" href="https://docs.streamlit.io/develop/api-reference/widgets"> <img src="https://user-images.githubusercontent.com/7164864/217936099-12c16f8c-7fe4-44b1-889a-1ac9ee6a1b44.png" style="max-height:150px; width:auto; display:block;"> </a> </td> <td> <a target="_blank" href="https://docs.streamlit.io/develop/api-reference/data/st.dataframe"> <img src="https://user-images.githubusercontent.com/7164864/215110064-5eb4e294-8f30-4933-9563-0275230e52b5.gif" style="max-height:150px; width:auto; display:block;"> </a> </td> <td> <a target="_blank" href="https://docs.streamlit.io/develop/api-reference/charts"> <img src="https://user-images.githubusercontent.com/7164864/215174472-bca8a0d7-cf4b-4268-9c3b-8c03dad50bcd.gif" style="max-height:150px; width:auto; display:block;"> </a> </td> <td> <a target="_blank" href="https://docs.streamlit.io/develop/api-reference/layout"> <img src="https://user-images.githubusercontent.com/7164864/217936149-a35c35be-0d96-4c63-8c6a-1c4b52aa8f60.png" style="max-height:150px; width:auto; display:block;"> </a> </td> <td> <a target="_blank" href="https://docs.streamlit.io/develop/concepts/multipage-apps"> <img src="https://user-images.githubusercontent.com/7164864/215173883-eae0de69-7c1d-4d78-97d0-3bc1ab865e5b.gif" style="max-height:150px; width:auto; display:block;"> </a> </td> <td> <a target="_blank" href="https://streamlit.io/gallery"> <img src="https://user-images.githubusercontent.com/7164864/215109229-6ae9111f-e5c1-4f0b-b3a2-87a79268ccc9.gif" style="max-height:150px; width:auto; display:block;"> </a> </td> </tr> <tr> <td>Input widgets</td> <td>Dataframes</td> <td>Charts</td> <td>Layout</td> <td>Multi-page apps</td> <td>Fun</td> </tr> </table> Our vibrant creators community also extends Streamlit capabilities using  🧩 [Streamlit Components](https://streamlit.io/components). ## Get inspired There's so much you can build with Streamlit: - 🤖  [LLMs & chatbot apps](https://streamlit.io/gallery?category=llms) - 🧬  [Science & technology apps](https://streamlit.io/gallery?category=science-technology) - 💬  [NLP & language apps](https://streamlit.io/gallery?category=nlp-language) - 🏦  [Finance & business apps](https://streamlit.io/gallery?category=finance-business) - 🗺  [Geography & society apps](https://streamlit.io/gallery?category=geography-society) - and more! **Check out [our gallery!](https://streamlit.io/gallery)** 🎈 ## Community Cloud Deploy, manage and share your apps for free using our [Community Cloud](https://streamlit.io/cloud)! Sign-up [here](https://share.streamlit.io/signup). <br><br> <img src="https://user-images.githubusercontent.com/7164864/214965336-64500db3-0d79-4a20-8052-2dda883902d2.gif" width="400"></img> ## Resources - Explore our [docs](https://docs.streamlit.io) to learn how Streamlit works. - Ask questions and get help in our [community forum](https://discuss.streamlit.io). - Read our [blog](https://blog.streamlit.io) for tips from developers and creators. - Extend Streamlit's capabilities by installing or creating your own [Streamlit Components](https://streamlit.io/components). - Help others find and play with your app by using the Streamlit GitHub badge in your repository: ```markdown [![Streamlit App](https://static.streamlit.io/badges/streamlit_badge_black_white.svg)](URL_TO_YOUR_APP) ``` [![Streamlit App](https://static.streamlit.io/badges/streamlit_badge_black_white.svg)](https://share.streamlit.io/streamlit/roadmap) ## Contribute 🎉 Thanks for your interest in helping improve Streamlit! 🎉 Before contributing, please read our guidelines here: https://github.com/streamlit/streamlit/wiki/Contributing ## License Streamlit is completely free and open-source and licensed under the [Apache 2.0](https://www.apache.org/licenses/LICENSE-2.0) license.
2,248
streamlit
streamlit/SECURITY.md
# Security Policy ## Supported Versions | Version | Supported | | --------- | --------- | | >= 1.11.1 | ✅ | </br> Please refer to the Snowflake [HackerOne program](https://hackerone.com/snowflake?type=team) for our security policies and for reporting any security vulnerabilities.
90
streamlit
streamlit/agent-knowledge/INDEX.md
--- last_updated: 2025-11-05 --- # Agent Knowledge Base Index Central registry of resources available to AI agents working on Streamlit. ## Available Resources ### [Processes](processes/) Development workflows and procedures: - [pr-creation/](processes/pr-creation/) - Complete PR creation workflow with guidance on filling `.github/pull_request_template.md` (see README.md for details) ### [Guides](references/guides/) System overviews and comprehensive documentation (e.g., layout system, caching, state management) ### [Features](references/features/) Feature development artifacts (specs + implementation plans) organized by area. Not tracked in git. ## Contributing See [README.md](README.md#contributing) for contribution guidelines. ## See Also - [README.md](README.md) - Purpose and usage guide for this directory - [../AGENTS.md](../AGENTS.md) - Top-level agent repo overview ---
250
streamlit
streamlit/agent-knowledge/README.md
# Agent Knowledge Base This directory contains documentation and guides designed for AI agents working on the Streamlit codebase. ## Purpose Tool-agnostic knowledge base for AI-assisted development. Works with any AI tool (Cursor, Cline, Aider, etc.) and any task type (commands, pipelines, ad-hoc prompts). ## Quick Start 📖 **[See INDEX.md for a catalog of available resources](INDEX.md)** ## How to Use 1. **Starting a task?** Check [INDEX.md](INDEX.md) for relevant resources 2. **Creating a PR?** Use `.github/pull_request_template.md` and fill according to [processes/pr-creation/](processes/pr-creation/) 3. **Need help with a specific workflow?** Browse the [processes/](processes/) directory ## Contributing ### Local Experimentation Keep files local while developing by using these patterns (automatically ignored by git): - `*.local.md` - Individual local files (e.g., `draft-guide.local.md`, `notes.local.md`) - `local/` - Directory for local experiments and work in progress ### Adding Shared Resources To add a resource for the team: 1. Create the resource following existing patterns (if any exist) 2. Add YAML frontmatter to set team expectations: ```yaml --- status: stable | experimental last_updated: YYYY-MM-DD --- ``` - `status: experimental` - Workflow being developed, team feedback welcome - `status: stable` - Established, reviewed workflow 3. Update INDEX.md to make it discoverable ## Relationship to Other Agent Resources | Resource | When Loaded | Scope | Tool Support | | --------------------------- | --------------------------- | ----------------------- | --------------- | | **AGENTS.md** | Every prompt (always-on) | Succinct, universal | All tools | | **agent-knowledge/** (here) | On-demand (when referenced) | Detailed, task-specific | All tools | | **.cursor/commands/** | Executed by user | Executable workflows | Cursor-specific | **Key Distinctions:** - **AGENTS.md**: Always injected → must be brief - **agent-knowledge/**: Referenced on-demand → can be comprehensive - **.cursor/commands/**: Executable workflows → tool-specific
703
streamlit
streamlit/agent-knowledge/processes/pr-creation/README.md
--- status: stable last_updated: 2025-11-19 --- # PR Creation Process Reference materials for creating pull requests in the Streamlit repository. ## The PR Template **Location**: `.github/pull_request_template.md` (canonical source) ## Available Guides - [writing-principles.md](./writing-principles.md) - General writing style (brevity, commit messages, PR titles) - [describe-changes-guide.md](./describe-changes-guide.md) - How to describe what changed (selectivity, what to include/omit) - [testing-plan-guide.md](./testing-plan-guide.md) - How to document tests in PRs - [branch-naming.md](./branch-naming.md) - Branch naming conventions - [labeling-guide.md](./labeling-guide.md) - Required PR labels ## See Also - [../../INDEX.md](../../INDEX.md) - Agent knowledge base index - [.github/pull_request_template.md](../../../.github/pull_request_template.md) - The canonical PR template
281
streamlit
streamlit/agent-knowledge/processes/pr-creation/branch-naming.md
--- status: experimental last_updated: 2025-11-05 --- # Branch Naming - Best Practices ## Format ``` {type}/{brief-description} ``` ## Types - `feature` - New features or functionality - `fix` - Bug fixes - `refactor` - Code refactoring - `chore` - Maintenance tasks (dependencies, tooling, etc.) - `docs` - Documentation changes ## Guidelines **Good branch names:** - Descriptive and specific (3-6 words typical, up to 8 for complex changes) - Use kebab-case (hyphens between words) - Clearly indicate what is being changed and why - Include the component/area being modified when helpful - Avoid ticket/issue numbers (use PR description for that) **Format patterns:** - `{type}/{action}-{component}-{detail}` - e.g., `feature/add-height-plotly-charts` - `{type}/{area}-{specific-change}` - e.g., `fix/dataframe-memory-leak-scrolling` - `{type}/{what-is-changing}` - e.g., `refactor/arrow-table-conversion-logic` ## Examples - `feature/add-height-parameter-plotly-charts` - Adding new parameter to specific component - `fix/dataframe-memory-leak-large-datasets` - Fixing specific bug with context - `refactor/element-width-height-logic` - Refactoring specific area of codebase - `chore/update-react-dependencies` - Maintenance task with specific scope - `docs/api-reference-layout-parameters` - Documentation update for specific section
426
streamlit
streamlit/agent-knowledge/processes/pr-creation/describe-changes-guide.md
--- status: stable last_updated: 2025-12-02 --- # Guide: Describing Changes Use this guide for PR template sections that ask what changed (e.g., "Describe your changes"). ## Content Guidelines **Keep it brief:** 2-4 bullets maximum for listing changes. **List only impactful changes** - not every file touched. **Omit obvious details** - don't explain what's clear from reading the code. **Explain non-obvious decisions** - only include implementation details that aren't obvious. ## Selectivity Checklist Before including a change, ask: 1. **Is this obvious?** → Omit it 2. **Is this a standard pattern?** (tests, types, validation) → Omit it 3. **Is this the most impactful change?** → Include it 4. **Does this involve a non-obvious decision?** → Include it with explanation ## Example: Feature Adding a Parameter **What to include:** - ✓ New parameter added (the main change) - ✓ Deprecation of old parameter (impacts users) - ✓ Non-obvious behavior (special handling, fallbacks) **What to omit:** - ✗ Added tests (obvious) - ✗ Updated types (obvious) - ✗ Added validation (obvious) - ✗ Updated proto (implementation detail) - ✗ Fixed linting (housekeeping) ## Good vs Bad Examples **Good (selective, highlights what matters):** > Adds `height` parameter to `st.plotly_chart()` using `Height` type system. > > - Added `height` parameter with default `"stretch"` > - Deprecates `use_container_height` (removed after 2025-12-31) **Bad (lists every change):** > - Added `height: Height = "stretch"` parameter to st.plotly_chart signature > - Updated layout config dataclass to accept height parameter > - Added validation for height parameter values > - Updated proto message to include height field > - Added unit tests for height parameter > - Added E2E tests for height visual behavior > - Updated type hints in plotly_chart.py **Why bad:** Most of these are obvious. Only list what's non-obvious or impactful. ## Implementation Notes If the change involves non-obvious behavior, add a brief explanation: **Good:** > When `height="content"`, extracts height from native Plotly figure if specified, falls back to `"stretch"` otherwise. **Bad:** > The height parameter is validated to ensure only valid values are accepted. **Why bad:** This is obvious - of course parameters are validated.
646
streamlit
streamlit/agent-knowledge/processes/pr-creation/labeling-guide.md
--- status: experimental last_updated: 2025-11-05 --- # PR Labeling Guide All Streamlit PRs must have the following labels applied. ## Required Labels ### 1. Security Assessment - `security-assessment-completed` - Required for all PRs ### 2. Impact Classification Choose **one**: - `impact:users` - Changes will affect behavior for users - `impact:internal` - Changes will not affect user behavior ### 3. Change Type Choose **one**: - `change:feature` - New features or feature enhancements - `change:bugfix` - Bug fixes - `change:chore` - Small changes for repo maintenance - `change:refactor` - Refactoring changes to improve code quality - `change:other` - Things that don't fit other categories - `change:docs` - Documentation updates, e.g. docstring only changes ## Label Combination Examples **For new features:** - `security-assessment-completed` - `impact:users` (changes user-facing API) - `change:feature` (adds new functionality) **For internal refactoring:** - `security-assessment-completed` - `impact:internal` (no user behavior change) - `change:refactor` (improves code quality) **For bug fixes:** - `security-assessment-completed` - `impact:users` (fixes user-facing issue) - `change:bugfix` (fixes a bug) **For documentation updates:** - `security-assessment-completed` - `impact:internal` (no behavior change) - `change:docs` (documentation)
410
streamlit
streamlit/agent-knowledge/processes/pr-creation/testing-plan-guide.md
--- status: experimental last_updated: 2025-11-05 --- # How to Fill In the Testing Plan Guide for documenting tests in PR descriptions. ## Detect Test Changes in Git Diff Check for modified/added test files: **Python unit tests:** - Pattern: `lib/tests/**/*.py` - Example: `lib/tests/streamlit/elements/plotly_chart_test.py` **Frontend unit tests:** - Pattern: `frontend/**/*.test.ts` or `frontend/**/*.test.tsx` - Example: `frontend/lib/src/components/elements/PlotlyChart/PlotlyChart.test.tsx` **E2E tests:** - Pattern: `e2e_playwright/**/*_test.py` - Example: `e2e_playwright/st_plotly_chart_test.py` ## Fill In PR Template Checklist Based on files changed: ```markdown - [x] Unit Tests (JS and/or Python) - If lib/tests/ or frontend/\*_/_.test.\* files changed - [x] E2E Tests - If e2e_playwright/ files changed - [ ] Manual testing completed - Leave unchecked (user will complete if applicable) - [x] Explanation of why no additional tests are needed - If no test files changed ``` ## Describe Testing in PR **If tests were added/modified:** List the test files and what they cover: ```markdown **Testing:** - `lib/tests/streamlit/elements/plotly_chart_test.py` - Tests height parameter functionality - `e2e_playwright/st_plotly_chart_test.py` - Visual regression tests for height behavior ``` **If no tests needed:** Explain why: ```markdown **No Additional Tests:** Documentation-only changes, no behavior modifications. ``` ## Detection Logic for AI Agents ```python # Pseudo-code for detecting test types from git diff has_python_tests = any("lib/tests/" in file for file in changed_files) has_frontend_tests = any(file.endswith(('.test.ts', '.test.tsx')) for file in changed_files) has_e2e_tests = any("e2e_playwright/" in file for file in changed_files) # Check boxes based on what can be detected checklist = { "unit_tests": has_python_tests or has_frontend_tests, "e2e_tests": has_e2e_tests, "manual_testing": False, # Always leave unchecked - user fills in "no_tests_needed": not (has_python_tests or has_frontend_tests or has_e2e_tests), } ```
707
streamlit
streamlit/agent-knowledge/processes/pr-creation/writing-principles.md
--- status: stable last_updated: 2025-12-02 --- # Writing Principles for PRs **Core principle: Highlight what matters. Omit the obvious.** Don't list every change - focus on the most impactful. Don't explain what's obvious from reading the code - only explain non-obvious decisions. ## Commit Messages **Format:** ``` <imperative verb> <what> <where> Optional body with technical details. ``` **Rules:** - First line: ≤50 characters - Use imperative mood ("Add" not "Added" or "Adds") - No periods at end of first line - Body: ≤72 characters per line (if needed) **Good examples:** ``` Add height parameter to plotly charts Fix memory leak in dataframe scrolling Refactor layout config validation logic ``` **Bad examples (too verbose):** ``` ✗ Added a new height parameter feature to the plotly chart component to enable users to control chart dimensions ✗ This commit fixes the memory leak that was occurring when users scrolled through large dataframes ``` ## PR Titles **Format:** ``` [type] lowercase description of change ``` **Rules:** - Start with change type in brackets: `[feature]`, `[fix]`, `[refactor]`, `[chore]`, `[docs]` - ≤80 characters total - Lowercase after the bracket - Descriptive, not marketing - Match commit message content if single commit **Good examples:** ``` [feature] add height parameter to plotly charts [fix] extra padding on button [refactor] layout config validation logic [chore] update dependencies [docs] clarify st.cache_data usage ``` **Bad examples:** ``` ✗ Exciting new feature: height parameter support for beautiful plotly charts! ✗ This PR fixes a critical memory leak issue that users were experiencing ✗ Add height parameter (missing [type] prefix) ``` ## General Content Principles **What NOT to include:** - ✗ "Added tests" (obvious) - ✗ "Updated type hints" (obvious) - ✗ "Added validation" (obvious) - ✗ "Updated documentation" (obvious) - ✗ "Fixed linting errors" (obvious) **Don't explain obvious behavior:** - ✗ "Parameters are validated to ensure correctness" - ✗ "Added error handling for edge cases" - ✗ "Code follows existing patterns" **DO explain non-obvious decisions:** - ✓ "Deprecates `use_container_height` (removed after 2025-12-31)" - ✓ "When `height="content"`, extracts from native figure if specified" - ✓ "Uses `rem` units instead of `px` for responsive sizing" ## No Meta-Commentary Skip phrases like: - "This PR..." - "We have..." - "I added..." Just state what changed directly.
726
streamlit
streamlit/agent-knowledge/references/guides/streamlit-layout-feature.md
--- status: stable last_updated: 2025-11-05 --- # Overview This document includes details of the implementation of the layout feature for Streamlit. The layout feature includes: - Width parameters for widgets and elements. - Height parameters for widgets and elements. - A container element for flexbox layouts (st.container). The layout system is still under development and some elements may still use an older style or may not have all of the described features implemented. # Description of Expected Behaviour ## Width All Streamlit elements have a width parameter that allows users developing layouts to configure its width. Elements support a subset of the modes or all of them. There are three modes: 1. Stretch When the width on an element is set to "stretch", the element should expand to fill available horizontal space according to these rules: - The element's display width should not exceed the width of its parent container. Examples: ```python import streamlit as st import numpy as np from numpy import typing as npt img: npt.NDArray[np.int64] = np.repeat(0, 75000).reshape(300, 250) with st.container(horizontal=True, key="horizontal_parent_container", width=300): # The width of this image should expand to fill the horizontal_parent_container (minus some padding). # The width of this image should never be more than 300px because that is the width of the parent. st.image(img, width="stretch") ``` - When the element is in a row inside a horizontal container, space should be shared with other elements. Examples: ```python import streamlit as st import numpy as np from numpy import typing as npt img: npt.NDArray[np.int64] = np.repeat(0, 75000).reshape(300, 250) with st.container(horizontal=True, key="horizontal_parent_container"): # The image and the markdown element should share the space equally. At typical screen widths, we expect to see # both elements on one row. st.image(img, width="stretch") # st.markdown will have internal whitespace on wide screens. st.markdown("HELLO DARLING", width="stretch") ``` 2. Content When the width on an element is set to "content", the width of the element should be based on the contents of the element. 3. Integer When an integer width is provided, the element will be that width in pixels. ## Height All Streamlit elements have a height parameter that allows users developing layouts to configure its height. Elements support a subset of the available modes or all of them. The modes are: 1. Stretch When the height on an element is set to "stretch", the element should expand to fill available vertical space according to these rules: - The element's display height should not exceed the height of its parent container. Examples: ```python with st.container(key="vertical_parent_container", height=300): # The height of the text area should expand to fill the parent container (minus some padding). # The height of the text area should never be more than 300px because that is the height of the parent container. st.text_area("enter your message here", height="stretch") ``` - When the element is in a column inside a vertical container, space should be shared with other elements. Examples: ```python with st.container(key="vertical_parent_container", height=400): # The text area and code block should share the vertical space equally. st.text_area("Enter your message here", height="stretch") # The code block will also stretch to share remaining space. st.code("print('Additional content below')", height="stretch") ``` 2. Content When the height on an element is set to "content", the height of the element should be based on the contents of the element. 3. Integer When an integer height is provided, the element will be that height in pixels. 4. Auto Some elements have this mode which indicates customized behavior for the element. This mode is only available for specific elements that display data tables. Examples: - `st.dataframe` and `st.data_editor` support this height mode. When `height="auto"` (the default), these elements automatically size their height to show at most 10 rows of data, optimizing the display for the dataset size. # Technical Implementation Details ## Architecture Flow - Python API → Proto messages → Frontend CSS **Key Implementation Layers:** - **Python**: `lib/streamlit/elements/`, `lib/streamlit/delta_generator.py` - **Proto**: `proto/streamlit/proto/{Element,WidthConfig,HeightConfig}.proto` - **Frontend**: `frontend/lib/src/components/core/{Block,Layout}/` (ElementNodeRenderer, StyledElementContainerLayoutWrapper, useLayoutStyles) ## Python Layer The API is provided to the user in the python function corresponding to the element. Example: ```python def metric( # ... other parameters ... *, # ... other keyword-only parameters ... width: Width = "stretch", height: Height = "content", # ... other keyword-only parameters ... ) -> DeltaGenerator: ``` The width and height are validated using common utility functions, then a LayoutConfig object is created and provided to the `_enqueue` method on the Delta Generator. validate_height(height, allow_content=True) validate_width(width, allow_content=True) layout_config = LayoutConfig(width=width, height=height) return self.dg._enqueue("metric", metric_proto, layout_config=layout_config) In `_enqueue`, the layout config is converted to proto messages: ```python if layout_config: if layout_config.height is not None: msg.delta.new_element.height_config.CopyFrom( get_height_config(layout_config.height) ) if layout_config.width is not None: msg.delta.new_element.width_config.CopyFrom( get_width_config(layout_config.width) ) ``` The `get_height_config` and `get_width_config` utility functions convert the Python string/int values into the appropriate proto message structure. ## Proto Messages Proto messages communicate layout preferences from Python to frontend. The layout system uses three related messages: ```protobuf // Element contains layout config fields message Element { optional streamlit.HeightConfig height_config = 57; optional streamlit.WidthConfig width_config = 58; // ... other element types ... } // Layout configuration messages (see CSS conversion table in useLayoutStyles section) message WidthConfig { oneof width_spec { bool use_stretch = 1; bool use_content = 2; uint32 pixel_width = 3; float rem_width = 4; // Used for literal sizes (e.g., st.space) } } message HeightConfig { oneof height_spec { bool use_stretch = 1; bool use_content = 2; uint32 pixel_height = 3; float rem_height = 4; // Used for literal sizes (e.g., st.space) } } ``` ## Frontend Styling Element node rendering is performed in `ElementNodeRenderer.tsx`. During rendering, each element is wrapped in a `StyledElementContainerLayoutWrapper`. The layout styles are primarily applied to this wrapper container. Some elements require further stylings in their react components to implement the different height/width modes. Keeping the styling in the shared layers is preferred. Styles are computed in the hook `useLayoutStyles` based on the information provided in the proto files. This hook also implements backwards compatibly logic for previous versions of the proto messages. ### Element Rendering Flow In `ElementNodeRenderer.tsx`, layout config is passed to elements and they're wrapped in the layout container: ```typescript // Layout config passed to elements for custom internal styling const elementProps = { widthConfig: node.element.widthConfig, heightConfig: node.element.heightConfig, // ... other props } // Elements wrapped in layout wrapper <StyledElementContainerLayoutWrapper node={node}> <RawElementNodeRenderer {...props} /> </StyledElementContainerLayoutWrapper> ``` ### Layout Wrapper Implementation The `StyledElementContainerLayoutWrapper` applies element categories and overrides: ```typescript // Element categories determine minimum widths in horizontal layouts if (LARGE_STRETCH_BEHAVIOR.includes(node.element.type)) { minStretchBehavior = "14rem"; // Complex elements: charts, dataframes, media } else if (MEDIUM_STRETCH_BEHAVIOR.includes(node.element.type)) { minStretchBehavior = "8rem"; // Form inputs: textInput, selectbox, slider } // Default: "fit-content" - elements shrink to natural content size // Element-specific overrides (example: textArea stretch) if ( node.element.type === "textArea" && node.element.heightConfig?.useStretch ) { styleOverrides = { height: "100%", flex: "1 1 8rem" }; } // Apply layout styles const styles = useLayoutStyles({ element, styleOverrides, minStretchBehavior, }); ``` **Element Categories:** - **LARGE_STRETCH_BEHAVIOR** (14rem minimum): Complex elements like `arrowDataFrame`, `plotlyChart`, `graphvizChart`, `video`, `fileUploader`, etc. - **MEDIUM_STRETCH_BEHAVIOR** (8rem minimum): Form inputs like `textInput`, `selectbox`, `slider`, `textArea`, `numberInput`, etc. These categories control the minimum flex-basis in horizontal layouts to prevent elements from shrinking too small when sharing space. They also provide min-width protection when elements are inside content-width containers. ### useLayoutStyles Hook The `useLayoutStyles` hook (in `useLayoutStyles.ts`) converts proto config to CSS properties using these core patterns: **Complete Proto to CSS Conversion (Default Behavior):** | Proto Field | Default CSS Properties | Context | | -------------------------- | -------------------------------------------- | ----------------------------------------------------- | | `widthConfig.useStretch` | `width: "100%"` + `flex: "1 1 ${minWidth}"` | Horizontal layouts apply flex with element categories | | `widthConfig.useContent` | `width: "fit-content"` | Element shrinks to natural content size | | `widthConfig.pixelWidth` | `width: "${pixels}px"` | Fixed width in pixels | | `heightConfig.useStretch` | `height: "100%"` + `flex: "1 1 auto"` | Vertical layouts apply flex properties | | `heightConfig.useContent` | `height: "auto"` | Element uses natural content height | | `heightConfig.pixelHeight` | `height: "${pixels}px"` + `overflow: "auto"` | Fixed height with scroll if needed | **Min-Width Protection in Content-Width Containers:** When an element with `width="stretch"` is inside a content-width container (tracked via `FlexContext.isInContentWidthContainer`), min-width is automatically applied using the `minStretchBehavior` value. This prevents elements from becoming too narrow when the container shrinks to fit its content. The min-width respects parent container constraints (via `calculateMinWidthWithParentConstraint`) to avoid overflow issues. **Element-Specific Overrides:** Some elements modify these defaults in `StyledElementContainerLayoutWrapper` (see Layout Wrapper examples above for specific cases like `textArea` stretch mode). **Backwards Compatibility:** Supports older proto formats through `subElement` parameter for cached messages. # Debugging ## Common Failure Modes 1. Height not stretching correctly. The element height does not fill the parent container height, instead it fits the content. Look for: - HTML elements interior to the component that may need `height: 100%` to stretch. - If it is a graph it may need the container height provided to the graphing library. The `useCalculatedDimensions` hook (in `frontend/lib/src/hooks/useCalculatedDimensions.ts`) can be utilized to measure the container height.
3,444
streamlit
streamlit/component-lib/.yarnrc.yml
nodeLinker: node-modules
8
streamlit
streamlit/component-lib/README.md
# Streamlit Component Library An npm package that provides support code for creating [Streamlit Components](https://docs.streamlit.io/develop/concepts/custom-components). The fastest way to start writing a Streamlit Component is to use our [Component Template repo](https://github.com/streamlit/component-template), which contains templates and example code.
82
streamlit
streamlit/component-lib/RELEASE_NOTES.md
<!-- Copyright (c) Streamlit Inc. (2018-2022) Snowflake Inc. (2022-2025) Licensed under the Apache License, Version 2.0 (the "License"); you may not use this file except in compliance with the License. You may obtain a copy of the License at http://www.apache.org/licenses/LICENSE-2.0 Unless required by applicable law or agreed to in writing, software distributed under the License is distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the License for the specific language governing permissions and limitations under the License. --> # Release notes # 2.0.0 This release has no significant changes to our API, but we bump the major version as the [`apache-arrow`](https://www.npmjs.com/package/apache-arrow) library is updated, which may affect users of the library. For details, see: [Apache Arrow Releases](https://arrow.apache.org/release/). Moreover, it is worth adding that: - The new version of `apache-arrow` requires a newer version of Typescript to work, but thanks to that you can also use `create-react-script` 5 and newer and the latest versions of NodeJS. - We dropped use of [`event-target-shim`](https://www.npmjs.com/package/event-target-shim) as modern browsers no longer need it. ## List of commits - [`21e7beeae`](https://github.com/streamlit/streamlit/commit/21e7beeae) Bump dependencies of component-lib (#6830) - [`1e6a3e45e`](https://github.com/streamlit/streamlit/commit/1e6a3e45e) Add tests for component-lib (#6580) - [`e43f64c72`](https://github.com/streamlit/streamlit/commit/e43f64c72) fix: upgrade command-line-args from 5.0.2 to 5.2.1 (#6258) - [`3bb2243ec`](https://github.com/streamlit/streamlit/commit/3bb2243ec) fix: upgrade flatbuffers from 1.11.0 to 1.12.0 (#6259) - [`fe8fd4f5c`](https://github.com/streamlit/streamlit/commit/fe8fd4f5c) fix: upgrade multiple dependencies with Snyk (#6262) - [`0dfd31940`](https://github.com/streamlit/streamlit/commit/0dfd31940) Update license headers (#5143) - [`76859d67b`](https://github.com/streamlit/streamlit/commit/76859d67b) fix: Allow renderData.args to be typed (#5205) - [`c8f2db61f`](https://github.com/streamlit/streamlit/commit/c8f2db61f) Fix typos (#5082) - [`f85a0feac`](https://github.com/streamlit/streamlit/commit/f85a0feac) Fix build issues due to linting errors (#4637) - [`a91272018`](https://github.com/streamlit/streamlit/commit/a91272018) Bump ansi-regex from 4.1.0 to 4.1.1 in /component-lib (#4558) - [`d44b16290`](https://github.com/streamlit/streamlit/commit/d44b16290) Update years in all license headers (#4291)
884
streamlit
streamlit/component-lib/jest.config.ts
/** * Copyright (c) Streamlit Inc. (2018-2022) Snowflake Inc. (2022-2025) * * Licensed under the Apache License, Version 2.0 (the "License"); * you may not use this file except in compliance with the License. * You may obtain a copy of the License at * * http://www.apache.org/licenses/LICENSE-2.0 * * Unless required by applicable law or agreed to in writing, software * distributed under the License is distributed on an "AS IS" BASIS, * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. * See the License for the specific language governing permissions and * limitations under the License. */ /* * For a detailed explanation regarding each configuration property and type check, visit: * https://jestjs.io/docs/configuration */ export default { // All imported modules in your tests should be mocked automatically // automock: false, // Stop running tests after `n` failures // bail: 0, // The directory where Jest should store its cached dependency information // cacheDirectory: "/private/var/folders/_7/mm9vqb4d5jsbb9v9xsfxy4600000gn/T/jest_dx", // Automatically clear mock calls, instances, contexts and results before every test clearMocks: true, // Indicates whether the coverage information should be collected while executing the test collectCoverage: true, // An array of glob patterns indicating a set of files for which coverage information should be collected collectCoverageFrom: ["src/**/*.{ts,tsx}"], // The directory where Jest should output its coverage files coverageDirectory: "coverage", // An array of regexp pattern strings used to skip coverage collection // coveragePathIgnorePatterns: [ // "/node_modules/" // ], // Indicates which provider should be used to instrument code for coverage coverageProvider: "v8", // A list of reporter names that Jest uses when writing coverage reports // coverageReporters: [ // "json", // "text", // "lcov", // "clover" // ], // An object that configures minimum threshold enforcement for coverage results // coverageThreshold: undefined, // A path to a custom dependency extractor // dependencyExtractor: undefined, // Make calling deprecated APIs throw helpful error messages // errorOnDeprecated: false, // The default configuration for fake timers // fakeTimers: { // "enableGlobally": false // }, // Force coverage collection from ignored files using an array of glob patterns // forceCoverageMatch: [], // A path to a module which exports an async function that is triggered once before all test suites // globalSetup: undefined, // A path to a module which exports an async function that is triggered once after all test suites // globalTeardown: undefined, // A set of global variables that need to be available in all test environments // globals: {}, // The maximum amount of workers used to run your tests. Can be specified as % or a number. E.g. maxWorkers: 10% will use 10% of your CPU amount + 1 as the maximum worker number. maxWorkers: 2 will use a maximum of 2 workers. // maxWorkers: "50%", // An array of directory names to be searched recursively up from the requiring module's location // moduleDirectories: [ // "node_modules" // ], // An array of file extensions your modules use // moduleFileExtensions: [ // "js", // "mjs", // "cjs", // "jsx", // "ts", // "tsx", // "json", // "node" // ], // A map from regular expressions to module names or to arrays of module names that allow to stub out resources with a single module // moduleNameMapper: {}, // An array of regexp pattern strings, matched against all module paths before considered 'visible' to the module loader // modulePathIgnorePatterns: [], // Activates notifications for test results // notify: false, // An enum that specifies notification mode. Requires { notify: true } // notifyMode: "failure-change", // A preset that is used as a base for Jest's configuration // preset: undefined, // Run tests from one or more projects // projects: undefined, // Use this configuration option to add custom reporters to Jest // reporters: undefined, // Automatically reset mock state before every test // resetMocks: false, // Reset the module registry before running each individual test // resetModules: false, // A path to a custom resolver // resolver: undefined, // Automatically restore mock state and implementation before every test // restoreMocks: false, // The root directory that Jest should scan for tests and modules within // rootDir: undefined, // A list of paths to directories that Jest should use to search for files in // roots: [ // "<rootDir>" // ], // Allows you to use a custom runner instead of Jest's default test runner // runner: "jest-runner", // The paths to modules that run some code to configure or set up the testing environment before each test // setupFiles: [], // A list of paths to modules that run some code to configure or set up the testing framework before each test setupFilesAfterEnv: [ "<rootDir>/src/setupTests.ts", "@testing-library/jest-dom/extend-expect", ], // The number of seconds after which a test is considered as slow and reported as such in the results. // slowTestThreshold: 5, // A list of paths to snapshot serializer modules Jest should use for snapshot testing // snapshotSerializers: [], // The test environment that will be used for testing testEnvironment: "jsdom", // Options that will be passed to the testEnvironment // testEnvironmentOptions: {}, // Adds a location field to test results // testLocationInResults: false, // The glob patterns Jest uses to detect test files // testMatch: [ // "**/__tests__/**/*.[jt]s?(x)", // "**/?(*.)+(spec|test).[tj]s?(x)" // ], // An array of regexp pattern strings that are matched against all test paths, matched tests are skipped // testPathIgnorePatterns: [ // "/node_modules/" // ], // The regexp pattern or array of patterns that Jest uses to detect test files // testRegex: [], // This option allows the use of a custom results processor // testResultsProcessor: undefined, // This option allows use of a custom test runner // testRunner: "jest-circus/runner", // A map from regular expressions to paths to transformers // transform: undefined, // An array of regexp pattern strings that are matched against all source file paths, matched files will skip transformation // transformIgnorePatterns: [ // "/node_modules/", // "\\.pnp\\.[^\\/]+$" // ], // An array of regexp pattern strings that are matched against all modules before the module loader will automatically return a mock for them // unmockedModulePathPatterns: undefined, // Indicates whether each individual test should be reported during the run // verbose: undefined, // An array of regexp patterns that are matched against all source file paths before re-running tests in watch mode // watchPathIgnorePatterns: [], // Whether to use watchman for file crawling // watchman: true, };
1,938
streamlit
streamlit/component-lib/package.json
{ "name": "streamlit-component-lib", "version": "2.0.0", "description": "Support code for Streamlit Components", "main": "dist/index.js", "types": "dist/index.d.ts", "files": [ "dist" ], "scripts": { "build": "tsc", "test": "jest ./src", "typesync": "typesync", "typesync:ci": "typesync --dry=fail" }, "repository": { "type": "git", "url": "git+https://github.com/streamlit/streamlit.git" }, "keywords": [ "streamlit" ], "author": "Streamlit", "license": "Apache-2.0", "bugs": { "url": "https://github.com/streamlit/streamlit/issues" }, "homepage": "https://github.com/streamlit/streamlit#readme", "dependencies": { "apache-arrow": "^11.0.0", "hoist-non-react-statics": "^3.3.2", "react": "^16.14.0", "react-dom": "^16.14.0" }, "devDependencies": { "@babel/preset-env": "^7.21.4", "@babel/preset-react": "^7.18.6", "@babel/preset-typescript": "^7.21.4", "@testing-library/jest-dom": "^5.16.5", "@testing-library/react": "<13", "@types/babel__preset-env": "~7.10.0", "@types/hoist-non-react-statics": "^3.3.1", "@types/jest": "^29.5.0", "@types/node": "^12.0.0", "@types/react": "^16.14.41", "@types/react-dom": "^16.9.24", "babel-jest": "^29.5.0", "jest": "^29.5.0", "jest-environment-jsdom": "^29.5.0", "ts-node": "^10.9.1", "typescript": "^5.0.4", "typesync": "^0.14.3" }, "resolutions": { "@types/react": "^16.14.41", "@types/react-dom": "^16.9.24" }, "packageManager": "yarn@4.5.3" }
767
streamlit
streamlit/component-lib/src/ArrowTable.test.ts
/** * Copyright (c) Streamlit Inc. (2018-2022) Snowflake Inc. (2022-2025) * * Licensed under the Apache License, Version 2.0 (the "License"); * you may not use this file except in compliance with the License. * You may obtain a copy of the License at * * http://www.apache.org/licenses/LICENSE-2.0 * * Unless required by applicable law or agreed to in writing, software * distributed under the License is distributed on an "AS IS" BASIS, * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. * See the License for the specific language governing permissions and * limitations under the License. */ import { ArrowTable } from "./ArrowTable"; import { EXAMPLE_DF } from "./mock_data"; const range = (startAt = 0, endAt = 0) => Array(endAt - startAt) .fill(0) .map((_, i) => i + startAt); describe("ArrowTable", () => { const table = new ArrowTable( EXAMPLE_DF.data, EXAMPLE_DF.index, EXAMPLE_DF.columns ); test("basic getters should returns values for basic table", () => { expect(table.rows).toEqual(6); expect(table.columns).toEqual(4); expect(table.headerRows).toEqual(1); expect(table.headerColumns).toEqual(1); expect(table.dataRows).toEqual(5); expect(table.dataColumns).toEqual(3); expect(table.uuid).toEqual(undefined); expect(table.caption).toEqual(undefined); expect(table.styles).toEqual(undefined); expect(table.table).toBeDefined(); expect(table.index).toBeDefined(); expect(table.columnTable).toBeDefined(); }); test.each([ { rowIndex: 0, columnIndex: 0, expectedResult: { classNames: "blank", content: "", type: "blank", }, }, { rowIndex: 0, columnIndex: 1, expectedResult: { classNames: "col_heading level0 col0", content: "First Name", type: "columns", }, }, { rowIndex: 1, columnIndex: 0, expectedResult: { classNames: "row_heading level0 row0", content: BigInt(0), id: "T_undefinedlevel0_row0", type: "index", }, }, { rowIndex: 1, columnIndex: 1, expectedResult: { classNames: "data row0 col0", content: "Jason", id: "T_undefinedrow0_col0", type: "data", }, }, { rowIndex: 5, columnIndex: 3, expectedResult: { classNames: "data row4 col2", content: BigInt(73), id: "T_undefinedrow4_col2", type: "data", }, }, ])( "getCell should return cell metadata", ({ rowIndex, columnIndex, expectedResult }) => { expect(table.getCell(rowIndex, columnIndex)).toEqual(expectedResult); } ); test("getCell should return cell content", () => { const celContents = range(0, table.rows).map((rowIndex) => range(0, table.columns).map( (columnIndex) => table.getCell(rowIndex, columnIndex).content ) ); expect(celContents).toEqual([ ["", "First Name", "Last Name", "Age"], [BigInt(0), "Jason", "Miller", BigInt(42)], [BigInt(1), "Molly", "Jacobson", BigInt(52)], [BigInt(2), "Tina", "Ali", BigInt(36)], [BigInt(3), "Jake", "Milner", BigInt(24)], [BigInt(4), "Amy", "Smith", BigInt(73)], ]); }); test("serialize should returns Uint8Array", () => { const { data, index, columns } = table.serialize(); expect(data).toBeInstanceOf(Uint8Array); expect(index).toBeInstanceOf(Uint8Array); expect(columns).toBeInstanceOf(Uint8Array); const new_table = new ArrowTable(data, index, columns); expect(new_table.rows).toEqual(6); expect(new_table.columns).toEqual(4); expect(new_table.headerRows).toEqual(1); expect(new_table.headerColumns).toEqual(1); }); });
1,591
streamlit
streamlit/component-lib/src/ArrowTable.ts
/** * Copyright (c) Streamlit Inc. (2018-2022) Snowflake Inc. (2022-2025) * * Licensed under the Apache License, Version 2.0 (the "License"); * you may not use this file except in compliance with the License. * You may obtain a copy of the License at * * http://www.apache.org/licenses/LICENSE-2.0 * * Unless required by applicable law or agreed to in writing, software * distributed under the License is distributed on an "AS IS" BASIS, * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. * See the License for the specific language governing permissions and * limitations under the License. */ import { tableToIPC, tableFromIPC, Table, Type, Vector, StructRow, } from "apache-arrow"; export type CellType = "blank" | "index" | "columns" | "data"; /** Data types used by ArrowJS. */ export type DataType = | null | boolean | number | string | Date // datetime | Int32Array // int | Uint8Array // bytes | Vector // arrays | StructRow; // interval export interface ArrowDataframeProto { data: ArrowTableProto; height: string; width: string; } export interface ArrowTableProto { data: Uint8Array; index: Uint8Array; columns: Uint8Array; styler?: Styler; } export interface Cell { classNames: string; content: DataType; id?: string; type: CellType; } export interface Styler { caption?: string; displayValuesTable: Table; styles?: string; uuid: string; } export class ArrowTable { private readonly dataTable: Table; private readonly indexTable: Table; private readonly columnsTable: Table; private readonly styler?: Styler; constructor( dataBuffer: Uint8Array, indexBuffer: Uint8Array, columnsBuffer: Uint8Array, styler?: any ) { this.dataTable = tableFromIPC(dataBuffer); this.indexTable = tableFromIPC(indexBuffer); this.columnsTable = tableFromIPC(columnsBuffer); this.styler = styler ? { caption: styler.caption, displayValuesTable: tableFromIPC(styler.displayValues), styles: styler.styles, uuid: styler.uuid, } : undefined; } get rows(): number { return this.indexTable.numRows + this.columnsTable.numCols; } get columns(): number { return this.indexTable.numCols + this.columnsTable.numRows; } get headerRows(): number { return this.rows - this.dataRows; } get headerColumns(): number { return this.columns - this.dataColumns; } get dataRows(): number { return this.dataTable.numRows; } get dataColumns(): number { return this.dataTable.numCols; } get uuid(): string | undefined { return this.styler && this.styler.uuid; } get caption(): string | undefined { return this.styler && this.styler.caption; } get styles(): string | undefined { return this.styler && this.styler.styles; } get table(): Table { return this.dataTable; } get index(): Table { return this.indexTable; } get columnTable(): Table { return this.columnsTable; } public getCell = (rowIndex: number, columnIndex: number): Cell => { const isBlankCell = rowIndex < this.headerRows && columnIndex < this.headerColumns; const isIndexCell = rowIndex >= this.headerRows && columnIndex < this.headerColumns; const isColumnsCell = rowIndex < this.headerRows && columnIndex >= this.headerColumns; if (isBlankCell) { const classNames = ["blank"]; if (columnIndex > 0) { classNames.push("level" + rowIndex); } return { type: "blank", classNames: classNames.join(" "), content: "", }; } else if (isColumnsCell) { const dataColumnIndex = columnIndex - this.headerColumns; const classNames = [ "col_heading", "level" + rowIndex, "col" + dataColumnIndex, ]; return { type: "columns", classNames: classNames.join(" "), content: this.getContent(this.columnsTable, dataColumnIndex, rowIndex), }; } else if (isIndexCell) { const dataRowIndex = rowIndex - this.headerRows; const classNames = [ "row_heading", "level" + columnIndex, "row" + dataRowIndex, ]; return { type: "index", id: `T_${this.uuid}level${columnIndex}_row${dataRowIndex}`, classNames: classNames.join(" "), content: this.getContent(this.indexTable, dataRowIndex, columnIndex), }; } else { const dataRowIndex = rowIndex - this.headerRows; const dataColumnIndex = columnIndex - this.headerColumns; const classNames = [ "data", "row" + dataRowIndex, "col" + dataColumnIndex, ]; const content = this.styler ? this.getContent( this.styler.displayValuesTable, dataRowIndex, dataColumnIndex ) : this.getContent(this.dataTable, dataRowIndex, dataColumnIndex); return { type: "data", id: `T_${this.uuid}row${dataRowIndex}_col${dataColumnIndex}`, classNames: classNames.join(" "), content, }; } }; public getContent = ( table: Table, rowIndex: number, columnIndex: number ): DataType => { const column = table.getChildAt(columnIndex); if (column === null) { return ""; } const columnTypeId = this.getColumnTypeId(table, columnIndex); switch (columnTypeId) { case Type.Timestamp: { return this.nanosToDate(column.get(rowIndex)); } default: { return column.get(rowIndex); } } }; /** * Serialize arrow table. */ public serialize(): ArrowTableProto { return { data: tableToIPC(this.dataTable), index: tableToIPC(this.indexTable), columns: tableToIPC(this.columnsTable), }; } /** * Returns apache-arrow specific typeId of column. */ private getColumnTypeId(table: Table, columnIndex: number): Type { return table.schema.fields[columnIndex].type.typeId; } private nanosToDate(nanos: number): Date { return new Date(nanos / 1e6); } }
2,387
streamlit
streamlit/component-lib/src/StreamlitReact.test.tsx
/** * Copyright (c) Streamlit Inc. (2018-2022) Snowflake Inc. (2022-2025) * * Licensed under the Apache License, Version 2.0 (the "License"); * you may not use this file except in compliance with the License. * You may obtain a copy of the License at * * http://www.apache.org/licenses/LICENSE-2.0 * * Unless required by applicable law or agreed to in writing, software * distributed under the License is distributed on an "AS IS" BASIS, * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. * See the License for the specific language governing permissions and * limitations under the License. */ import { ComponentProps, StreamlitComponentBase, withStreamlitConnection, } from "./StreamlitReact"; import * as React from "react"; import { render } from "@testing-library/react"; import { tick } from "./test_utils"; import { EXAMPLE_DF } from "./mock_data"; import { ArrowTable } from "./ArrowTable"; import { Streamlit } from "./streamlit"; class StaticComponent extends StreamlitComponentBase { render() { return <>Static component</>; } } describe("StreamlitReact", () => { test("the component should be empty initially", () => { const StreamlitComponent = withStreamlitConnection(StaticComponent); expect(document.body.innerHTML).toEqual(""); render(<StreamlitComponent />); expect(document.body.innerHTML).toEqual("<div></div>"); }); test("the component should be visible after initialization", async () => { const StreamlitComponent = withStreamlitConnection(StaticComponent); expect(document.body.innerHTML).toEqual(""); const { getByText } = render(<StreamlitComponent />); window.postMessage({ type: "streamlit:render", args: {} }, "*"); await tick(); expect(getByText("Static component")).toBeInTheDocument(); }); test("the component should receive arguments from the parent frame", async () => { interface ComponentArgument { firstArg: string; } class ComponentWithArguments extends StreamlitComponentBase< {}, ComponentArgument > { render(): JSX.Element { return <p>{this.props.args.firstArg}</p>; } } const Component = withStreamlitConnection(ComponentWithArguments); const { getByText } = render(<Component />); window.postMessage( { type: "streamlit:render", args: { firstArg: "Argument text 123" } }, "*" ); await tick(); expect(getByText("Argument text 123")).toBeInTheDocument(); }); test("the component should receive dataframe from the parent frame", async () => { interface ComponentArgument { firstArg: ArrowTable; } class DataframeComponent extends StreamlitComponentBase<ComponentArgument> { render() { const firstArg = this.props.args.firstArg; const { content } = firstArg.getCell(1, 1); return <>{String(content)}</>; } } const Component = withStreamlitConnection(DataframeComponent); const { getByText } = render(<Component />); window.postMessage( { type: "streamlit:render", args: {}, dfs: [ { key: "firstArg", value: { data: { data: EXAMPLE_DF.data, index: EXAMPLE_DF.index, columns: EXAMPLE_DF.columns, }, }, }, ], }, "*" ); await tick(); expect(getByText("Jason")).toBeInTheDocument(); }); test("the component error should be visible", async () => { class BrokenComponent extends StreamlitComponentBase { render(): React.ReactNode { throw new Error("Error in component"); } } jest.spyOn(console, "error").mockImplementation(() => {}); const Component = withStreamlitConnection(BrokenComponent); const { getByText } = render(<Component />); window.postMessage( { type: "streamlit:render", args: {}, }, "*" ); await tick(); expect(getByText("Component Error")).toBeInTheDocument(); expect(getByText("Error in component")).toBeInTheDocument(); expect(jest.mocked(console.error).mock.calls).toHaveLength(2); }); test("the component should update the frame height initially", async () => { jest.spyOn(Streamlit, "setFrameHeight"); const Component = withStreamlitConnection(StaticComponent); const { getByText } = render(<Component />); window.postMessage( { type: "streamlit:render", args: {}, }, "*" ); await tick(); expect(jest.mocked(Streamlit.setFrameHeight).mock.calls).toHaveLength(1); expect(getByText("Static component")).toBeInTheDocument(); }); test("the component should update the frame height after updating the arguments", async () => { jest.spyOn(Streamlit, "setFrameHeight"); const Component = withStreamlitConnection(StaticComponent); render(<Component />); for (const value of [1, 2, 3]) { window.postMessage( { type: "streamlit:render", args: { value }, }, "*" ); await tick(); } expect(jest.mocked(Streamlit.setFrameHeight).mock.calls).toHaveLength(3); }); });
1,944
streamlit
streamlit/component-lib/src/StreamlitReact.tsx
/** * Copyright (c) Streamlit Inc. (2018-2022) Snowflake Inc. (2022-2025) * * Licensed under the Apache License, Version 2.0 (the "License"); * you may not use this file except in compliance with the License. * You may obtain a copy of the License at * * http://www.apache.org/licenses/LICENSE-2.0 * * Unless required by applicable law or agreed to in writing, software * distributed under the License is distributed on an "AS IS" BASIS, * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. * See the License for the specific language governing permissions and * limitations under the License. */ import hoistNonReactStatics from "hoist-non-react-statics"; import React, { ReactNode } from "react"; import { RenderData, Streamlit, Theme } from "./streamlit"; /** * Props passed to custom Streamlit components. */ export interface ComponentProps<ArgType = any> { /** Named dictionary of arguments passed from Python. */ args: ArgType; /** The component's width. */ width: number; /** * True if the component should be disabled. * All components get disabled while the app is being re-run, * and become re-enabled when the re-run has finished. */ disabled: boolean; /** Theme definition dictionary passed from the main client.*/ theme?: Theme; } /** * Optional Streamlit React-based component base class. * * You are not required to extend this base class to create a Streamlit * component. If you decide not to extend it, you should implement the * `componentDidMount` and `componentDidUpdate` functions in your own class, * so that your plugin properly resizes. */ export class StreamlitComponentBase< S = {}, ArgType = any > extends React.PureComponent<ComponentProps<ArgType>, S> { public componentDidMount(): void { // After we're rendered for the first time, tell Streamlit that our height // has changed. Streamlit.setFrameHeight(); } public componentDidUpdate(): void { // After we're updated, tell Streamlit that our height may have changed. Streamlit.setFrameHeight(); } } /** * Wrapper for React-based Streamlit components. * * Bootstraps the communication interface between Streamlit and the component. */ export function withStreamlitConnection<ArgType = any>( WrappedComponent: React.ComponentType<ComponentProps> ): React.ComponentType { interface WrapperProps {} interface WrapperState { renderData?: RenderData<ArgType>; componentError?: Error; } class ComponentWrapper extends React.PureComponent< WrapperProps, WrapperState > { public constructor(props: WrapperProps) { super(props); this.state = { renderData: undefined, componentError: undefined, }; } /** * Error boundary function. This will be called if our wrapped * component throws an error. We store the caught error in our state, * and display it in the next render(). */ public static getDerivedStateFromError = ( error: Error ): Partial<WrapperState> => { return { componentError: error }; }; public componentDidMount = (): void => { // Set up event listeners, and signal to Streamlit that we're ready. // We won't render the component until we receive the first RENDER_EVENT. Streamlit.events.addEventListener( Streamlit.RENDER_EVENT, this.onRenderEvent as EventListener ); Streamlit.setComponentReady(); }; public componentDidUpdate = (): void => { // If our child threw an error, we display it in render(). In this // case, the child won't be mounted and therefore won't call // `setFrameHeight` on its own. We do it here so that the rendered // error will be visible. if (this.state.componentError != null) { Streamlit.setFrameHeight(); } }; public componentWillUnmount = (): void => { Streamlit.events.removeEventListener( Streamlit.RENDER_EVENT, this.onRenderEvent as EventListener ); }; /** * Streamlit is telling this component to redraw. * We save the render data in State, so that it can be passed to the * component in our own render() function. */ private onRenderEvent = ( event: CustomEvent<RenderData<ArgType>> ): void => { // Update our state with the newest render data this.setState({ renderData: event.detail }); }; public render(): ReactNode { // If our wrapped component threw an error, display it. if (this.state.componentError != null) { return ( <div> <h1>Component Error</h1> <span>{this.state.componentError.message}</span> </div> ); } // Don't render until we've gotten our first RENDER_EVENT from Streamlit. if (this.state.renderData == null) { return null; } return ( <WrappedComponent width={window.innerWidth} disabled={this.state.renderData.disabled} args={this.state.renderData.args} theme={this.state.renderData.theme} /> ); } } return hoistNonReactStatics(ComponentWrapper, WrappedComponent); }
1,754
streamlit
streamlit/component-lib/src/index.ts
/** * Copyright (c) Streamlit Inc. (2018-2022) Snowflake Inc. (2022-2025) * * Licensed under the Apache License, Version 2.0 (the "License"); * you may not use this file except in compliance with the License. * You may obtain a copy of the License at * * http://www.apache.org/licenses/LICENSE-2.0 * * Unless required by applicable law or agreed to in writing, software * distributed under the License is distributed on an "AS IS" BASIS, * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. * See the License for the specific language governing permissions and * limitations under the License. */ // Workaround for type-only exports: // https://stackoverflow.com/questions/53728230/cannot-re-export-a-type-when-using-the-isolatedmodules-with-ts-3-2-2 import { ComponentProps as ComponentProps_ } from "./StreamlitReact"; import { RenderData as RenderData_, Theme as Theme_ } from "./streamlit"; export { StreamlitComponentBase, withStreamlitConnection, } from "./StreamlitReact"; export { ArrowTable } from "./ArrowTable"; export { Streamlit } from "./streamlit"; export type ComponentProps = ComponentProps_; export type RenderData<ArgType = any> = RenderData_<ArgType>; export type Theme = Theme_;
365
End of preview. Expand in Data Studio
README.md exists but content is empty.
Downloads last month
3