Buckets:

|
download
raw
1.95 kB

Kernels CLI Reference

The kernels CLI provides commands for managing compute kernels.

Commands

Command Description
upload Upload kernels to the Hub
benchmark Run benchmark results for a kernel
check Check a kernel for compliance
versions Show kernel versions
lock Lock kernel revisions
download Download locked kernels
skills Add skills for AI coding assistants

Quick Start

For building and writing kernels, please refer building kernels and writing kernels.

Use kernels in your project

Directly from the Hub

import torch

from kernels import get_kernel

# Download optimized kernels from the Hugging Face hub
my_kernel = get_kernel("my-username/my-kernel", version=1)

# Random tensor
x = torch.randn((10, 10), dtype=torch.float16, device="cuda")

# Run the kernel
y = torch.empty_like(x)
my_kernel.my_kernel_function(y, x)

print(y)

or

Locked and downloaded

Add to pyproject.toml:

[tool.kernels.dependencies]
"my-username/my-kernel" = "1"

Then lock and download:

kernels lock .
kernels download .

See help

kernels --help

Xet Storage Details

Size:
1.95 kB
·
Xet hash:
a7bc64ba9e7f301b563616f2b5ef806e005b03ddee325996fc53e77da50bb06d

Xet efficiently stores files, intelligently splitting them into unique chunks and accelerating uploads and downloads. More info.