File size: 1,500 Bytes
7ed77a6
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
# data_transfer

A Python tool to upload files and directories to cloud backends with a focus on speed and simplicity.

Supported backends:
- Hugging Face Hub
- Google Drive

Features:
- Simple CLI and Python API
- Resumable, chunked uploads
- Parallel transfer where supported
- Credential storage via file store at `~/.config/data_transfer/credentials.json` (macOS & Linux)
- Configurable data root (default `./data`) and remote paths/filenames
- Timing summary with `-v`

## Quickstart

1. Install

```bash
pip install -e .
```

2. Configure credentials

```bash
# For Hugging Face
data-transfer auth hf

# For Google Drive (opens browser for OAuth once)
# Optionally specify path to client_secrets.json
# export GDRIVE_CLIENT_SECRETS=/path/to/client_secrets.json
data-transfer auth gdrive
```

3. Upload

```bash
# Upload a single file to Hugging Face Hub repo under path `datasets/mydir/file.bin`
# By default source is resolved under ./data; override with --data-root

data-transfer upload --backend hf --source test.pt --remote datasets/mydir/file.bin --repo your-username/your-repo --create-repo -v

# Upload directory recursively to Google Drive under a folder id
# Remote path supports nested folders; will be created on demand under the given folder id
# Use --workers to speed up parallel uploads

data-transfer upload --backend gdrive --source ./myfolder --remote nested1/nested2 --parent-id <FOLDER_ID> --workers 8 -v
```

See `data_transfer/config.example.toml` for optional defaults.