exdysa commited on
Commit
2eb3740
·
verified ·
1 Parent(s): 694dd4b

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +26 -67
README.md CHANGED
@@ -2,7 +2,27 @@
2
  language:
3
  - en
4
  library_name: mir
 
 
 
 
 
 
 
5
  ---
 
 
 
 
 
 
 
 
 
 
 
 
 
6
  <div align="center"><img src="https://github.com/darkshapes/entity-statement/raw/main/png/mir/mir300_dark.png" width="25%"></div>
7
 
8
 
@@ -18,79 +38,18 @@ The work is inspired by:
18
  Example:
19
 
20
  > [!NOTE]
21
- > # mir : model . transformer . clip-l : stable-diffusion-xl
22
 
23
 
24
  ```
25
- mir : model . lora . hyper : flux-1
26
  ↑ ↑ ↑ ↑ ↑
27
  [URI]:[Domain].[Architecture].[Series]:[Compatibility]
28
  ```
29
 
30
  Code for this project can be found at [darkshapes/MIR on GitHub](https://github.com/darkshapes/MIR)
 
 
31
 
32
-
33
- ## Definitions:
34
-
35
- Like other URI schema, the order of the identifiers roughly indicates their specificity from left (broad) to right (narrow)
36
-
37
- ### Domains
38
-
39
-
40
- - `dev`: Varying local neural network layers, in-training, pre-release, items under evaluation, likely in unexpected formats<br>
41
- - `model`: Static local neural network layers. Publicly released machine learning models with an identifier in the database<br>
42
- - `operations`: Varying global neural network attributes, algorithms, optimizations and procedures on models<br>
43
- - `info`: Static global neural network attributes, metadata with an identifier in the database<br>
44
-
45
- ### Architecture
46
- Broad and general terms for system architectures.
47
- - `dit`: Diffusion transformer, typically Vision Synthesis
48
- - `unet`: Unet diffusion structure
49
- - `art` : Autoregressive transformer, typically LLMs
50
- - `lora`: Low-Rank Adapter (may work with dit or transformer)
51
- - `vae`: Variational Autoencoder
52
- etc
53
-
54
- ### Series
55
- Foundational network and technique types.
56
-
57
- ### Compatibility
58
- Implementation details based on version-breaking changes, configuration inconsistencies, or other conflicting indicators that have practical application.
59
-
60
- ### Goals
61
- - Standard identification scheme for **ALL** fields of ML-related development
62
- - Simplification of code for model-related logistics
63
- - Rapid retrieval of resources and metadata
64
- - Efficient and reliable compatibility checks
65
- - Organized hyperparameter management
66
-
67
- > <details> <summary>Why not use `diffusion`/`sgm`, `ldm`/`text`/hf.co folder-structure/brand or trade name/preprint paper/development house/algorithm</summary>
68
- >
69
- > - The format here isnt finalized, but overlapping resource definitions or complicated categories that are difficult to narrow have been pruned
70
- > - Likewise, definitions that are too specific have also been trimmed
71
- > - HF.CO become inconsistent across folders/files and often the metadata enforcement of many important developments is neglected
72
- > - Development credit often shared, [Paper heredity tree](https://www.connectedpapers.com/search?q=generative%20diffusion), super complicated
73
- > - Algorithms (esp application) are less common knowledge, vague, ~~and I'm too smooth-brain.~~
74
- > - Overall an attempt at impartiality and neutrality with regards to brand/territory origins
75
- > </details>
76
-
77
- > <details><summary>Why `unet`, `dit`, `lora` over alternatives</summary>
78
- >
79
- > - UNET/DiT/Transformer are shared enough to be genre-ish but not too narrowly specific
80
- > - Very similar technical process on this level
81
- > - Functional and efficient for random lookups
82
- > - Short to type
83
- > </details>
84
-
85
- > <details><summary>Roadmap</summary>
86
- >
87
- > - Decide on `@` or `:` delimiters (like @8cfg for an indistinguishable 8 step lora that requires cfg)
88
- > - crucial spec element, or an optional, MIR app-determined feature?
89
- > - Proof of concept generative model registry
90
- > - Ensure compatability/integration/cross-pollenation with [OECD AI Classifications](https://oecd.ai/en/classification)
91
- > - Ensure compatability/integration/cross-pollenation with [NIST AI 200-1 NIST Trustworthy and Responsible AI](https://www.nist.gov/publications/ai-use-taxonomy-human-centered-approach)
92
- > </details>
93
-
94
- massive thank you to [@silveroxides](https://huggingface.co/silveroxides) for phenomenal work collecting pristine state dicts and related information
95
-
96
- ![image/png](https://cdn-uploads.huggingface.co/production/uploads/65ff1816871b36bf84fc3c37/NWZideVk_pp_4OzQDl96w.png)
 
2
  language:
3
  - en
4
  library_name: mir
5
+ license: bsd
6
+ tags:
7
+ - code
8
+ - weight_maps
9
+ - layer_metadata
10
+ size_categories:
11
+ - n<1K
12
  ---
13
+ ```yaml
14
+ language:
15
+ - en
16
+ library_name: mir
17
+ license: bsd
18
+ tags:
19
+ - code
20
+ - weight_maps
21
+ - layer_metadata
22
+ size_categories:
23
+ - n<1K
24
+ ```
25
+
26
  <div align="center"><img src="https://github.com/darkshapes/entity-statement/raw/main/png/mir/mir300_dark.png" width="25%"></div>
27
 
28
 
 
38
  Example:
39
 
40
  > [!NOTE]
41
+ > # mir : // model . vit . clip_l : stable_diffusion_xl
42
 
43
 
44
  ```
45
+ mir:// model . lora . hyper : flux_1
46
  ↑ ↑ ↑ ↑ ↑
47
  [URI]:[Domain].[Architecture].[Series]:[Compatibility]
48
  ```
49
 
50
  Code for this project can be found at [darkshapes/MIR on GitHub](https://github.com/darkshapes/MIR)
51
+ ```py
52
+ from datasets import load_dataset
53
 
54
+ weight_maps = load_dataset("darkshapes/mir", data_dir="weight_maps")
55
+ ```