atlaswang commited on
Commit
276e0ce
·
verified ·
1 Parent(s): be9d798

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +3 -7
README.md CHANGED
@@ -7,20 +7,16 @@ sdk: static
7
  pinned: false
8
  ---
9
 
10
- # Visual Informatics Group @ University of Texas at Austin ([VITA-Group](https://vita-group.github.io/))
11
 
12
- At VITA group, we have unusually broad, and forever-evolving research interests spanning from the theory to the
13
- application aspects of machine learning (ML). Our current "research keywords" include, but are not limited to:
14
- sparsity (from classical optimization to modern neural networks); efficient training, inference or transfer
15
- (especially, of large foundation models); robustness and trustworthiness; learning to optimize (L2O);
16
- generative AI; graph learning, and more.
17
 
18
 
19
  ## Compressed LLM Model Zone
20
 
21
  **NOTE: All compressed LLMs are moved to a new repo at [compressed-llm](https://huggingface.co/compressed-llm).**
22
 
23
- The models are prepared by [Visual Informatics Group @ University of Texas at Austin (VITA-group)](https://vita-group.github.io/). Credits to Ajay Jaiswal, Zhenyu Zhang, Zhangheng Li, Lu Yin, Shiwei Liu and Junyuan Hong.
24
 
25
  License: [MIT License](https://opensource.org/license/mit/)
26
 
 
7
  pinned: false
8
  ---
9
 
10
+ # VITA-Group@UT Austin (https://vita-group.github.io/))
11
 
12
+ We revisit classical sparse and low-rank optimization through the lens of modern AI, developing theory-driven algorithms that accelerate training and inference in large-scale models. We also investigate how algebraic and logical structures emerge during learning, uncovering the interplay between neural and symbolic computation across streamlined architectures, reasoning pipelines, and agentic systems. See https://www.vita-group.space/research for our latest research efforts.
 
 
 
 
13
 
14
 
15
  ## Compressed LLM Model Zone
16
 
17
  **NOTE: All compressed LLMs are moved to a new repo at [compressed-llm](https://huggingface.co/compressed-llm).**
18
 
19
+ The models are prepared by [VITA-group](https://vita-group.github.io/). Credits to Ajay Jaiswal, Zhenyu Zhang, Zhangheng Li, Lu Yin, Shiwei Liu and Junyuan Hong.
20
 
21
  License: [MIT License](https://opensource.org/license/mit/)
22