Collections
Discover the best community collections!
Collections trending this week
-
Automatic Metadata Extraction Incorporating Visual Features from Scanned Electronic Theses and Dissertations
Paper • 2107.00516 • Published -
New Methods for Metadata Extraction from Scientific Literature
Paper • 1710.10201 • Published • 1 -
Grobid
🌍57Extract bibliographic data from PDFs
-
TheBritishLibrary/EThOS-PhD-metadata
Updated • 267 • 2
-
Trellis Networks for Sequence Modeling
Paper • 1810.06682 • Published • 1 -
ProSG: Using Prompt Synthetic Gradients to Alleviate Prompt Forgetting of RNN-like Language Models
Paper • 2311.01981 • Published • 1 -
Gated recurrent neural networks discover attention
Paper • 2309.01775 • Published • 10 -
Inverse Approximation Theory for Nonlinear Recurrent Neural Networks
Paper • 2305.19190 • Published • 1
-
Trellis Networks for Sequence Modeling
Paper • 1810.06682 • Published • 1 -
Pruning Very Deep Neural Network Channels for Efficient Inference
Paper • 2211.08339 • Published • 1 -
LAPP: Layer Adaptive Progressive Pruning for Compressing CNNs from Scratch
Paper • 2309.14157 • Published • 1 -
Mamba: Linear-Time Sequence Modeling with Selective State Spaces
Paper • 2312.00752 • Published • 150
-
Automatic Metadata Extraction Incorporating Visual Features from Scanned Electronic Theses and Dissertations
Paper • 2107.00516 • Published -
New Methods for Metadata Extraction from Scientific Literature
Paper • 1710.10201 • Published • 1 -
Grobid
🌍57Extract bibliographic data from PDFs
-
TheBritishLibrary/EThOS-PhD-metadata
Updated • 267 • 2
-
Trellis Networks for Sequence Modeling
Paper • 1810.06682 • Published • 1 -
Pruning Very Deep Neural Network Channels for Efficient Inference
Paper • 2211.08339 • Published • 1 -
LAPP: Layer Adaptive Progressive Pruning for Compressing CNNs from Scratch
Paper • 2309.14157 • Published • 1 -
Mamba: Linear-Time Sequence Modeling with Selective State Spaces
Paper • 2312.00752 • Published • 150
-
Trellis Networks for Sequence Modeling
Paper • 1810.06682 • Published • 1 -
ProSG: Using Prompt Synthetic Gradients to Alleviate Prompt Forgetting of RNN-like Language Models
Paper • 2311.01981 • Published • 1 -
Gated recurrent neural networks discover attention
Paper • 2309.01775 • Published • 10 -
Inverse Approximation Theory for Nonlinear Recurrent Neural Networks
Paper • 2305.19190 • Published • 1