LCT (Laten Connection Model)
LCT is Transformers variant model that replaced Feed forward network with LCM , how its work ? LCM looking attention as upgraded signal by step1 step2 for looking attention. and then, LCM will do residural connection (attention (input), laten (output)) for making reached attention signal.
LCT look processing following:
embedding => LCT Block (Attention + LCM ) xN sub model => FFN as decoder => Linear
Here LCM ploating bechmark:
LCT install:
bash
import LCT_architecture
model = keras.models.load(LCt-Tiny-version.keras)
support LCT: https://ko-fi.com/alpin92578
Author and Researcher LCT:
Candra Alpin gunawan
note:
this model trained by 3K Conversations Dataset for ChatBot from kaggle user Kreesh Rajani datasets link: https://www.kaggle.com/datasets/kreeshrajani/3k-conversations-dataset-for-chatbot
- Downloads last month
- 3
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
🙋
Ask for provider support
