Hugging Face's logo Hugging Face
  • Models
  • Datasets
  • Spaces
  • Buckets new
  • Docs
  • Enterprise
  • Pricing

  • Log In
  • Sign Up

LINKER98
/
CareerBoostAI

Text Generation
Adapters
Safetensors
Korean
exaone
business
text-generation-inference
finance
대한민국 노동법
Transfomers
conversational
custom_code
Model card Files Files and versions
xet
Community

Instructions to use LINKER98/CareerBoostAI with libraries, inference providers, notebooks, and local apps. Follow these links to get started.

  • Libraries
  • Adapters

    How to use LINKER98/CareerBoostAI with Adapters:

    from adapters import AutoAdapterModel
    
    model = AutoAdapterModel.from_pretrained("undefined")
    model.load_adapter("LINKER98/CareerBoostAI", set_active=True)
  • Notebooks
  • Google Colab
  • Kaggle
CareerBoostAI
15.6 GB
Ctrl+K
Ctrl+K
  • 1 contributor
History: 7 commits
LINKER98's picture
LINKER98
Update README.md
d9208d9 verified over 1 year ago
  • .gitattributes
    1.52 kB
    initial commit over 1 year ago
  • README.md
    548 Bytes
    Update README.md over 1 year ago
  • config.json
    1.05 kB
    Upload ExaoneForCausalLM over 1 year ago
  • generation_config.json
    134 Bytes
    Upload ExaoneForCausalLM over 1 year ago
  • merges.txt
    1.22 MB
    Upload tokenizer over 1 year ago
  • model-00001-of-00004.safetensors
    4.97 GB
    xet
    Upload ExaoneForCausalLM over 1 year ago
  • model-00002-of-00004.safetensors
    4.92 GB
    xet
    Upload ExaoneForCausalLM over 1 year ago
  • model-00003-of-00004.safetensors
    4.92 GB
    xet
    Upload ExaoneForCausalLM over 1 year ago
  • model-00004-of-00004.safetensors
    839 MB
    xet
    Upload ExaoneForCausalLM over 1 year ago
  • model.safetensors.index.json
    23.7 kB
    Upload ExaoneForCausalLM over 1 year ago
  • special_tokens_map.json
    457 Bytes
    Upload tokenizer over 1 year ago
  • tokenizer.json
    7.91 MB
    Upload tokenizer over 1 year ago
  • tokenizer_config.json
    70.7 kB
    Upload tokenizer over 1 year ago
  • vocab.json
    1.93 MB
    Upload tokenizer over 1 year ago