Hugging Face's logo Hugging Face
  • Models
  • Datasets
  • Spaces
  • Buckets new
  • Docs
  • Enterprise
  • Pricing

  • Log In
  • Sign Up

dongbobo
/
adapter-checkpoint-lora

PEFT
lora
adapter
causal-lm
Model card Files Files and versions
xet
Community

Instructions to use dongbobo/adapter-checkpoint-lora with libraries, inference providers, notebooks, and local apps. Follow these links to get started.

  • Libraries
  • PEFT

    How to use dongbobo/adapter-checkpoint-lora with PEFT:

    from peft import PeftModel
    from transformers import AutoModelForCausalLM
    
    base_model = AutoModelForCausalLM.from_pretrained("meta-llama/Llama-2-7b-hf")
    model = PeftModel.from_pretrained(base_model, "dongbobo/adapter-checkpoint-lora")
  • Notebooks
  • Google Colab
  • Kaggle
adapter-checkpoint-lora
16.8 MB
Ctrl+K
Ctrl+K
  • 1 contributor
History: 7 commits
dongbobo's picture
dongbobo
Add YAML frontmatter to README (license, library, tags)
08ba5de verified about 2 months ago
  • examples
    Add examples/chat/few_shot/prompt.json about 2 months ago
  • .gitattributes
    1.52 kB
    initial commit about 2 months ago
  • README.md
    2.51 kB
    Add YAML frontmatter to README (license, library, tags) about 2 months ago
  • adapter_config.json
    392 Bytes
    Add adapter_config.json about 2 months ago
  • adapter_model.bin

    Detected Pickle imports (4)

    • "_codecs.encode",
    • "numpy.dtype",
    • "numpy._core.multiarray._reconstruct",
    • "numpy.ndarray"

    How to fix it?

    16.8 MB
    xet
    Add adapter_model.bin about 2 months ago