May 2025: This is the LLM used for the genAI scanners in our product.
It has been quantized internally from the orignal microsoft/phi-4
Its native context window is 16k tokens
May 2025: This is the LLM used for the genAI scanners in our product.
It has been quantized internally from the orignal microsoft/phi-4
Its native context window is 16k tokens