๐ข For those who interested in applying LLM for inferring iterators of data with CoT / prompts, this update might be relevant. Deligted to share the new release of the bulk-chain. This is a framework that contributes to efficient AI querying in synthetic data generation scenarios.
๐ This features the no-string framework for quierrying LLMs in various modes: sync, async and with optional support for output streaming. ๐ฆ๏ธ In the latest 1.2.0 release, the updates on outlining API parameters for inference mode.
VANTA Research is excited to announce a small lab preview of our new 675B fine tune, Loux-Large. Loux is an AI model with a sophisticated, rebellious edge designed to assist and collaborate with engineers, builders, and people working on technical projects.
If you enjoy working with Loux and would like full access, let us know by liking the space or opening a discussion in the community!