OpenLLM Core: Core components for OpenLLM.
Project description
📖 Introduction
With OpenLLM, you can run inference with any open-source large-language models, deploy to the cloud or on-premises, and build powerful AI apps, and more.
To learn more about OpenLLM, please visit OpenLLM's README.md
This package holds the core components of OpenLLM, and considered as internal.
Components includes:
- Configuration generation.
- Utilities for interacting with OpenLLM server.
- Schema and generation utilities for OpenLLM server.
📔 Citation
If you use OpenLLM in your research, we provide a citation to use:
@software{Pham_OpenLLM_Operating_LLMs_2023,
author = {Pham, Aaron and Yang, Chaoyu and Sheng, Sean and Zhao, Shenyang and Lee, Sauyon and Jiang, Bo and Dong, Fog and Guan, Xipeng and Ming, Frost},
license = {Apache-2.0},
month = jun,
title = {{OpenLLM: Operating LLMs in production}},
url = {https://github.com/bentoml/OpenLLM},
year = {2023}
}
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
openllm_core-0.4.44.tar.gz
(58.2 kB
view hashes)
Built Distribution
Close
Hashes for openllm_core-0.4.44-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | 7ee775980b4864b20906a7a6866e022e04ba8103977f88f30c259af4216b9be6 |
|
MD5 | 726c1b68c84f9e039f2e0339946b93ff |
|
BLAKE2b-256 | 9c64eb33d7c6da397dd89d9dfa3e12b14f0e3716b0c31b0ed9f4848f178eca3a |