OpenELM is an open-source library by CarperAI, designed to enable evolutionary search with language models in both code and natural language. For QDAI

Search code, repositories, users, issues, pull requests...

submited by
Style Pass
2024-04-26 20:00:08

OpenELM is an open-source library by CarperAI, designed to enable evolutionary search with language models in both code and natural language.

For QDAIF: poetry domain currently implemented in main, and other experiment code with few-shot LMX domains currently in experimental branch

OpenELM supports the quality-diversity algorithms MAP-Elites, CVT-MAP-Elites, and Deep Grid MAP-Elites, as well as a simple genetic algorithm baseline.

OpenELM’s language models are instantiated as Langchain classes by default, which means that OpenELM can support practically any existing LLM API, as well as models run on your local GPU via HuggingFace Transformers.

We also provide optional Nvidia Triton Inference Server support, intended for use cases where low latency on 8 or more GPUs is important. Finally, for code generation domains, we provide a sandbox environment, consisting of a container server backed with gVisor (a container runtime that introduces an additional barrier between the host and the container) as well as a heuristic-based safety guard.

All options for these classes are defined in configs.py, via dataclasses which are registered as a hydra config, and can be overriden via the command line when running one of the example scripts such as run_elm.py.

Leave a Comment