A TextProcessing provider Large Language Model that runs locally on CPU This app is deprecated in favor of llm2. Have a look at [the docs for llm2](https://docs.nextcloud.com/server/latest/admin_manual/ai/app_llm2.html) The models run completely on your machine. No private data leaves your servers. After installing this app you will need to run occ llm:download-model Models: * Llama 2 by Meta * Languages: English * [LLAMA 2 Community License](https://download.nextcloud.com/server/apps/llm/llama-2-7b-chat-ggml/LICENSE) * GPT4All Falcon by Nomic AI * Languages: English * [Apache License 2.0](https://download.nextcloud.com/server/apps/llm/LICENSE) * Leo HessianAI by LAION LeoLM * Languages: English/German * [LLAMA 2 Community License](https://download.nextcloud.com/server/apps/llm/leo-hessianai-13B-chat-bilingual-GGUF/LICENSE) Requirements: * x86 CPU * GNU lib C (musl is not supported) * Python 3.10+ (including python-venv) #### Nextcloud All-in-One: With Nextcloud AIO, this app is not going to work because AIO uses musl. However you can use [this community container](https://github.com/nextcloud/all-in-one/tree/main/community-containers/local-ai) as replacement for this app. ## Ethical AI Rating "Llama 2" model ### Rating: 🟡 Positive: * the software for training and inference of this model is open source * the trained model is freely available, and thus can be run on-premises Negative: * the training data is not freely available, limiting the ability of external parties to check and correct for bias or optimise the model’s performance and CO2 usage. ## Ethical AI Rating for "GPT4All Falcon" model ### Rating: 🟢 Positive: * the software for training and inference of this model is open source * the trained model is freely available, and thus can be run on-premises * the training data is freely available, making it possible to check or correct for bias or optimise the performance and CO2 usage. Learn more about the Nextcloud Ethical AI Rating [in our blog](https://nextcloud.com/blog/nextcloud-ethical-ai-rating/).