# The meta list of various /aicg/ guides on running local ->back to the comfy times when we had pyg in our OP<- ->==(Updated 2024-01-08)==<- *** [TOC2] *** #####Guides https://rentry.org/freellamas - the one that started it all anew, even when this great anon doesn't host a proxy, you can get the settings and the set up info here https://rentry.org/hostfreellamas - guide on how to set up a local model for yourself or for others - `free` on your own machine https://rentry.org/colabfreellamas - guide on how to set up a local model for yourself or for others - `free` on colab (paid available but overpriced: vast\.ai, paperspace or runpod suggested instead) (~~see https://desuarchive.org/g/thread/95578945#95579604 for the updated colab script until anon updates the rentry~~ updated) (~~https://desuarchive.org/g/thread/95969617#95970981 premade templates for Mythomax, MLewd, Stheno, Tsukasa~~ included) https://rentry.org/koboldcpp_colab_guide - guide on how to set up a local model for yourself or for others with KoboldCpp - `free` on colab https://docs.sillytavern.app/usage/local-llm-guide/how-to-use-a-self-hosted-model - from SillyTavern docs, guide on how to set up a local model for yourself to be used in SillyTavern - `free` on your own machine *** #####More guides https://vast.ai/docs/guides/oobabooga - guide on how to set up a local model for yourself or for others - `paid` (anon was shilling this, no idea how much it costs) https://rentry.org/aicglocal - guide on how to set up a local model for yourself with KoboldCpp - `free` on your own machine https://rentry.org/llama_v2_sillytavern - from /lmg/, guide on how to set up a local model for yourself with KoboldCpp - `free` on your own machine https://rentry.org/better-llama-roleplay - stolen from /lmg/, listed there under "LLaMA RP Proxy" https://rentry.org/stheno-guide - from /lmg/, model-specific guide for Stheno-L2-13B with SimpleProxy after making it work using aforementioned guides https://rentry.org/easylocalnvidia - guide on how to set up a local model for yourself with KoboldCpp - `free` on your own machine https://rentry.org/ky239 - borrowed from /CHAG/, guide on how to set up a local model for yourself with a lot of additional explanations - `free` on colab https://rentry.org/MixtralForRetards - model-specific guide on using Mixtral with KoboldCpp - `free` on your own machine (see https://desuarchive.org/g/thread/97942495#97942574 explanation of why it might actually be suboptimal) https://rentry.org/mixtral_vastai_for_dummies - model-specific guide on how to set up Mixtral for yourself or for others with KoboldCpp - `paid` *** #####Ayumi's LLM Role Play & ERP Ranking https://rentry.org/ayumi_erp_rating (http://ayumi.m8geil.de/ayumi_bench_v3_results.html) - stolen from /lmg/, a huge table with various models performance in automated benchmarks, admits being flawed ![pic 23.08.2023, also thanks, Weird Constructor](https://files.catbox.moe/o6cmri.png) ->`pic is very old!`<- *** #####More rankings https://rentry.org/lmg-13b-showdown - stolen from /lmg/, a small test of 13B models popular at the time https://old.reddit.com/user/WolframRavenwolf/submitted/?sort=new - linked [here](https://desuarchive.org/g/thread/96895763#96898926), llm comparison with ST roleplay tests https://snombler.neocities.org/logs - some tests of various llms specifically for long-form roleplay, performed by a botmaker https://huggingface.co/spaces/lmsys/chatbot-arena-leaderboard - stolen from /lmg/, looks like a foundation models ranking https://rentry.org/thecelltest - a small test of multiple models, checks understanding of a specific scenario *** *** *** Sister rentries: - https://rentry.org/aicg_extra_information - /aicg/ OP overflow, additional info missing from the OP Email for feedback and suggestions: - aicg2023@proton.me