Replies: 6 comments 6 replies
-
Dude super cool! I’ll make an account today Edit: Tried just now but worldserver might be down it seems |
Beta Was this translation helpful? Give feedback.
-
a bit of a progress update for providing more context to the conversation including Faction, Race, Class, Level, Spec of player and Responding AI, this influences the response based on the chosen personality. for example if its a rude/joking demenor it could poke fun at you being a lower level then the Bot or differences in class, race/Faction some bugs to iron out. but it takes some data into perspective when responding to requests to give some context about Location, race, class, faction etc |
Beta Was this translation helpful? Give feedback.
-
Gota Add more Context and test some different models out as well as shortening the messages. but its looking good so far =) |
Beta Was this translation helpful? Give feedback.
-
getting a little bit better with context and responses Scenario 1 - Level/Faction Difference:
Scenario 2 - Combat Status:
Player (Level 80 Blood Elf Paladin, ): "nice tanking in that last raid"
Player (Level 80 Human Mage, ): "get out of our territory horde scum"
|
Beta Was this translation helpful? Give feedback.
-
Commenting here to chime in just a bit. Presuming to give the poster the benefit of the doubt and that this isn't going to be (mis)used for any RMT or other shenanigans. For an RP-focused server, having 'alive' NPCs that are bots (and hopefully clearly marked as such) would be really wonderful. I'd appreciate and support such an endeavor. I may have misunderstood the original message, but is there a server to connect to for testing this out or is the OP asking others who run their own servers to host and test the 'playerbots' module locally? I know there's at least one public server that allows for players to play the game nearly-solo and to build up a 'team' of hired bots that fight alongside you in questing and dungeons. It is a lovely idea if they managed to package it in a self-contained single-plaeyr format, though probably not in the spirit of an MMO. |
Beta Was this translation helpful? Give feedback.
-
I like this concept i have similar running on cmanogs but it works via kobold . i tried to compile with this mod and the current azerothcore, but it fail at 98% --> void OnChat(Player* player, uint32 type, uint32 lang, std::string& msg) override; I hope this will be fully implemented as this makes it more interesting. |
Beta Was this translation helpful? Give feedback.
-
Hey everyone!
I’m looking for testers for a module I’ve been working on. It integrates with player bots using a simple LLM setup powered by Ollama, designed to run on a local machine. Right now, it’s using a 1B model, a 3b or 8b is preferable (If you have access to a GPU)
The cool thing is that the 1B model is currently running on CPU, so no GPU is required to use the module. I’m running it on a 6-core server with 24GB of RAM, and it performs well. For reference, the 1B model requires about 8–10GB of memory to run.
The goal is to make the bot responses feel more human-like.
The configuration offers a lot of flexibility. You can:
Adjust how many player bots respond to each input. (Currently, 2–3 bots respond, with a 75% chance of replying and a 25% chance of ignoring.)
Customize the personalities of the bots to make their responses more varied and unique.
If no response is generated, the system falls back to the player bots’ prebuilt responses.
If you’re interested in testing it out, you can download the module from my GitLab server here:
https://gitlab.realsoftgames.win/krazor/mod_llm_chat
Let me know if you have any feedback or ideas to improve it!
also dont forget to install ollama and install the default model with
ollama pull krith/meta-llama-3.2-1b-instruct-uncensored:IQ3_M
Self Hosted Gitlab
Want to test the module running on my WOTLK Server?
Website Registration
Game version: 3.3.5a
set realmlist realsoftgames.ddns.net
Support Link
https://discord.gg/7SJFb4hx
Beta Was this translation helpful? Give feedback.
All reactions