-
-
Notifications
You must be signed in to change notification settings - Fork 148
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Ask about setup Artalk with redis cache #705
Comments
If you enable the built-in cache, it uses BigCache (an upstream package) as the caching mechanism. It seems that due to its design and Go's handling of memory scheduling and garbage collection, it may appear to have relatively high memory usage. |
Yes, I came across that topic before In this topic, I just want to ask if redis configuration is correct? Because when I use it, I don't feel the effectiveness of caching :D I even have the feeling that cache slows down the system more |
Hello, I'm sorry I missed part of your feedback! Regarding the speed issue with Redis, you might want to consider switching the TCP connection to a Unix socket to see if that helps. Artalk uses upstream packages from https://github.com/eko/gocache and https://github.com/redis/go-redis for its caching. Currently, Artalk is using go-redis version 9.2.1, but from what I've seen in their latest releases, there doesn't seem to be any significant performance improvements. 🤔 I'm not so sure about this issue; it still requires further testing and tracking. |
Yes, I will go back to the no-cache default, the default performance is great enough in my opinion |
I want to test a bit of redis cache configuration on Artalk, specifically my docker configuration is as follows
Configuration inside Artalk
Next, restart docker
According to the notification that appears, everything seems to work normally
On the default builtin cache I see that it uses a lot of RAM, mostly 300-400MB, on redis, it uses very little, only 4-5MB
Is this parameter correct, or have I configured it wrong?
The text was updated successfully, but these errors were encountered: