How does redis work is it scalable



  • Hello NodeBB,

    I wanted to know, I've just read the wikipedia page about Redis (I understood that NodeBB is using this technology to store the data).

    But since the datas are stored into the RAM, what happen on small servers (less than 1 GB of RAM), it will be fulled really quick isn't it ?

    Do you think that NodeBB would be able to handle a big board forum ? 🙂

    Sorry for all of theses questions, I'm trying to learn more about the web of tomorrow ^^

    Kind regards.


  • Staff Admin

    Hi,

    You are right Redis stores all data in RAM that's why it is really fast. As for Ram filling up quick that depends on how busy your forum is and how many posts you have. For example on this forum we have ~350 users and ~1k posts and redis is using 7.50Mb right now.

    If you are going to have a really large forum you will obviously need more ram. We are also considering adding a disk based db backend so if people are too worried about ram usage they can just use the disk based one.



  • Thanks you for your reply.

    Adding a disk backend would be a really great alternative yes 🙂

    And do you have an idea on how much RAM is needed for a 2 000 000 posts forum with 400 000 users ? 🙂


  • Staff Admin

    It is hard to tell as we haven't tested with that many posts, but a simple interpolation from this forum would give a rough value. This forum has 1k posts for 7.50 mb so if you have 1 million posts without any new users you would need ~7.5 gigs of ram at least. Obviously this is a rough estimate and doesn't take into account any redis overhead.

    We will create some test data to see how much ram we consume and maybe make a table/graph out of it. That should be useful for everyone.


  • Staff Admin

    I have run some tests to check the memory usage.

    The redis INFO command was reporting 811kb at the beginning since I was running the tests on db1. An empty NodeBB installation starts with roughly 405kb of memory usage.

    start 811.44 kb

    **Users **
    100 users 1.04M diff (254kb) per user = 2.54kb
    1000 users 3.61M diff( 2,885kb) per user = 2.885kb
    10000 users 29.88M diff( 29,786kb) per user = 2.97kb

    Posts (500 chars each)
    336 posts 30.71M diff(849kb) per post = 2.52kb
    996 posts 32.70M diff(2,887kb) per post = 2.89kb
    9995 posts 54.39M diff (25,098kb) per post = 2.51kb

    The memory usage per user is roughly 3kbs at 10,000 users. This value goes up as we have other structures that grow as the user count grows.

    The memory usage per post is about 2.5kbs per post, keep in mind the posts I created are all 500 characters long.

    So for 2,000,000 posts and 400,000 users :

    2,000,000 x 2.5kb + 400,000 x 3kb = 5.9gb

    So theoretically 6gb should be enough.

    Hope that helps.



  • ^ Wanted to like or heart your post. 😉



  • ^ Wanted to like or heart your post. 😉

    The like button here is the favourite 0 star button beside the reply button. I will like and you will see it.


Log in to reply
 

Suggested Topics

  • 2
  • 19
  • 2
  • 3
  • 11
| |