Absolutely lagging



  • Hey there!

    I have successfully ported vBulletin 5.3.5 forum users to NodeBB, but there is an issue.

    When I have imported them, the forum seems really just unresponsive if you have more than 50,000 users. Takes around 10-15 seconds to load the main page although other pages load under few seconds and sometimes instantly.

    Am I missing something? Also the database is MongoDB and I followed user-create from user.js to import my users.

    Regards.


  • Global Moderator

    Are you hosting yourself or through our service? If self-hosted, how many threads are you running?



  • Hi,

    I am hosting on my own dedicated server.
    I have 3 instances of NodeBB setup, but as far as logging tools go, from command htop the most CPU was being used from MongoDB and when I have take a look at mongodb instance, there is a lot of logs coming with "COMMAND" and CPU stays at 100%. I am running mongod with --quiet option, but I am unsure why this is happening.

    Regards.



  • In mongodb log I see a LOT of spam regarding COLLSCAN and mainly from COMMAND options, even though I have --quiet enabled.

    Is there any solution to this?



  • huh 6 days and still nobody knows. what a drag


  • Global Moderator

    What are the specs of your VPS? Have you looked in our docs for tips on how to cluster NodeBB processes behind a load balancer like nginx? Can you provide actual log readouts and screenshots so we can see what you're looking out instead of just trying to figure out what you're describing? We can help you but we need more info to work with.


  • GNU/Linux Admin

    @nullbyte It sounds like your indices have not been created for NodeBB.

    Please run the following three commands in the mongo command line interface:

    db.objects.createIndex({ _key: 1, score: -1 }, { background: true });
    db.objects.createIndex({ _key: 1, value: -1 }, { background: true, unique: true, sparse: true });
    db.objects.createIndex({ expireAt: 1 }, { expireAfterSeconds: 0, background: true });
    


  • @PitaJ

    Intel  2x Xeon E5-2630v3
    16c/32t - 2.4 GHz/3.2 GHz
    128GB DDR4 ECC 2133 MHz
    

    Server would always stay on 100% CPU due to MongoDB, I don't think it needs a load balancer if you are alone trying to load the website.

    @julian It works perfect now! Thank you!


  • GNU/Linux

    @julian After the last update to 1.11.2 we have been seeing one of our forum getting pretty slow. Each page load has slowed down a lot (the rest calls too).

    e.g https://chatrooms.talkwithstranger.com/api/topic/29181/tell-your-dreams-here/8

    It seems that indices are not being created properly. So when I run the 3 commands above, i get stuck on a lot of duplicate errors like mentioned here https://github.com/NodeBB/NodeBB/issues/5265 on line 2

    db.objects.createIndex({ _key: 1, value: -1 }, { background: true, unique: true, sparse: true });
    

    I've been able to delete many by using following but the duplicates keep coming.

    db.objects.aggregate([
      // discard selection criteria, You can remove "$match" section if you want
      { $match: { _key: "tid:38:bookmarks"}
      },
      { $group: {
        _id: {value: "$value"}, // can be grouped on multiple properties
        dups: { "$addToSet": "$_id" },
        count: { "$sum": 1 }
      }},
      { $match: {
        count: { "$gt": 1 }    // Duplicates considered as count greater than one
      }}
    ],
    {allowDiskUse: true}       // For faster processing if set is larger
    )               // You can display result until this and check duplicates
    .forEach(function(doc) {
        doc.dups.shift();      // First element skipped for deleting
        db.objects.remove({_id : {$in: doc.dups }});  // Delete remaining duplicates
    })
    

    I also noticed that when the forum is running the duplicate keys are being created for the same keys even when when they have been deleted.

    Result of following is below

    db.objects.createIndex({ expireAt: 1 }, { expireAfterSeconds: 0, background: true });
    
    { 
        "createdCollectionAutomatically" : false, 
        "numIndexesBefore" : 3.0, 
        "numIndexesAfter" : 3.0, 
        "note" : "all indexes already exist", 
        "ok" : 1.0
    }
    
    

    Bottom line. We're unable to create indices. Any help would be appreciated.

    exomarty created this issue in NodeBB/NodeBB

    closed duplicate key error index issue #5265


  • GNU/Linux Admin

    Are you turning off the forum, deleting the duplicates, and then re-creating the indices?


  • GNU/Linux

    @julian I turned off the forum and then started the re-indicing process. It doesn't create the duplicate for the deleted one but there are just too many unique keys with duplicate values that it doesn't seem possible to remove the duplicates manually. Any way i can update that script to delete duplicates in one go?


  • GNU/Linux

    @julian Do you think there could be any other reason why mongodb is this slow specs of the system are these.

    32 GB RAM
    8 vCPUs
    320 GB

    No replicas though


  • Global Moderator

    @fais3000 lack of indices will drastically cut into the performance of the database. Likely by orders of magnitude.


  • GNU/Linux

    @PitaJ Makes sense. Is there anyway i can speed up the process for finding duplicate keys like following

    { 
        "ok" : 0.0, 
        "errmsg" : "E11000 duplicate key error collection: nodebb.objects index: _key_1_value_-1 dup key: { : \"tid:2616:bookmarks\", : \"94099\" }", 
        "code" : 11000.0, 
        "codeName" : "DuplicateKey"
    }
    

    Problem is that it takes about 10 minutes each time before the indices command stops and finds out that there is a duplicate key. Or a way where i can ignore the indices that have already been created and continue where it left.


  • GNU/Linux Admin

    Right, it depends... Is there a pattern to the type of keys that are considered duplicates?


  • GNU/Linux

    @julian Unfortunately, lost patience and have created indexes like this (without unique parameter)

    db.objects.createIndex({ _key: 1, value: -1 }, { background: true, sparse: true });
    

    Load speed seems to be back.

    Regarding the pattern of duplicates, its all over the keys e.g the ones i found

    • cat last post
    • topic bookmarks
    • user email

Log in to reply
 

Suggested Topics

  • 1
  • 1
  • 3
  • 1
  • 2
| |