Thanks @julian @psychobunny!
It makes sense not to rely on the javascript loaded comments for SEO.
I think the issue can be addressed by adding multiple sitemaps to your robots.txt file (if you are sharing domains) i.e. Include links to your blog's and nodeBB's Sitemap.xml files, and Google will have no trouble finding your content.
PS. Getting started with nodeBB has been super cool! Thanks for the fantastic work @julian @psychobunny!
How Google Interprets the robots.txt Specification | Google Search Central | Documentation | Google for Developers
Learn specific details about the different robots.txt file rules and how Google interprets the robots.txt specification.
Google for Developers (developers.google.com)
sitemap
Supported by Google, Ask, Bing, Yahoo; defined on sitemaps.org.
Usage:
sitemap: [absoluteURL]
[absoluteURL] points to a Sitemap, Sitemap Index file or equivalent URL. The URL does not have to be on the same host as the robots.txt file. Multiple sitemap entries may exist. As non-group-member records, these are not tied to any specific user-agents and may be followed by all crawlers, provided it is not disallowed.