I think the issue can be addressed by adding multiple sitemaps to your robots.txt file (if you are sharing domains) i.e. Include links to your blog's and nodeBB's Sitemap.xml files, and Google will have no trouble finding your content.
Supported by Google, Ask, Bing, Yahoo; defined on sitemaps.org.
[absoluteURL] points to a Sitemap, Sitemap Index file or equivalent URL. The URL does not have to be on the same host as the robots.txt file. Multiple sitemap entries may exist. As non-group-member records, these are not tied to any specific user-agents and may be followed by all crawlers, provided it is not disallowed.