I have somewhere in the range of 5-10GB of text files saved to disk. This includes, notably, a large corpora of documents intended primarily for use in NLP applications. One feature that may be worth noting about these documents is that many consist solely of unique words and misspellings, comprising a vast set of strings, over a search space that is somewhat resistant to compression (lots of entropy).
Could having large amounts of text data on one's MBP noticeably degrade Spotlight performance? It has been running unusually slow lately, and I'm unsure whether the issue is related to memory management (I have 8GB) or simply the size or characteristics of the search space.
Could having large amounts of text data on one's MBP noticeably degrade Spotlight performance? It has been running unusually slow lately, and I'm unsure whether the issue is related to memory management (I have 8GB) or simply the size or characteristics of the search space.