Hi,
Given that those parameters control the amount of memory available to the Validator, it’s possible for them to negatively impact performance in instances where the data being processed requires a lot to be retained in-memory.
For small studies this is generally a non-issue, but in the case of large studies where datasets that are looked up against as part of cross-dataset validations are quite big, there can be problems. Typically the Validator will just give up, and throw an OutOfMemoryException to tell you that the program cannot continue without more memory available to it. However, in other cases it’s possible for the Validator to enter a sort of “snails pace” state, where it will continue validating, but the exchange required to free up memory after each new record is processed makes the whole session take many, many hours.
We’re working on finding a way to automatically address that second issue, though if you’re monitoring the process using the desktop client you can typically tell when it’s entered this state by judging how fast the progress meter is going.
In any case, those parameters should work fine when running on a server of that size, as at most the Validator will use 1/12 of your available RAM. If the server has a lot of other processes running on it, however, it’s possible that they use up your memory to the point where it’s not possible to allocate the default amount to the Validator.
Regards,
Tim