It won't advance until you resolve the validation issue. You can test for failure and then revert if it fails and then move on.
Set Error Capture On
Set Field ( Field ; Value )
If ( Get ( LastError ) > 0 )
Set Error Capture Off
Go To Next Record (Exit Last)
You could also add a test to determine that the user is authorized based on the current user's permission group, but set the script to run with Full Access, which might help.
Taylor's rearrangement of the error captures should help as well. It's usually most realiable to set error capture the way you want it immediately before the step that may generate the error -- in this case, inside the loop.
Sounds like Validation is the wrong option for the field. What problem are you trying to solve by using Validation when you don't actually need it on thousands of records?
Thanks all for your ideas. I'm using the script to "import" data, using Set Field rather than Import to allow for some calculated clean-up of rather dirty data. But the calculations cannot clean up the data in every case, and I want the dirty data brought in even if it cannot be cleaned, so users can manually clean up the data as necessary rather than having to figure out what's missing from the source file somehow. So e.g. reverting in case of bad data is not a good solution in this situation.
I'll share what I ended up doing just in case someone else has this problem... I started my script with setting a local variable, $skip_validation, to true. Then I changed the validation calculations for the fields to $skip_validation or (original validation calc). That allowed me to effectively disable (or really, override) validation while the script was running, after which the local variable would stop existing and validation would be re-enabled. (Error capture was not the way to go.)