So, when performing a search in a script, there are three instances for the matching found set - a number of records, one record, or no records. It is the last option that I'm looking for some guidance. When the record set is a subset (FoundCount<TotalRecordCount), that's fine. When it's a single record (FoundCount=1) works easily enough. But when no records are found, I find one of two results: no records or all records. When performing functions against a found set, it makes no sense to forge ahead when there are no records, and it can be disastrous when all records are returned.
So, in instances where no records would be found and all or none are returned, what's the best way to trap that? FoundCount=0 works only if no records are returned, so does LastError=401. Is one better than the other (apologies if the pseudocode is bothersome)? If all records are returned, do I resort to FoundCount<TotalRecordCount, or is there a feature I'm missing that would be a better option? Currently, I'm using a combination of these formats, but I have to wonder if there isn't a better option out there.
Thanks in advance!