When one of my users executes a find he has the option to save the found set, in order to export these records later. To save the found set I use a technique that uses a variable: A script loops through all the records, puts the id of each record into a variable, and after the looping is done all the record ids are being put into a global field. The ids are separated by a ¶.
My aim is to make it possible that a user can perform multiple finds, select the desired records and add these records to a saved set of records. So, this is not about extending a "normal" found set, this is about extending a saved set of records.
Adding multiple found sets to the found set is actually not a problem and quite fast; however, if I simply add another found set to a saved set of records it easily happens that I have duplicate record ids in my global field. For example, saving first a set that consists of "¶15¶3¶1¶11¶" and then adding "¶3¶1¶28¶" results in "¶15¶3¶1¶11¶¶3¶1¶28¶". I've written a script that eliminates these duplicate record ids and the double returns, and it works fine on smaller sets, but as soon as I add 2000 to a set of 4000 record ids the script is getting way too slow.
So, one question is: Is there a simpler way to add record ids to a list of record ids than the one I use?
(If no, I guess I have to try to make the way I add a set to another one quicker. The script step I use contains
"¶" & Substitute ( UniqueValues ( adresses::gXId & $Ids ) ; "¶¶" ; "¶" ) & "¶"
to remove all the duplicate ids and double returns. UniqueValues is a custom function that might slow down the process, too.)
Many Thanks for your suggestions,
Redakteur Ressort Dokumentation
FOCUS Magazin Verlag GmbH
Tel.: 089 / 9250 2605
Fax: 089 / 9250 2832