Assuming the tables have identical fields
Export the single record as XML or an FMP file and immediately import to new table.???
Off the top of my head:
# on source layout/TO of DataTable
Set Variable [ $ID ; DataTable::primaryKey ]
Find Matching Records [ DataTable::primaryKey ]
Go to Layout [ History ( targetTO ) ]
Import [ from: DataTable ]
Set Field [ HistoryTable::id_dataTable ; $ID ]
Increment a Field each time an update is made. Then Transfer the update to the External file. Then the External file has a copy of all your updates or at least the fields in the record that you want to view. Keep the ones you want to keep, i.e. most recent, or last 3, etc. Delete the other history records. You get to play around with this to make it work.
1. Create a calculation field called Field Values in your Records table which concatenates all your field values together using a obscure text separator. Example using 3 pipe symbols. Field1 & "|||" & Field2 & "|||" & Field3... etc
2. Create a text field in your History table called Record Dump
3. Create a calculation field with number result in your Records table call Modification ID = Get( RecordID ) & "." & Get ( RecordModificationCount ), and Create a Modification ID number field in your History table
4. Create a 1 to 1 relationship between your Records table and your History table from Modification ID = Modification ID with ability to create new records
5. On your record editing layout place a portal of the visible edge of the layout (way to the right) based on the 1 to 1 relationship you created in step 4.
6. When your Trigger or Button script runs to transfer the latest history, have it go to the portal on the layout, go to last record (new record), set the Record Dump = Field Values.
7. The fields in your History table can be calculation fields that pull their corresponding values from the Record Dump.
Some methods are heavier on scripting, or calculations, or relationship schema. This method is a little heavier on the calculation side. Hope it helps.
[…] Some methods are heavier on scripting, or calculations, or relationship schema. […]
…and some were written by Rube Goldberg …
Is the solution running on a server?
I think wimdecorte provided a method with Perform script on server.
Basically how that works is this:
The client holds a record that he edits in cache, the server has a record that is before editing started.
An OnRecordCommit script trigger fires a Perform Script On Server routine passing any relevant id's of the current record so it can be identified / found. The script finds the record(s) and makes a duplicate. If all is fine the record is committed with the new changes.
Thank you all for your input,
I just tried Lance's solution, it works great, and erolst's solution, which works fine, too. And Lance, Thanks for reminding me of Get ( RecordModificationCount ). I had something like your suggestion in mind, but didn't think of this Get function.
I've come across another solution that works different:
- In Source table create a text field called Uuid.
- In History table create a text field called Uuid.
- Create Relationship Source::Uuid = History::Uuid. Allow creation of records via this relationship (on the right side): yes.
- In the History table all fields contain Looked-up values that get their values from the Source table through the relationship created in 3.
- Create and run script:
step 1: Set Variable $UUID = Get ( UUID )
step 2: Set Field Source::Uuid = $UUID
step 3: Commit Record
step 4: Set (via relationship) Field History::Uuid = $UUID
step 5: Commit Record
step 6: Set Field Source::Uuid = ""
step 7: Commit Record
This solution seems to have the advantage that when you add or delete a field you do not need to take care of any import scripts. Sure, it can happen that you forget to add a field in the History table, but you will not accidentally put a value into the wrong field.
Don't forget to also create a new record in history every time you create a new record in source...
Good point :-)
This method of Perform Script on Server works well. You could send the relevant fieldname/value pairs as script parameters and identify the fields that have changed via Get(ModifiedFields)... thats if you use a field-level audit trail approach, rather than record-level (where all fields in the record are duplicated regardless of whether every field has changed or not).