Audit Log

Discussion created by onefish on Sep 5, 2018
Latest reply on Dec 5, 2018 by jormond

I know there are various solutions available for tracking record changes, including "ultralog", Wim did one a couple years back, and a few plugins (I'm sure there's more). Not that there's anything wrong with any of these per se, but I wanted one that was abstracted, very easily integrated into other solutions and native (and I love a challenge!). So I set out to make a system myself.


The attached AuditLog file is what I've come up with. The "Database" file is an example of how it can be integrated into an existing and totally separate database file without having to make any new tables or fields. The two files are related and need to be in the same directory or you will have to change the paths. The Audit table and fields can also be created in the database file itself, there's only a few fields so not much extra work.


Now, as for being "abstracted", it is not 100% abstracted unfortunately. You have to list the table::field names for the ID field of each table that appears on a layout being logged. This is NOT all fields to be monitored, the algorithms take care of that, it is only 1 field for each table which contains the unique record identifier. This is necessary so that the records modified and recorded in log entries can be identified. This data is in a single global field in the Audit table (click the popover button).


There is a "Read Me" comment script that contains more information, but in short, the Audit script needs to be imported into the file being monitored and run with the following script triggers:

  • OnRecordLoad (parameter = 1)
  • OnRecordCommit (no parameter)

And you have to create a relationship between the Audit table and the tables being monitored. This is so that log records can be created using a modified "magickey" technique without leaving the current layout.


I originally set out to try and do this without script loops except for generating individual log records. I did this for setting several variables including a list of "Local_Table::Local_ID_field ~ value | Local_Table::Local_Field ~ value" data fields (non-data fields are excluded) on the layout in a single calculation. But it gets really complex to do this for related records as there can be multiple related tables, records and fields on the layout. I know theoretically how to do it in a single calculation but didn't want to spend the time on it (debugging Evaluate() is a pain). So it uses 3 cascading loops to build the list for related tables/records/fields. However it only loops through variables not records so it is very fast. That said, speed may be improved if necessary by building a single algorithm to do it.


Some things you need to know

  • I have not spent much time at all testing this. Please test fully before implementing in any production environment.
  • It uses the functions SortValues and UniqueValues which are not available in Filemaker versions less than 16. They are only used in a couple places and there are ways to achieve the same effect without them, but you'll have to do it if you want to use this on earlier versions.
  • For the same reason above, it is not suitable for runtime environments.
  • It does not use ExecuteSQL. That would have been too easy but not suitable when the client is querying its currently open record.
  • It uses a global variable switch $$NoAudit. Value 1 exits the script. You can set this at the beginning and clear at the end of other scripts when you don't want the Audit script to run. The script itself sets this switch to terminate an iteration of itself called via an early commit record script step and then clears it at the end. If the script is terminated before completing (esc endless loop during testing, etc) then you will have to run the included "NoAudit" script to clear the variable before Audit will run again.


OK, if you got to this point in the post you are very patient! Have a play with it and let me know what you think.