At first glance, it looks like the syntax you have for the "Let" statement in field "id_Job" is incorrectly formatted.
I am assuming that you are trying to combine fields "id_Line" and "id" together. If so, then the correct format should be something like this:
Let ( [
//var1 = expression1
~trigger = id_Line & " - " & Detection_Line_JOB_active::id ] ;
) //end Let
The syntax is correct. What it does is to recalculate the field id_Job when id_Line is modified. This is why I call the variable ~trigger. The Let statement I'm using is a proven technique. The problem here is that for any reason even if the id_Line field has been modified, it does not trigger the calculation.
You are right, but the fact is that I am using id_Job as a key for establishing a relationship "on the left side" of the graph. Therefore it can not be unstored. What I am using with the auto-enter calc (replaces existing value) is to convert a calculation where fields from other tables are involved - then unstored -, to a stored field that can be indexed and used in relationship. This is thanks to the fact that auto-enter calcs are always stored...
The Let statement is just a way to force recalc of the field when the id_Line changes.
In that case, your problem may be further back in the dependencies of that auto-entered calc.
If the field "id_Line" is being populated by a "script (external to FM)" being executed by a PC connected to a sensor on a production line, then it may be that you are needing some sort of "commit record" action in order to trigger your auto-entered calc.
If the PC running the script "external to FM" cannot perform a commit function, then perhaps the field "id_Line" could benefit from a script trigger that commits the record and therefore triggers your auto-entered calc
Commit, thats what I said from the very beginning when I started this discussion. What is confusing me is that when looking at the list layout of records in Detection table, I see all records with the id_Line correctly populated. Then I am assuming that all them are committed. But could be that those records are not yet committed, in spite that they show up in a list view?
It may be that "records are well created", but not by FileMaker, but by "a script (external to FM)".
Therefore, it could be that they are being treated by FileMaker more like an "import" than like a "set field", and so your PCs may not be triggering a "record commit" in FileMaker.
So, while the records are there, they have not triggered your auto-entered calc. You may need to create a layout trigger that executes "OnRecordLoad" or something, to make sure you "commit" each record.
Only after a commit can you be sure that your auto-entred calc will be fired. During an import, for example, you can choose to perform auto-entered or not. I think that a similar situation is happening here with your ODBC function.
Thanks Julio, I understand your comparison with the import issue. The fact is that I do not use any layout based on Detection table. This table goes behind the scene, an I am controlling the record creation from another context. So, how can I implement a layout trigger in this scenario?
There are several ways to "skin this cat":
- The ideal way would be if you could somehow trigger a Perform Script on Server command that would commit the record each time your ODBC connection creates a new record, but that puts the responsibility back on these PCs "external to FM" to somehow send that PSoS call to FM Server. This would require some research. I will look around for external methods of triggering Perform Script on Server and let you know what I find.
- Another way is with a Server-Side script that executes every 1 minute (or however frequently you choose) and loops thru newly created records in the Detection table and either commits those records or simply sets that "id_Job" field with the value it requires.
- A third way is to employ a "FileMaker Robot", which is a dedicated FM client with an active session of your database and which uses a list view layout of "Detection table" to execute an "OnRecordLoad" script. This may be necessary in the beginning just for troubleshooting purposes. For example, I am sure that if you do a replace field on the "Id_Line" field right now, your "Id_Job" field would be populated instantly.
If I can't find a PSoS solution for you, let me know which other way you decide to go and how it worked out for you,
Thanks a lot for you comments. It's being very interesting. The Detection table is being used to gather records from all the lines. In other words all the sensors are writing on the same table. So, every second can be multiple records created. The dynamic of a single line is, at most, of one detection per two seconds (speed < 30 units/s). The Detection table is used for several purposes. One of them is from Job context. From that context I am looking (OnTImer 1 second) at the Detection table to count records and display with such information a variable to show up the real speed of the line. So I need every second to commit records...
You are totally right if I do a replace field on the Id_Line the Id_job field populates instantly... But how to accomplish it under the hood i the challenge.
I am thinking the avoid the use of the id_Job field. So now to get the records in Detection for a certain Job I am using the relation JOB_Detection. However I am thinking to replace that relationship by JOB_Line_Detection. In this case I only need the Id_Line in Detection, which is already there...
What is the calculation relationship? only based on id_Line?
A commit is not needed to evaluate a calculation, it evaluates based on dependencies.
Anyway it looks like the records are committed, otherwise you won't see any data.
If it only has to get a value from other table, maybe try changing it to a lookup and see if it makes a difference.
Looks like the calculation engine is failing to do it's job when data is flown in.
Is Filemaker set up as an ODBC source for other machines? ODBC driver issues maybe?
OK, the naming in initial description is different than what the graph shows.
Could you explain it in terms of the graph picture which field is actually is looked up, from which table?
Middle or far right?
How do you determine the active job?
I assume first a record is created in JOBS, then machines run and collect data for this particular JOB.
And last are all those records committed?
Yes, naming is different in the graph than in the initial description. Lest change naming to the one showed in the graph from now on.
The Layout » DETECCION::id_Parte is looked up taking the value of Layout » Deteccion_Linea_iniciado::id.
The active job is determined by matching a key in the middle table (Layout » Detection:_LINEA::key_iniciado) with the Layout » Detection_Linea_PARTE_iniciado::estado. (Equal)
Yes, first a PARTE is created. Only one active Parte per Linea is allowed. Then the sensors on each line write on the Deteccion table, setting the id_Line.