"Export Field Contents" is not a server-compatible script step... so I think something else is going on.
The reason for the open storage vs secure is that I need to be able to manipulate the files afterwards.
That is not a good idea, only FM should touch those files, no external manipulation should be done on those files directly, even if it is from inside FM with any kind of script steps.
I realize that Export Field Contents isn't a server-compatible step. Using the external storage was a trick demonstrated at DevCon as a way to get a file up onto the Server machine. That's the issue right now - not being able to write to the folder.
The reason for the open storage is to be able to import the contents of the file into FMS. The web service that the data is coming from only offers the file for download. No way to hook into their service to just import the data.
Currently, the import is running at the client-level - user executes a script, it does the Insert From URL, exports the field contents to the local machine and then imports the data from the exported file into the hosted file. Each CSV file is around 4 megs and around 45,000 records per day. The Insert from URL, export and then import takes too long. I want to push it to a server-based event but there are too many non-server steps in the process right now.
If I can get the CSV file on to the Server machine I can cut down the time it takes to run.
Using the external storage was a trick demonstrated at DevCon as a way to get a file up onto the Server machine.
Can you expand on that? It goes everything that I've learned and know about RC. Would it require the RC folder to be shared through the OS?
No. No sharing.
Greg Lane (I think it was Greg) showed this as a bit of a throwaway at the end of his session. He did a drag and drop into a container, it put the file up on the server and did an import of the data. It was pretty slick.
It sounds like the example at Devcon was exactly this. The trick is to configure the external storage location to be inside the FM Server 'documents' folder, which is accessible from an Import script performed on the server. In the article example, they have set the external storage to be directly within the 'documents' folder. I would likely modify this to have the target be one or more folders inside of 'documents'. I am 99% sure FM server scripts can target sub-folders wihin 'documents'.
I am inserting here the discussion that I also started yesterday, perhaps it will better serve everybody to include it in this thread. See below.
After writing this yesterday, I discovered something that did not entirely solve my problem but it is definitely related to what was discussed at Devcon.
You must close the database in FMS and then open it as a local file. Then, you can use Manage Containers from the File-Manage menu to modify the paths of existing container base locations, or add others as you wish. I did this and it was partially helpful.
In Manage Containers, I added Get(DocumentsPath) but that did not work. Then I tried again, editing it to insert the absolute path for both a Windows or a Mac server, but that did not work either. When I insert something into the container field, I get the message you see in the screen shot below.
I checked the Folder permissions and they are identical to the permissions for the RC-Data_FMS folders.
Here is my original post. (I will close that discussion now)
I have a solution in which the users are placing a CSV text file into a container field, with the intention of a triggered scipt executing a subsequent Perform Script on Server script that will import the contents of the CSV file into one of the database's tables as new records.
Perform Script on Server does not allow Export Field Contents. Therefore, I am relying on the container field's external storage to place the CSV file into either the server's TemporaryPath or DocumentsPath. While debugging the script, I'm working with the latter (DocumentsPath) because it's visible to me.
The problem I'm having is how to define the external storage. It begins with [hosted location]/DatabaseFileName/ which would be OK if I could figure out how to navigate backward (upward) in the hierarchy to get to the Documents folder located at FileMaker Server/Data/Documents
Because this is a cross-platform solution that will run on both Mac and Win Servers, I suppose I could use two different paths,
one that begins with filewin:/C:Program Files/FileMaker/FileMaker Server
and the other one beginning with filemac:/Macintosh HD/Library/FileMaker Server
but I prefer to use relative paths that I can specify in my script with variables.
Right now, I have the CSV file being placed into the default location that I am given with Open Storage. FileMaker Pro is creating the correct file in the location within the usual RC_Data_FMS directory, however when the script runs, it deosn't import anything. The server log is reporting Error 100 which means it isn't able to locate the CSV file, but I can't seem to control where it's looking.
What I would prefer to do is rely on the Server's ability to grab things from the FileMaker Server/Data/Documents directory, but I can't figure out how to compose the external storage to cause the CSV to be put in that location.
I'd be content with the Documents directory and forget about using the TemporaryPath if I could get it to work at all.
The thing about defining those paths in the Storage pane of the field definition is that the path somehow inherits the path that exists on the user's local computer instead of the Server's path to its own Documents folder. In other programming systems, you can use ../ to indicate a hierarchy above the current one, but I can't seem to make that work in FileMaker.
thanks in advance.
Looks to me like you are trying to manually move the contents of a container field through the operating system to another container field. Sorry if that's not what you mean.
Do you realize you can use Set Field to move the contents from one container to another? (Me slapping forehead, of course you do. But, did you think of using it here)? I would not export and import the container contents.
I think you have already set the Destination field to use (secure?) external storage.
I suggest you use Set Field ["destination_field"; "source_field"].
FMS will copy the file without you having to specify anything on the hard drive. You can then clear the original container field to get rid of the original copy of the file.
I've done this so I could virus scan uploaded files, stored in a quarantined location. Then moved the 'clean files' to the final storage container where it can be downloaded.
I understand what you mean, but that isn't what I was trying to accomplish. I need to get the file OUT of the container field altogether, onto the hard drive of the server so that FM can import it as records.
Thanks to Vince Mennano, I have determined that the solution he presented at Devcon will do the trick nicely. At first, I was trying to do it without creating an alternate database location, but I now see that it is necessary because the FMS Admin Console allows you to specify the precise "DOCUMENTS" location for your alternate base locations, which it doesn't allow for the standard base location.
Let me recommend Vince's excellent blog post that explains it all clearly:
We seem to be working on the same thing. Vince's posting has been a great help.
I've been able to configure everything to get the file up onto the FileMaker Server machine. The only part left is to get the file to import. I've been getting a script error on that part. The Server log says the script completes, but when I look at the table there are no new records. I'm close, but not qutie yet.
Of course I can't see your server-side script, David, but here are some things to check:
make sure that in your OnOpen script, if there is one, that you detect the server as a user and exit that script before it goes somwhere that the server-user wouldn't go.
Check your server script with the option for "Server" in your Script editor to make sure there are no steps that are not "legal" for the server.
Double check the import options, particularly the file to be imported. I always define it as a variable rather than expect the absolute path to work.
I had a similar problem and found that one stray underscore character in the path was defeating it. Fixed that and now it's fine.
Detecting server-based - works fine
All legal steps
File path is a variable
What's happening for me now is that if I run the script as a server schedule, it fails. If I invoke it as a Perform Script on Server event, it works.
server log will tell you the error number
An update on my Server-based importing routine...
Finally got the Server-based import to work properly. I was using the Beezwax demo file as a guideline. The only thing that I had done differently was using one less level of folders for the external container files. The placement of the file always worked properly but I could never import the CSV file despite the path looking correct. I always got an error in the Server logs that the file could not be opened. This morning I added the extra layer back in and it worked.
The next step was to see what the performance difference was going to be with the system running as a Perform Script on Server event (PSoS) vs. the old system where the CSV file was inserted from the web service to the container field, exported to the Temporary folder, and then imported into the hosted file. There is also a Replace Field Contents step after the import to set a flag field.
The CSV file containes about 41,000 records and 5 fields.
The old method was took about 260 seconds. Ran it several times and all of the results were pretty close to that each time.
The PSoS method brought the import down to 65 seconds. That's pretty significant.
Before the client got their FMS machine set up the database was run off of a single laptop - no Server involved. The import there usually ran around 30-45 seconds depending on the number of records being imported. The Replace Field Contents is slowest part of the process.
I've been doing a lot of imports and then resetting the system for the next run. Using PSoS to delete 41,000 records is a LOT faster than doing it locally.
I wish FMS logs were a little more detailed for debugging Server-based scripts.