mmmm, I guess I am not missing something obvious - or it is so obvious it's not worth pointimg out!
What is it that the scripts do (how many temp files are created, how big are they), any plugins involved?
Perhaps you have hit on a genuine "bug". Does stopping the server make any difference ?
Just curious ... what happens if you don't clear the temp folder ? does it lock up ?
> My routines use the temporary folder extensively and I am having to go in and manually delete the multiple flder that get created in the temp folder.
The scripts manage stock levels across different Amazon marketplaces and Ebay. The files created are text files for uploading to amazon/Ebay and downloaded text files from EBay using plugings generated from ScriptMaster (4.134)
In 24 Hours there are around 1340 scripts runcreating 205 folders in the temp folder with a tital of 208 files and 28.2Mb. Files individually go from about 60 bytes to about 800kb.
The number and size of files will vary a little but will be around this level.
Thanks for our interest
stopping and restarting the server service makes no difference, the computer has even been restarted and the temp folder was not cleared.
I suspect that when the number of folders in the temp folder got to around 10,000 then the server performance suffered - but cannot prove this as there are many variablesthat are outwith my control that could impact perfrmance. Performance certainly seemed to improve when the temp folder was cleared out.
FileMaker also says "the FileMaker Temporary files get stored on the startup volume even if the files are hosted from a different drive"
see if Windows allows you to redirect the Temp folder to a larger drive
on a Mac, I have used Applescript to clear the Temp folder. Perhaps a batch file could help
you can post the problem here:
> stopping and restarting the server service makes no difference, the computer has even been restarted and the temp folder was not cleared.
Given the sheer number of files I would take explicit control over the cleanup instead of relying on FMS and the OS to do it. A little more work for you but then you own the whole process.
Why not use the path name to export an empty record to delete the original.
Set Variable [ $ExportPath; Value:Case(
Get ( SystemPlatform ) = 1; "filemac:";
Get ( SystemPlatform ) = -2; "filewin:") &
Get ( TemporaryPath )&
Export Field Contents DeleteTemp::picture; “$ExportPath”
Show All Records
Show Omitted Only
February 25, 2013 8:07:38 DeleteTemp.fp7 - Download and Email Image -1-
I wonder if this is somethign that you have considered, you mention that you have to manually empty the temp folder, is this something that could not be scripted?
Have a look to confirm, the Get ( TemporaryPath ) returns a very specific temp pathe location, for both Windows and Mac. Could you not create a step, that via a batch file is run (incrementally if preferred) that performs a purge of every folder that begins S<n> within the <path>\Temp\ location?
Not 100% on the command, but I think it would be something like rmdir /s /q "FolderPath"
had expected (hoped) that by using the temporary folder this would be cleared automatically - otherwise I would use the Documents folder!
Disk space is not the problem - but I suspect the number of folders that get created slows things down when it gets into the 10,000s!
May have to look at that - but would also have to filter to only delete the folders up to the day before running - potentially a new temporary 'S' folder can be created every minute!.
I know nothing about OS level commands on Windows, but if anyone can help......
My knowledge of Windows commands are too not the best that they could be, but there is a website that I hold a lot of faith in, http://ss64.com/nt/
Having a quick look through, the main area that I would start from are 'FORFILES'; http://ss64.com/nt/forfiles.html, there is an exapmple that gives "delete file if over 5 days old". If you wanted to get a bit more control, ROBOCOPY has a /MOV; http://ss64.com/nt/robocopy.html. This also has /MINLAD (Last Access Date). With ROBOCOPY you can also specify folders starting with S<n>.
Though these are not perfect, and none of them are specific enough for time, you could quite easily pass a variable date through to the command. With the ROBOCOPY method, you would have to purge another folder, where-ever you move the older items to, but you could be hapy in the knowledge that these are old files and not longer needed.
Could this be run daily and still fit keep the Temporary folder managed? or are too many files being created on a daily basis and need to be managed to a time level?
Will have a look at those pages, need to find a base that will delete all folders and enclosed files within the temp folder that were created more than x days ago.
Could certainly be run on a daily basis - the files created are not required within 10 mins of being created!