That's a common issue of no-context. It has to do with FMP not having a Current Window with the found set. Once you switch to the other layout you've lost the context needed to use that found set.
Two ways to handle this.
If you don't need to post process the import found set then you don't need to be on the import destination table's layout.
1. Don't switch layouts. The import script step does NOT require you be on a destination TO layout to work. You are actually adding records to a Table not the current layouts TO.
If you need to post process the import found set.
2. Create a new window before switching layouts to do the import. This leaves an open window with the current found set that FMP should use as the context for the import. Note: if this is from different file you may need to open a window in that File on a layout with that Tables found set in order to establish the correct context for the import.
Another approach which is more complex in some ways, but assured of having the correct context every time, is to export the found set to a FileMaker file using the export records script step, opening the file on completion, and importing from that new file.
Caveat: You didn't indicate if this is a user/client-run process or a server-side routine. Server-side scripts are less friendly with the above process, but any client-based routine should allow it if the user's account has permission to export from the file. If this is a matter of you personally cleaning up data, the expor then import process avoids the window context issue completely.
Sorry, got a phone call and hit save and that sent the unfinished message.
The problem to repeat and finish, One script one window one file two tables.
Generate a found set in table A and import into table B.
Not all that difficult or complex and can be done in 4 script steps. No
problems with windows, etc.
Go to table a layout
Do a find
Go to table b layout
import that table from this file
Hope this explains it better. Works fine in a clean new file but not in
this old broken down 8 Gig monster.
One interesting gotcha is using a development file copy of the original.
The import file name will be different.
I wanted to find out if anyone else is having import problems with
Filemaker 12 upgrade from 10 to 11 to 12. I inherited a 2009 database and
it shows up as damaged using Advanced.
The import routine even after I've simplified and rewritten as above wants
to import 50,000 records rather than the 6 that were found.
I think there is damage in one or both of the tables causing the error.
I am going to rewrite a manual intake of the data rather than use import.
Any commercial job should avoid import as much as possible when working
with tables in the same file. And manual means you can control and parse
the data when transferred, far superior.
This must be a FM Pro/Advanced hosted ( or local ) file in order to import into a table in the same file. The import will fail when put up on FM Server and hosted.
This is usually where I would use FileMaker's record creation via a Relationship. And set the fields to a "Related Record". Since I have it set up to have no related records in the target table...FileMaker creates the new record for me. Set Field through all the fields I need to set and then commit. That saves all the fields in the record in one transaction...as Todd Geist has shown can be very useful when needing to roll back updates.
Often I will also combine this with a Virtual List. Compile the records I want moved into a Temporary Virtual List, and then generate the records from there. That way I ensure that there is only data for those records available to move.
Hmm, I have inherited an old file converted to 12 which imports records in one table into another table in the same file. One does need to have an account with correct privileges to do so.
However, in one instance it randomly imports every record in that table rather than just the 12 in the found set. Perhaps this is being caused by what you suggest.
Do you have a pointer to a faq, etc. that would document this so I can show it to a client as proof that it is not my fault, which I know it isn't.
I resolved this by looping through the found set, putting the relevent data into variables, creating a new record and setting the appropriate fields to the variables. Kind of old fashion and much like how I used to work in FoxBase (yuck!).
Using export and import in a script step is not always reliable and in some cases might be considered illegal.
Answer: I decided that the import script step was unreliable due to corruption within the script. I could write a test script mimicing the steps and it would import ok. But I don't trust import that much and not being a lazy developer I scripted the transfer of data from one table to another using a loop, variables and new record with set field. The old fashioned way.
One benefit is that the actual values, fields, etc. being transfered are visible in the script. Using the import step can be a disaster or troublesome. And, in this case I found that the import setup assigned a user id to the wrong field, probably why other areas of the database are experiencing problems.
As an aside there are many beginner level functions that experienced users should avoid such as Relookup, Replace all values in a field, and others. Then there are transactions which few Filemaker scripters handle correctly.
So, I just tested importing from one table to another on FM Server, and it worked fine. With that, I'm not sure if that is something that works in 12 but not in previous versions, or if there was some specific aspect that didn't work on server before.
That aside, I did find this reference to importing a found set of records and the file reference itself. I don't know the particulars of your set up...but maybe this is one cause of it. http://help.filemaker.com/app/answers/detail/a_id/325/kw/records%20from%20another%20table
On the larger scale, moving away from the import probably isn't a bad thing. If for no other reason than the control and ability to error trap the actual problem. Which we know Import Records doesn't provide much info about. There are so many variable that could make an import fail, I also find it unreliable. We have had scripts in the past that worked fine for years, all of a sudden start exhibiting problems and errors. This even from file that I KNOW were not touched or altered, short of simple data entry.
That refers to Filemaker 7 but the details sound like what I am working with, an old file first created in 2009 and migrated through the various upgrades and I am certain NOT following proper procedures.
On top of that, it is a live development database and it has been moved to a server farm with a T1 and then to another website host using Comcast (who is a FMP beginner and had all kinds of problems) and then from there to another and now in house.
Not bad for an 8GB file... OOpss...
I can see how live development might crunch into the issues in the faq although the file pointers all say something like file:filename since it is on the server.
But it certainly is one good reason not to be lazy when transfering records from one table to another. But then, why are we transfering data?
Exporting a found set of records to a file in FM-format has never been unreliable in any instance I have seen, nor has importing from that exported file, unless you don't give the script or users permission to do so. (Is that what you mean by it being illegal?)
I inherited a hosted file that began life on a T1 and then was moved to a server farm and used Comcast for connection. I had it moved in house and use Filemaker Server 12. While on the T1 the users describe it as stable but slow. When transferred to the server farm and using Comcast cable, the file began to act up.
The existing script finds a set of records and the imports those records into another table in the file. Sometimes the script would import all records in the first table even though a small selection had been found.
Per your reference I traced down to the import step and sure enough it was acting just as the faq stated. I believe this was caused by moving the file from one host to another as described in the faq.
So, we should assume that the script step Import can become unreliable when used for two internal files.
The best method is a manual transfer of data using a loop.
The next hearbreak is that the scripter rewrote the scripts and import routine half a dozen or more times. The import should have been a perform script needing only one debug or repair.
The file I am having trouble with is on Filemaker Server and was originally a 9 or 10 file that was upgraded through each version until 12.
The problem is, it appears, is that the file was moved from a server using a T1 to a remote server hosted by a web site host and no Filemaker experience and then to a server in house.
The import from FileA:TableB into FileA:TableC fails and instead of importing the found set of TableB imports all records in TableB much like the problem reported in the faq.
It is possible that inexperienced people did the updates and proper procedures were not followed. Users report that the file functioned admirably on the T1 (where much of the development occurred) but when moved to the web site host using Comcast Cable many problems began to occur.
This AM I was looking at a script I had written a year ago to import Safari Bookmarks and then import that table into another table after cleanup. I realised I had solve this issue back then before I knew it was an issue. Then I was only concerned in making sure the script worked if the file was moved or transfered to Go, etc. Hmm, that is the current issue.
So, here is the not-bug tested script but one that I think totally solves this issue and covers anyone backside.
set variable $$_filepath to get(filepath) <--isolating the current location of the file
import <-- replace the manually entered file name with $$_filepath
Adding the first step to your startup script eliminates having to insert it in every script doing a inside my file table to table import.