We actually don't want a live "connection" between the two programs. One of the issues is that the QB database is dog slow (it's an old version… we've had that discussion 27 times). For now, we want to separate the 2. I'm just pulling history data, so that we have a starting point in the new solution.
I have the data I need. I just need to make sure primary and foreign key fields get assigned when I bring it into FMP.
Understood. However, you don't need to have a 'Live' connection for all users. You can just set one workstation to do the transfers and they can be script driven just like the Export/Import option.
Another issue with FMBooks Connector is that we are on QB 2004, which isn't listed as supported. See previous post about old, slow version of QB.
I think this may be something that we look into later on (after we upgrade QB), for being able to pull items from FMP to create a QB invoice, but for now, this is not an option.
For now, I just need to work with the exported data that I have. I'm looking for any tips that people have to add the ID key fields and link them. I'm wondering if I could temporarily change my relationships to link to the fields that I have, like Company Name? Or make my foreign key field a lookup for the 2nd import?
I'm sure it must be a somewhat common practice to take exports from unknown sources and try to link them together.
I think you are on the right track when you mention relationships to link to the fields you already have. Although I probably would not want to "temporarily change" any existing relationships, for fear of breaking something. Instead, I would create several (clearly labeled, perhaps temporary) relationships based on text or other QB ID fields you may have. You may want a separate TOG just for this purpose. Use those relationships to lookup and/or replace data into the key fields you really need.
A few caveats:
1. Go slowly.
2. Check the number of related records you may have prior to bringing in the key fields. For instance, Find duplicates and/or Count (Child:ParentName) so that you can be more secure any text-based keys aren't duplicated.
3. Check after bringing in new keys that the data looks correct. For instance: compare subtotals of child records for each company: create calcs to check for problems such as child records without a parent: spot check records for critical companies, or those that showed as dupes earlier, etc.
4. Backup frequently while you are working.
Thanks for the tips. Right now, I have no data in my FMP solution, so there is nothing to mess up. I've just been playing with some sample data I entered, but I'm eager to get "real" data in… you know, to see how quick I can break things!
Data consistency is a mess. I'm not sure if that is the fault of QB or the human factor (probably both).
For instance, there is a "CUSTOMER" field and a "COMPANY" field. The CUSTOMER field is always populated, but the COMPANY field quite often has better formatting (most in CUSTOMER are all caps, for instance). And no, I don't just want to change it to title case. Sometimes we want what is in caps to stay that way, like "ABC Company" or something like that.
Just working with .csv or .txt files and examining them in Excel for now.
What I want to do is take the COMPANY field, unless it's empty, then take the CUSTOMER field for my CompanyName field. Similar stuff with CONTACT (usually has first and last name), but there are also sparsely populated FIRST and LAST name fields. I'll use those when I can, otherwise parse out the CONTACT field.
Would you recommend creating a temp FMP file to do this sort of data clean-up? Looping script, or auto-enter calcs and import the data? 6 of one, for just a 1-time data clean-up & dump?
Ahhh, "nothing to mess up" is a GREAT stage to be at!
If looking at your data in Excel is helpful but you really want to get rolling, why not use table view in FMP? In some ways, it's even more flexible than Excel: you can show related fields, add special fields to the header (if it's showing) for globals or just to highlight key data as you move from record to record, set up some simple scripts and buttons for finds and sorts, etc.
I understand how messy data can be. One thing I do is add temp fields and use them for Replace Field Contents calcs. That way, I can test the results of what I'm doing, then push those to "real" fields when I'm satisfied with the results. For instance, you might want to find all the contacts that have first and last names, and do a replace on a "tempname" field, then use a different calculated replace for a found set of the contacts that only have first name. When you've resolved all your different conditions with data that looks right, time to push data and move onto the other pieces of the puzzle.
Sorry, I don't think I can give a clear-cut list of procedures for something like this. I usually find scrubbing and linking data a pretty tedious project. The prospect of doing it a small number of times, in some ways, makes it more cumbersome. If you expected to do it frequently, it would make sense to try to automate it. But otherwise, it seems you just get the pattern down when you're all done anyway.
Some simple scripts will probably be useful, but if I'm not automating the cleanup for future use, I end up just using the Replace Field Contents step a lot. One problem there is that, done outside of script, the calc is volatile. So sometimes, right before performing the step, I copy the calc and paste it on a layout or someplace where I can reuse it, or examine it for issues. Do you have FMP Advanced? If so, don't overlook the DataViewer and Custom Functions.
HTH and best of luck!
Yes, of course… Table view. As for Excel… old habits die hard.
I do a lot of mailing/shipping list work, so I'm so used to popping .csv files open in Excel, if they don't need any real manipulation that requires FMP. That way, I don't have to save out another file right away, and I don't have a bunch of basically temporary files in my recent files list.
Sounds like we work in a similar way when it comes to data clean-up that is only happening once. Obviously, if this were going to be a common practice, I'd create a more automated workflow.
Thanks for your time!