1 Reply Latest reply on Aug 16, 2009 3:04 AM by disabled_menno

    Scraping Data from the internet



      Scraping Data from the internet


      What I am wondering is, is it possible to have filemaker pro 10, automatically log into a site (username and password) and with this new information, extract certain data that I want?


      I am trying to have my database go through a number of records automatically, for people (each person is a record) who I have permission for, login names and passwords, to log into their accounts at this one particular web site, get certain activity, populate my database fields with this information from the web within the persons record, then if certain conditions are met, automatically send out one of a number of emails depending on what conditions were met, log the email transaction in the persons record and then move on to the next record and repeat until all names in the database have gone through this information updating procedure? 


      Just curious if this is possible...it's the logging in part I am confused about.  Anyone know if this can be done and how to do it?


      Thanks in advance- 

        • 1. Re: Scraping Data from the internet

          If the webviewer is not an option because the thing you're trying to do must be done unattended then you could use plugins like the troi-url and troi-text or the 360works-scriptmaster. With these plugins installed you can have filemaker visit websites and/or ftp-servers to download or to upload data.

          Another possible solution is to use cUrl (which is integrated in MacOSX 10.5.x and must be installed in windows), you use actually the commandline of your OS to send a certain command to cUrl which will upload or download to or from a website.

          You have a bit of reading to do though before you have this running. I have not found any ready to use sulotions only a lot of tips. To use the plugins you can download trails their websites, Troi: http://www.troi.com/, 360-works: http://www.360works.com/products/ and to use cUrl : http://curl.haxx.se/


          regards, Menno