If you want all field values written out, clear the “Group by” option.
Thank you, erolst!
I am using Summary Fields to calculate total number of "species collected" and "species tested" by Year, State, County, Week Number, and Arbovirus. The Summary field, "Collected_Summary", adds up the number of Collected. Similarly, the Summary field, "Tested_Summary", adds up the number of Tested.
If I remove the "Group By" option, I do not have the option of listing the two Summary fields. So, removing the "Group By" option and just using the non-Summary fields, Collected and Tested, the following result is obtained (See Screenshot below). In the highlighted areas, you'll notice that, for example, species 1239 is reported on two separate lines. What I need to see is species 1239 reported on just one line, showing that 100 individuals were collected in that year, state, county, week, and arbovirus.
Do you have any ideas on how to accomplish that?
These aren't technically "blank" as much as the spreadsheet doesn't automatically merge the rows for any given value in the column.
1. you can manually do that in Excel (after export and import) or create the summary totals later.
2. you can create Spreadsheet ML (xml), but to create merge rows (or columns) is pretty complex. creating summaries, not as much, but still complex.
3. You might consider Virtual list, though I don't have details on how this would help.
In the highlighted areas, you'll notice that, for example, species 1239 is reported on two separate lines. What I need to see is species 1239 reported on just one line, showing that 100 individuals were collected in that year, state, county, week, and arbovirus.
Correct me if I'm wrong, but doesn't line 51 have that same combination of attributes?
Anyway, adding to Beverly's suggestions: condense each group and its aggregates into one record and export these.
Here's how that works:
• create two new number fields to hold the desired summary values
• create a summary field, names, say, sCountAll, that counts any non-empty field (preferably the primary key)
• create a calculation field, result type text, named, say, cSorter, as
Substitute ( List ( year ; state ; county ; week ; arbovirus ) ; ¶ ; "|" )
Write a script:
# [ find records to summarize and export ]
Sort Records [ YourTable::cSorter ]
Go to Record [ first ]
Set Variable [ $groupSize ; GetSummary ( YourTable::sCountAll ; cSorter ) ]
Set Variable [ $isLastGroup ; Get ( RecordNumber ) + $groupSize - 1 = Get ( FoundCount ) ]
Set Field [ YourTable::countCollectedForExport ; GetSummary ( YourTable::Collected_Summary ; cSorter ) ]
Set Field [ YourTable::countTestedForExport ; GetSummary ( YourTable::Tested_Summary ; cSorter ) ]
Go to Record [ next ]
Omit Multiple Records [ $groupSize - 1 ]
Exit Loop if [ $isLastGroup ]
Export Records [ including fields countCollectedForExport and countTestedForExport ]
will export a found set that has one record per group – where a group is defined by the combination of year, state, county, week, and arbovirus.
Thank you. I'll give your script a try, and let you know how it went.
This may be an easy way, not virtual list but something similar.
Create unstored calculation fields for each emptized field, and use them instead for export.
per all group fields except minimum group. (You need 5 in this case)
per "summary by" fields. You need 1 in this case. GetSummary ( Collected_summary ; arbovirus )
<ugh> NOT in favor of creating any more calculations fields then absolutely necessary. Create additional fields and Set Field in script.
I like the idea since there is no need to change the records only for exporting. (and, easy for test the result since work without any script)
And, this case the data exported is summary, (image a layout that don't have body part) there is no record to set field, or need to set all found set.
If you don't want to create many calculation fields in data table, real virtual list or temp table for export may be good.
Thank you, erolst. Your script worked perfectly. This new method of summarizing the data will save me a lot of time. I really really appreciate it.
Thank you, Beverly
Thank you, @user19752