...
- get the updated Word document for the country from the SPM team
- from OpenOffice Writer open the document and then "saveAs" choosing "Text Encoded" and then "Western Europe (ASCII/US)" and save the file into $PPI_BASE/data/<CountryName>.txt which should replace the existing file
- update the line for this country in nicknames.csv, increasing the questionsVersion by 1
- run $PPI_BASE/parseall.sh to regenerate the Mifos PPI files
- include the updated $PPI_BASE/generated/scoringEtl/<country><year>PPIScore.sql in load_dw_ppi_survey.sql (which loads all scoring sql)
cat generated/scoringEtl/*PPIScore.sql > ../JohnWoodlockWorkInProgress/MifosDataWarehouseETL/load_dw_ppi_survey.sql
When a new major version of PPI is released.
...
- in bi/ppiparser run parseall.sh (this will regenerate generated/testData/*.properties files
- initialize a clean Mifos database with base etl test data
echo "drop database mifos_ppi_test" | mysql -u root
echo "create database mifos_ppi_test" | mysql -u root
mysql -u root mifos_ppi_test < mifos_testetl_db.sql
- run PPITestDataGenerator from inside Eclipse with an arg pointing to the test data dir (e.g. -a /home/van/reportingWorkspace/bi/ppiparser/generated/testData)
- save the resulting Mifos database (with completed ppi surveys)
mysqldump -u root mifos_ppi_test > load_testppi_db.sql
- run the etl (bi/ppi_build.sh) to populate with DW with PPI survey scores
ppi_build.sh mifos_ppi_test mifos_ppi_test_dw ~/pentaho/data-integration/ '-u root'
- now PPITest.groovy can be run