...
- get the updated Word document for the country from the SPM team
- from OpenOffice Writer open the document and then "saveAs" choosing "Text Encoded" and then "Western Europe (ASCII/US)" and save the file into $PPI_BASE/data/<CountryName>.txt which should replace the existing file
- update the line for this country in nicknames.csv, increasing the pointsVersion by 1
- run $PPI_BASE/parseall.sh to regenerate the Mifos PPI files
- include the updated $PPI_BASE/generated/scoringEtl/<country><year>PPIScore.sql in load_dw_ppi_survey.sql (which loads all scoring sql)
cat generated/scoringEtl/*PPIScore.sql > ../JohnWoodlockWorkInProgressETL/MifosDataWarehouseETL/load_dw_ppi_survey.sql
...
- Get the updated "Lookup Tables v15.xls" document
- Open "Lookup Tables v15.xls" in OpenOffice Calc and navigage to the "<country_name> Mifos" tab for the country who's data has changed.
- Do a "Save As..." in CSV format to the $PPI_BASE/data/percents/<CountryName>.csv file
- run $PPI_BASE/parseall_percents.sh to regenerate the poverty line files (generated/povertylines/*PovertyLines.sql
- put all the poverty line sql into a single file that is loaded by the data warehouse
mv ../JohnWoodlockWorkInProgressETL/MifosDataWarehouseETL/load_ppi_poverty_lines.sql /tmp
cat generated/povertyLines/*.sql > ../JohnWoodlockWorkInProgressETL/MifosDataWarehouseETL/load_ppi_poverty_lines.sql
...
- get the updated Word document for the country from the SPM team
- from OpenOffice Writer open the document and then "saveAs" choosing "Text Encoded" and then "Western Europe (ASCII/US)" and save the file into $PPI_BASE/data/<CountryName>.txt which should replace the existing file
- update the line for this country in nicknames.csv, increasing the questionsVersion by 1
- run $PPI_BASE/parseall.sh to regenerate the Mifos PPI files
- include the updated $PPI_BASE/generated/scoringEtl/<country><year>PPIScore.sql in load_dw_ppi_survey.sql (which loads all scoring sql)
cat generated/scoringEtl/*PPIScore.sql > ../JohnWoodlockWorkInProgressETL/MifosDataWarehouseETL/load_dw_ppi_survey.sql
...
Procedure for updating and running PPI tests
- if any ppi data has changed, then in bi/ppiparser run parseall.sh (this will regenerate generated/testData/*.properties files
- initialize a clean Mifos database with base etl test data
echo "drop database mifos_ppi_test" | mysql -u root
echo "create database mifos_ppi_test" | mysql -u root
mysql -u root mifos_ppi_test <
bi/
JohnWoodlockWorkInProgressETL/MifosDataWarehouseETLTest/
mifos_testetl_db.sql
- copy all question group XML files to MIFOS_CONF/uploads/questionGroups
cp $PPI_BASE/ppiparser/generated/questionGroups/* ~/.mifos/uploads/questionGroups
- run PPITestDataGenerator from inside Eclipse with an arg pointing to the test data dir (e.g. -a /home/van/reportingWorkspace/bi/ppiparser/generated/testData) and an arg for the client to use (e.g. -i 0003-000000006)
- save the resulting Mifos database (with completed ppi surveys) CAREFULLY NOTE OUTPUT FILE NOTE: different output sql file than above
mysqldump --default-character-set=utf8 -u root mifos_ppi_test > bi/
JohnWoodlockWorkInProgressETL/MifosDataWarehouseETLTest/
load_testppi_db.sql
- run the etl (bi/ppi_build.sh) to populate with DW with PPI survey scores NOTE: jndi file for pentaho data integration must be configured with same OLTP and DW databases, and must support UTF-8
ppi_build.sh mifos_ppi_test mifos_ppi_test_dw ~/pentaho/data-integration/ '-u root'
- now PPITest.groovy can be run
- under Eclipse run configurations, add a new JUnit test
- in the "VM arguments", define variables for biTestDbUrl and other system properties used by PPITest.groovy, for example:
(note that biTestDbUrl points to a test data warehouse schema)Code Block -DbiTestDbUrl=jdbc:mysql://localhost/mifos_ppi_test_dw -DbiTestDbUser=root -DbiTestDbPassword= -DbiTestDbDriver=com.mysql.jdbc.Driver