Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

...

  1. get the updated Word document for the country from the SPM team
  2. from OpenOffice Writer open the document and then "saveAs" choosing "Text Encoded" and then "Western Europe (ASCII/US)" and save the file into $PPI_BASE/data/<CountryName><CountryName>.txt which should replace the existing file
  3. update the line for this country in nicknames.csv, increasing the pointsVersion by 1
  4. run $PPI_BASE/parseall.sh to regenerate the Mifos PPI files
  5. include the updated $PPI_BASE/generated/scoringEtl/<country><year>PPIScore.sql in load_dw_ppi_survey.sql (which loads all scoring sql)
    • cat generated/scoringEtl/*PPIScore.sql > ../JohnWoodlockWorkInProgress/MifosDataWarehouseETL/load_dw_ppi_survey.sql

When changes are made to PPI category likelihood (poverty line) values.

  1. Get the updated "Lookup Tables v15.xls" document
  2. Open "Lookup Tables v15.xls" in OpenOffice Calc and navigage to the "<country_name> Mifos" tab for the country who's data has changed.
  3. Do a "Save As..." in CSV format to the $PPI_BASE/data/percents/<CountryName><CountryName>.csv file
  4. run $PPI_BASE/parseall_percents.sh to regenerate the poverty line files (generated/povertylines/*PovertyLines.sql
  5. put all the poverty line sql into a single file that is loaded by the data warehouse
    • mv ../JohnWoodlockWorkInProgress/MifosDataWarehouseETL/load_ppi_poverty_lines.sql /tmp
    • cat generated/povertyLines/*.sql > ../JohnWoodlockWorkInProgress/MifosDataWarehouseETL/load_ppi_poverty_lines.sql

...

  1. get the updated Word document for the country from the SPM team
  2. from OpenOffice Writer open the document and then "saveAs" choosing "Text Encoded" and then "Western Europe (ASCII/US)" and save the file into $PPI_BASE/data/<CountryName><CountryName>.txt which should replace the existing file
  3. update the line for this country in nicknames.csv, increasing the questionsVersion by 1
  4. run $PPI_BASE/parseall.sh to regenerate the Mifos PPI files
  5. include the updated $PPI_BASE/generated/scoringEtl/<country><year>PPIScore.sql in load_dw_ppi_survey.sql (which loads all scoring sql)
    • cat generated/scoringEtl/*PPIScore.sql > ../JohnWoodlockWorkInProgress/MifosDataWarehouseETL/load_dw_ppi_survey.sql

When a new major version of PPI is released.

...

  1. in bi/ppiparser run parseall.sh (this will regenerate generated/testData/*.properties files
  2. initialize a clean Mifos database with base etl test data
    • echo "drop database mifos_ppi_test" | mysql -u root
    • echo "create database mifos_ppi_test" | mysql -u root
    • mysql -u root mifos_ppi_test < mifos_testetl_db.sql
  3. run PPITestDataGenerator from inside Eclipse with an arg pointing to the test data dir (e.g. -a /home/van/reportingWorkspace/bi/ppiparser/generated/testData)
  4. save the resulting Mifos database (with completed ppi surveys)
    • mysqldump -u root mifos_ppi_test > load_testppi_db.sql
  5. run the etl (bi/ppi_build.sh) to populate with DW with PPI survey scores
    • ppi_build.sh mifos_ppi_test mifos_ppi_test_dw ~/pentaho/data-integration/ '-u root'
  6. now PPITest.groovy can be run