Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.
Comment: Migrated to Confluence 5.3

...

  1. get the updated Word document for the country from the SPM team
  2. from OpenOffice Writer open the document and then "saveAs" choosing "Text Encoded" and then "Western Europe (ASCII/US)" and save the file into $PPI_BASE/data/<CountryName><CountryName>.txt which should replace the existing file
  3. update the line for this country in nicknames.csv, increasing the pointsVersion by 1
  4. run $PPI_BASE/parseall.sh to regenerate the Mifos PPI files
  5. include the updated $PPI_BASE/generated/scoringEtl/<country><year>PPIScore.sql in load_dw_ppi_survey.sql (which loads all scoring sql)

When changes are made to PPI category likelihood (poverty line) values.

  1. Get the updated "Lookup Tables v15.xls" document
  2. Open "Lookup Tables v15.xls" in OpenOffice Calc and navigage to the "<country_name> Mifos" tab for the country who's data has changed.
  3. Do a "Save As..." in CSV format to the $PPI_BASE/data/percents/<CountryName><CountryName>.csv file
  4. run $PPI_BASE/parseall_percents.sh to regenerate the poverty line files (generated/povertylines/*PovertyLines.sql
  5. put all the poverty line sql into a single file that is loaded by the data warehouse

When changes are made to PPI survey question/answer text.

  1. get the updated Word document for the country from the SPM team
  2. from OpenOffice Writer open the document and then "saveAs" choosing "Text Encoded" and then "Western Europe (ASCII/US)" and save the file into $PPI_BASE/data/<CountryName><CountryName>.txt which should replace the existing file
  3. update the line for this country in nicknames.csv, increasing the questionsVersion by 1
  4. run $PPI_BASE/parseall.sh to regenerate the Mifos PPI files
  5. include the updated $PPI_BASE/generated/scoringEtl/<country><year>PPIScore.sql in load_dw_ppi_survey.sql (which loads all scoring sql)

When a new major version of PPI is released.

...

Procedure for updating and running PPI tests

  1. if any ppi data has changed, then in bi/ppiparser run parseall.sh (this will regenerate generated/testData/*.properties files
  2. initialize a clean Mifos database with base etl test data
    • echo "drop database mifos_ppi_test" | mysql -u root
    • echo "create database mifos_ppi_test" | mysql -u root
    • mysql -u root mifos_ppi_test < bi/ETL/MifosDataWarehouseETLTest/mifos_testetl_db.sql
  3. copy all question group XML files to MIFOS_CONF/uploads/questionGroups
    • cp $PPI_BASE/ppiparser/generated/questionGroups/* ~/.mifos/uploads/questionGroups
  4. run PPITestDataGenerator from inside Eclipse with an arg pointing to the test data dir (e.g. -a /home/van/reportingWorkspace/bi/ppiparser/generated/testData) and an arg for the client to use (e.g. -i 0003-000000006)
  5. save the resulting Mifos database (with completed ppi surveys) NOTE: different output sql file than above
    • mysqldump --default-character-set=utf8 -u root mifos_ppi_test > bi/ETL/MifosDataWarehouseETLTest/load_testppi_db.sql
  6. run the etl (bi/ppi_build.sh) to populate with DW with PPI survey scores NOTE: jndi file for pentaho data integration must be configured with same OLTP and DW databases, and must support UTF-8
    • ppi_build.sh mifos_ppi_test mifos_ppi_test_dw ~/pentaho/data-integration/ '-u root'
  7. now PPITest.groovy can be run
    • under Eclipse run configurations, add a new JUnit test
    • in the "VM arguments", define variables for biTestDbUrl and other system properties used by PPITest.groovy, for example:
      Code Block
      -DbiTestDbUrl=jdbc:mysql://localhost/mifos_ppi_test_dw
      -DbiTestDbUser=root
      -DbiTestDbPassword=
      -DbiTestDbDriver=com.mysql.jdbc.Driver
      (note that biTestDbUrl points to a test data warehouse schema)