Installing Mifos BI 1.3.1
Software Requirements
In order to run Mifos BI, you need a server running ubuntu 10.04 LTS with the following software installed:
- Sun Microsystems Java - JDK/JRE 1.6
- MySQL - Community Edition 5.1.45
- Pentaho BI Server 3.6.0
- Pentaho CDA 1.0.0
- Pentaho Data Integration Community Edition 4.0.0
Note: these tools can also be run on Windows, but instructions and scripts are currently provided only for Ubuntu.
Mifos BI 1.3 works with Mifos 2.0 and above.
Overview
This document will show you how to first setup and configure Pentaho Data Integration to run ETL scripts necessary for some BI and reporting (PPI data export and reports which show a "Data for this report last updated on 2011-05-17 at 18:58:04" type message when entering parameters) . It also continues to show you how to setup the Pentaho business intelligence J2EE web application in an existing tomcat instance.
The existing Mifos application database (mifos) will be the SourceDB. The Mifos Data Warehouse database (mifos_dw) will be the DestinationDB which will be created by running the ETL script later on.
Upgrade Process
- Remove existing files/dirs before following below Install Process:
- reports/CDFReportingTemplate/template-dashboard-mifosreports.html
- /etc/pentaho/system/pentaho-cdf/resources/style/images/*.jpg
- $PDI_HOME/ETL/*
- /etc/pentaho/reports/standardReports
- /etc/pentaho/reports/commonActions
- /etc/pentaho/reports/dataAccess
- /etc/pentaho/reports/i18n/org/mifos
- /etc/pentaho/reports/CDFReportingTemplate
Install Process
Install base os dependencies
sudo add-apt-repository "deb http://archive.canonical.com/ lucid partner"
sudo apt-get update
sudo apt-get install sun-java6-jdk libmysql-java
Install Data Integration/ETL
Download pdi-ce-4.0.0-stable.tar.gz
Extract it to a location we will call $PDI_HOME (the root of the Pentaho Data Integration)
tar xzf pdi-ce-4.0.0-stable.tar.gz -C /usr/local/
Get Mifos BI 1.3
Get the Latest Mifos BI from sourceforge.org. download here
Extract the ETL folder to $PDI_HOME :
unzip -d $PDI_HOME mifos_bi-1.3.1.zip ETL/*
Configure Pentaho database settings and install ETL
You will need to configure two databases to get pentaho up and running a "Hibernate" database and a "Quartz" database
echo "create database hibernate default character set latin1;" | mysql <root mysql credentials>
echo "create database quartz default character set latin1;" | mysql <root mysql credentials>
echo "grant all on hibernate.* to '<hibernate user>'@'localhost';" | mysql <root mysql credentials>
echo "grant all on quartz.* to '<quartz user>'@'localhost';" | mysql <root mysql credentials>
echo "set password for '<hibernate user id>'@'localhost' = password('<hibernate user password>');" | mysql <root mysql credentials>
echo "set password for '<quartz user id>'@'localhost' = password('<quartz user password>');" | mysql <root mysql credentials>
Note: hibernate user and quartz user can be the same.
You will need to configure two databases SourceDB and DestinationDB. Edit the file with SourceDB and DestinationDB login credentials
$PDI_HOME/simple-jndi/jdbc.properties (DestinationDB entries)
Also you will need to add these credentials to the "hibernate" database:
echo "CREATE TABLE DATASOURCE(NAME VARCHAR(50) NOT NULL PRIMARY KEY,MAXACTCONN INTEGER NOT NULL,DRIVERCLASS VARCHAR(50) NOT NULL,IDLECONN INTEGER NOT NULL,USERNAME VARCHAR(50) NULL,PASSWORD VARCHAR(150) NULL,URL VARCHAR(512) NOT NULL,QUERY VARCHAR(100) NULL,WAIT INTEGER NOT NULL);" | mysql <hibernate login creds>
echo "INSERT INTO DATASOURCE VALUES('SourceDB',20,'com.mysql.jdbc.Driver',5,'<SourceDB user>','<base64 encoded SourceDB password>','jdbc:mysql://<SourceDB host/port>/<SourceDB name>?useUnicode=true&characterEncoding=UTF-8','select 1',1000);" | mysql <hibernate login creds>
echo "INSERT INTO DATASOURCE VALUES('DestinationDB',20,'com.mysql.jdbc.Driver',5,'<DestinationDB user>','<base64 encoded DestinationDB password>','jdbc:mysql://<DestinationDB host/port>/<DestinationDB name>_dw?useUnicode=true&characterEncoding=UTF-8','select 1',1000);" | mysql <hibernate login creds>
Initialize Quartz DB
mysql <quartzdb credentials> < create_quartz_mysql.sql >
Run ETL job
Set up a cron job to run the etl job. You may also run it manually:
$PDI_HOME/ETL/MifosDataWarehouseETL/full_reload.sh <destination db name> $PDI_HOME "<destination db mysql args>" > /tmp/data_integration.sh.out 2> /tmp/data_integration.sh.err
Install Pentaho BI Server
Download Pentaho BI Server biserver-ce-3.6.0-stable.tar.gz
tar xzf biserver-ce-3.6.0-stable.tar.gz -C /tmp/bi
cp -r /tmp/bi/biserver-ce/tomcat/webapps/pentaho $TOMCAT_HOME/webapps
cp -r /tmp/bi/biserver-ce/tomcat/webapps/pentaho-style $TOMCAT_HOME/webapps
cp -r /tmp/bi/biserver-ce/tomcat/webapps/sw-style $TOMCAT_HOME/webapps
mkdir -p /etc/pentaho
cp -r /tmp/bi/biserver-ce/pentaho-solutions/system /etc/pentaho
cp /usr/share/java/mysql.jar $TOMCAT_HOME/webapps/pentaho/WEB-INF/lib/mysql.jar
Install CDA in BI Server
download cda-installer-v1.0.jar
java -jar cda-installer-v1.0.jar
select 3.6 version of pentaho
create the directory /etc/pentaho/
point to /etc/pentaho/ when asked for pentaho solutions directory
To verify CDA is working use sample CDA (download here) which need to be extracted in biserver-ce-3.7.0-stable/biserver-ce/pentaho-solutions.
Deploy Mifos BI files
unzip -j -d /etc/pentaho/system/pentaho-cdf mifos_bi-1.3.1.zip mifos_bi-1.3.0/reports/CDFReportingTemplate/template-dashboard-mifosreports.html
unzip -j -d /etc/pentaho/system/pentaho-cdf mifos_bi-1.3.1.zip mifos_bi-1.3.0/DashboardsTemplate/template-dashboard-dashboards.html
unzip -j -d /etc/pentaho/system/pentaho-cdf/resources/style/images mifos_bi-1.3.1.zip mifos_bi-1.3.1/reports/CDFReportingTemplate/*.jpg
unzip -d /etc/pentaho mifos_bi-1.3.1.zip mifos_bi-1.3.1/reports/*.*
mv /etc/pentaho/mifos_bi-1.3.1/reports/Dashboards /etc/pentaho
mv /etc/pentaho/mifos_bi-1.3.1/reports/DashboardsMenu /etc/pentaho
mv /etc/pentaho/mifos_bi-1.3.1/reports/standardReports /etc/pentaho
mv /etc/pentaho/mifos_bi-1.3.1/reports/commonActions /etc/pentaho
mv /etc/pentaho/mifos_bi-1.3.1/reports/mpesa /etc/pentaho
mv /etc/pentaho/mifos_bi-1.3.1/reports/dataAccess /etc/pentaho
mv /etc/pentaho/mifos_bi-1.3.1/reports/i18n/org $TOMCAT_HOME/webapps/pentaho/WEB-INF/classes
rm -r /etc/pentaho/mifos_bi-1.3.1
Configure Pentaho BI database parameters
Edit:
1. /etc/pentaho/system/applicationContext-spring-security-hibernate.properties
Edited content in file should look like this (Hibernate login credentials should be also changed):
jdbc.driver=com.mysql.jdbc.Driver
jdbc.url=jdbc:mysql://localhost:3306/hibernate
jdbc.username=hibuser
jdbc.password=password
hibernate.dialect=org.hibernate.dialect.MySQLDialect
2. /etc/pentaho/system/hibernate/mysql5.hibernate.cfg.xml edit the following lines with Hibernate login credentials:
<property name="connection.username">hibuser</property>
<property name="connection.password">password</property>
3. $TOMCAT_HOME/webapps/pentaho/META-INF/context.xml (jdbc/Hibernate entry)
with the Hibernate login credentials. Edited content should look like this:
<Resource name="jdbc/Hibernate" auth="Container" type="javax.sql.DataSource"
factory="org.apache.commons.dbcp.BasicDataSourceFactory" maxActive="20" maxIdle="5"
maxWait="10000" username="hibuser" password="password"
driverClassName="com.mysql.jdbc.Driver" url="jdbc:mysql://localhost:3306/hibernate"
validationQuery="select 1" />
4. $TOMCAT_HOME/webapps/pentaho/META-INF/context.xml (jdbc/Quartz entry)
with the Quartz login credentials. Edited content should look like this:
<Resource name="jdbc/Quartz" auth="Container" type="javax.sql.DataSource"
factory="org.apache.commons.dbcp.BasicDataSourceFactory" maxActive="20" maxIdle="5"
maxWait="10000" username="pentaho_user" password="password"
driverClassName="com.mysql.jdbc.Driver" url="jdbc:mysql://localhost:3306/quartz"
validationQuery="select 1"/>
5. Edit /etc/pentaho/system/hibernate/hibernate-settings.xml, so that the config-file line reads
<config-file>system/hibernate/mysql5.hibernate.cfg.xml</config-file>
6. Edit the following file with the destination DB entries
/etc/pentaho/system/simple-jndi/jdbc.properties
The content of the file should be similar to this (Adjust DB entries to your settings):
SampleData/type=javax.sql.DataSource
SampleData/driver=com.mysql.jdbc.Driver
SampleData/url=jdbc:mysql://localhost/sampledata
SampleData/user=pentaho_user
SampleData/password=password
Hibernate/type=javax.sql.DataSource
Hibernate/driver=com.mysql.jdbc.Driver
Hibernate/url=jdbc:mysql://localhost/hibernate
Hibernate/user=hibernate
Hibernate/password=hibernate
Quartz/type=javax.sql.DataSource
Quartz/driver=com.mysql.jdbc.Driver
Quartz/url=jdbc:mysql://localhost/quartz
Quartz/user=quartz
Quartz/password=quartzSourceDB/type=javax.sql.DataSource
SourceDB/driver=com.mysql.jdbc.Driver
SourceDB/url=jdbc:mysql://localhost/mifos?useUnicode=true&characterEncoding=UTF-8
SourceDB/user=mifos
SourceDB/password=mifos
DestinationDB/type=javax.sql.DataSource
DestinationDB/driver=com.mysql.jdbc.Driver
DestinationDB/url=jdbc:mysql://localhost/mifos_dw?useUnicode=true&characterEncoding=UTF-8
DestinationDB/user=mifos
DestinationDB/password=mifos
install CDA "hack":
edit line 24 in:
/etc/pentaho/system/pentaho-cdf/js/CoreComponents.js
to
line 24> default: myArray.push([this.resultp0,this.resultp1]);break;
See http://thread.gmane.org/gmane.comp.finance.mifos.devel/10466
Troubleshooting
- If you're having specific issues with ETL jobs see Mifos and Data Warehouse Data Health Checks for scripts/advice on cleaning quality of your data warehouse.
- For other troubleshooting tips see Troubleshooting PentahoDEFAULT CHARACTER SET latin1;