项目作者: kr900910

项目描述 :
ETL process which downloads, transforms, and loads Freddie Mac/Fannie Mae mortgage data
高级语言: Python
项目地址: git://github.com/kr900910/mortgage_data_analysis.git
创建时间: 2017-11-22T22:37:40Z
项目社区:https://github.com/kr900910/mortgage_data_analysis

开源协议:

下载


Mortgage Data Analysis

Inital Setup

  1. Register for Fannie Mae: https://loanperformancedata.fanniemae.com/lppub/index.html#.
  2. Register for Freddie Mac: https://freddiemac.embs.com/FLoan/Bin/loginrequest.php.
  3. Pull mortgage-data-analysis repository in EC2 instance (git clone https://github.com/kr900910/mortgage-data-analysis.git).
  4. Create temp_download directory inside mortgage-data-analysis (mkdir temp_download).

Download the data

  1. Go to mortgage-data-analysis/loading_and_modeling, and pip install requests==2.5.3.
  2. Type python download_freddie_mac.py. Enter credentials and quarters to download when prompted. This downloads zip files into the current folder for each quarter.
  3. Type python download_fannie_mae.py. Enter credentials and quarters to download when prompted. This downloads zip files into the current folder for each quarter.

Move the data into HDFS directory

  1. Start Hadoop, postgres, and Hive in EC2 instance.
  2. If this is your first time, type . create_hdfs_dir.sh. This creates necessary HDFS folders.
  3. Type . unzip_to_HDFS.sh. This unzips the zipped files into mortgage-data-analysis/temp_download, removes the zipped files, loads unzipped files to HDFS, and removes the unzipped files. Note that this step can take 15-30 minutes depending on number of quarters being loaded.

Create Hive tables

  1. Go to mortgage-data-analysis/transforming and type . create_hive_tables.sh. This creates Hive metadata for base Fannie and Freddie data in hdfs and for the combined data sets. Note that this script can take several hours to run, depending on how many quarters of data are there (for 15 quarters, acquisition data took 10 min, performance data took ~ 2 hours).

Use Tableau to visualize data

  1. Once Hive tables are created, start HiveServer2 by typing hive --service hiveserver2 &.
  2. Set up an ODBC connection with the server in Tableau and visualize data as necessary. A sample Tableau workbook along with the CSV file extracted from one of Hive tables are available in mortgage-data-analysis/serving folder.