Fannie Mae

2. Accelerating Data Intensive. Financial Analytics over 450-500%. John Eubanks. Systems Engineer V, Fannie Mae. Accelerating SAS, Informatica, R, Ab Initio, ...
11MB Sizes 2 Downloads 206 Views

© 2015 DataDirect Networks, Inc. * Other names and brands may be claimed as the property of others. Any statements or representations around future events are subject to change.

Accelerating Data Intensive Financial Analytics over 450-500% Accelerating SAS, Informatica, R, Ab Initio, Matlab and in house codes

John Eubanks Systems Engineer V, Fannie Mae


Summary ▶ 

Engineered comprehensive solution


Exploited IB-RDMA to deliver higher throughput


Eliminated TCP encapsulation overhead


higher throughput per core


Fannie Mae accelerated key elements of its data intensive risk analysis in issuance of mortgage backed securities over 450%



Minimized TCO and Maximized ROI

Fannie Mae ▶ 

The leading source of residential mortgage credit in the U.S. secondary market •  establish and implement industry standards‚ •  develop better tools to price and manage credit risk‚ •  build new infrastructure to ensure a liquid and efficient market‚ and •  facilitate the collection and reporting of data for accurate financial reporting and improved risk management.



Fannie Mae is supporting today's economic recovery and helping to build a sustainable housing finance system

Fannie Mae Risk Analytics Modeling ▶ 

Risk Analytics Modeling Framework based on several data intensive applications •  Combination of SAS, Informatica, R, Ab Initio, Matlab and in house codes •  Primarily executed in a UNIX environment


Daily, weekly, and monthly ETL processes using the modeling framework •  Also used for reporting and modeling process


Common workflows involve extraction and manipulation of billions of records of loan level data •  Construct panel data sets and build competing risk models using various analytical methods •  Robust loan risk analysis system


Risk Analytics and IO Bottleneck Challenges ▶ 

Time Sensitive Workloads •  Prepare optimized algorithms for risk analytics •  Limited window to deliver accurate risk predictions •  Deliver higher performance for mixed I/O SAS, Informatica, R, Ab Initio, Matlab and in house workloads


Data Intensive Risk Analytics Challenges •  Extremely compute and IO intensive •  Number of users were scaling up along with the data •  IO bottlenecks in prior infrastructure crippling performance for current and emerging workflow growth trends


Broader Impact to Stakeholders



We were able to run only a select risk modeling scenarios across select paths


Data storage infrastructure was siloed on a usecase basis limiting performance and increasing costs


Turn around time for mission critical Data Intensive workloads such as SAS, R, Informatica, Matlab and home grown applications, was long


Lower throughput forced us to deploy more compute resources to accommodate longer runtimes resulting in increased OPEX


Limited our capability to address broad customer stakeholders within the company


Due to the growing demand of HPC powered Risk Modeling, the Data Center foot print had grown considerably


Total cost of ownership including licensing costs, services and support costs had grown resulting in very high Price-Performance premiums

Urgent Need for High Performance And Throughput Solutions ▶ 

Needed infrastructure that: •  Delivered higher performance for Data Intensive Workflows (Such as SAS, Informatica, R, Ab Initio, Matlab and in house codes) •  Eliminated IO Bottlenecks and Data Silos •  Minimized runtime enabling wider range of modeling – enabling more accurate risk predictions •  Enabled Scalable growth and without scaling complexity or costs •  Seamless integration into our broader infrastructure •  Minimize CAPEX and OPEX


POCs and Design Space Exploration Fannie Mae Engineers led an intensive POC exploration process Tested multiple configurations and compared performance metrics Using SAN technologies delivered on average 20 – 30 MB/s per core throughput Performance degraded at Scale Very poor utilization of Compute Grid infrastructure For all alternative solutions Costs and Complexity increased with scale Longer Runtimes and poor CAPEX/OPEX


Architected High Performance Solution for SAS Grid and Informatica SAS Grid Solution 452 % Faster Workflows


Informatica Solution 500 % Faster Workflows

DDN Delivered Unmatched Acceleration for Key Risk Management Workflows

Up to 500% faster Financial Risk Analysis

452% faster application performance for risk management and analytics

Lowered Data Center Footprint, HVAC, Maintenance and Support costs


Unmatched SAS Grid Performance for Mortgage Risk Analytics 400% higher bandwidth per core than traditional storage 120

452% faster performance for MOST COMPLEX SAS Grid Workflow 500

▲ Higher is Better

▼ Lower is Better

Average SAS Grid Runtime (Min.)

Per Core Throughput MB/s/core

450 100 80 60 40 20

400 350 300 250 200 150 100 50 0






Unprecedented Informatica Performance for Mortgage Risk Analytics 500% Faster Informatica Performance for Informatica based Risk Management workflows

Average Informatica Runtime (Minutes)

600 500

▼ Lower is Better

400 300 200 100 0 DDN



Results of Deploying Parallel File Systems Solutions 1 Risk Management Workflow on SAN 5 Risk Management Workflows on DDN

▶  ▶  ▶  ▶ 


Fannie Mae is now able to deliver 5X more workflows in the same amount of time Reduced Data Center Footprint by more than 50% Minimized OPEX, Support and Services Costs NEXT – Consolidate Data Management along similar lines

Key outcomes While designing high-performance infrastructure analyze: End to end runtimes Identify IO bottlenecks Eliminate IO bottlenecks as part of the strategy to deliver higher throughput

Discover opportunistic ways of improving performance Eliminate wasted CPU cycles by removing IO overhead Exploit RDMA access where possible

Architect comprehensive solution that addresses data movement challenges. Parallel storage delivers higher throughput an enables consolidation of infrastructure