denodo datafest 2017: the need for speed and agility in business
TRANSCRIPT
How CIT Met the Need for Speed and Agility
in Business Through Data Virtualization
October 19th, 2017
Nagaraj VijapurkarData virtualization Lead
CIT Group, Inc.
.
2
Abstract
How CIT Meet Need for Speed and Agility in Business
Through Data Virtualization
Enterprises face a variety of data management challenges that
influence their ability to leverage accurate, meaningful
information, quickly and efficiently. Data virtualization is an
enabling technology which can address many of these
challenges.
This session will explore how data virtualization is being used to
dramatically reduce data proliferation and ensure that all
consumers are working from a single source of the truth. It will
also look at how data virtualization can drive standardization,
measure & improve data quality, abstract data consumers from
data providers, expose data lineage, enable cross-company data
integration, and serve a common provisioning point from which to
access all authoritative sources of data.
3
Data – Key Principles
Cloud
Cloud
• Universal data access across internal, external, web, structured or unstructured
Realize Value from all Data
• Flexible integration options with fine control – virtual real-time, cached or
scheduled batch
Minimize Replication
• Abstracted data delivered as reusable data services Managed access control,
service levels
Abstracted and Unified Data Services
• Performance and scalability
• Data governance, lineage, management
• Easily integrated into existing IT infrastructure
• Shared metadata services and metadata-driven data integration approach
Enterprise Class – Powerful and Agile
4
Party Master – Real time integration using Denodo
Data Service Layer
Risk Origination
Salesforce CRM
Compliance On-
boarding
Reference data CW
Party Master
(MDM)
Regulatory
Credit Risk
Business
Reporting
Rest api
soap
Rest api
Party identification
process
Standardization &
Enrichment
Unique Customer
Consolidation
Data Acquisition
AR Systems Strategic Data hubs
Rest api Rest api Rest api
batch
batch
batch
batch
5CIT — Internal Use
Augment Dataset / Data stores
Architecture – Current and In Progress
DCC Systems
Loans Lease
Lease Hub Loan Hub Trade Hub
(CSDH)
Treasury
Hub
Ref. Data
Std. values & Crosswalk
• PS BU, PS OU
• Currency Rates
• TDR & Cross Ref.
DSL (Data Inbound Views)
Deposit Hub
(BDM)
Canonical Enterprise Views
Party Loan Lease DepositsTrade Mortgage Securities &
Derivatives
Cash @
Bank
Debts
Sub Ledger
(CSL)
PS Repository Credit Risk Party Master
(MDM)
DSL (Data Outbound Views)
GL Summary
FinArch
(Regulatory)FRY 9C, 14 Q/A/M, CECL, LiquidityActimize
Other Downstream Application / Data marts / Reporting
ARCQRM SFRGES CFDI
ReportingTreasury Comm. Loan Origination
and Services
CCRR
Reverse
Repos
TaxHub Party Master
(MDM)
Mortgage hub
(SDM)
CIT Systems
Loans Lease Comm.
services
Reverse
RepoDepositsDerivative,
Securities
& Debts
CF
OWB Systems
Loans Lease Mortgages Deposits Derivative,
Securities
& Debts
CF
ARC PS
Reference data
DQ
Monitoring
Operational
Metadata
- New Strategic Hubs
6
Key Benefits
Application Development Benefits
• Applications development can start immediately against data “mock ups” before the actual data source is
accessible. This approach was used in the OWB integration started.
• The underlying data source(s) can be radically changed without impacting the applications using the data.
For example, the company has changed the vendors that supply data, but the data virtualization layer
made the new vendor look like the old vendor to the applications. Greatly reducing any need to recode the
application.
• Regulation and policy change rules can be applied quickly and viewed for refinement of compliance
parameters throughout the development cycle
• Business Process and regulation changes
• Early identification of Gaps from source system and requirement perspective saving cost and time
• Well-time Business Insight, Data Security, Audit and Compliance
Data Governance Benefits
• Easier to build one enterprise view of corporate data
• Easier to monitor data quality. The data monitoring tools can be independent of the underlying data
storage locations and technologies