big data case study - caixa bank

27
Copyright © 2014 Oracle and/or its affiliates. All rights reserved. | Big Data in Banking – How CaixaBank Uses Big Data in Order to Anticipate the Needs of its Customers Seoul 17 Sep 2015 Chungsik Yun Oracle Consulting Technical Manager [email protected]

Upload: chungsik-yun

Post on 06-Jan-2017

902 views

Category:

Data & Analytics


11 download

TRANSCRIPT

Slide 1

Big Data in Banking How CaixaBank Uses Big Data in Order to Anticipate the Needs of its Customers

Seoul 17 Sep 2015

Chungsik YunOracle Consulting Technical [email protected]

Copyright 2014 Oracle and/or its affiliates. All rights reserved. |

Copyright 2014, Oracle and/or its affiliates. All rights reserved. |Document Owner : [email protected] (Senior Director, EMEA BI & Big Data Consulting Sales & Services Portfolio)

Singapore Seminar Speaker : [email protected] (APAC Consulting Solutions Director) Support : [email protected], [email protected]

. Big Data BI Solution Delivery . Big Data .Big Data , .

CaixaBank Big Data .

1,370 , Infrastructure Powerful Data Repository . , Big Data High Level Solution Knowledge , End to End Solution, Infrastructure Solution .

CaixaBank is one of the leading bank in the Spanish Market with a Customer base of 13.7 million.CaixaBank is implementing Oracle Big Data Infrastructure to create a powerful and secure data repository.

CaixaBank achieves four Big Data goals, by teaming with OPN Diamond partner, Accenture, valuing their high-level solution knowledge and strategy definition, and selecting Oracle Exalytics, Oracle Big Data Appliance and Oracle Exadata.

Program AgendaFinancial industry in major transitionEuropean leaderHow can I launch my journey1232

Copyright 2014, Oracle and/or its affiliates. All rights reserved. |Agenda for this session

Note Petr: I do not think we need the Agenda slides, as we have only 3 sections !!!Thats why I have hidden them.

Program AgendaFinancial industry in major transitionEuropean leaderHow can I launch my journey1233

Copyright 2014, Oracle and/or its affiliates. All rights reserved. |Agenda for this session

, , , .

Banks are fundamentally changing the way they serve CustomersHistorically, banking systems have been product and account centricProduct Out

But the demand on them is to be truly customer centric Customer In

Digital EngagementDigital Experience

Checking

Product DefinitionAccountingEligibilityChannelsMaster

Mortgage

Product DefinitionAccountingEligibilityChannelsMaster

Credit Card

Product DefinitionAccountingEligibilityChannelsMaster4

Copyright 2014, Oracle and/or its affiliates. All rights reserved. | , // Product . , Attach . 3 . , . Product Out , Family Card/ House Hold , , , Customer In Product Out .

// , Cross , Top . Customer In layered Approach Addressing, . , Digital Experience, Digital Engagement Delivery . , Right Time .

Mortgage : LoanEligibility : ,

Historically, in banking and insurance, communications were all product-centric businesses. An account would be activated for a product and a customer name would be attached to the account. Customer was third in the hierarchy, behind products and accounts. Credit Cards are a classic example of product out business families / households get a few dozens of card offers from the same bank - very inefficient and leads to the customer asking Do you know me as a Customer?

Banks and Insurers need to build a common set of customer-centric services that are used across all products, with the customer at the top of the hierarchy. The layered approach allows you to start addressing the customer experience while protecting customers from seeing the complexity of changes happening throughout the rest of the organisation.

For Customer In you need a business and technology architecture that delivers Digital Experience, Digital Engagement, and Componentized Core to take the prospect off the market and get to an yes at the shortest possible time to existing customers.

--

Data is at the heart of Customer In - Leaders convert Data into Value

One of the strategic objectives is that CaixaBank becomes a European leader in the use of Big Data and generates value from analyzing its customer data. In order to do that, CaixaBank has partnered with Oracle to develop a new technology platform that can help improve the business and enable the bank to anticipate the needs of customers with a 360 view of the customer

Juan Maria Nin, CEO CaixaBank

Expansin (Spain), 26 Mar 2014, translated from Spanish

App StoreAdaptive STPApp CaptureDocument submissione-signatureCustomer 360Fine grained segmentationMobile PaymentsContextual SellingReal-time BundlesDynamic PricingPFM ToolsProduct ComparisonOmni channel Self serviceAutomated workflowsLower CostFoundationalTransformationalUplift Revenue

5

Copyright 2014, Oracle and/or its affiliates. All rights reserved. |Customer in , . , Foundation , , , . . .

360 Degree "View Transformational Use Care Capability . Pricing . Pricing Transaction Off Context Control . , Up Sell , Right Time Selling .

, Communication , Interaction Context , . , Message Communication , ( () ), ( ) , . Data Pool . 360 Degree View .

Juan Maria Nin, CEO CaixaBank Value Generation . , Customer 360 View .

[animated version of the slide transitions to second topic of the pitch, Caixa]In order for banks to become real digital banks, new capabilities are required. These can be either Foundational or Transformational, leading to lower Costs or Uplifting Revenue. (Big) Data is the cornerstone of the capabilities that create substation revenue uplift potential.

360 degree view of the customer aided by fine-grained segmentation information, are the foundational capabilities in support of the transformational use cases. Revenue uplift will come from the ability to deliver customer specific pricing in real-time. It is here, in this stage, you will have control of the context of the customer transaction to deliver a price or make an offer in real time. Up sell a customer-specific bundle that is presented at the right time every time.

Selling by product silos can completely disappear and with the deep insight about the customer and context, you can be proactive in the engagement. Example enabling a switch to withdraw or pay in USD using the debit card, with clear communication on how it costs to use, when you see a message or can locate the customer away from home and in Washington DC. This is how the telcos do it today. Ability to use data from outside the four walls of your enterprise, process it inline and deliver the right results.

CaixaBank started its data pool initiative just to do that. As you can read in the quote of Mr Nin, CEO of CaixaBank, the bank will create, in a partnership with Oracle, a new data platform to enable the bank to anticipate the needs of customers with a 360 degrees view of these customers.==Uplift Revenue : Increase RevenuePersonal Financial Management (PFM) refers to software that helps users manage their money. PFM often lets users categorize transactions and add accounts from multiple institutions into a single view. PFM also typically includes data visualizations such as spending trends, budgets and net workSTP : Segmentation Targeting Positioning

Program AgendaFinancial industry in major transitionEuropean leaderHow can I launch my journey1236

Copyright 2014, Oracle and/or its affiliates. All rights reserved. |Agenda for this session

Customer backgroundOne of the leading banks in the Spanish market

7

Copyright 2014, Oracle and/or its affiliates. All rights reserved. | . Branch 16.8%, (absorption) ATM 74.25%, 81.17% . 4,670 1,320 .

CaixaBank, S.A. (Catalan pronunciation:[kaba]), formerly Criteria CaixaCorp, is a Spanish financial services company owned by the Catalan savings bank La Caixa with a 72.76% stake.[2] Headquartered in Barcelona, the company consists of the universal banking and insurance activities of the La Caixa group, along with the group's stakes in the oil and gas firm Repsol YPF, the telecommunications company Telefnica and its holdings in several other financial institutions. Isidre Fain is the Chairman of the company, having replaced Ricard Fornesa Rib in May 2009,[3] and since June 2014 its CEO is Gonzalo Gortzar. It is Spain's third-largest lender by market value and with 5,695 branches to serve its 13.2 million customers, CaixaBank has the most extensive branch network in the Spanish market

Customer backgroundOne of the leading banks in the Spanish market

8

Copyright 2014, Oracle and/or its affiliates. All rights reserved. |

9Current SituationAs-Is Architecture & current limitations

Assessment for Continuous Innovation CapabilitiesOn the 21st Century things can not be done in the way was designed in the previous Century.Current limitation sensed by observed Business needs:Agility, flexibility and capability for transformationBusiness users acquiring emerging roles/skills and able to take advantages by information analysisInformation Discovery Data Democratization on/troughInternal dataExternal dataLeveraging latest technologies available (Big Data, Advanced Analytics)Business and Competitiveness on risk if the Information Architecture is not flexible enough to embrace the change of paradigmAgility affected by complexity on ELT, Lack of agility due to complexity

Data hijacked by OLTP, Silos and ComplexityOver decades, the Informational Systems Architecture have been evolved in a way that the data goes from Transactional and Operational Systems to the Informational and Analytical Data Marts through complex and thus, expensive processes, resultingLimitations/dependencies on to current IT infrastructureDifficult to access to unstructured formats, limited scalability, complexity on providing SLAs

Copyright 2014, Oracle and/or its affiliates. All rights reserved. |http://medianetwork.oracle.com/video/player/3843337229001

CaixaBank achieves four Big Data goals, by teaming with OPN Diamond partner, Accenture, valuing their high-level solution knowledge and strategy definition, and selecting Oracle Exalytics, Oracle Big Data Appliance and Oracle Exadata.

Consolidate 17 data marts into ONE Improve relationships with customers by offering better products.Improve employee efficiency : Monitoring SystemCentralize regulatory information : 9

4 main goals: Consolidate 17 data marts into ONE.Improve relationships with customers by offering better products.Improve employee efficiency.Centralize regulatory information.

Luis Esteban, Chief Data Officer, CaixaBank

Motivation20+ years DWH in Mainframe10

Copyright 2014, Oracle and/or its affiliates. All rights reserved. |http://medianetwork.oracle.com/video/player/3843337229001

CaixaBank achieves four Big Data goals, by teaming with OPN Diamond partner, Accenture, valuing their high-level solution knowledge and strategy definition, and selecting Oracle Exalytics, Oracle Big Data Appliance and Oracle Exadata.

Consolidate 17 data marts into ONE Improve relationships with customers by offering better products.Improve employee efficiency : Monitoring SystemCentralize regulatory information :

Business NeedsProvide agile, timely response to growing regulatory pressure (e.g. European Stress Tests)Enable full 360 view of the customerDemocratization of information, from siloed organization and information to a data model pool to promote creativeness and productiveness

IT NeedsGetting an holistic and unified vision of internal and external data used by CaixaBank business processes along its lifecycle: ingestion, production, storage, transformation and consume.Increase agility, transparency and security in the use of data, improving the capacity for adapting to the changes and business requirementsIncorporating Advanced Analytic and Data Discovery tools to identify correlations, new data to ingest, new attributes and patterns to add business valueImprove the quality of service in informational systems assuring high availability, contingency add data protection

Major Focus 1) Big Data & Real Time Bidding 2) Extreme Personalization 3) Social Network Analysis 4) Analysis Factory

Phased approachApps/Cases built on the Data Pool

Build Data Pool + Data Factory Enginefor All Data11

Copyright 2014, Oracle and/or its affiliates. All rights reserved. | ., Data Pool + Data Factory Engine . Application Use Case Data Pool .

Data Pool , Value , 4 .

, Value View Time to Market (, , ) Quality Service

CaixaBank Data Pool Strategic Initiative Maximizing Business Value from Informational Assets

Complete and Unified View of internal and external data meaningful for Business in all the data lifecycle/stages (online/production, staging, enterprise, consume)Increase agility, transparency and security in using data, much more flexible to address emerging business needs and meet new requirements coming from the lines of business (time-to-market)Capable of Data discovery and Advanced Analytics, able to find patterns and correlations, new uses and transformations, ingest new data regardless of the format, and flexible to introduce new attributes for creating new value addIncrease Quality of Service, providing Data Protection, High Availability, Recovery and Contingency to every kind of data without affecting operations

CaixaBank started with building the foundation for future business driven use cases - the Data PoolDeposits PricingSource - CaixaBankATMs Customized MenusOnline Risks ScoringOnline Marketing AutomationSentiment Analysis12Business Use Case Examples

Copyright 2014, Oracle and/or its affiliates. All rights reserved. |CaixaBank has created a strategic initiative by the name of Data Pool, which can be summarized as the extraction of the maximum business value from any kind of data, regardless of its type, its origin and its consumption model. The Big Data project is aimed at ingesting and making available across The Bank any piece of information demanded by the business: Smart Banking, Sentiment Analysis, Customer Behaviour patterns, Artificial Intelligence, and more.

The Data Pool Initiative is not driven by a single or a set of concrete business cases to be addressed in a short or medium term. It is driven instead as the strategic approach to the Banks new Information Management Architecture for the coming years. Based on that the various business initiatives will be implemented. Some examples are:

Deposits Pricing: creating a framework for pricing liabilities and control heading pricing which promotes the customer relationshipATMs Customized Menus: customizable buttons and operations, e.g. voice guidance for blind peopleOnline Risk Scoring: "immediate" granting of a credit card to No-Customers from their card and / or account number in another entityOnline Marketing Automation, offers at the right time, right location via the preferred channelSentiment Analysis

13

These Use Cases are dependent on multiple Data Sources that will feed into the Data PoolSource - CaixaBank

Copyright 2014, Oracle and/or its affiliates. All rights reserved. |

CaixaBank Logical Architecture14

Copyright 2014, Oracle and/or its affiliates. All rights reserved. | Data Pool .

Big Data ApplianceExadataExalytics14

Data Pool - Platform Architecture

DC2DC1IBIB

Backup

Snapshot

TSM10GbEVTLVTL

Oracle DataGuardFCIBIBIBIB

ZFS Replication

BDR

Replication

ZS-3 BackupOracle RMAN

TSMFC10GbE

Backup

SnapshotData Pool

Data Pool

ZS-3 BackupOracle RMAN

SAN

SAN

15

Copyright 2014, Oracle and/or its affiliates. All rights reserved. | Data Pool . Data Pool, .

Big Data Appliance : Oracle Database Exadata : DW OLTP Application Exalytics : In-Memory S/W H/W , Visualization BI Platform

End to End Fast Implementation, Risk . 18 Data Volume 1.7 Peta Data

Data Pool approach could be reused in other customers/ verticalsStrategic approach providing vision and architecture is a differentiatorExecutive sponsorship is importantOracle-on-Oracle strategy is key enablerIntensive use of extended team (OCS, Enterprise Architects, ISG, ...) is fundamentalOracle Services Leadership is key:Support the initial project(s)Help filling gaps at customer sideMany emerging technologies require specialized skills

IM Architecture Products Mapping16

ActionableEventsEvent EngineData ReservoirData FactoryEnterprise Information StoreReportingDiscovery LabActionableInformationActionableInsights

DataStreamsExecutionInnovationDiscovery OutputEvents & Data

StructuredEnterprise Data

OtherData

Oracle Information Management Reference Architecture

Oracle Event Processing

OracleGolden Gate

ApacheFlume

Oracle Data Integrator / Oracle Enterprise Metadata Manager

OracleReal-timeDecision

Cloudera Hadoop

OracleNoSQLDatabase

OracleRDistribution

OracleDatabase

OracleAdvancedAnalytics

OracleREnterprise

OracleBig DataConnectors

OracleBusinessIntelligenceEnterpriseEdition

Big DataDiscovery

OracleR

Data Factory Engine

Copyright 2014 Oracle and/or its affiliates. All rights reserved. |Oracle Information Management Reference Architecture Conceptual View Execution Layer Innovation Layer .Execution Layer Data Mining, Big Data Discovery Innovation Layer .Execution Layer Input Events Events Data Factory Execution Information Platform, Data Application Data Factory , Information Solution Data Factory .Execution Real Time Events .Data Factory Engine Code Generation, New Data Source ETL Procedure, Schedule Job Dependence, Protect & Audit, Monitoring & Reporting .

--Oracle Flume

Oracle Confidential Internal/Restricted/Highly Restricted17Main requirementsSolutionInitially, 900 different file structures to be ingested. Nowadays 2.000 and 3.000 in the future and they are not known at the beginningODI code generator based on descriptions and common patterns Deploying new sources has to follow a procedureFiles are first ingested in a test environment, checked and then the automatic ingestion is promoted to productionLoading dependencies based on data loaded and finishing of the previous loadA custom scheduler for controlling loadings and dependenciesDatascientists need an area to play with the dataThe discovery lab has been created and tools for managing data & metadata between areasAccess to data has to be protected & auditedA custom solution based on Oracle products for giving access & auditing Monitoring and reporting on the loadings is neededAll actions generate traces that can be reported. Monitoring modules are implemented.

Why DFE?

Copyright 2014 Oracle and/or its affiliates. All rights reserved. |Data Factory EngineMethodology and Governance

Security (BDA!)

Oracle Big Data SQL18

Copyright 2014, Oracle and/or its affiliates. All rights reserved. |Metadata Ingestion : , Execution and Scheduling : Scheduling the loadings by dependencies and resources available Re-Use : Guidelines & Best practices - Rules and guidelines for developing projects, Modelling support - Tools for supporting the modeling of the structured information and also the metadata associated, Application Governance - Metadata management & project configuration data maintenceData Validation and Quality : Code Generation -Speeding the development of projects by providing code generators & knowledge modulesAudit and Design : Audit - Recording the operations executed by users, Access Control - Managing the access to the information stored. Object, row & column filters based on metadataData Management Promotion : Data Management - Data lineage & impact analysis of changes

Module description19

Copyright 2014, Oracle and/or its affiliates. All rights reserved. |

19

IM Architecture - Data Factory EngineHow It WorksMetadata

DFEMetadata Management

Data FactoryEngineIngestion Specialist

User requests a copy of a content to the Staging Layer :A user requests access to a content of Staging Layer for a list of people or for a role within the organization through the application: Name of the original content Name to give to the content into the staging Persistency policy Usage type Expiration time1Ingestion Specialist check the request complete the data and grant premissions (also based on resources), if needed apply a charge-back function:a) DFE proposes a name for the columns to includeb) For names grater than 30 characters make a warning and asks for the namec) The Ingestion Specialist and the User enter the names of the columnsd) User/group will be assigned to files defined as consumer of information2DFE Acquire Data FormatBased on the entered metadataa) DFE capture Format Data Definition from metadata defined by ingestion specialistb) DFE Capture Format Data Definition from sources using native ODI functionality and connectorsc) DFE load metadata sources into ODI metadatad) DFE generate ODI metadata describing process and execution steps to execute3132Metadata

ODI4

User Discovery Data & Reworksa) User Analyze & Discover Data Structure changes Using Big Data Discovery featuresb) User and Ingestion Specialist update Data Formatc) DFE Regenerate all Related Metadata without copying again DataCode Execution by DFE/ODIa) DFE, thru ODI, Create a copy inside the Data Reservoir and then the Discovery Labb) DFE, thru ODI, Register metadata with assigned column namesc) DFE Assign privileges and publish information on all required level d) DFE allows copy management:Refresh the contentManage Life-Cycle of Raw Data (age-out)Warn about expiration54520

Copyright 2014 Oracle and/or its affiliates. All rights reserved. |1. User requests a copy of a content to the Staging Layer :A user requests access to a content of Staging Layer for a list of people or for a role within the organization through the application: Name of the original content Name to give to the content into the staging Persistency policy Usage type Expiration time

2. Ingestion Specialist check the request complete the data and grant permissions (also based on resources), if needed apply a charge-back function:a) DFE proposes a name for the columns to includeb) For names grater than 30 characters make a warning and asks for the namec) The Ingestion Specialist and the User enter the names of the columnsd) User/group will be assigned to files defined as consumer of information

3.DFE Acquire Data FormatBased on the entered metadataa) DFE capture Format Data Definition from metadata defined by ingestion specialistb) DFE Capture Format Data Definition from sources using native ODI functionality and connectorsc) DFE load metadata sources into ODI metadatad) DFE generate ODI metadata describing process and execution steps to execute

4. Code Execution by DFE/ODIa) DFE, thru ODI, Create a copy inside the Data Reservoir and then the Discovery Labb) DFE, thru ODI, Register metadata with assigned column namesc) DFE Assign privileges and publish information on all required level d) DFE allows copy management:Refresh the contentManage Life-Cycle of Raw Data (age-out)Warn about expiration

5. User Discovery Data & Reworksa) User Analyze & Discover Data Structure changes Using Big Data Discovery featuresb) User and Ingestion Specialist update Data Formatc) DFE Regenerate all Related Metadata without copying again Data

Data Factory Engine - IngestionoverviewOracle Internal21Stage

ConsumerEnterprise

HDFS / NoSQLOracle DB

Data Poolstrongly typed datastrongly typed format conversion (GBs/TBs)HDFS data mappingweakly typed dataOracle Data Integrator

MetadataData Factory Engine

Copyright 2014 Oracle and/or its affiliates. All rights reserved. |

Data Factory Engine - IngestionAutomatic code generationOracle Internal22

INCLUDE__________________________________________________

PARSER

PREPROCESSING

MERGERMetadata

CleaningReserved wordsExpansion of nested includesCommenting LONGREC fieldsDescriptors: files for the generator

Combine multirecord includes into one xml contract for the scanner

It creates the ODI objects used to load the file (data stores, mapping/s and scenario/s)

GENERATOR

ODI OBJECTS

Copyright 2014 Oracle and/or its affiliates. All rights reserved. |

DATAPOOLCosts reduction with controlled TCO

Improvement in Time to Market & Time to Value

Flexibility and Agility

Advanced Analytics (Interactive, Discovery, etc)

Any type data management

High Performance with Homogeneous & Scalable platform

End to End support to Oracle Solution

Data Factory Engine Benefits and Summary23

Copyright 2014, Oracle and/or its affiliates. All rights reserved. | Data Pool Benefit Announce .

Data Mart consolidation ETL Job 30% Evolution to near-real-time for Informational Systems , Reduce time-to-market, increase time-to-value and alignment to business requirements (Estimated 70% )OPEX 20%

New Data Marts and Consume Structures , (Simplifying and Controlling data access, allowing relations without data duplication ) (Increase agility by unified and consistent vision of business concepts from the data, Better time-to-market and response to business requirements)

Advanced Analytics (Advanced Analytics against very large volumes of data, enabling fast decision and increasing detection of data patterns and relations over existing conventional methodologies ) TCO (Reduce TCO by embracing In-Memory capabilities)

Any type of Data Deploy Cost reduction by a incremental and progressive deployment of the Data Pool, including new data sources and not structured data : While reducing operational complexity ) , (Data augmentation: Enrich Information to increase knowledge and Business Value_

Oracle Confidential InternalCaixas Use Cases - Roadmap

Data and Systems ConsolidationCredit Risk CalculationResource Mgmt at BranchesChurn DetectionRegulatory compliance

Best Offering at Branch DeskAnalysis of trading chatsWeb abandon. detectionFraud DetectionAnalytical ProcessingRT ProcessingData Governance

Sandboxing and Rapid DevepmentDiscoveryData AgingLocation based offeringMainframe Offloading24

Copyright 2014, Oracle and/or its affiliates. All rights reserved. | , , , Sandbox , ,Compliance , , , , Trading , Data Aging (the process of removing old data from secondary storage to allow the associated media to be reused for future backups)Real Time Processing : , , Data Governance

Benefits

25Agility and Flexibility (embrace Data First versus Schema/Model first)Reduce time-to-market, increase time-to-value and alignment to business requirements Better Time to market by feeding Data Marts through the Data Pool (estimated 20-40% cost reduction)Advanced Analytics against very large volumes of data, enabling fast decision and increasing detection of data patterns and relations over existing conventional methodologiesData Mart consolidation, increase performance while reducing TCOMainframe downsizing through batch time reduction and ETL to ELT (estimated 30% mainframe MIPS reduction)

Copyright 2014, Oracle and/or its affiliates. All rights reserved. |

Benefits Lnea Abierta (main web site)3M login/day to Online bankingReal-time messages(commercial & non-)Data Pool & Oracle RTD peak capacity 1600 req/sBusiness impact: 39% click-thru increase for new campaignsPremia-T Real-time proactive SMS triggered by credit-card paymentsGeolocation1.5M payments a dayOracle RTD & OEP

26

Copyright 2014, Oracle and/or its affiliates. All rights reserved. |

Program AgendaFinancial industry in major transitionEuropean leaderHow can I launch my journey12327

Copyright 2014, Oracle and/or its affiliates. All rights reserved. |Agenda for this session

Data Factory Engine

Innovation Workshops

DiscoveryLabData Reservoir

DW Offload

Information Management Deep DiveFast Data

Big Data & AnalyticsRapid Start Packs

28How To Get Started - with Oracle Consulting

Transform the business

Lay the foundation

PilotBIG DATAANALYTICS

BIG DATAAPPLICATIONS

BIG DATAMANAGEMENT

BIG DATAINTEGRATION

CREATE VALUEFROM DATA

28

Copyright 2014, Oracle and/or its affiliates. All rights reserved. |, Big Data , .Big Data Use Case , , Pilot , . Optional .

. , . , Big Data Application . IT . Lob .

, , Implementation Direction . . Foundation , Capability , Critical , EDW , ., . , .

, . , , Big Data discovery Workshop .

Here are next steps for three different big data appetites. These arent sequential choices. They are options, based on where you are.Some organizations are seeing the threat or the opportunity around big data and feel that the correct response is a comprehensive transformation of the business. This is the kind of approach that CaixaBank are taking as, you heard earlier. Doing this requires that you touch all different aspects of that big data wheel, from upgrading your data integration and management to creating new analytics and applications. The potential payoff is huge, but it does require some significant work, both on the technical side, but more importantly on the business side. Because getting everybody in alignment to make all of this happen across a large organization is complex. We can help with a big data innovation workshop series, advising on and guiding the formation of your business strategy, architecture, and implementation.

Not everybody is ready for that scale of transformation, and thats perfectly OK. For some companies we work with or talk to, the next step is to build something of a foundation. That means working to upgrade their data integration capabilities. And its critical to look at data management, perhaps expanding or modernizing a data warehouse, and adding a Hadoop-based data reservoir. With the goal being to get those two environments seamlessly integrated together so that new data is easily available to the rest of the organization. Again, we can help start that process with an architecture workshop to help identify what you can do that will deliver the most value to your company.

One of the best pieces of advice on getting started with big data is to pick something thats smaller, delivers some worthwhile value, but does it in a short time frame. It gives you an opportunity to take a lower cost, lower risk first step that can lead to bigger things in future iterations. And here we would recommend looking at a discovery project on Hadoop.

If you have an existing Hadoop cluster you can work with that, or remember that using the Big Data Appliance will get that cluster up and running quicker and cheaper than if you build it yourself. And then use Big Data Discovery to take a look at that new data and see what it can do for you.

Oracle Confidential - Internal29Oracle Big Data Consulting Framework 2.0

29

Technology

Rapid Start Packs

AcquireOrganizeAnalyse / Decide- NoSQL - Real Time Decision- Big Data SQL- Big Data Connectors- Advanced Analytics- Endeca Information DiscoveryArchitecture

in

in

Innovation Workshops

Big Data & Analytics

Information Management Deep Dive

in

Roadmap & BlueprintSolutions

Discovery Lab

Data Factory Engine

Apps Store for Oracle BDADW Offload

Data Reservoir

Fast Data

Big Data Competency Centers

Big Data Workshops

Copyright 2014, Oracle and/or its affiliates. All rights reserved. |{Technology Services}Oracle Consulting Technology Services for Oracle Big Data Solutions are principally aimed to customers and partners who are after a product-oriented accelerator to quick ramp up Big Data technology skills and to have an initial understanding of concrete use cases for a specific Oracle product.

Rapid Start for Oracle NoSQL.It is a pre-packaged service based on Oracle NoSQL technology. Fast ingestion process, high scalability and availability, high performance concurrency and low latency response are key features of this technology. Oracle Consulting Rapid Start includes tangible use cases (based on real delivery projects) accompanied by leading practices of Oracle NoSQL implementations. Duration: from few days to 3 weeks.

Rapid Start for Oracle Real Time Decision.Fast data solutions and machine learning models are at the core of main Big Data Solutions. Oracle Real Time Decision offers self-adaptive learning that prescribes optimized recommendations. With the Oracle Consulting Rapid Start for Oracle Real time Decision the customer could easily fill the knowledge gap on the product and immediately increase its ROI.Duration: from few days to 3 weeks.

Rapid Start for Oracle Big Data SQL.A new entry of the Oracle Big Data Enterprise Solution which leverages SQL queries to seamlessly and efficiently access data stored in Hadoop, relational databases, and NoSQL stores. Rapid Start for Oracle Big Data SQL unveils the potential of this new technology for your business intelligence strategyDuration: from few days to 2 weeks.

Rapid Start for Oracle Big Data Connectors.This Rapid Start guides you through a step-by-step use case on how integrate a Hadoop HDSF cluster with an Oracle Database. Oracle experts provide tips and tricks on how to pass from a large and unstructured dataset to a structured dataset, ready for consumption. Design, configuration and run of Oracle Big Data Connectors (OBDC), ODI and other integration tools between Hadoop and Oracle database are activities enlisted in the catalogue of this service (the one to be delivered depending on the specific customer use case).Duration: from few days to 2 weeks.Rapid Start for Oracle Advanced Analytics.Empower your Data Analysts and Data Scientists with Oracle Data Mining and statistical algorithms (for example, Linear and Logistic Regression, Neural Networks, Time Series Analysis). By leveraging prior use cases in your industry, this service provides a step-by-step implementation of a statistical model with Oracle Data Mining (ODM) and Oracle R Enterprise (ORE). Duration: from few days to 2 weeks.Rapid Start for Oracle Endeca Information Discovery.This service proves the unbeatable value of having a complete Big Data solution in just one product, Oracle Endeca Information Discovery (EID). From the Acquire phase of unstructured data, that is sourced from different means (e.g. Social media, Sensor data), to the graphical visualization on dynamic dashboards (Decide phase), Oracle Consulting delivers a Use Case that guides and supports you with the Big Data challengeDuration: from few days to 3 weeks.------------------------------------------------------------------------------------------------------------------------------------{Architectural Services}Oracle Consulting Architectural Services are generally designed to help customers in the early stage of their Big Data project or any other stage in which they want to deep dive business requirements and understand how to translate them into a Big Data design.Innovation Workshops.A business led innovative approach to optimize your Big Data transformation journey, from qualification to go-live. Key Big Data concepts are instilled into business and technical users and then collected and harmonized within the Divergent Thinking phase. Ideas with recognized business value are promoted into requirements and subsequently into Big Data design decisions of the Big Data solution. This is the Convergent Thinking phase. Finally, Implementation Iterations allow iteratively reach the optimal solution. Duration: from 5 to 10 days (not consecutive), including backoffice work & final close with customer.

Information Management and Big Data MasterClass Workshops.The MasterClass provides an adaptable platform aimed at: highlighting Oracles thought leadership on Information Management and Big Data, explore aspects of the customers current state architecture and capabilities, develop a shared understanding among delegates in order to make progress. The workshop can vary in length and focus depending on the situation. Typically run as a whiteboard session (no PPT) and is product agnostic. In this way the workshop can be offered to customers who are not yet Oracle oriented. Duration: from 1 to 3 days, including backoffice work & final close with customer.

Analytical Capability Workshops.The workshop cover three main elements, the emphasis placed on each will vary depending on the customer situation and their current skills: (a) the data, process flow and analytical techniques required in order to drive business value based on specific use-cases. e.g. how you might increase product up-sell through customer segmentation; (b) how analytical capabilities might be enabled through the use of a Discovery Lab and what this entails; (c) other people, process and technology elements that must be considered in order to realise analytical capabilities and business value (e.g. current IT Architecture issues, current roadmap of the customers IT Architecture) Duration: from 1 to 3 days, including backoffice work & final close with customer.Roadmap & Blueprint (Workshops).The Blueprint and Roadmap service delivers a series of detailed workshops to review customers use cases and Big Data requirement and map them to industry use cases. This packs analyses and supports the discussion with the customer around different scenarios of future-state architectures at different levels: from conceptual down to technical and infrastructure levels. One key aspect is the definition of the Data Governance and the end-to-end Big Data process flow (i.e. Acquire, Organize, Decide and Analyse). Finally, Oracle Consulting delivers the recommend Architecture Blueprint and Roadmap document to the customer, to assist its Big Data transformation journey. Duration: from 2 or 3 days to 5 weeks, depending upon the level of details for which the customer requires support in the Blueprint definition.------------------------------------------------------------------------------------------------------------------------------------{Solution Services}Oracle Consulting Solution Services are based on the expertise and leading practices hoarded by Oracle Consulting in several Big Data customer success stories. They provide for solutions and advisory services upon specific design patterns of a Big Data modernisation project.

Applications Store (Rapid Start Pack) for Oracle Big Data Appliance.If the customer is looking at the Oracle Big Data Appliance as a platform for different pre-packaged solutions from different partners, this advisory service exemplifies Oracle guidelines and leading practices to ensure maximise the Big Data Appliances value. It looks at the optimal deployment of different Big Data third-party solutions, advising on the compliance and adherence to the Oracle Big Data (Appliance) leading practices.Duration: from few days to 2 weeks.Data Reservoir Rapid Start Pack.Have customers ever wondered how to deal with the massive proliferation of new sources of digital information and the volume and velocity at which they are generated? Do they know a cost-effective manner to minimize the risk and maximize the value it provides? The Oracle Consulting Pack for Data Reservoir walks customers through the design, build and run of a solution which innovates your business harmonizes different storages for different data types (e.g. Hadoop HDFS, Oracle NoSQL, Oracle database and other databases), facilitate interaction of data provisioning and transformation tools (e.g. ETL) and set a structured Data Governance approach for your daily execution. Data Reservoir really empowers your business with an innovative platform that fosters new insights and new value out of your data.Duration: 4 weeks.Date Factory Engine Rapid Start Pack.The Rapid Start Pack for Data Factory Engine comes from a long Oracle Consulting experience with integration platforms for Big Data Solutions. By using a flexible metadata definition, the Data Factory engine deals with any type of data, from any source, with any volume and at any frequency. Data orchestration between the different components of your solution (e.g Data Discovery, Data Reservoir, Data Staging and Data Warehouse) is therefore simplified and controlled, to maximize ROI on your asset.Duration: 3 weeks.Data Warehouse Offload Rapid Start Pack.The Rapid Start Pack for Data Warehouse offload uses an innovative approach to optimize your data warehouse, from profile to production. Profile workshops first help to understand your key pain points before carrying out the offloading process as a series of repeatable packages to optimize each workload. At the end of the Data Warehouse Offload pack implementation the customer will see a substantial gain in performance execution and maintenance efficiency joint to a cost-effective platform which is future-proof for any extension of the company information management strategy.Duration: 4 weeks.Discovery Lab Rapid Start Pack.Many key stakeholders have not yet understood the business value of an enterprise Big Data Solution. The Rapid Start Pack for Discovery Lab quickly empowers customers organization (e.g. analysts, data scientists, planners) with a comprehensive and agile Big Data Solution which deals with either structured, poly-structured and unstructured data. Oracle Consulting advises not just on the proper technologies that enables the Big Data (to be chosen among a portfolio of Oracle and non-Oracle Big Data products) but also on the discovery approach: Prototyping, Visualization, Bridging, Replication and Transformation. Duration: from 2 to 4 weeks.Fast Data Rapid Start Pack.Allowing on-the-fly fast analytics is a key element of any Big Data Solution; it gives new opportunities for data monetization of streaming information, a more proactive monitoring of customers behavior and real-time analysis of any core business processes in the company. The Rapid Start Pack for Fast Data advises on the best solution which fits customers needs, spanning across Oracle and non-Oracle technologies and leveraging some of the most relevant industry use cases. Duration: from 2 to 4 weeks.

29

Business led Innovation WorkshopsDivergent and Convergent Thinking. Iterative process. finally Business Value through an innovative approach Big Data & Information Management MasterClassBig Data Architecture Solutions & Leading Practices. Vendor agnostic. Set the foundations of your Big Data Architecture

30Analytical CapabilityYour Business Use Cases.Swiftly Discovered. empower Data Scientists and Analysts in your Discovery Lab Roadmap & BlueprintDesign your Big Data Solution.

deep dive Big Data Eng Systems and Technologies

Big Data Workshop

Copyright 2014, Oracle and/or its affiliates. All rights reserved. |

30

Oracle Confidential Internal/Restricted/Highly Restricted31

Copyright 2014, Oracle and/or its affiliates. All rights reserved. |

Copyright 2014, Oracle and/or its affiliates. All rights reserved. |