cse 232a database system implementation

21
Topic 8: Data Systems for ML Workloads Book: “Data Management in ML Systems” by Morgan & Claypool Publishing Arun Kumar 1 CSE 232A Database System Implementation

Upload: others

Post on 27-May-2022

13 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: CSE 232A Database System Implementation

Topic 8: Data Systems for ML Workloads

Book: “Data Management in ML Systems” by Morgan & Claypool Publishing

Arun Kumar

1

CSE 232A Database System Implementation

Page 2: CSE 232A Database System Implementation

2

“Big Data” Systems

❖ Parallel RDBMSs and Cloud-Native RDBMSs ❖ Beyond RDBMSs: A Brief History ❖ “Big Data” Systems

❖ The MapReduce/Hadoop Craze ❖ Spark and Other Dataflow Systems ❖ Key-Value NoSQL Systems ❖ Graph Processing Systems ❖ Advanced Analytics/ML Systems

Page 3: CSE 232A Database System Implementation

3

Lifecycle/Tasks of ML-based Analytics

Data acquisition Data preparation

Feature Engineering Training

Model Selection

Inference Monitoring

Page 4: CSE 232A Database System Implementation

4

ML 101: Popular Forms of ML

Generalized Linear Models (GLMs); from statistics

Bayesian Networks; inspired by causal reasoning

Decision Tree-based: CART, Random Forest, Gradient-Boosted Trees (GBT), etc.; inspired by symbolic logic

Support Vector Machines (SVMs); inspired by psychology

Artificial Neural Networks (ANNs): Multi-Layer Perceptrons (MLPs), Convolutional NNs (CNNs), Recurrent NNs (RNNs), Transformers, etc.; inspired by brain neuroscience

Page 5: CSE 232A Database System Implementation

5

Advanced Analytics/ML Systems

❖ A data processing system (aka data system) for

mathematically advanced data analysis ops (inferential or

predictive), i.e., beyond just SQL aggregates

❖ Statistical analysis; ML, deep learning (DL); data mining

(domain-specific applied ML + feature eng.)

❖ High-level APIs for expressing statistical/ML/DL

computations over large datasets

Q: What is a Machine Learning (ML) System?

Page 6: CSE 232A Database System Implementation

6

Data Management Concerns in ML

Q: How do “ML Systems” relate to ML?

ML Systems : ML :: Computer Systems : TCS

Key concerns in ML: Accuracy Runtime efficiency (sometimes)

Additional key practical concerns in ML Systems: Scalability (and efficiency at scale) Usability Manageability Developability

Q: What if the dataset is larger than single-node RAM?Q: How are the features and models configured?Q: How does it fit within production systems and workflows?Q: How to simplify the implementation of such systems?

Long-standing concerns in the DB systems

world!

Can often trade off accuracy a bit to gain on the rest!

Page 7: CSE 232A Database System Implementation

7

Conceptual System Stack Analogy

Program Specification

Declarative Query Language

Execution Primitives

Parallel Relational Operator Dataflows

Program Modification Query Optimization

TensorFlow? R? Scikit-learn?

Hardware CPU, GPU, FPGA, NVM, RDMA, etc.

???

Depends on ML Algorithm

Program Formalism Relational Algebra Matrix Algebra

Gradient Descent

Relational DB Systems ML Systems

Theory First-Order Logic Complexity Theory

Learning Theory Optimization Theory

Page 8: CSE 232A Database System Implementation

8

Categorizing ML Systems

❖ Orthogonal Dimensions of Categorization:

1. Scalability: In-memory libraries vs Scalable ML

system (works on larger-than-memory datasets)

2. Target Workloads: General ML library vs Decision

tree-oriented vs Deep learning, etc.

3. Implementation Reuse: Layered on top of scalable

data system vs Custom from-scratch framework

Page 9: CSE 232A Database System Implementation

9

Major Existing ML Systems

General ML libraries:

Disk-based files:In-memory: Layered on RDBMS/Spark:

Cloud-native: “AutoML” platforms:

Decision tree-oriented: Deep learning-oriented:

Page 10: CSE 232A Database System Implementation

10

ML as Numeric Optimization

❖ Recall that an ML model is a parametric function:

L(W) =nX

i=1

l(yi, f(W, xi))<latexit sha1_base64="2xHoHxDTpYj9FZ94WjRGA4eUUSI=">AAACIXicbVBNSwJBGJ61L7OvrY5dhiRQENntg7wIUpcOHQxSA9eW2XFWB2dnl5nZSBb/Spf+SpcORXiL/kyj7sG0BwYenud5mfd9vIhRqSzr28isrK6tb2Q3c1vbO7t75v5BU4axwKSBQxaKBw9JwignDUUVIw+RICjwGGl5g+uJ33oiQtKQ36thRDoB6nHqU4yUllyzcltwAqT6np+0RkVYhY6MAzehVXv0yCGDhaFLS9CfC5Xgs0uLRdfMW2VrCrhM7JTkQYq6a46dbojjgHCFGZKybVuR6iRIKIoZGeWcWJII4QHqkbamHAVEdpLphSN4opUu9EOhH1dwqs5PJCiQchh4OjlZVC56E/E/rx0rv9JJKI9iRTiefeTHDKoQTuqCXSoIVmyoCcKC6l0h7iOBsNKl5nQJ9uLJy6R5WrbPyhd35/naVVpHFhyBY1AANrgENXAD6qABMHgBb+ADfBqvxrvxZYxn0YyRzhyCPzB+fgE0I6Gu</latexit>

❖ Training: Process of fitting model parameters from data

❖ Training can be expressed in this form for many ML models;

aka “empirical risk minimization” (ERM) aka “loss” function:

f : DW ⇥DX ! DY<latexit sha1_base64="3nwsg8hxgtnGtmOBHQ5mjABxB8g=">AAACMnicbVDLSgMxFM34rPU16tJNsAiuyowPFFdFXeiugn1Ip5RMmmlDM5khuaOUod/kxi8RXOhCEbd+hOm0C9t6IHA4515y7vFjwTU4zps1N7+wuLScW8mvrq1vbNpb21UdJYqyCo1EpOo+0UxwySrAQbB6rBgJfcFqfu9y6NcemNI8knfQj1kzJB3JA04JGKll3wTn2AsJdCkR6dWglXE/SGsD7AEPmZ5w69hTvNMFolT0OOHc45ZdcIpOBjxL3DEpoDHKLfvFa0c0CZkEKojWDdeJoZkSBZwKNsh7iWYxoT3SYQ1DJTFpmml28gDvG6WNg0iZJwFn6t+NlIRa90PfTA5T6mlvKP7nNRIIzpopl3ECTNLRR0EiMER42B9uc8UoiL4hhCpusmLaJYpQMC3nTQnu9MmzpHpYdI+KJ7fHhdLFuI4c2kV76AC56BSV0DUqowqi6Am9og/0aT1b79aX9T0anbPGOztoAtbPLyQXq1A=</latexit>

❖ l() is a differentiable function; can be compositions

❖ GLMs, linear SVMs, and ANNs fit the above template

is a training example(xi, yi)<latexit sha1_base64="RqFgvBkpZLuXZDIZXEvdMwy9X6U=">AAAB8XicbVDLSsNAFL2pr1pfVZduBotQQUriA10W3bisYB/YhjCZTtqhk0mYmYgh9C/cuFDErX/jzr9x2mahrQcuHM65l3vv8WPOlLbtb6uwtLyyulZcL21sbm3vlHf3WipKJKFNEvFIdnysKGeCNjXTnHZiSXHoc9r2RzcTv/1IpWKRuNdpTN0QDwQLGMHaSA/VJ4+doNRjx165YtfsKdAicXJSgRwNr/zV60ckCanQhGOluo4dazfDUjPC6bjUSxSNMRnhAe0aKnBIlZtNLx6jI6P0URBJU0Kjqfp7IsOhUmnom84Q66Ga9ybif1430cGVmzERJ5oKMlsUJBzpCE3eR30mKdE8NQQTycytiAyxxESbkEomBGf+5UXSOq05Z7WLu/NK/TqPowgHcAhVcOAS6nALDWgCAQHP8ApvlrJerHfrY9ZasPKZffgD6/MHRuCQAw==</latexit>

Page 11: CSE 232A Database System Implementation

11

Key Algo. For ML: Gradient Descent

❖ Goal of training is to find minimizer: ❖ Not possible to solve for optimal in closed form usually ❖ Gradient Descent (GD) is an iterative procedure to get

to an optimal solution:

W⇤ = argmin L(W)<latexit sha1_base64="3M40XHUnyANhElUSxOkfbGUhArM=">AAACFnicbVDLSgMxFM3UV62vUZdugkWogmXGB7oRim5cuKhgH9DWkkkzbWgmMyQZoQzjT7jxV9y4UMStuPNvzEwH1NYDgZNz7k3uPU7AqFSW9WXkZmbn5hfyi4Wl5ZXVNXN9oy79UGBSwz7zRdNBkjDKSU1RxUgzEAR5DiMNZ3iR+I07IiT1+Y0aBaTjoT6nLsVIaalr7rc9pAaOGzXi2z14BtMrVRESfY/y+B5elX4qdrtm0SpbKeA0sTNSBBmqXfOz3fNx6BGuMENStmwrUB39uqKYkbjQDiUJEB6iPmlpypFHZCdK14rhjlZ60PWFPlzBVP3dESFPypHn6MpkRDnpJeJ/XitU7mknojwIFeF4/JEbMqh8mGQEe1QQrNhIE4QF1bNCPEACYaWTLOgQ7MmVp0n9oGwflo+vj4qV8yyOPNgC26AEbHACKuASVEENYPAAnsALeDUejWfjzXgfl+aMrGcT/IHx8Q0AEJ9H</latexit>

Gradient

W⇤<latexit sha1_base64="JmrhwCpR7HHpyf9T6lKe/X5Nc4s=">AAAB83icbVDLSgMxFL3js9ZX1aWbYBHERZnxgS6LblxWsA/ojCWTZtrQJDMkGaEM/Q03LhRx68+482/MtLPQ1gOBwzn3ck9OmHCmjet+O0vLK6tr66WN8ubW9s5uZW+/peNUEdokMY9VJ8SaciZp0zDDaSdRFIuQ03Y4us399hNVmsXywYwTGgg8kCxiBBsr+b7AZhhGWXvyeNqrVN2aOwVaJF5BqlCg0at8+f2YpIJKQzjWuuu5iQkyrAwjnE7KfqppgskID2jXUokF1UE2zTxBx1bpoyhW9kmDpurvjQwLrccitJN5Rj3v5eJ/Xjc10XWQMZmkhkoyOxSlHJkY5QWgPlOUGD62BBPFbFZEhlhhYmxNZVuCN//lRdI6q3nntcv7i2r9pqijBIdwBCfgwRXU4Q4a0AQCCTzDK7w5qfPivDsfs9Elp9g5gD9wPn8A7g2Rng==</latexit> W

<latexit sha1_base64="WpYihnUfp16/nQ4t+mDG09yA8hg=">AAAB8XicbVDLSgMxFL1TX7W+qi7dBIvgqsz4QJdFNy4r2Ae2pWTSO21oJjMkGaEM/Qs3LhRx69+482/MtLPQ1gOBwzn3knOPHwuujet+O4WV1bX1jeJmaWt7Z3evvH/Q1FGiGDZYJCLV9qlGwSU2DDcC27FCGvoCW/74NvNbT6g0j+SDmcTYC+lQ8oAzaqz02A2pGflB2pr2yxW36s5AlomXkwrkqPfLX91BxJIQpWGCat3x3Nj0UqoMZwKnpW6iMaZsTIfYsVTSEHUvnSWekhOrDEgQKfukITP190ZKQ60noW8ns4R60cvE/7xOYoLrXsplnBiUbP5RkAhiIpKdTwZcITNiYgllitushI2ooszYkkq2BG/x5GXSPKt659XL+4tK7SavowhHcAyn4MEV1OAO6tAABhKe4RXeHO28OO/Ox3y04OQ7h/AHzucPzViRAg==</latexit>

L(W)<latexit sha1_base64="3MgH+mIhiOohql8Mc0t7ZQ5X0QU=">AAAB9HicbVDLSsNAFL2pr1pfVZduBotQNyXxgS6Lbly4qGAf0IYymU7aoZNJnJkUSsh3uHGhiFs/xp1/46TNQlsPDBzOuZd75ngRZ0rb9rdVWFldW98obpa2tnd298r7By0VxpLQJgl5KDseVpQzQZuaaU47kaQ48Dhte+PbzG9PqFQsFI96GlE3wEPBfEawNpJ7X+0FWI88P2mnp/1yxa7ZM6Bl4uSkAjka/fJXbxCSOKBCE46V6jp2pN0ES80Ip2mpFysaYTLGQ9o1VOCAKjeZhU7RiVEGyA+leUKjmfp7I8GBUtPAM5NZRLXoZeJ/XjfW/rWbMBHFmgoyP+THHOkQZQ2gAZOUaD41BBPJTFZERlhiok1PJVOCs/jlZdI6qznntcuHi0r9Jq+jCEdwDFVw4ArqcAcNaAKBJ3iGV3izJtaL9W59zEcLVr5zCH9gff4AM7eRvQ==</latexit>

W(0)<latexit sha1_base64="S+keXcWMFFf6wrOJ7MBq7Rng6AM=">AAAB+XicbVDLSsNAFL3xWesr6tLNYBHqpiQ+0GXRjcsK9gFtLJPppB06mYSZSaGE/IkbF4q49U/c+TdO2iy09cDA4Zx7uWeOH3OmtON8Wyura+sbm6Wt8vbO7t6+fXDYUlEiCW2SiEey42NFORO0qZnmtBNLikOf07Y/vsv99oRKxSLxqKcx9UI8FCxgBGsj9W27F2I98oO0nT2lVecs69sVp+bMgJaJW5AKFGj07a/eICJJSIUmHCvVdZ1YeymWmhFOs3IvUTTGZIyHtGuowCFVXjpLnqFTowxQEEnzhEYz9fdGikOlpqFvJvOcatHLxf+8bqKDGy9lIk40FWR+KEg40hHKa0ADJinRfGoIJpKZrIiMsMREm7LKpgR38cvLpHVecy9qVw+XlfptUUcJjuEEquDCNdThHhrQBAITeIZXeLNS68V6tz7moytWsXMEf2B9/gAKppNG</latexit>

W(1)<latexit sha1_base64="FI2/tJBTSmjc6HdcV6spCWeSwCc=">AAAB+XicbVDLSsNAFL3xWesr6tLNYBHqpiQ+0GXRjcsK9gFtLJPppB06mYSZSaGE/IkbF4q49U/c+TdO2iy09cDA4Zx7uWeOH3OmtON8Wyura+sbm6Wt8vbO7t6+fXDYUlEiCW2SiEey42NFORO0qZnmtBNLikOf07Y/vsv99oRKxSLxqKcx9UI8FCxgBGsj9W27F2I98oO0nT2lVfcs69sVp+bMgJaJW5AKFGj07a/eICJJSIUmHCvVdZ1YeymWmhFOs3IvUTTGZIyHtGuowCFVXjpLnqFTowxQEEnzhEYz9fdGikOlpqFvJvOcatHLxf+8bqKDGy9lIk40FWR+KEg40hHKa0ADJinRfGoIJpKZrIiMsMREm7LKpgR38cvLpHVecy9qVw+XlfptUUcJjuEEquDCNdThHhrQBAITeIZXeLNS68V6tz7moytWsXMEf2B9/gAMLJNH</latexit>…

❖ Each iteration/pass aka epoch akin to a SQL aggregate! ❖ Typically, multiple of epochs needed for convergence

rL(W) =nX

i=1

rl(yi, f(W, xi))<latexit sha1_base64="y8dF+HfOIec6uWLjxK5GF482Lrc=">AAACLnicbVDLSgMxFM3UV62vUZdugkVooZQZH+imUBTBhYsKthU6dcikmTY0kxmSjFiGfpEbf0UXgoq49TNM21nU1gOBwznnknuPFzEqlWW9G5mFxaXllexqbm19Y3PL3N5pyDAWmNRxyEJx5yFJGOWkrqhi5C4SBAUeI02vfzHymw9ESBryWzWISDtAXU59ipHSkmteOhx5DMFrWHACpHqenzSHRViBjowDN6EVe3jPYRpihYFLS9CfipYeXVosumbeKltjwHlipyQPUtRc89XphDgOCFeYISlbthWpdoKEopiRYc6JJYkQ7qMuaWnKUUBkOxmfO4QHWulAPxT6cQXH6vREggIpB4Gnk6M95aw3Ev/zWrHyz9oJ5VGsCMeTj/yYQRXCUXewQwXBig00QVhQvSvEPSQQVrrhnC7Bnj15njQOy/ZR+eTmOF89T+vIgj2wDwrABqegCq5ADdQBBk/gBXyAT+PZeDO+jO9JNGOkM7vgD4yfX4PspwQ=</latexit>

Update rule at iteration t:

Learning rate hyper-parameterW(t+1) W(t) � ⌘rL(W(t))

<latexit sha1_base64="Jl05J1KClG7uTW4dnf4TNEBhksk=">AAACPHicbVBNSyNBFOxR14/4sVGPXhqDEFkMM66iR9GLBw+KxgiZGN503mhjT8/Q/UYJQ36Yl/0Re/PkxYMiXj3biTmscQsaiqp69HsVZUpa8v0Hb2x84sfk1PRMaXZufuFneXHp3Ka5EVgXqUrNRQQWldRYJ0kKLzKDkEQKG9HNQd9v3KKxMtVn1M2wlcCVlrEUQE5ql0/DBOg6iotG77KoEv/Fg/UeDxXGBMakd/yr77wNHiIBDzVECvgRr44m1tvlil/zB+DfSTAkFTbEcbv8N+ykIk9Qk1BgbTPwM2oVYEgKhb1SmFvMQNzAFTYd1ZCgbRWD43t8zSkdHqfGPU18oP47UUBibTeJXLK/qB31+uL/vGZO8W6rkDrLCbX4/CjOFaeU95vkHWlQkOo6AsJItysX12BAkOu75EoIRk/+Ts43a8Hv2vbJVmVvf1jHNFthq6zKArbD9tghO2Z1Jtg9e2TP7MX74z15r97bZ3TMG84ssy/w3j8AYsatAA==</latexit>

Page 12: CSE 232A Database System Implementation

12

GD vs Stochastic GD (SGD)

❖ Disadvantages of GD: ❖ An update needs pass over whole dataset; inefficient for

large datasets (> millions of examples) ❖ Gets us only to a local optimal; ANNs are “non-convex”

❖ Stochastic GD (SGD) resolves above issues: ❖ Popular for large-scale ML, especially ANNs/deep learning ❖ Updates based on samples aka mini-batches; batch sizes:

10s to 1000s; OOM more updates per epoch than GD!

W(t+1) W(t) � ⌘rL̃(W(t))<latexit sha1_base64="UQEnrJTW1ICg3Ivhlxip9zAg4io=">AAACRHicbVDBTttAFFyHUmhKW1OOvawaVQpCjewCao+oXHrgABIhSHaInjfPsGK9tnafiyLLH8elH8CNL+ilh6Kq16rrkAMkjLTa0cw87dtJCiUtBcGt11p6tvx8ZfVF++Xaq9dv/PW3JzYvjcC+yFVuThOwqKTGPklSeFoYhCxROEgu9xt/8B2Nlbk+pkmBwwzOtUylAHLSyI/iDOgiSatBfVZ1iW/xcLPmscKUwJj8ij/2nfeRx0jAYw2JchdJNcbqoObd+eTmyO8EvWAKvkjCGemwGQ5H/k08zkWZoSahwNooDAoaVmBICoV1Oy4tFiAu4RwjRzVkaIfVtISaf3DKmKe5cUcTn6oPJyrIrJ1kiUs2i9p5rxGf8qKS0i/DSuqiJNTi/qG0VJxy3jTKx9KgIDVxBISRblcuLsCAINd725UQzn95kZx86oXbvd2jnc7e11kdq+wde8+6LGSf2R77xg5Znwl2zX6y3+zO++H98v54f++jLW82s8Eewfv3H7dAsLY=</latexit>

rL̃(W) =X

i2B

rl(yi, f(W, xi))<latexit sha1_base64="9NcFodfWR8o3UOssWYu1349rFPk=">AAACOXicbVDLSsNAFJ34rPVVdenmYhFakJL4QDdCqRsXLirYVmhKmEwnOnQyCTMTsYT8lhv/wp3gxoUibv0Bp20W9XFg4HDOucy9x485U9q2n62Z2bn5hcXCUnF5ZXVtvbSx2VZRIgltkYhH8trHinImaEszzel1LCkOfU47/uBs5HfuqFQsEld6GNNeiG8ECxjB2kheqekK7HMMrma8T9OLDCpuiPWtH6SdrAqn4Kok9FIGLhPQyCCP88rQY3sQTIX34N5j1apXKts1ewz4S5yclFGOpld6cvsRSUIqNOFYqa5jx7qXYqkZ4TQruomiMSYDfEO7hgocUtVLx5dnsGuUPgSRNE9oGKvTEykOlRqGvkmOFlW/vZH4n9dNdHDSS5mIE00FmXwUJBx0BKMaoc8kJZoPDcFEMrMrkFssMdGm7KIpwfl98l/S3q85B7Wjy8NyvZHXUUDbaAdVkIOOUR2doyZqIYIe0At6Q+/Wo/VqfVifk+iMlc9soR+wvr4Bbvircw==</latexit>

Sample mini-batch from dataset without replacement

Page 13: CSE 232A Database System Implementation

13

Data Access Pattern of SGD

W(t+1) W(t) � ⌘rL̃(W(t))<latexit sha1_base64="UQEnrJTW1ICg3Ivhlxip9zAg4io=">AAACRHicbVDBTttAFFyHUmhKW1OOvawaVQpCjewCao+oXHrgABIhSHaInjfPsGK9tnafiyLLH8elH8CNL+ilh6Kq16rrkAMkjLTa0cw87dtJCiUtBcGt11p6tvx8ZfVF++Xaq9dv/PW3JzYvjcC+yFVuThOwqKTGPklSeFoYhCxROEgu9xt/8B2Nlbk+pkmBwwzOtUylAHLSyI/iDOgiSatBfVZ1iW/xcLPmscKUwJj8ij/2nfeRx0jAYw2JchdJNcbqoObd+eTmyO8EvWAKvkjCGemwGQ5H/k08zkWZoSahwNooDAoaVmBICoV1Oy4tFiAu4RwjRzVkaIfVtISaf3DKmKe5cUcTn6oPJyrIrJ1kiUs2i9p5rxGf8qKS0i/DSuqiJNTi/qG0VJxy3jTKx9KgIDVxBISRblcuLsCAINd725UQzn95kZx86oXbvd2jnc7e11kdq+wde8+6LGSf2R77xg5Znwl2zX6y3+zO++H98v54f++jLW82s8Eewfv3H7dAsLY=</latexit>

rL̃(W) =X

i2B

rl(yi, f(W, xi))<latexit sha1_base64="9NcFodfWR8o3UOssWYu1349rFPk=">AAACOXicbVDLSsNAFJ34rPVVdenmYhFakJL4QDdCqRsXLirYVmhKmEwnOnQyCTMTsYT8lhv/wp3gxoUibv0Bp20W9XFg4HDOucy9x485U9q2n62Z2bn5hcXCUnF5ZXVtvbSx2VZRIgltkYhH8trHinImaEszzel1LCkOfU47/uBs5HfuqFQsEld6GNNeiG8ECxjB2kheqekK7HMMrma8T9OLDCpuiPWtH6SdrAqn4Kok9FIGLhPQyCCP88rQY3sQTIX34N5j1apXKts1ewz4S5yclFGOpld6cvsRSUIqNOFYqa5jx7qXYqkZ4TQruomiMSYDfEO7hgocUtVLx5dnsGuUPgSRNE9oGKvTEykOlRqGvkmOFlW/vZH4n9dNdHDSS5mIE00FmXwUJBx0BKMaoc8kJZoPDcFEMrMrkFssMdGm7KIpwfl98l/S3q85B7Wjy8NyvZHXUUDbaAdVkIOOUR2doyZqIYIe0At6Q+/Wo/VqfVifk+iMlc9soR+wvr4Bbvircw==</latexit>

Sample mini-batch from dataset without replacement

Original dataset

Random “shuffle”

Epoch 1

W(0)<latexit sha1_base64="S+keXcWMFFf6wrOJ7MBq7Rng6AM=">AAAB+XicbVDLSsNAFL3xWesr6tLNYBHqpiQ+0GXRjcsK9gFtLJPppB06mYSZSaGE/IkbF4q49U/c+TdO2iy09cDA4Zx7uWeOH3OmtON8Wyura+sbm6Wt8vbO7t6+fXDYUlEiCW2SiEey42NFORO0qZnmtBNLikOf07Y/vsv99oRKxSLxqKcx9UI8FCxgBGsj9W27F2I98oO0nT2lVecs69sVp+bMgJaJW5AKFGj07a/eICJJSIUmHCvVdZ1YeymWmhFOs3IvUTTGZIyHtGuowCFVXjpLnqFTowxQEEnzhEYz9fdGikOlpqFvJvOcatHLxf+8bqKDGy9lIk40FWR+KEg40hHKa0ADJinRfGoIJpKZrIiMsMREm7LKpgR38cvLpHVecy9qVw+XlfptUUcJjuEEquDCNdThHhrQBAITeIZXeLNS68V6tz7moytWsXMEf2B9/gAKppNG</latexit>

Seq. scan

W(1)<latexit sha1_base64="FI2/tJBTSmjc6HdcV6spCWeSwCc=">AAAB+XicbVDLSsNAFL3xWesr6tLNYBHqpiQ+0GXRjcsK9gFtLJPppB06mYSZSaGE/IkbF4q49U/c+TdO2iy09cDA4Zx7uWeOH3OmtON8Wyura+sbm6Wt8vbO7t6+fXDYUlEiCW2SiEey42NFORO0qZnmtBNLikOf07Y/vsv99oRKxSLxqKcx9UI8FCxgBGsj9W27F2I98oO0nT2lVfcs69sVp+bMgJaJW5AKFGj07a/eICJJSIUmHCvVdZ1YeymWmhFOs3IvUTTGZIyHtGuowCFVXjpLnqFTowxQEEnzhEYz9fdGikOlpqFvJvOcatHLxf+8bqKDGy9lIk40FWR+KEg40hHKa0ADJinRfGoIJpKZrIiMsMREm7LKpgR38cvLpHVecy9qVw+XlfptUUcJjuEEquDCNdThHhrQBAITeIZXeLNS68V6tz7moytWsXMEf2B9/gAMLJNH</latexit>

W(2)<latexit sha1_base64="EHwZoqY5j/U4nnFUPxcanVrWIr8=">AAAB+XicbVDLSsNAFL2pr1pfUZduBotQNyWpii6LblxWsA9oa5lMJ+3QySTMTAol5E/cuFDErX/izr9x0mahrQcGDufcyz1zvIgzpR3n2yqsrW9sbhW3Szu7e/sH9uFRS4WxJLRJQh7KjocV5UzQpmaa004kKQ48Ttve5C7z21MqFQvFo55FtB/gkWA+I1gbaWDbvQDrsecn7fQpqdTO04FddqrOHGiVuDkpQ47GwP7qDUMSB1RowrFSXdeJdD/BUjPCaVrqxYpGmEzwiHYNFTigqp/Mk6fozChD5IfSPKHRXP29keBAqVngmcksp1r2MvE/rxtr/6afMBHFmgqyOOTHHOkQZTWgIZOUaD4zBBPJTFZExlhiok1ZJVOCu/zlVdKqVd2L6tXDZbl+m9dRhBM4hQq4cA11uIcGNIHAFJ7hFd6sxHqx3q2PxWjByneO4Q+szx8NspNI</latexit>

Mini-batch 1

Mini-batch 2

Mini-batch 3W(3)

<latexit sha1_base64="DOy/KUlhuZMwbREidWpApiAKXFQ=">AAAB+XicbVDLSsNAFL2pr1pfUZduBotQNyWxii6LblxWsA9oa5lMJ+3QySTMTAol5E/cuFDErX/izr9x0mahrQcGDufcyz1zvIgzpR3n2yqsrW9sbhW3Szu7e/sH9uFRS4WxJLRJQh7KjocV5UzQpmaa004kKQ48Ttve5C7z21MqFQvFo55FtB/gkWA+I1gbaWDbvQDrsecn7fQpqdTO04FddqrOHGiVuDkpQ47GwP7qDUMSB1RowrFSXdeJdD/BUjPCaVrqxYpGmEzwiHYNFTigqp/Mk6fozChD5IfSPKHRXP29keBAqVngmcksp1r2MvE/rxtr/6afMBHFmgqyOOTHHOkQZTWgIZOUaD4zBBPJTFZExlhiok1ZJVOCu/zlVdK6qLq16tXDZbl+m9dRhBM4hQq4cA11uIcGNIHAFJ7hFd6sxHqx3q2PxWjByneO4Q+szx8POJNJ</latexit>

Epoch 2

Seq. scan

(Optional) New

Random Shuffle

W(3)<latexit sha1_base64="DOy/KUlhuZMwbREidWpApiAKXFQ=">AAAB+XicbVDLSsNAFL2pr1pfUZduBotQNyWxii6LblxWsA9oa5lMJ+3QySTMTAol5E/cuFDErX/izr9x0mahrQcGDufcyz1zvIgzpR3n2yqsrW9sbhW3Szu7e/sH9uFRS4WxJLRJQh7KjocV5UzQpmaa004kKQ48Ttve5C7z21MqFQvFo55FtB/gkWA+I1gbaWDbvQDrsecn7fQpqdTO04FddqrOHGiVuDkpQ47GwP7qDUMSB1RowrFSXdeJdD/BUjPCaVrqxYpGmEzwiHYNFTigqp/Mk6fozChD5IfSPKHRXP29keBAqVngmcksp1r2MvE/rxtr/6afMBHFmgqyOOTHHOkQZTWgIZOUaD4zBBPJTFZExlhiok1ZJVOCu/zlVdK6qLq16tXDZbl+m9dRhBM4hQq4cA11uIcGNIHAFJ7hFd6sxHqx3q2PxWjByneO4Q+szx8POJNJ</latexit>

W(4)<latexit sha1_base64="7HsMu7dOxTzZgQl2dq540qpd99U=">AAAB+XicbVDLSsNAFL2pr1pfUZduBotQNyXRii6LblxWsA9oa5lMJ+3QySTMTAol5E/cuFDErX/izr9x0mahrQcGDufcyz1zvIgzpR3n2yqsrW9sbhW3Szu7e/sH9uFRS4WxJLRJQh7KjocV5UzQpmaa004kKQ48Ttve5C7z21MqFQvFo55FtB/gkWA+I1gbaWDbvQDrsecn7fQpqdTO04FddqrOHGiVuDkpQ47GwP7qDUMSB1RowrFSXdeJdD/BUjPCaVrqxYpGmEzwiHYNFTigqp/Mk6fozChD5IfSPKHRXP29keBAqVngmcksp1r2MvE/rxtr/6afMBHFmgqyOOTHHOkQZTWgIZOUaD4zBBPJTFZExlhiok1ZJVOCu/zlVdK6qLqX1auHWrl+m9dRhBM4hQq4cA11uIcGNIHAFJ7hFd6sxHqx3q2PxWjByneO4Q+szx8QvpNK</latexit>

ORDER BY RAND()

Randomized dataset

Page 14: CSE 232A Database System Implementation

14

❖ An SGD epoch is similar to SQL aggs but also different: ❖ More complex agg. state (running info): model param. ❖ Multiple mini-batch updates to model param. within a pass ❖ Sequential dependency across mini-batches in a pass! ❖ Need to keep track of model param. across epochs ❖ (Optional) New random shuffling before each epoch ❖ Not an algebraic aggregate; hard to parallelize! ❖ Not even commutative: different random shuffle orders

give different results (very unlike relational ops)!

Data Access Pattern of SGD

Q: How to implement scalable SGD in an ML system?

W(t)<latexit sha1_base64="jHixyQj+DLWH5ucHau8W5BCjQHQ=">AAAB+XicbVDLSsNAFL3xWesr6tLNYBHqpiQ+0GXRjcsK9gFtLJPppB06mYSZSaGE/IkbF4q49U/c+TdO2iy09cDA4Zx7uWeOH3OmtON8Wyura+sbm6Wt8vbO7t6+fXDYUlEiCW2SiEey42NFORO0qZnmtBNLikOf07Y/vsv99oRKxSLxqKcx9UI8FCxgBGsj9W27F2I98oO0nT2lVX2W9e2KU3NmQMvELUgFCjT69ldvEJEkpEITjpXquk6svRRLzQinWbmXKBpjMsZD2jVU4JAqL50lz9CpUQYoiKR5QqOZ+nsjxaFS09A3k3lOtejl4n9eN9HBjZcyESeaCjI/FCQc6QjlNaABk5RoPjUEE8lMVkRGWGKiTVllU4K7+OVl0jqvuRe1q4fLSv22qKMEx3ACVXDhGupwDw1oAoEJPMMrvFmp9WK9Wx/z0RWr2DmCP7A+fwByPpOK</latexit>

Page 15: CSE 232A Database System Implementation

15

Bismarck: Single-Node Scalable SGD

❖ An SGD epoch runs in RDBMS process space using its User-Defined Aggregate (UDA) abstraction with 4 functions: ❖ Initialize: Run once; set up info; alloc. memory for ❖ Transition: Run per-tuple; compute gradient as running

info; track mini-batches and update model param. ❖ Merge: Run per-worker at end; “combine” model param. ❖ Finalize: Run once at end; return final model param.

❖ Commands for shuffling, running multiple epochs, checking convergence, and validation/test error measurements issued from an external controller written in Python

https://adalabucsd.github.io/papers/2012_Bismarck_SIGMOD.pdf

W(t)<latexit sha1_base64="jHixyQj+DLWH5ucHau8W5BCjQHQ=">AAAB+XicbVDLSsNAFL3xWesr6tLNYBHqpiQ+0GXRjcsK9gFtLJPppB06mYSZSaGE/IkbF4q49U/c+TdO2iy09cDA4Zx7uWeOH3OmtON8Wyura+sbm6Wt8vbO7t6+fXDYUlEiCW2SiEey42NFORO0qZnmtBNLikOf07Y/vsv99oRKxSLxqKcx9UI8FCxgBGsj9W27F2I98oO0nT2lVX2W9e2KU3NmQMvELUgFCjT69ldvEJEkpEITjpXquk6svRRLzQinWbmXKBpjMsZD2jVU4JAqL50lz9CpUQYoiKR5QqOZ+nsjxaFS09A3k3lOtejl4n9eN9HBjZcyESeaCjI/FCQc6QjlNaABk5RoPjUEE8lMVkRGWGKiTVllU4K7+OVl0jqvuRe1q4fLSv22qKMEx3ACVXDhGupwDw1oAoEJPMMrvFmp9WK9Wx/z0RWr2DmCP7A+fwByPpOK</latexit>

Page 16: CSE 232A Database System Implementation

16

Combining SGD Model Parameters

❖ RDBMS takes care of scaling UDA code to larger-than-RAM data and distributed execution across workers

❖ Tricky part: How to “combine” model params. from workers, given that an SGD epoch is not an algebraic agg.? ❖ Master typically performs “Model Averaging”

Worker 1 Worker 2 Worker n…

Master

W(t)1

<latexit sha1_base64="DDvIp/6uHHDZOjNMUvnpHo6Owbk=">AAAB+3icbVDLSsNAFJ3UV62vWJdugkWom5L4QJdFNy4r2Ae0MUymk3boZBJmbsQS8ituXCji1h9x5984abPQ1gMDh3Pu5Z45fsyZAtv+Nkorq2vrG+XNytb2zu6euV/tqCiRhLZJxCPZ87GinAnaBgac9mJJcehz2vUnN7nffaRSsUjcwzSmbohHggWMYNCSZ1YHIYaxH6Td7CGtw0nmOZ5Zsxv2DNYycQpSQwVanvk1GEYkCakAwrFSfceOwU2xBEY4zSqDRNEYkwke0b6mAodUuekse2Yda2VoBZHUT4A1U39vpDhUahr6ejJPqha9XPzP6ycQXLkpE3ECVJD5oSDhFkRWXoQ1ZJIS4FNNMJFMZ7XIGEtMQNdV0SU4i19eJp3ThnPWuLg7rzWvizrK6BAdoTpy0CVqolvUQm1E0BN6Rq/ozciMF+Pd+JiPloxi5wD9gfH5A6OglC4=</latexit>

W(t)2

<latexit sha1_base64="xRaCmja4+Alc2G0VSB2AwXG38es=">AAAB+3icbVDLSsNAFL3xWesr1qWbYBHqpiRV0WXRjcsK9gFtDJPppB06mYSZiVhCfsWNC0Xc+iPu/BsnbRbaemDgcM693DPHjxmVyra/jZXVtfWNzdJWeXtnd2/fPKh0ZJQITNo4YpHo+UgSRjlpK6oY6cWCoNBnpOtPbnK/+0iEpBG/V9OYuCEacRpQjJSWPLMyCJEa+0HazR7SmjrNvIZnVu26PYO1TJyCVKFAyzO/BsMIJyHhCjMkZd+xY+WmSCiKGcnKg0SSGOEJGpG+phyFRLrpLHtmnWhlaAWR0I8ra6b+3khRKOU09PVknlQuern4n9dPVHDlppTHiSIczw8FCbNUZOVFWEMqCFZsqgnCguqsFh4jgbDSdZV1Cc7il5dJp1F3zuoXd+fV5nVRRwmO4Bhq4MAlNOEWWtAGDE/wDK/wZmTGi/FufMxHV4xi5xD+wPj8AaUklC8=</latexit>

W(t)n

<latexit sha1_base64="dQYgF7AMPEB/nejv+XzaZ3+cDiQ=">AAAB+3icbVDLSsNAFJ3UV62vWpduBotQNyXxgS6LblxWsA9oY5lMJ+3QySTM3Igl5FfcuFDErT/izr9x0mahrQcGDufcyz1zvEhwDbb9bRVWVtfWN4qbpa3tnd298n6lrcNYUdaioQhV1yOaCS5ZCzgI1o0UI4EnWMeb3GR+55EpzUN5D9OIuQEZSe5zSsBIg3KlHxAYe37SSR+SGpykAyNW7bo9A14mTk6qKEdzUP7qD0MaB0wCFUTrnmNH4CZEAaeCpaV+rFlE6ISMWM9QSQKm3WSWPcXHRhliP1TmScAz9fdGQgKtp4FnJrOketHLxP+8Xgz+lZtwGcXAJJ0f8mOBIcRZEXjIFaMgpoYQqrjJiumYKELB1FUyJTiLX14m7dO6c1a/uDuvNq7zOoroEB2hGnLQJWqgW9RELUTRE3pGr+jNSq0X6936mI8WrHznAP2B9fkDACOUaw==</latexit>

W(t) =1

n

nX

i=1

W(t)i

<latexit sha1_base64="YvRR+r5RoOYyp/ar/kI19j/9+94=">AAACJ3icbVDLSgMxFM3UV62vqks3wSLUTZnxgW4qRTcuK9gHdNohk2ba0ExmSDJCCfM3bvwVN4KK6NI/MW1noa0HAodzziX3Hj9mVCrb/rJyS8srq2v59cLG5tb2TnF3rymjRGDSwBGLRNtHkjDKSUNRxUg7FgSFPiMtf3Qz8VsPREga8Xs1jkk3RANOA4qRMpJXvHJDpIZ+oFtpT5fVcQqr0A0EwtpJNU+hK5PQ07TqpD0O57Me9Yolu2JPAReJk5ESyFD3iq9uP8JJSLjCDEnZcexYdTUSimJG0oKbSBIjPEID0jGUo5DIrp7emcIjo/RhEAnzuIJT9feERqGU49A3ycmmct6biP95nUQFl11NeZwowvHsoyBhUEVwUhrsU0GwYmNDEBbU7ArxEJmWlKm2YEpw5k9eJM2TinNaOb87K9Wuszry4AAcgjJwwAWogVtQBw2AwSN4Bm/g3XqyXqwP63MWzVnZzD74A+v7B2K9pkU=</latexit>

❖ A bizarre heuristic! ❖ Works OK for GLMs ❖ Terrible for ANNs!

Page 17: CSE 232A Database System Implementation

17

ParameterServer for Distributed SGD

❖ Disadvantages of Model Averaging for distributed SGD: ❖ Poor convergence for non-convex/ANN models; leads to

too many epochs and typically poor ML accuracy ❖ UDA merge step is choke point at scale (n in 100s); model

param. size can be huge (even GBs); wastes resources ❖ ParameterServer is a more flexible from-scratch design of an

ML system specifically for distributed SGD: ❖ Break the synchronization barrier for merging: allow

asynchronous updates from workers to master ❖ Flexible communication frequency: at mini-batch level too

https://www.cs.cmu.edu/~muli/file/parameter_server_osdi14.pdf

Page 18: CSE 232A Database System Implementation

18

ParameterServer for Distributed SGD

Worker 1 Worker 2 Worker n…

PS 1 PS 2 PS 2…

Multi-sever “master”; each server manages a part of W(t)<latexit sha1_base64="jHixyQj+DLWH5ucHau8W5BCjQHQ=">AAAB+XicbVDLSsNAFL3xWesr6tLNYBHqpiQ+0GXRjcsK9gFtLJPppB06mYSZSaGE/IkbF4q49U/c+TdO2iy09cDA4Zx7uWeOH3OmtON8Wyura+sbm6Wt8vbO7t6+fXDYUlEiCW2SiEey42NFORO0qZnmtBNLikOf07Y/vsv99oRKxSLxqKcx9UI8FCxgBGsj9W27F2I98oO0nT2lVX2W9e2KU3NmQMvELUgFCjT69ldvEJEkpEITjpXquk6svRRLzQinWbmXKBpjMsZD2jVU4JAqL50lz9CpUQYoiKR5QqOZ+nsjxaFS09A3k3lOtejl4n9eN9HBjZcyESeaCjI/FCQc6QjlNaABk5RoPjUEE8lMVkRGWGKiTVllU4K7+OVl0jqvuRe1q4fLSv22qKMEx3ACVXDhGupwDw1oAoEJPMMrvFmp9WK9Wx/z0RWr2DmCP7A+fwByPpOK</latexit>

Push / Pull when ready/needed

No sync. for workers or servers

Workers send gradients to master for updates

at each mini-batch (or lower frequency)

rL̃(W(t+1)n )

<latexit sha1_base64="GGLCamUO1sNnU5MPUcSRaDlHVdg=">AAACEHicbVC7SgNBFJ2NrxhfUUubwSAmCGHXB1oGbSwsIpgHZGOYncwmQ2Znl5m7Qlj2E2z8FRsLRWwt7fwbJ49CEw9cOJxzL/fe40WCa7DtbyuzsLi0vJJdza2tb2xu5bd36jqMFWU1GopQNT2imeCS1YCDYM1IMRJ4gjW8wdXIbzwwpXko72AYsXZAepL7nBIwUid/6EriCYJd4KLLkpsUF92AQN/zk0Z6nxThyCmlHVnq5At22R4DzxNnSgpoimon/+V2QxoHTAIVROuWY0fQTogCTgVLc26sWUTogPRYy1BJAqbbyfihFB8YpYv9UJmSgMfq74mEBFoPA890jo7Vs95I/M9rxeBftBMuoxiYpJNFfiwwhHiUDu5yxSiIoSGEKm5uxbRPFKFgMsyZEJzZl+dJ/bjsnJTPbk8LlctpHFm0h/ZRETnoHFXQNaqiGqLoET2jV/RmPVkv1rv1MWnNWNOZXfQH1ucPdTycNg==</latexit>

rL̃(W(t�1)2 )

<latexit sha1_base64="z+XBRFQZlyoDHOZkNVVbaz0Q+wU=">AAACEHicbVC7SgNBFJ31GeMramkzGMSkMOxGRcugjYVFBPOAbAyzk9lkyOzsMnNXCMt+go2/YmOhiK2lnX/j5FFo4oELh3Pu5d57vEhwDbb9bS0sLi2vrGbWsusbm1vbuZ3dug5jRVmNhiJUTY9oJrhkNeAgWDNSjASeYA1vcDXyGw9MaR7KOxhGrB2QnuQ+pwSM1MkduZJ4gmAXuOiy5CbFBTcg0Pf8pJHeJwU4doppp1zs5PJ2yR4DzxNnSvJoimon9+V2QxoHTAIVROuWY0fQTogCTgVLs26sWUTogPRYy1BJAqbbyfihFB8apYv9UJmSgMfq74mEBFoPA890jo7Vs95I/M9rxeBftBMuoxiYpJNFfiwwhHiUDu5yxSiIoSGEKm5uxbRPFKFgMsyaEJzZl+dJvVxyTkpnt6f5yuU0jgzaRweogBx0jiroGlVRDVH0iJ7RK3qznqwX6936mLQuWNOZPfQH1ucPHSSb/A==</latexit>

rL̃(W(t)1 )

<latexit sha1_base64="OcO+0LVRY+mliWjDoWz5mG5oEXI=">AAACDnicbVC7SgNBFJ31GeNr1dJmMASSJuz6QMugjYVFBPOAbAyzk9lkyOzsMnNXCMt+gY2/YmOhiK21nX/j5FFo4oELh3Pu5d57/FhwDY7zbS0tr6yurec28ptb2zu79t5+Q0eJoqxOIxGplk80E1yyOnAQrBUrRkJfsKY/vBr7zQemNI/kHYxi1glJX/KAUwJG6tpFTxJfEOwBFz2W3mS45IUEBn6QNrP7tATlrOuWu3bBqTgT4EXizkgBzVDr2l9eL6JJyCRQQbRuu04MnZQo4FSwLO8lmsWEDkmftQ2VJGS6k07eyXDRKD0cRMqUBDxRf0+kJNR6FPqmc3yqnvfG4n9eO4HgopNyGSfAJJ0uChKBIcLjbHCPK0ZBjAwhVHFzK6YDoggFk2DehODOv7xIGscV96RydntaqF7O4sihQ3SESshF56iKrlEN1RFFj+gZvaI368l6sd6tj2nrkjWbOUB/YH3+AChxm4k=</latexit>

❖ Model params may get out-of-sync or stale; but SGD turns out to be remarkably robust—multiple updates/epoch really helps

❖ Communication cost per epoch is higher (per mini-batch)

Page 19: CSE 232A Database System Implementation

19

Deep Learning Systems

❖ Offer 3 crucial new capabilities for DL users: ❖ High-level APIs to easily construct complex neural

architectures; aka differentiable programming ❖ Automatic differentiation (“autodiff”) to compute ❖ Automatic backpropagation: chain rule to propagate

gradients and update model params across ANN layers ❖ “Neural computational graphs” compiled down to low-level

hardware-optimized physical matrix ops (CPU, GPU, etc.) ❖ In-built support for many variants of GD and SGD ❖ Wider tooling infra. for saving models, plotting loss, etc.

L(·)<latexit sha1_base64="tNJfpjXH2DN9un7btQdKjtFqYNI=">AAAB73icbVDLSgNBEJz1GeMr6tHLYBDiJez6QI9BLx48RDAPSJYwOzubDJmdWWd6hbDkJ7x4UMSrv+PNv3GS7EETCxqKqm66u4JEcAOu++0sLa+srq0XNoqbW9s7u6W9/aZRqaasQZVQuh0QwwSXrAEcBGsnmpE4EKwVDG8mfuuJacOVfIBRwvyY9CWPOCVgpfZdpUtDBSe9UtmtulPgReLlpIxy1Hulr26oaBozCVQQYzqem4CfEQ2cCjYudlPDEkKHpM86lkoSM+Nn03vH+NgqIY6UtiUBT9XfExmJjRnFge2MCQzMvDcR//M6KURXfsZlkgKTdLYoSgUGhSfP45BrRkGMLCFUc3srpgOiCQUbUdGG4M2/vEiap1XvrHpxf16uXedxFNAhOkIV5KFLVEO3qI4aiCKBntErenMenRfn3fmYtS45+cwB+gPn8wc9NI90</latexit>

rL̃(·)<latexit sha1_base64="PY4Gjg8y3tt/U8mGjWXSxhIXfGU=">AAACAHicbVDLSsNAFJ3UV62vqgsXbgaLUDcl8YEui25cuKhgH9CEMplM2qGTSZi5EUrIxl9x40IRt36GO//G6WOhrQcuHM65l3vv8RPBNdj2t1VYWl5ZXSuulzY2t7Z3yrt7LR2nirImjUWsOj7RTHDJmsBBsE6iGIl8wdr+8Gbstx+Z0jyWDzBKmBeRvuQhpwSM1CsfuJL4gmAXuAhYdpdXXRrEcNIrV+yaPQFeJM6MVNAMjV75yw1imkZMAhVE665jJ+BlRAGnguUlN9UsIXRI+qxrqCQR0142eSDHx0YJcBgrUxLwRP09kZFI61Hkm86IwEDPe2PxP6+bQnjlZVwmKTBJp4vCVGCI8TgNHHDFKIiRIYQqbm7FdEAUoWAyK5kQnPmXF0nrtOac1S7uzyv161kcRXSIjlAVOegS1dEtaqAmoihHz+gVvVlP1ov1bn1MWwvWbGYf/YH1+QMfT5Yb</latexit>

Page 20: CSE 232A Database System Implementation

20

PDF: https://www.morganclaypool.com/doi/10.2200/S00895ED1V01Y201901DTM057

Hardcopy: https://www.morganclaypoolpublishers.com/catalog_Orig/product_info.php?products_id=1366

Takeaway: Scalable, efficient, and usable systems for ML have become critical for unlocking the value of “Big Data”

Advertisement: I will offer CSE 291/234 titled “Data Systems for Machine Learning” in Fall 2020

If you are interested in learning more about this topic, read my recent book “Data management ML Systems”:

Machine Learning Systems

Page 21: CSE 232A Database System Implementation

Switch to Krypton Slides