websphere extreme scale - customer experiences · 2018. 5. 18. · for 6 jvms, this is 2.7gb of...
TRANSCRIPT
We
bS
ph
ere
eX
trem
e S
cale
Custo
me
r E
xpe
rie
nce
s
Jonath
an M
ars
hall
WebS
phere
Technic
al P
rofe
ssio
nal
mars
halj@
uk.ibm
.com
2
WebS
phere
eX
trem
e S
cale
� w
hat's
a c
ache?
(fa
r a
wa
y)
(ne
ar)
(hap
py
)
A c
ac
he a
llo
ws
yo
u t
o g
et
stu
ff f
aste
r an
d h
elp
s y
ou
avo
id d
oin
g s
om
eth
ing
ove
r an
d o
ver
ag
ain
(wh
ich
may
be r
ed
un
dan
t a
nd
ma
y n
ot
mak
e s
en
se)
3
Com
ple
x E
vent
Pro
cessin
g
In-m
em
ory
OLT
PIn
tern
al
clo
uds
Worldw
ide c
ache
Peta
byte
analy
tics
Share
d c
ache
Data
buffer
In-m
em
ory
S
OA
Save m
oney
Gro
w B
usin
ess
Application change required
Cu
sto
mer
Go
al
data
ori
en
ted
ap
plicati
on
ori
en
ted
Se
ssio
n m
ana
gem
ent
Ela
stic D
ynaC
ache
Web s
ide c
ache
Wh
at
is W
eb
Sp
he
re e
Xtr
em
e S
ca
le?
Sce
na
rio
1 �
Dyn
aca
ch
e r
ep
lace
me
nt
9
Com
ple
x E
vent
Pro
cessin
g
In-m
em
ory
OLT
PIn
tern
al
clo
uds
Worldw
ide c
ache
Peta
byte
analy
tics
Share
d c
ache
Data
buffer
In-m
em
ory
S
OA
Save m
oney
Gro
w B
usin
ess
Application change required
Cu
sto
mer
Go
al
data
ori
en
ted
ap
plicati
on
ori
en
ted
Se
ssio
n m
ana
gem
ent
Ela
stic D
ynaC
ache
Web s
ide c
ache
eX
trem
e S
cale
Wh
at
is W
eb
Sp
he
re e
Xtr
em
e S
ca
le?
10D
yn
aca
ch
e -
Wh
at's t
he
ch
alle
ng
e?
Dyn
acache
bri
ng
s a
nu
mb
er
of
ch
alle
ng
es t
o a
la
rge
or
gro
win
g d
eplo
ym
ent
Each J
VM
has it's
ow
n c
ache s
tore
Each J
VM
has it's
ow
n d
isk o
fflo
ad
Cache
is p
riva
te t
o a
JV
M c
ostin
g C
PU
, m
em
ory
, ne
twork
an
d d
isk
Each c
ache e
ntr
y is d
uplic
ate
d n
tim
es for
n J
VM
s, w
asting m
em
ory
Cache c
hange r
equires n
invalid
ations w
asting C
PU
and n
etw
ork
bandw
idth
n c
opie
s o
f th
e c
ache e
ntr
y m
eans there
is p
ote
ntial fo
r a s
tale
cache h
it
Dis
k o
fflo
ad for
each J
VM
can g
et expensiv
e a
nd r
equire h
igh-p
ow
ere
d h
ard
ware
Cache
ne
ed
s �
wa
rmin
g u
p�
on
JV
M (
re)s
tart
Upon r
esta
rt w
ill e
xperience s
low
responses
Resta
rt c
auses h
eavy d
isk I/O
and h
eavy C
PU
for
invalid
ating s
tale
data
on d
isk
11
WC
S J
VM
1.5
Gb h
eap
WC
S A
pp
60
% -
0.9
Gb
WA
S in
tern
als
10
% -
0.1
5G
b
Dyn
aca
ch
e d
ata
30
% -
0.4
5G
b
WC
S J
VM
1
.5G
b h
ea
pW
CS
Ap
p6
0%
- 0
.9G
b
WA
S in
tern
als
10
% -
0.1
5G
b
Dyn
aca
ch
e d
ata
30
% -
0.4
5G
b
WC
S J
VM
1
.5G
b h
ea
pW
CS
Ap
p6
0%
- 0
.9G
b
WA
S in
tern
als
10
% -
0.1
5G
b
Dyn
aca
ch
e d
ata
30
% -
0.4
5G
b
WC
S J
VM
1.5
Gb h
eap
WC
S A
pp
60
% -
0.9
Gb
WA
S in
tern
als
10
% -
0.1
5G
b
Dyn
aca
ch
e d
ata
30
% -
0.4
5G
b
WC
S J
VM
1.5
Gb h
ea
pW
CS
Ap
p60%
- 0
.9G
b
WA
S in
tern
als
10
% -
0.1
5G
b
Dyn
aca
ch
e d
ata
30
% -
0.4
5G
b
WC
S J
VM
1
.5G
b h
ea
pW
CS
Ap
p6
0%
- 0
.9G
b
WA
S in
tern
als
10
% -
0.1
5G
b
Dyn
aca
ch
e d
ata
30
% -
0.4
5G
b
Dynacache
mem
ory
use e
xam
ple
� A
ssum
e takes 3
0%
of 1
.5G
b h
eap =
450M
b c
ached d
ata
� this
is the s
am
e d
ata
in e
ach J
VM
� F
or
6 J
VM
s, th
is is 2
.7G
b o
f data
needed just to
repre
sent 45
0M
b o
f cache
� D
isk o
fflo
ad is 6
x 1
8G
b =
10
8G
b (
every
insta
nce
has it's
ow
n d
isk o
fflo
ad)
� C
osts
in p
erf
orm
ance
�G
arb
age
colle
ction
�D
isk I/O
�D
yna
cache
invalid
ation
18
Gb
18
Gb
18
Gb
18
Gb
18
Gb
18
Gb
Exa
mp
le -
We
bS
ph
ere
Co
mm
erc
e w
ith
dyn
aca
ch
e
Invalid
ation C
hatter
Local
Dis
k o
fflo
ads
12D
yn
aca
ch
e r
ep
lace
d w
ith
We
bS
ph
ere
eX
tre
me
Sca
le
� I
t is
a s
hare
d c
ach
e s
tore
d in
de
pe
nd
en
tly f
rom
the
ap
plic
ation
serv
er
�T
he
app
licatio
n c
an
ru
n m
ore
effic
ien
tly; fa
ste
r re
spo
nse
tim
es a
nd
gre
ate
r th
roug
hp
ut
�N
o d
uplic
ate
of cach
e in
-mem
ory
re
du
cin
g m
em
ory
waste
�N
o in
valid
atio
n b
etw
ee
n c
ach
es r
ed
ucin
g C
PU
an
d n
etw
ork
of JV
M �
cha
tter�
�W
he
n a
cach
e e
ntr
y is in
valid
ate
d, it o
nly
nee
ds r
ecre
atin
g o
nce a
nd a
ll JV
Ms c
an s
ee
it.
This
is a
sig
nific
ant C
PU
savin
g o
n a
la
rge
esta
te
�N
o s
tale
ca
che
hits
�N
o �
wa
rm u
p�
ne
ed
ed
or
perf
orm
ance
hit o
n J
VM
re
sta
rt
� T
he
cach
e c
an b
e v
ery
la
rge;
100
s o
f G
Bs w
itho
ut
req
uir
ing
dis
k
�S
avin
g m
on
ey o
n d
isk h
ard
ware
�S
ca
le c
ache
mo
re e
asily
(ju
st a
dd
mo
re J
VM
s)
�Im
pro
ve p
erf
orm
an
ce th
rou
gh
sto
ring
data
in
me
mory
No
cod
e c
han
ge
ne
ed
ed
, ju
st
plu
g e
Xtr
em
e S
cale
in a
s d
yn
aca
che
pro
vid
er
14W
ha
t's t
he
ca
tch
?
Som
e F
AQ
s
But
isn
't a
dis
k o
fflo
ad m
ore
resili
ent?
Web
Sph
ere
eX
tre
me
Sca
le c
an e
ffic
iently c
op
y c
ache
da
ta t
o a
con
fig
ure
d n
um
be
r of
replic
as t
o p
rovid
e in-m
em
ory
availa
bili
ty
Can
a W
ebS
phe
re e
Xtr
em
e S
cale
cache
be
big
eno
ug
h?
Web
Sph
ere
eX
tre
me
Sca
le w
ou
ld p
art
itio
n t
he
ca
ch
e d
ata
se
t, w
hic
h is p
rove
n t
o
scale
to
10
00
s o
f JV
Ms w
ith c
on
sis
tent
and
pre
dic
table
re
sp
on
se
tim
es.
To
incre
ase c
ache
siz
e,
we s
imply
nee
d t
o a
dd J
VM
s.
No e
xtr
a c
onfigu
ration
is
ne
ed
ed
Wha
t's t
he
ca
tch
?V
ery
little
� t
he p
rim
ary
diffe
rence w
ill b
e t
he
ne
twork
ba
nd
wid
th f
rom
th
e
Com
me
rce
tie
r to
th
e W
XS
tie
r.
Bu
t th
is is m
itig
ate
d b
y t
he
WX
S c
om
pre
ssio
n o
f ca
ch
e d
ata
over
the w
ire (
betw
een
2.5
or
3:1
)
15D
yn
aca
ch
e R
ep
lace
me
nt
- S
um
ma
ry
We
bS
ph
ere
eX
tre
me
Sca
le b
rin
gs e
lastic d
ata
gri
d t
ech
no
log
y t
o d
yn
aca
ch
e p
rim
ari
ly b
rin
gin
g
be
ne
fits
on
� A
pp
lica
tio
n p
erf
orm
an
ce
an
d s
ca
lab
ility
� D
isk h
ard
wa
re c
ost
� R
ed
uctio
n in
Sta
ck-p
rod
uct
se
rve
r e
sta
te (
e.g
. C
om
me
rce
, P
ort
al)
Fu
rth
er
info
rma
tio
n
� T
he
se
hig
hlig
hts
are
dra
wn
fro
m B
illy N
ew
po
rt's
Blo
gh
ttp
://w
ww
.de
vw
eb
sp
he
re.c
om
/de
vw
eb
sp
he
re/2
01
0/0
5/w
eb
sp
he
re-c
om
me
rce
-se
rve
r-n
ow
-su
pp
ort
s-u
sin
g-w
eb
sp
he
re-e
xtr
em
e-s
ca
le-f
or-
pa
ge
-fra
gm
en
t-ca
ch
ing
.htm
l
htt
p:/
/ww
w.d
evw
eb
sp
he
re.c
om
/de
vw
eb
sp
he
re/2
01
0/0
5/s
om
e-r
ea
so
ns-w
hy-w
eb
sp
he
re-e
xtr
em
e-
sca
le-l
ow
ers
-co
mm
erc
e-s
erv
er-
cp
u-r
eq
uir
em
en
ts.h
tml
� A
lo
ad
te
st
co
mp
ari
so
n o
f d
yn
aca
ch
e w
ith
an
d w
ith
ou
t W
eb
Sp
he
re e
Xtr
em
e S
ca
leh
ttp
://w
ww
.de
vw
eb
sp
he
re.c
om
/we
bsp
he
re_
extr
em
e_
sca
le/2
00
9/0
9/r
ep
lacin
g-d
yn
aca
ch
e-d
isk-
off
loa
d-w
ith
-dyn
aca
ch
e-u
sin
g-i
bm
-we
bsp
he
re-e
xtr
em
e-s
ca
le.h
tml
Sce
na
rio
2 �
Hig
h P
erf
orm
an
ce
Da
ta
Ca
ch
e
17
Com
ple
x E
vent
Pro
cessin
g
In-m
em
ory
OLT
PIn
tern
al
clo
uds
Worldw
ide c
ache
Peta
byte
analy
tics
Share
d c
ache
Data
buffer
In-m
em
ory
S
OA
Save m
oney
Gro
w B
usin
ess
Application change required
Cu
sto
mer
Go
al
data
ori
en
ted
ap
plicati
on
ori
en
ted
Se
ssio
n m
ana
gem
ent
Ela
stic D
ynaC
ache
Web s
ide c
ache
17
eX
trem
e S
cale
Wh
at
is W
eb
Sp
he
re e
Xtr
em
e S
ca
le?
18
Web
Sph
ere
App
lica
tio
n
Serv
er
We
bS
phe
re
Ap
plic
ation
Se
rver
We
bS
phe
re
Ap
plic
ation
Se
rver
We
bS
phe
re
eX
tre
me
Scale
Data
Pow
er
Pro
ject
Arc
hite
ctu
re O
ve
rvie
w
Data
Po
we
r
Web
Sph
ere
App
lica
tio
n
Serv
er
WebS
phere
eX
trem
e S
cale
File
Web
Se
rvic
es
Tra
din
g d
ata
feed
x8
25
0m
s r
esp
on
se
Hig
h P
erf
orm
ance A
ccess (
<5m
s)
Larg
e a
mounts
of data
(13m
obje
cts
)
Data
is h
ighly
availa
ble
� p
erf
orm
ance
cost of fa
ilure
must be m
inim
al
Makes d
iffe
rent ty
pes o
f data
sourc
es
availa
ble
in m
em
ory
19W
hy W
ebS
ph
ere
eX
tre
me
Sca
le?
�F
ile loa
din
g �
it's
20
0M
b,
cou
ldn't it
be loa
de
d e
ve
ryw
here
? (
With
WX
S it
take
s a
pp
rox 3
Gb!)
�O
nly
loa
de
d o
nce
�R
equ
ire
me
nt
to m
ana
ge
relo
ads
�N
o �
warm
-up
� tim
e f
or
ne
w J
VM
s
�M
ain
tain
s v
ery
fast
acce
ss (
no
txn
mg
mt
for
rea
d-o
nly
)
�A
vaila
bili
ty c
ritica
l �
Cache
ou
tag
e =
site
ou
tag
e
�W
eb
Serv
ices �
just
no
rma
l cache
op
era
tio
n?
�S
tate
less W
S,
so n
o a
ffin
ity t
o W
AS
=>
cache
du
plic
ation
(s
imila
r sce
na
rio
to d
yn
acache
)
�F
ine
-gra
ined
in
va
lida
tio
n c
on
trol esse
ntial fo
r p
erm
issio
ns
201.
Da
ta p
art
itio
nin
g is a
bstr
acte
d b
ut
no
t hid
de
n
Prin
cip
le:
WX
S d
oe
s n
ot
sup
po
rt 2
-pha
se
co
mm
it (
it is 7
x s
low
er)
. N
o t
ransactio
n
ca
n inte
ract
with
mu
ltip
le p
art
itio
ns.
Desig
n a
ccord
ingly
1 -
Th
ink c
are
fully
ab
ou
t qu
eri
es
�W
XS
is h
ighly
optim
ise
d f
or
dir
ect
acce
ss �
map
.ge
t(ke
y)
�G
en
era
lise
d q
ue
rie
s n
ee
d t
o s
can
en
tire
gri
d �
ca
n p
erf
orm
we
ll as
all
pa
ralle
lise
d,
but
are
a s
cala
bili
ty lim
it
�G
rid
ag
en
ts c
an
run
qu
eri
es o
n r
ele
van
t pa
rtitio
ns
2 -
Loa
din
g c
om
ple
xitie
s
�R
un o
n 1
pa
rtitio
n o
r e
ve
ry p
art
itio
n?
No c
ho
ice w
ith f
ile
�B
atc
hin
g u
pd
ate
s w
ithin
tra
nsactio
n
�T
here
are
no
w g
reat
sam
ple
s a
vaila
ble
(see
no
tes)
212.
Ob
ject
map
de
sig
n
P
rincip
le:
Sto
re d
ata
�a
s c
lose
to t
he e
dg
e�
as p
ossib
le
Guid
ance
�O
ptim
ise
for
app
lica
tio
n u
se
� h
ow
is it
go
ing t
o b
e u
sed
?
�D
uplic
ate
, do
n't n
orm
alis
e �
use
la
rge
en
tities w
ith
op
tio
na
l field
s
�O
ptim
ise
for
que
rie
s -
th
e f
aste
st
que
rie
s a
re c
ache
d q
ue
rie
s
�E
.g.
Full-
text
sea
rch
on
githu
b
�S
tore
sub
-en
tities in
sa
me
pa
rtitio
n a
s p
rim
ary
�O
ptim
ise
for
rem
ote
access
�D
on't w
an
t to
have t
o r
ecycle
gri
d,
so s
ch
em
a d
esig
n im
port
ant
�G
en
eri
c �
help
er�
ag
en
t in
terf
ace
ca
n a
dd
fle
xib
ility
223.
To
ne
ar
cach
e o
r n
ot
nea
r cache
Princip
le: W
XS
allo
ws for
a s
ubset of th
e g
rid to b
e s
tore
d in the c
lient in
a n
ear
cache for
absolu
te m
axim
um
perf
orm
ance
Sure
ly n
ear
cache is s
uper
fast =
good? S
uper
fast
is µ
s inste
ad o
f m
s. A
re m
s g
ood e
nough?
�D
ow
nsid
e is that you w
ill h
ave s
tale
data
,
Guid
ance
1)
Use a
evic
tion p
olic
y o
n the n
ear
cache to c
ache
. T
he tra
de o
ff is b
etter
perf
orm
ance in
exchange for
data
bein
g p
ote
ntially
sta
le.
2)
Rely
on o
ptim
istic lockin
g. S
tops w
rite
s o
f sta
le d
ata
but can s
till
have s
tale
reads. A
n
evic
tion p
olic
y c
an r
educe the fre
quency o
f optim
istic r
ollb
acks.
3)
Use J
MS
to p
ush s
tale
event notifications fro
m t
he r
em
ote
grid to c
lients
. Y
ou s
till
need to d
o
(2)
but th
e s
tale
win
dow
is v
ery
short
now
4)
Don't u
se a
near
cache if 1,2
or
3 a
re n
ot accep
table
.
Or
if y
ou h
ave to...
�N
o n
ear
cache, and e
valu
ate
NO
_C
OP
Y o
ption to r
em
ove txn s
em
antics
ris
ky, only
�
for
read-o
nly
!
234.
Siz
ing
the
gri
d
Prin
cip
le: D
on't
assu
me th
at th
e a
mo
unt of
da
ta t
o s
tore
= t
he
re
qu
ired
Java
he
ap
�U
se r
ea
l da
ta to s
ize
the
grid
�B
ased
on h
ow
man
y o
bje
cts
wou
ld fill
60
% o
f th
e h
ea
p o
f a
JV
M
�B
e a
wa
re o
f o
bje
ct o
verh
ea
d s
ize
o1
3m
x 2
00
byte
s =
re
ally
qu
ite a
lot!
�R
ep
licatio
n -
How
ma
ny s
ynch
ron
ou
s r
eplic
as a
nd
asynch
ron
ou
s r
eplic
as?
�N
um
ber
of p
art
itio
ns
�A
im for
10
prim
arie
s p
er
JV
M �
even
dis
trib
utio
n a
nd lo
w im
pact o
f fa
ilure
�N
um
ber
of p
art
itio
ns r
eco
mm
en
de
d to
be a
pri
me n
um
be
r d
on't
kno
w w
hy
�
�S
et N
um
Initia
lCon
tain
ers
to
be th
e to
tal nu
mbe
r o
f JV
Ms in th
e in
itia
l gri
d�
It c
an
destr
oy s
tart
-up p
erf
orm
an
ce o
the
rwis
e
�R
em
em
be
r to
fa
cto
r in
th
e L
oad
er
as c
an
take
up s
ignific
an
t spa
ce.
Me
tho
dolo
gy o
utlin
ed
in B
illy's
vid
eo
-
http://w
ww
.de
vw
ebsp
he
re.c
om
/de
vw
ebsp
he
re/2
009
/02/w
ebsp
he
re-e
xtr
em
e-s
ca
le-
siz
ingco
nfig
ura
tion
-pre
sen
tation
.htm
l
Me
mo
ry o
verh
ead
- h
ttp://w
ww
.devw
eb
sp
he
re.c
om
/de
vw
eb
sp
he
re/2
009/1
0/m
em
ory
-usa
ge
-in-i
bm
-w
eb
sph
ere
-extr
em
e-s
cale
.htm
l
245.
Man
ag
ing
the
grid
Pri
ncip
le:
We
bS
ph
ere
eX
tre
me
Sca
le c
an
ru
n in
WA
S o
r a
s a
sta
nd
alo
ne
JV
M
Sta
nd
alo
ne
�C
he
ap
er
lice
nse
!
�Im
plic
atio
n is m
an
ua
l m
an
ag
em
en
t; s
izin
g,
op
era
tio
na
l co
ntr
ol, m
an
ua
l a
dm
inis
tra
tio
n
�X
sa
dm
in t
oo
l is
pro
vid
ed
, b
ut
is d
escri
be
d a
s a
sa
mp
le
With
in W
eb
Sp
he
re A
pp
lica
tio
n S
erv
er
�W
XS
ad
min
istr
atio
n is t
rivia
l �
ca
talo
g s
erv
ers
an
d r
ou
tin
g a
uto
ma
tica
lly p
rovid
ed
�C
om
mo
n a
dm
inis
tra
tio
n in
fra
str
uctu
re
Sh
ou
ld t
he
grid
co
lloca
te w
ith
th
e a
pp
lica
tio
n o
r n
ot?
� O
fte
n s
tart
with
co
lloca
tio
n f
or
rea
so
ns o
f �p
erf
orm
an
ce
�.
Pro
ba
bly
wo
n't m
ate
ria
lise
� P
erf
orm
an
ce
hit o
f n
etw
ork
is lo
w a
nd
ca
n b
e o
ptim
ise
d
� S
ep
ara
te g
rid
giv
es f
lexib
ility
of
ma
na
ge
me
nt:
�A
pp
lica
tio
n u
pd
ate
s a
re s
ep
ara
te f
rom
gri
d
�A
pp
lica
tio
n a
nd
grid
tie
rs c
an
be
siz
ed
ve
ry d
iffe
ren
tly
Su
mm
ary
26
Com
ple
x E
vent
Pro
cessin
g
In-m
em
ory
OLT
PIn
tern
al
clo
uds
Worldw
ide c
ache
Peta
byte
analy
tics
Share
d c
ache
Data
buffer
In-m
em
ory
S
OA
Save m
oney
Gro
w B
usin
ess
Application change required
Cu
sto
mer
Go
al
data
ori
en
ted
ap
plicati
on
ori
en
ted
Se
ssio
n m
ana
gem
ent
Ela
stic D
ynaC
ache
Web s
ide c
ache
26
We
bS
ph
ere
eX
tre
me
Sca
le �
Exp
eri
en
ce
s S
um
ma
ry
low
co
mp
lexity,
co
nfigu
ration
-on
ly
co
st-
savin
g f
ocu
shig
her
com
ple
xity,
de
v r
equ
ire
me
nt
en
ab
ling
techn
olo
gy
27Sum
ma
ry
We
bS
phe
re e
Xtr
em
e S
cale
is im
men
se
ly p
ow
erf
ul a
nd
su
pp
ort
s a
div
ers
e
rang
e o
f de
plo
ym
ent
sce
na
rio
s
�D
yna
ca
ch
e �
lo
w c
om
ple
xity,
cost-
sa
vin
g f
ocus
�A
pplic
ation
ca
ch
ing �
hig
her
com
ple
xity,
en
ab
ling
techn
olo
gy
28Ne
xt
ste
ps:
Ho
w d
o I
ge
t sta
rte
d?
Do
wn
loa
d W
eb
Sph
ere
eX
tre
me S
cale
for
fre
e a
nd
bu
ild a
tria
l a
pplic
atio
n
htt
p:/
/ww
w.ib
m.c
om
/develo
pe
rwo
rks/d
ow
nlo
ad
s/w
s/w
sd
g/le
arn
.htm
l
Ch
eck o
ut
We
bS
phere
eX
tre
me
Sca
le in
Am
azo
n E
C2
htt
p:/
/de
ve
lope
r.am
azon
web
se
rvic
es.c
om
/co
nnect/
en
try.jsp
a?
exte
rnalID
=27
21
Join
the
WX
S c
om
mun
ity f
oru
m
htt
p:/
/ww
w.ib
m.c
om
/develo
pe
rwo
rks/fo
rum
s/f
oru
m.jsp
a?
foru
mID
=7
78
&sta
rt=
0
Ge
ttin
g s
tart
ed t
uto
rials
on d
evelo
perW
ork
s
htt
p:/
/ww
w.ib
m.c
om
/de
ve
lop
erw
ork
s/w
ebsph
ere
/te
ch
journ
al/07
11
_cha
mbers
/0711
_cha
mb
ers
.htm
l
htt
p:/
/ww
w.ib
m.c
om
/de
ve
lop
erw
ork
s/w
ebsph
ere
/te
ch
journ
al/07
12
_m
ars
hall/
07
12
_m
ars
hall.
htm
l
29Ad
ditio
na
l re
so
urc
es
Eng
ag
e w
ith t
he X
TP
Co
mm
un
ity:
foru
ms,
sa
mp
les,
vid
eo
s
htt
p:/
/ww
w.ib
m.c
om
/de
ve
lope
rwork
s/s
pa
ce
s/x
tp
Rea
d B
illy N
ew
po
rt�s
blo
g o
n X
TP
htt
p:/
/ww
w.d
evw
ebsph
ere
.co
m/
Watc
h B
illy N
ew
po
rt a
nsw
er
yo
ur
que
stion
s o
n Y
ouT
ube
htt
p:/
/ww
w.y
outu
be
.co
m/ibm
extr
em
esca
le
Red
bo
ok:
WX
S U
se
rs G
uid
e
htt
p:/
/ww
w.r
ed
bo
oks.ibm
.co
m/r
edp
iece
s/a
bstr
acts
/sg2
47
68
3.h
tml?
Ope
n
WX
S V
ers
ion 7
Info
Cen
ter
htt
p:/
/pu
blib
.bo
uld
er.
ibm
.com
/in
focen
ter/
wxsin
fo/v
7r0
/ind
ex.jsp