building distributed system with celery on docker swarm
TRANSCRIPT
Building Distributed Systemwith Celery on Docker Swarm
Wei Lin
2016/06/03
2
About Me
• Wei Lin• @Wei_1144• [email protected]
• Mostly worked in the fields of:– marketing– strategy-planning
– project-management
3
Parallel computinghttps://resin.io/blog/what-would-you-do-with-a-120-raspberry-pi-cluster/
4
Monitor and Control
5
Distributed system
6
Distributed system (II)
7
Neural network
h1 hh2 h3
8
Communicate via Celery
Celery
10
Celery flowchart
Send message to a specific queue
12
Send messagehttp://docs.celeryproject.org/en/latest/userguide/calling.html
kick.apply_async([‘neuron_x’])
kick.apply_async([‘neuron_x’], routing_key = ‘neuron_h1’)
13
Configuration file
14
Task Message Examplehttp://docs.celeryproject.org/en/latest/internals/protocol.html
15
Routing Taskshttp://docs.celeryproject.org/en/latest/userguide/routing.html
Assign an unique worker for a specific queue
17
Starting the Workerhttp://docs.celeryproject.org/en/latest/userguide/workers.html
https://github.com/celery/celery/issues/3065
$ celery -A IoT worker -n neuron_h1 \-Q neuron_h1 --concurrency=1
Docker
19
Host and containershttps://denibertovic.com/talks/supercharge-development-env-using-docker/#/10
20
Run worker container
21
Deploy containers
22
Celery worker containersas neurons
hh1 h2 h3
kick.apply_async( routing_key = ‘neuron_h1’)
getData.apply_async(routing_key = ‘neuron_h1’) .get()
23
“Celery + Docker-Swarm”
So ?
I think…
25
Docker file/image/containerhttps://denibertovic.com/talks/supercharge-development-env-using-docker/#/11
26
Think OO
• “From” to Inherit• Dockerfile to Encapsulate• Docker Image as Class• Docker Container as Object
27
Docker Swarm & Celery
doSomething.apply_async (routing_key = ‘host_NY’)
Taipei Tokyo New York
getData.apply_async(routing_key = ‘neuron_h1’) .get()
28
Monitor and Control
29
OOADStation C Station B Station A
wait.apply_async ([xxx], routing_key = ‘station_B’)
wait.apply_async ([xxx], routing_key = ‘station_A’)
30
Bound services
31
Effects
• easier to convert a system into its distributed version.
– Decoupling, DI/IC (dependency injection / inversion of control )
– Distributed = Shared– Load-Balancing
doSomething.apply_async(routing_key = ‘host@remote’)
getData.apply_async(routing_key = ‘neuron_h1’) .get()
32
Load-Balancingcontroller model database
addItem.apply_async ([order], routing_key = ‘model’)
addItem.apply_async ([‘order’], routing_key = ‘database’)
33
Parallel computinghttps://resin.io/blog/what-would-you-do-with-a-120-raspberry-pi-cluster/
34
Deploy parallel computing
• Deploy containers over Docker Swarm– Scale up to 480 containers
• docker-compose up –d• docker-compose scale celery=480
• Send task messages to worker containersresults = [ doSomething.apply_async([data]) for data in dataset ]
• But, needs root password-less SSH
35
SETI@home
36
One for all, All for onehttps://github.com/Wei1234c/OneForAll_AllForOne/blob/master/celery_projects/OneForAll_AllForOne.md
• run a Docker image to join distributed parallel computing cluster, that’s all.docker run –d wei1234c/musketeer (for example)
• Works on RPi and amd64 PC
One for all, All for one
37
Summary
• Caution:– Distributed systems theory for the distributed systems engineer
• Easy to usedoSomething.apply_async ([xxx], routing_key = ‘host_NY’)
getData.apply_async(routing_key = ‘neuron_h1’) .get()
• Canvas– Chain– Group
Q & A
39
References
References :• Distributed systems theory for the distributed systems engineer• Celery user guide• Docker document• MQTT Message Type and Flows• Celery on Docker Swarm• IoT as Brain• One for all, All for one