programming humans
TRANSCRIPT
FROM THE DNA TO THE MANKIND AND FROM THE ASSEMBLER TO CLOUD COMPUTING
Or how to program human beings
Georgios Kasselakis // KCorax
DRAFT VERSION. DO NOT HUMILIATE ME BY REPUBLISHING.
Abstract The idea that one could program human beings using processes and methodologies
commonly found in the software world is an enticing one. If we closely look at the way
natural processes have structured entities as small as cells into cognizant beings, it quickly
becomes evident that there is a parallel into the way that humans have chosen to structure
their intelligent machines.
Programming in the software world begins at very basic languages that even describe how
electrons are allowed to pass through circuitry, but they go up to the point of defining how
software services work, much in the way that a bureaucratic organization would.
In this document I suggest that the way human beings structure their software borrows
closely mirrors the way natural processes work, both by design and by consequence of the
way they are structured themselves.
Introduction I started programming in medium abstraction languages such as Basic and Logo, then
moved down to assembly and then back up again. Today I mostly delve in Service Oriented
Architectures and Smart Clients. As I was moving to greater abstraction languages, this
allowed my designs to be more complicated. So instead of a command listing, these became
assisting libraries, then frameworks, then Web Services automatically discovering each other
and lately an AI inspired code generation engine. As I evolved, and I chose to get further
away from code, I found myself hiring two university classmates to complete a project.
I then introduced a wiki, a source repository, a file syncing tool for documents,
communication protocols, a list of steps they had to follow steps from before doing a code
contribution and so on. And then it struck me: I was trying to program humans, much like I
would program a computer. The wiki, code repository and folder share were data structures
(and I had spent time evaluating the complexity of their actions as I realized later) and the
processes were simple algorithms for manipulating my humans into processing and then
storing the useful data in them.
The people I had hired also had resemblances to vector and general purpose processing
units. In fear they might read this post, I will not analyze their attributes.
Journey to life Programming, when abstracted of its strict technical underpinnings, really is the act of
predefining an entity’s behavior by documenting it in some mutually understood language.
As such the notion of programming should be applicable to any array capable of reproducing
actions in steps.
Before explaining how a system as complicated as a human can be programmed, let’s take
some time to examine the ways that finer parts of a living being are programmed by nature
itself. By up-scaling these examples we’ll get a more firm grip of the programming humans
analogy.
The story generally goes like this:
So let’s examine the levels of integration and autonomy in pairs of life building blocks and
software building blocks.
Assembler and DNA: Here we have very straightforward and simplistic commands. On
the computer side (the iron thing) we have a Turing complete machine which is made by
design, while on the other side we have a runtime engine made of chemicals. Primarily slabs
of mRNA that float in a nutritious ooze, that stick on the sides of split DNA chains and are
then moved outside the nucleus for execution. On both sides, there is very little ambiguity
and –as long as the program is small enough to be entirely in the purview of its creator - no
question about the end result.
Another interesting similarity, is that -excluding fancy modern tools such as assembler IDEs
and the code is executed in the same form it’s written.
C and Proteins: Simple reusable processes appear here. There is still a lot of manual
work, and there is no insulation from the underlying subsystem. Very often the effort
required to reuse a certain piece of lower level code rivals rewriting it completely. For the
first time, irrationality and the possibility of a mistake is introduced. This is because lower
lever routines are mapped to notions. These notions mean something to their maker. They
are expected to cause something predefined, and the fail when they don’t match it very
closely, or because they have side effects that in certain mixes create ghost behaviors. The
complexity of the programs created is such that complete purview is undesirable as it would
–much like gazing the abyss- lead to madness. The runtime is still fed with nucleus liquids, or
Jolt Cola.
Assembly
DNA
C
Protein
C++
Cell
Java
Tissue
OS
Body
C++ and Cells: At this level we try to introduce semi autonomous constructs. This is
done by simulating a kind of self containment commonly called encapsulation. While this
autonomy is really an illusion, it simplifies our perception a lot by allowing us to forget how
they work. The analogous to cells is here C++ objects.
Again the constructs live in a nutritious serum of sugar or solid state memory, and of course
need a system for the collection of natural excrements such as feces and de-allocated
memory.
For the first time we introduce the concept of an exception: a special state of the program
that unless is caught and handled appropriately it might lead to death of the cell. Never
before have we been conscious of life and the ramifications a mistake could have.
By devising procedures that operate on objects, we actually sacrifice a lot of performance,
but get much more understandable and reusable (and as such maintainable) code in
exchange. Since RAM modules cost less than a developer’s wage, this is usually a desirable
tradeoff.
History savvy readers might know that the first object oriented language was actually
Sketchpad v3, which actually called its self-contained drawing entities cells. At least in the
reference booklet.
Tissues and Java: At this level, we simplify programming by completing the illusion of
simplicity. Much like the java runtime takes care of the memory management and tidying
up, the tissue’s blood and lymph irrigation takes care of the cell’s nutrition, protection and
excrement collection problems. For the first time we can really forget of how the underlying
code works and at the same time, call the construct alive and self sustainable.
Notice also that we can transplant tissue in the context of a greater construct. Where a
single cell might not make it, a collection of these plus the support systems in place, have a
far greater likelihood of survival.
Skipping organs and animals: We could argue here that the analogous in the
software world be frameworks, but the distinction between Programming libraries and
frameworks is relatively speaking as trivial as the distinction between a tissue and a more
complete organ. This is because there exist systems such as lymph, blood and TCP/IP name
resolution that can’t be classified in this category. These are too pervasive in the system to
be called organs. They resemble more services that are provided by the complete construct
of the next step. We’ll just skip this sticky categorization altogether.
The analogy after making the human step is easy, and the fact that computers tend to be
personal helps the investigation a lot. It implies a leap however: the computer is the
Operating System of the previous step as much as it is hardware. Similarly a human is far
more than an extensive network of tissues. The secret to this jump purportedly lies in
certain tissues that are found inside the brain. Others choose to believe that the difference
is more ‘special’ and involves terms such as that of a soul and external entities such as a god.
Humans and Computers: At this level of integration the construct is autonomous,
independent of its surroundings save for its lowest level parts still requiring electricity,
carbohydrates and air with a certain percentage of oxygen in it. The construct is resilient
against natural anomalies such as attacks, accidents and possible scarcity of resources such
as lack of drinkable water or stable voltage electricity current. Also access to the network
drives or access to a society (which in all likelihood increases the life expectancy of human
beings).
Programming (manipulating really) computers takes shape in an operating system’s shell,
much like programming humans is done using natural language in voice, writing and also
gestures, physical contact, and environment manipulation.
The reader (parents especially) might argue that telling a person what to do might not have
the desirable result. If a parent yells at a child, and the child does not respond, then the code
that the parent issues is might not be in the right domain. It might require physical contact
for example and so on. In light of this definition, a punch to his face is also code for a human
being. Yelling, singing, reading poetry and so on are all code.
I should point out that I consider slapping a child a poor programming practice.
Pushing the analogy, the non-material components of the human (electric signals in the
brain or a soul depending on your beliefs) match the material parts of the computer. This
also brings along another realization: programming and the way it’s human creators have
structured it, closely mirrors the inner workings of the human mind which closely mirrors it’s
teacher: the natural surroundings as perceived by observation.
Notes on programming humans
Natural language’s programming commands generally fall in three categories: appeals to
emotion, appeals to logic and appeals to conformity (or vagueness as we will discuss).
Appeals to emotion tend to be the most effective. They generally create a psychic spasm of
some magnitude that causes the subject to plan and perform the necessary actions that will
relieve the suffering, or improve its wellbeing. This is also compatible with (primarily Asian)
religions where physical being in any form is considered an undesirable state.
average size of
program
resemblance to natural
language
size of simple program integration
reliability
app
rop
riat
e m
etri
c
Note that unlike stupid robots, humans and animals in general will do the planning
(partitioning of a complicated command into simpler steps) and execution (of steps) for you.
For best results™, the complexity of the command must match the capacity of the
processing unit (brain’s intelligence).
For example take two commands that a housewife might issue to her husband:
Low IQ command: Take the kids to the park for at least an hour, or else I will flay you alive.
On a simpleton husband this will have him execute the desired steps to fulfill the command,
in order to alleviate the stress. A more intelligent husband might foresee that he will have a
miserable life with his spouse, and call a divorce lawyer instead.
High IQ command: Take the kids to the park because, baby, I’m oh so tired and I need to
take a bath. On the low IQ husband will fail, because he’d rather sit on the couch, put his
hand under his pants and watch TV like Al Bundy. On the high IQ husband, this will succeed
because he sees a potential for a wife that doesn’t smell like carrot purée. In this case it’s
also an appeal to logic.
So all in all, emotional commands are about redefining the subjects perception of its
wellbeing and the immediately executable actions it can perform to improve it. Most often
when the programmers’ agenda is manipulative, the cake is a lie!
Appeals to logic when seen from afar are appeals to emotion that stay resident on the
subject’s memory, and rely on the subject’s ability to simulate his feelings on a longer term.
The mentioned high iq command on the high iq husband partly employs such an appeal.
The subject will generally pick up the commands and carry them until a) they can’t stand
logical validation anymore and the subject has had time to ponder on that), b) until it forgets
them, c) until they conflict with emotional commands.
So for example the statement that ‘You should brush your teeth’ is armed when a) the
subject is aware that not brushing = cavities = drilling = pain, b) the subject isn’t forgetful
due to being drunk, in love or otherwise intoxicated, c) the burn sensation of the toothpaste
doesn’t rival the pain of drilling as the subject remembers or simulates it in mind.
Unresponsiveness to a command can be fixed by repetitively inflicting pain, until the
learning process is complete. Learning here is the bridging of high level commands with low
lever ones. In computer land, this is done by programmers, OS shells, Prolog interpreters,
compilers and so on.
Again the effectiveness of the command varies depending on the intelligence but also –here-
the perception of the subject. So telling a child that unless it studies it will become a bum,
might not have the desirable effect because children are generally isolated from the
miserable experience that a bum’s life is. The closest one they have is depictions of bums in
films where bums are usually cool and easygoing. Looser on the other hand is something
that a kid might have heard from school bullies and therefore more suitable for
programming it’s behavior.
Notice that the count of commands a human can process greatly increases with the proper
conditioning such as education. On computers the equivalent is installing an operating
system or runtime etc.
Closing this section, Appeals to Conformity are macroscopic appeals to logic and emotion.
These include statements such as “you should wear clothes even in the summer because
everyone else does it” or “this is true because that web celebrity wrote it in his blog”.
Their efficiency depends on how vague they are and on how comfortable the subject feels
about reusing wisdom that others produced. They tend to be effective even on intelligent
subjects because their vagueness quickly depletes the subject’s ability to detect and validate
the reasoning behind them. Therefore it reverts to the safe state which is ‘trust the wisdom
of the crowds’.
Most people will obey to a suggestion if they can’t prove the opposite. The few who are not
inclined to obey are called rebels, punks, thinkers or weird, depending on their wealth, visual
comeliness and social status.
An aside about wisdom
There is a pyramid that generally represents integration in information. Those with a
background on computer science should recognize it.
1. At the bottom we have raw data, such as the letters in this document, the X-Y distances
from the sides of the paper etc.
2. The next step is information. It’s the step where we say that ‘oh look those drawings all
have the same X value’ . The letters then get names, they formulate words and
sentences etc, and the X-Y pairs of values become coordinates.
3. Next up, we have knowledge which comes from systematic observation. The words are
assigned meanings and we can debate weather one word is more suitable than another.
G.W. Bush for example has the information of what the Internet is, but his lack of
knowledge prevented him from properly adapting the word. Hence the Internets.
4. Finally on top we have wisdom. This comes from systematic observation of knowledge.
Here we are enabled to notice things such as ‘I really don’t get poetry’ . Words of
wisdom though are a tokens that represent a process and the knowledge attached to it.
Without it’s dependencies wisdom is useless, as we can’t decide properly on whether it’s
applicable on a new instance of a problem or not.
Really wise people though are usually desperate and sad tough, as they understand that
wisdom can’t be transmitted. The new bearer will misuse it since he doesn’t fully understand
the meaning behind the token.
That said, there exists a level of power called meta-wisdom, or wisdom on wisdom. At that
level, a sentient creature should be able to handle wisdom from other creatures. But then
again, that level of manipulation is mythical.
Dealing with the memory effect A possible problem when debugging human behavior is the memory effect. In this case the
system is using input received in a previous time, or digested results of this input that are
imprinted on its OS (experience) that deters the results that we would otherwise expect.
Under the model we introduced we have essentially expanded the constructs we were
examining with Signals and Systems Theory. A human or computer are systems, subjected to
input, providing output, have a memory effect and so on. So slapping the child might have
immediate results, and we as human neural networks might ‘learn’ that slapping was the
right kind of code but the long term memory effect of the child system might reduce its
reliability in the future. Same applies to slaves.
Beyond the One Looking at self organizing constructs in nature it’s clear that animals aren’t the top of the
chain in integration of features.
Societies and Networks: It’s fairly easy to notice that animals tend to get organized in
groups. In this context, while their sense of individuality is not lost, it is rendered less
important. Their behavior as a greater system becomes predictable through the use of
statistical models, and as such decreases the need for understanding the individual.
On the side of computers, we have networks. These were created primarily out of the need
to have personal workstations and their handlers collaborating with each other. Much like in
human societies we also have specialization in the skills that each unit provides. Hence we
have printing, file storage and backup servers and also routers, in analogy to hunters,
farmers and politicians in a society. These parallelisms were not made in a sentient decision,
but rather evolved to be like that.
It turned out that it’s rather sensible to have individuals that focus and excel in a very
specific discipline.
Programming in networks, especially Service Oriented Architectures is done using web
service languages such as BPEL. In this language, we program using flowcharts in an xml
human
computer
society
network
nation
web
mankind
cloud
language. This might seem a blast from the past, but in this case the commands that the
flowchart invokes in every step hide immensely complicated computer code.
Remember though that flowcharts were originally invented to program the behavior of
humans. Defining for example and then teaching to school children how to make divisions,
or office workers on how to evacuate their building properly.
Another language that contributes to this concept is BPEL4People. This is an extension to
BPEL that allows business process architects to involve humans in the interaction of
enterprise systems. So much like a process designer (human, natural selection or god for
some) would drag decision making components in a diagram, they can drag humans that
execute verifications and so on.
Nations on the Web: Human is differentiated by animals in many ways. One such is
that humans have a greater sense of belonging that identifies them with a nation. They are
aware of and care about other societies and recognize the larger conglomerates as
something friendly to them. Programming a nation is done using politics. This is a very
advanced science which I will not attempt to analyze here.
It’s not by chance that humans have sought to bring their habits to the software world. The
web is explicitly modeled after the image and form of human nations. And these nations
transcend traditional boundaries imposed by the physical world. Such modern nations
include the dominant social networking sites and MMORP Games. Notice also that there is
the notion of citizens that are devoted to their nation, but also others that are
cosmopolitans, and are more flexible about traveling and assuming the ways of the nation
they’re into.
While one would expect that in a social superconstruct such as the web where traveling is a
few clicks and some typing away, the new country borders are defined by new notions such
as data portability, the existing ‘dominant nationality’ of physical friends and vendor lock-in .
Mankind and the Cloud: The notions that are being paralleled here are quite
complicated and fairly new. I won’t analyze them too much again as there is no consensus in
what a mankind is, how it behaves and how it should, much like there is so much dispute on
the web about what cloud computing really is.
In the context of the entire mankind we have again specialization. Parts of the whole that
provide the rest of the mankind with a certain benefit. As such benefits could be considered
products and such as wood from the Amazon, manual labor in the developing world and
knowledge in Universities and other research institutions.
On the web on the other hand, in the past 5 years we have had the cloud phenomenon. In
this, we have a greater specialization of the services being offered, and observing as they get
consolidated according to the maximum efficiency of their underlying platform. So for
example we have middleware services where people can outsource their storage and
processing en-masse.
Epilogue Overall the type of analogy-based analysis in this document could be applied to most natural
constructs that are the product of evolution, and manmade ones that are the product of
iterative design. These include building architecture, computer hardware design, natural
ecosystems on earth but also planetary combined with geology and so on.