Saturday, December 12, 2015

Towards non deterministic computing

We have always linked computers to logic and determinism.
A computer or a software application is mostly considered as black box where we expect that a specific input always results into the same exact output. When this is not the case, we're annoyed and frustrated.
Having non-deterministic results can have dramatic outcomes. Take the example of nuclear plants or airplanes. It is a funny thing because sometimes we forget that the ones who have final say on the behavior of nuclear plants and airplanes are humans with totally non-deterministic and unreliable behavior. We are relying on them solely based on their sense of "responsibility" and will to survive

During decades, software and hardware have been designed to achieve exactly that: reliability and determinism, based on Boolean logic and others.


In several fields of software and hardware design, we are discovering that in order to progress further we need to let go of determinism. Some examples are:
-Concurrent programming
-Distributed programming
-Artificial Intelligence and Machine Learning
-Quantum computing

For example, in concurrent and distributed systems, we used to obtain increased performance by throwing more CPU or threads at a problem. 
But this has some limits as stated by Amdahl's law. Indeed, at some point in time, increasing the numbers of CPU's, they will spend more time synchronizing their tasks rather than spending it on actual work.

Those who are dealing with project management know the issues of having several parallel tasks going on. At some point in time, the people involved with the different tasks will need to synchronize and exchange information. The larger the amount of people or parallel tasks, the more time you will spend synchronizing afterwards. I am sure you all have been through that, when you spend more time in meetings than actually doing work.

A lot of work has been done in hardware and software to relax synchronization requirements between CPU's and memory or in distributed systems between servers.
People working, on system software, trying to achieve maximum performance know the strange phenomenons you can observe if you don't take specific measures synchronizing memory accesses on cpu's or systems with non-sequentially consistent memory models. (which are the majority nowadays) with the use of so called memory barriers or memory fences.

In distributed systems, we have the same issues but we give them different names, and sometimes different solutions with conflict resolution algorithms resolving conflicts

Pushing this to the extreme, we get research projects like the Renaissance Virtual Machine from IBM where non-determinism is accepted as a premise and they are trying to find ways to deal with it.

I could go on and on. I guess that what I am trying to say here, is that we are observing a paradigm shift: we are starting to accept uncertainty/non-determinism either out of necessity or due to maturity. We are accepting it and mostly learning to deal with it.

Remaining relevant: Abstraction layers and API's for cloud native applications

Separation of concerns, abstraction layers and API's for Cloud native environments Those 3 terms are closely related to each other. T...