This is the first post in Science of Memory Dump Analysis category where I apply philosophy, systems theory, mathematics, physics and computer science ideas. It was inspired after reading Life Itself book written by Robert Rosen where computers are depicted as direct sums of states. As shown in that book, in the case of machines, their synthetic models (direct sums) are equivalent to analytic models (direct product of observables). Taking every single bit as an observable having its values in Z2 set {0, 1} we can make a definition of an ideal memory dump as a direct product or a direct sum of bits saved instantaneously at the given time:
∏i si = ∑i si
Of course, we can also consider bytes having 8 bits as observables having their values from Z256 set, etc.
In our case we can simply rewrite direct sum or product as the list of bits, bytes, words or double words, etc:
(…, si-1, si, si+1, …, sj-1, sj, sj+1, …)
According to Rosen we include hardware states (registers, for example) and partition memory into input, output states for particular computation and other states.
Saving a memory dump takes certain amount of time. Suppose that it takes 3 discrete time events (ticks). During the first tick we save memory up to (…, si-1, si) and that memory has some relationship to sj state. During the second tick sj state changes its value and during the 3rd tick we copy the rest of the memory (si+1, …, sj-1, sj, sj+1, …). Now we see that the final memory dump is inconsistent:
(…, si-1, si, si+1, …, sj-1, sj, sj+1, …)
I explained this earlier in plain words in Inconsistent Dump pattern. Therefore we might consider a real memory dump as a direct sum of disjoint memory areas Mt taken during some time interval (t0, …, tn)
M = ∑t Mt where Mt = ∑k stk or simply
M = ∑t ∑k stk
- Dmitry Vostokov @ DumpAnalysis.org -