What is Memory Dump Analysis?

From a computer system we get a memory dump composed from fixed size observable values called bit or byte values. Then we impose some structure on it in order to extract various derived objects like threads, processes, etc., build some organization and understand what had happened. This activity is called modeling and memory, crash or core dump analysis is all about modeling a dynamical computer system based on its memory slice. Then we can make predictions and test them via controlled experiments called troubleshooting advices. Tools like WinDbg or GDB can be considered as abstract computers whose job is to model another computer when we feed memory dumps to them.

Modeling computers on computers is inherently reductionist approach and most of mainstream science is just plain reductionism. Just compare the notion of point-like particles as building blocks in physics and bits, recursiveness of physical states and computer algorithms.   

If you want to understand reductionist modeling in physics I would recommend the following book The Comprehensible Cosmos: Where Do the Laws of Physics Come From? written by Victor J. Stenger which I have just finished reading.

Buy from Amazon

The nice feature of this book is its clear separation between textual description and mathematics. The first 190 pages don’t have any mathematical formulas and the next 130 pages repeat the same discussion using undergraduate level of mathematics. 

- Dmitry Vostokov @ DumpAnalysis.org -

One Response to “What is Memory Dump Analysis?”

  1. Dmitry Vostokov Says:

    One sentence definition:
    http://www.dumpanalysis.org/blog/index.php/2011/12/12/what-is-software-trace-and-memory-dump-analysis-a-one-sentence-definition/

Leave a Reply