Software discussion

Web software | www.agner.org

 
thread Objconv - disassembly performance & P.I. - dos386 - 2011-03-05
last reply Objconv - disassembly performance & P.I. - Agner - 2011-03-05
 
Objconv - disassembly performance & P.I.
Author:  Date: 2011-03-05 12:55
Hi

agner.org/optimize/

Thank you for this tool. Tiny problem:

I noticed that disassembly of large files takes suspiciously long (all PE EXE):

CRC32.EXE 77'824 Byte's 0.9 s Test
OCC.EXE 446'976 Byte's 47 s Test
MPWRC4.EXE 17'455'616 Byte's 230 Ks AKA 2.7 days Prediction

The time grows with file size, but not linearly, not even quadratically, but with an exponent of 2.3. So on bigger files it simply seems to hang.

There is a FAQ on this subject, but it doesn't reveal too much.

2 suggestions:

- Enhance the FAQ (about the exponent of 2.3), also about memory requirements (do they also grow that way ?)

- Would it be possible to add some progress indicator ? Telling how much has been done at suitable times (for example 3s, 12s, 1m, 6m, 1h, 4h, 24h, 4 days) ?

Disassembling MPLAYER would take almost 3 days, so it would be cool to know that the work is doing some progress rather than hanging really and permanently.

   
Objconv - disassembly performance & P.I.
Author: Agner Date: 2011-03-05 14:53
I know the problem, and it bothers me too. The reason is that I am using ordered lists for labels and cross references. These lists are used in a random access way, i.e. the list is written to and read from in random order. The longer the list becomes, the more time it takes to keep the list sorted every time a new item is added. A hash map won't work. The best solution would be a binary tree. I haven't had the time to find an optimal way of implementing a binary tree for this. Most implementations of binary trees do a lot of dynamic memory allocation, which is bad for the performance.