:: Re: [DNG] Memory management strateg…
Página Principal
Delete this message
Reply to this message
Autor: Rainer Weikusat
Data:  
Para: dng
Assunto: Re: [DNG] Memory management strategies.
Edward Bartolo <edbarx@???> writes:
> Lately, I have been pondering how memory managers deal with a
> situation, when memory is fragmented with scattered allocated blocks,
> in the event a memory allocation request, is made for a memory chunk,
> whose size is bigger than the biggest unallocated contiguous memory
> chunk.
>
> In a situation similar to the one mentioned, my temptation is to opt
> to using linked lists, so as to avoid requiring large unallocated
> contiguous memory blocks. However, this increases the overall
> processing load which tends to slow whatever program using such a
> model.


I could make an argument about that but the following is probably
better: The kernel uses linked lists extensively.

> The question is how do memory managers succeed to remain efficient and
> yet cope with memory allocation of so many different sizes?


By and large, by firmly sticking two fingers into their ears and
chanting "Moore's law! Moore's law! Moore's law! I can't hear you!",
IOW, userspace memory managers are usually anything but efficent but
this stopped to trouble people without special needs a while ago.

Provide it's necessary to offer the malloc-interface where the allocator
has no information beyond the block size, the usual strategy is to split
a fairly 'large' block of memory into many smaller chunks of the same
size and use a number of block-size segregrated freelists with some
coalescing of small chunks back into large ones and various kinds of
'special treatments' for certain kinds of allocations.

Programs required to manage memory efficiently, IOW, OS kernels, usually
use the same basic model but type-seggregated free lists.