Quality of Data

There was some discussion why we need forgetting at all; isn't it a progress if machines can keep everything within the memory? Discussing the search in the memory (see below) it became clear that the size of the memory space can become a limiting factor to a finite system. All those data which are not really 'important' can be qualified as 'trash'. If the amount of trash growth much larger than the 'valid' informations then it will increasingly become more and more difficult - if not finally impossible - to find those data which are 'helpful'. Thus it can be a good strategy to 'filter out' all the 'unimportant' data. One main criterion for being unimportant is, that one does not make use of something or very scarcely. This is the above mentioned frequency count combined with the duration measure. And additional 'fine tuning' can maybe be realized by some kind of a 'meta level' (see below).



Gerd Doeben-Henisch 2012-03-31