Abstract

The generic function of a feedforward neural net is to map patterns from one space to another. This mapping function, determined by the set of examples used to train the network, may be viewed as a hashing function -- the Neuro-Hasher. This paper reports the experiments on using a backpropagation network with a dynamic hidden layer for finding an appropriate hashing function for a given population of hashing keys. Comparative studies show that the Neuro-Hasher performs robustly over various populations of scarce hashing keys which would cause uneven distributions with some traditional hashing function. 1. Introduction Hashing is a technique for randomising the locations in a linear space where the records are stored and indexed directly on the basis of a given hashing key. Traditional hashing functions are discontinuous and predetermined, aiming at randomising some sort of continuous sequence of hashing keys, e.g. integers [KNU73]. However, such hashing functions may lead to uneven d...