On the Impact of Forgetting on Learning Machines
People tend not to have perfect memories when it comes to learning, or to anything else for that matter. Most formal studies of learning, however, assume a perfect memory. Some approaches have restricted the number of items that could be retained. We introduce a complexity theoretic accounting of memory utilization by learning machines. In our new model, memory is measured in bits as a function of the size of the input. There is a hierarchy of learnability based on increasing memory allotment. The lower bound results are proved using an unusual combination of pumping and mutual recursion theorem arguments. For technical reasons, it was necessary to consider two types of memory: long and short term.
Freivalds, Rūsiņš, Efim Kinber, and Carl H. Smith. "On the Impact of Forgetting on Learning Machines." Journal of the Association for Computing Machinery 42.6 (1995): 1146-1168.