Scalable knowledge acquisition through cumulative learning and memory organization
The field of machine learning is dedicated to the process of finding and acquiring new knowledge automatically. A majority of research in the field is based on the assumption that knowledge of a domain may always be learned and stored using a flat, unstructured representation.
In this work we assert the importance of learning in a structured environment, and of using a structured representation. We take the view that intermediate knowledge representations are of equal importance to high-level target knowledge. Development of intermediate representations is critical to subsequent learning, and requires at least as much learning effort as the high-level learning goals. Most importantly, we assert that intermediate knowledge must be organized in order for an agent to achieve its full learning potential.
The goals of this research are to investigate the benefits of learning in a structured environment, and to demonstrate the mechanisms by which an agent can accumulate and organize knowledge. We focus on understanding how structured learning can produce highly compact representations, and how intermediate learning problems can remain tractable regardless of the size or complexity of the high-level target learning problem.
Toward this end, we propose and evaluate a cumulative learning algorithm, SCALE, which acquires and organizes knowledge from a structured learning environment. We then compare the performance of SCALE with several algorithms from the machine learning literature, and demonstrate the importance and effects of memory organization on the learning process and scalability. We then conclude by highlighting several lessons learned regarding the nature of the structure learning problem, and key areas for future exploration.