Leveraging Language to Learn Program Abstractions and Search Heuristics

Program synthesis (the automatic inference of symbolic programs) can help to create robust, interpretable, and verifiable machine learning approaches. A recent paper on arXiv.org proposes a framework for improving the efficiency and generalizability of learned program synthesis using natural language supervision.

Image credit: pixnio.com, CC0 Public Domain

Language lets to communicate both search space (an instruction to draw a large hexagon next to a small pentagon decomposes a complex task into high-level parts) and a lexicon that names important reusable concepts in a given domain (such as polygons in the previous example).

Therefore, it is suggested to learn both libraries of reusable program abstractions and heuristics for searching in the space of programs. The approach significantly improves the performance of tasks like string editing, structured image generation, and scene understanding.

Inductive program synthesis, or inferring programs from examples of desired behavior, offers a general paradigm for building interpretable, robust, and generalizable machine learning systems. Effective program synthesis depends on two key ingredients: a strong library of functions from which to build programs, and an efficient search strategy for finding programs that solve a given task. We introduce LAPS (Language for Abstraction and Program Search), a technique for using natural language annotations to guide joint learning of libraries and neurally-guided search models for synthesis. When integrated into a state-of-the-art library learning system (DreamCoder), LAPS produces higher-quality libraries and improves search efficiency and generalization on three domains — string editing, image composition, and abstract reasoning about scenes — even when no natural language hints are available at test time.

Research paper: Wong, C., Ellis, K., Tenenbaum, J. B., and Andreas, J., “Leveraging Language to Learn Program Abstractions and Search Heuristics”, 2021. Link: https://arxiv.org/abs/2106.11053


Share This Post

Post Comment