Authors:
Milo Knell
1
;
Sahil Rane
1
;
Forrest Bicker
1
;
Tiger Che
1
;
Alan Wu
2
and
George Montañez
1
Affiliations:
1
AMISTAD Lab, Department of Computer Science, Harvey Mudd College, Claremont, CA 91711, U.S.A.
;
2
Department of Computer Science, California Institute of Technology, Pasadena, CA 91125, U.S.A.
Keyword(s):
Algorithmic Search Framework, Satisfaction, Fuzzy Membership.
Abstract:
Many machine learning tasks have a measure of success that is naturally continuous, such as error under a loss function. We generalize the Algorithmic Search Framework (ASF), used for modeling machine learning domains as discrete search problems, to the continuous space. Moving from discrete target sets to a continuous measure of success extends the applicability of the ASF by allowing us to model fundamentally continuous notions like fuzzy membership. We generalize many results from the discrete ASF to the continuous space and prove novel results for a continuous measure of success. Additionally, we derive an upper bound for the expected performance of a search algorithm under arbitrary levels of quantization in the success measure, demonstrating a negative relationship between quantization and the performance upper bound. These results improve the fidelity of the ASF as a framework for modeling a range of machine learning and artificial intelligence tasks.