Entropy Based Search Algorithm for Experimental Design Nabin K. Malakar Kevin H. Knuth University at Albany, SUNY
Outline
• • • •
Autonomous Experimental Design Maximum Entropy for Inquiry Nested Entropy Sampling for Search Application to Robotic Arm
MaxEnt 2010
The Scientific Method
MaxEnt 2010
The Scientific Method
CAN WE AUTOMATE INQUIRY?
MaxEnt 2010
Mars Exploration
Mars Exploration Rovers: Spirit and Opportunity 128 kilobits per second / 10 Megabytes per day MaxEnt 2010
Mars Exploration Rover Mission Control
Event: MER Mission Activities Date: Spirit Sol 4 Source: Kris Becker MaxEnt 2010
Time Constraints and Human Intervention
6 to 44 minute round-trip communication delay
MaxEnt 2010
Missions to Jupiter’s Moons
60 to 100 minute round-trip communication delay
MaxEnt 2010
Data: not a problem!
MaxEnt 2010
Drinking from Fire hose
No one was hurt during this shot MaxEnt 2010
Bayesian Decision Theory
MaxEnt 2010
Bayesian Decision Theory U (d e , e ) = ∑ p (θ | d e , D , e ) log p (θ | d e , D , e ) Experiment Predicted outcome
• Expected Utility:
EU (e ) = ∑ U (d e ,e ) p (d e ) , de
• The best action maximizes the expected utility
eˆ = arg max EU (e )
MaxEnt 2010
Maximization of Entropy… • Optimal Experiment: maximizes the Information Entropy of the predictive distributions ⎛ ⎞ eˆ = arg max ⎜ C − ∑ p (d e | D ,e , M ) log p (d e | D ,e M )⎟. ⎜ ⎟ d e ⎝ ⎠
• This can also be shown using the Inquiry Calculus (Knuth)
MaxEnt 2010
Estimated value of a parameter “c”
Entropic Search vs. Random Search
MaxEnt 2010
Nested Entropy Sampling INPUT: posterior samples from the inference phase SET UP: Generate a set of s sample experiments randomly and compute the entropy H for each. WHILE samples have different entropy values Select the sample s* from the set that has the least entropy, denoted H*. Generate a trial experiment strial by selecting another sample at random from . the set EXPLORE LOOP Explore by varying the parameters of strial and computing the new value of Htrial. Accept the trial if H > H* otherwise reject it. Monitor acceptance range and change exploration step size Replace s* with strial END WHILE OUTPUT: prescription of the optimal experiment.
MaxEnt 2010
MaxEnt 2010
Simulations with Parameterized Landscape ⎡ 1⎧ H ( x ′, y ′) = ∑ ak Exp ⎢ − ⎨ (( x ′ − uk′ ) k ⎣ 2⎩
A ( y ′ − vk′ ))⎛⎜⎜ k ⎝ Ck
Ck ⎞ ⎛ ( x ′ − uk′ )⎞ ⎫⎤ ⎟⎟ ⎬⎥ ⎟⎟ ⎜⎜ ′ ′ Bk ⎠ ⎝ ( y − vk )⎠ ⎭⎦
We Investigated landscapes comprised of 1- 100 Gaussians MaxEnt 2010
Illustration of NES Sampling
MaxEnt 2010
Compression Efficiency vs. Landscape Complexity
MaxEnt 2010
Compression Efficiency and Convergence Time
MaxEnt 2010
MaxEnt 2010
Experiments with Robotic Arm • Infer the position (x, y) and radius of a circle • Using light sensor measure intensity at a given position • Automated inference and inquiry engines • Programmed only with model and likelihood---no search strategy!
MaxEnt 2010
NES Simulations with Robotic Arm • Brute Force versus NES method
MaxEnt 2010
Time for Convergence
-.- Brute Force Method -.- NES Method
MaxEnt 2010
Conclusions • Implemented Entropy-Based Search – Reduces the number of measurements
• Nested Entropy Sampling – Reduces the number of inquiry computations – Compression Ratio independent of landscape complexity
• Can be adapted for other Utility Functions
MaxEnt 2010
ROBOT DEMO
MaxEnt 2010
Acknowledgements • • • • •
Prof. Ariel Caticha, Physics, University at Albany Dr. Tom Loredo, Cornell University Dr. Julian Center, Autonomous Explorations Inc. Physics Department, University at Albany The MaxEnt Organizers for support
MaxEnt 2010
References • • •
• • • • • • • • • • •
Bernardo, J. M. Expected Information as Expected Utility, The Annals of Statistics, 7, 3, 686-690, 1979. Earle, K.A., Knuth, K.H., Schneider, D.J, Budil, D.E. 2008. Information Theory and Spectral Information, Advanced Electron Spin Resonance Technology Research Center Workshop 2008 (ACERT 2008), Ithaca NY, May 2008. Knuth, K. H., Erner P. M., Frasso S. 2007, Designing intelligent instruments. In Bayesian Inference and Maximum Entropy Methods in Science and Engineering, Saratoga Springs, NY, USA, 2007, edited by K. H. Knuth, A. Caticha, J. L. Center, A. Gi_n, C. C. Rodriguez, AIP Conference Proceedings 954, American Institute of Physics, Melville NY, pp. 203-211. Knuth, K. H. and Center, J. L., Autonomous Science Platforms and Question-Asking Machines, Cognitive Information Processing 2010 (CIP 2010), Italy, 2010. Lindley, D. V., Making Decisions Wiley-interscience, 1971. Lindley, D. V., On a Measure of the Information Provided by an Experiment, Ann. Math. Stat., 27, 4, 9861005, 1956. Loredo, T. J. 2004, Bayesian adaptive exploration. In Bayesian Inference and Maximum Entropy Methods in Science and Engineering, Jackson Hole WY, USA, edited by G. J. Erickson & Y. Zhai, AIP Conference Proceedings 707, American Institute of Physics, New York, 2004, 330. Sebastiani, P. and Wynn, H. P., Maximum Entropy Sampling and Optimal Bayesian Experimental Design, J. Roy. Stat. Soc. B, 62, pp. 145-157, (2000). Shannon, C. E., A Mathematical Theory of Communication, Bell System Technical Journal, 27, 379423, 623656, 1948. Sivia, D. S., Skilling, J., Data Analysis: A Bayesian Tutorial, 2nd Edition, Oxford University Press, 2006. Skilling, J., Nested Sampling for General Bayesian Computation, Bayesian Analysis 4, pp. 833-860, 2006. Thrun S., Burgard W., Fox D.,. Probabilistic Robotics, MIT Press:Cambridge, 2005. Thompson, D. R., Smith, T. and Wettergreen, D., Information-Optimal Selective Data Return for Autonomous Rover Traverse Science and Survey. ICRA 2008. Fischer, R., 2004. Bayesian Experimental Design-Studies for Fusion Diagnostics. In Bayesian Inference and Maximum Entropy Methods in Science and Engineering, edited by R. Fischer, R. Preuss and U. von Toussaint, AIP Conference Proceedings, 735, pp. 76-83.
MaxEnt 2010
MaxEnt 2010