The Robot Scientist Project has now moved to Manchester with Prof Ross King. These pages represent previous work that was conducted at Aberystwyth.
The Robot Scientist is perhaps the first physical implementation of the task of Scientific Discovery in a microbiology laboratory. It represents the merging of increasingly automated and remotely controllable laboratory equipment and knowledge discovery techniques from Artificial Intelligence.
Automation of laboratory equipment (the "Robot" of Robot Scientist) has revolutionised laboratory practice by removing the "drudgery" of constructing many wet lab experiments by hand, allowing an increase in both the scope and scale of potential experiments. Most lab robots only require a simple description of the various chemical/ biological entities to be used in the experiments, along with their required volumes and where these entities are stored. Automation has also given rise to significantly increased productivity and a concomitant increase in the production of results and data requiring interpretation, giving rise to an "interpretation bottleneck" where the process of understanding the results is lagging behind the production of results.
The research fields of Computational Scientific Discovery and Bioinformatics have emerged in part as a response to this bottleneck. Both disciplines use computational approaches from Statistics and Machine Learning to provide an "automated understanding" of the experimental results.
It has become typical practice in Bioinformatics to separate the data collection or experimentation process and the understanding process, where large numbers of experiments are conducted and then specially designed data mining tools are used to identify correlations in the data that might represent hitherto undiscovered scientific knowledge.
This knowledge will initially correspond to the goals of the scientific task, but increasingly the internet repositories that are often constructed to store the data have become the focus of less directed scientific study, where "hidden" knowledge not originally anticipated by the goals of the scientific task may be found. However, this "scrapyard" approach is partly a result of overexperimentation where many unnecessary experiments were conducted along with the potentially informative ones.
The Robot Scientist makes use of an iterative approach to experimentation, where knowledge aquired from a previous iteration is used to guide the next experimentation step. This is a process known as Active Learning, where the learner can plan its own agenda, i.e. decide how best to improve its knowledge base and how to go about acquiring this information. The Robot Scientist uses the laboratory robot to execute the experiment(s) selected as most informative; has a plate reader to analyse the experiments, generating data corresponding to the scientific observations; uses abductive logic programming to generate valid hypotheses that explain the observations; and uses these hypotheses to determine the next most informative experiment. At the beginning of any investigation, the Robot Scientist has not discovered any information, therefore all possible hypotheses are equally valid. As the directed discovery process continues, each new observation (or experiment/interpretation cycle) will invalidate some of the hypotheses, thereby excluding incorrect discoveries. The experiment selection process aims to choose the experiment most likely to refute the most hypotheses. This iterative process allows irrelevant experiments to be avoided, potentially saving both laboratory time and the cost of using unnecessary reagents and biological materials.