Professor Paul Krause

Professor of Software Engineering

Qualifications: BSc PhD FIMA CMath

Email:
Phone: Work: 01483 68 9861
Room no: 32 BB 02

Office hours

Monday: 14:00-16:00

Further information

Biography

1977       BSc Combined Honours (Class I) in Pure Mathematics and Physics (University of Exeter)

1977-1980    PhD in Geophysics (University of Exeter)

1980-1987    National Physical Laboratory - Research in low-temperature metrology, with a specific focus on the maintenance and development of the Josephson Voltage Standard.

1987-1989    Research Fellow in Computing at University of Surrey studying the animation of formal specifications

1989-1996    Research Fellow at Imperial Cancer Research Fund studying models of argumentation and their application to clinical diagnosis and patient management

1996-2003    Prinicipal Scientist and then Senior Principle Scientist at Philips Research Laboratories. Focus on developing techniques to support the specification, automated testing and quality analysis of embedded software

2001-Present    Professor of Software Engineering, University of Surrey

Google Scholar is currently (February 2014) giving me an h-index of 24.

I am a Guest Mentor at Coding House in California.

I am also Editor (Computing and Software) of the IET’s Open Access Journal of Engineering, and Editor-in-Chief of the GSTF Journal on Artificial Intelligence.

Use this link to access my Getting Started with Ruby on Rails course at a 50% discount.

Research Interests

  • Digital and Industrial Ecosystems as Complex Adaptive Systems
  • Use of ICT to support sustainable living and social change
  • Social-constructivist approaches to continued professional development
  • Formal models of interactive computing
  • Practical applications of Machine Learning

Further details can be found on my Mendeley entry:
Paul Krause's citations
and on my personal web page.

Publications

Click here to see my Google Scholar Citation Profile.

Journal articles

  • Krause PJ, Sabry N. (2013) 'Optimal Green Virtual Machine Migration Model'. International Journal of Business Data Communications and Networking, 9 (3), pp. 35-52.

    Abstract

    Cloud computing provides the opportunity to migrate virtual machines to “follow-the-green” data centres. That is, to migrate virtual machines between green data centres on the basis of clean energy availability, to mitigate the environmental impact of carbon footprint emissions and energy consumption. The virtual machine migration problem can be modelled to maximize the utility of computing resources or minimizing the cost of using computing resources. However, this would ignore the network energy consumption and its impact on the overall CO2 emissions. Unless this is taken into account the extra data traffic due to migration of data could then cause an increase in brown energy consumption and eventually lead to an unintended increase in carbon footprint emissions. Energy consumption is a key aspect in deploying distributed service in cloud networks within decentralized service delivery architectures. In this paper, the authors address an optimiza- tion view of the problem of locating a set of cloud services on a set of sites green data centres managed by a service provider or hybrid cloud computing brokerage. The authors’ goal is to minimize the overall network energy consumption and carbon footprint emissions for accessing the cloud services for any pair of data centres i and j. The authors propose an optimization migration model based on the development of integer linear programming (ILP) models, to identify the leverage of green energy sources with data centres and the energy consumption of migrating VMs.

  • de Lusignan S, Krause P, Michalakidis G, Vicente MT, Thompson S, McGilchrist M, Sullivan F, van Royen P, Agreus L, Desombre T, Taweel A, Delaney B. (2012) 'Business Process Modelling is an Essential Part of a Requirements Analysis. Contribution of EFMI Primary Care Working Group.'. Schattauer Publishers Yearb Med Inform, Germany: 7 (1), pp. 34-43.

    Abstract

    To perform a requirements analysis of the barriers to conducting research linking of primary care, genetic and cancer data.

  • de Lusignan S, Cashman J, Poh N, Michalakidis G, Mason A, Desombre T, Krause P. (2012) 'Conducting Requirements Analyses for Research using Routinely Collected Health Data: a Model Driven Approach.'. Stud Health Technol Inform, Netherlands: 180, pp. 1105-1107.

    Abstract

    Background: Medical research increasingly requires the linkage of data from different sources. Conducting a requirements analysis for a new application is an established part of software engineering, but rarely reported in the biomedical literature; and no generic approaches have been published as to how to link heterogeneous health data. Methods: Literature review, followed by a consensus process to define how requirements for research, using, multiple data sources might be modeled. Results: We have developed a requirements analysis: i-ScheDULEs - The first components of the modeling process are indexing and create a rich picture of the research study. Secondly, we developed a series of reference models of progressive complexity: Data flow diagrams (DFD) to define data requirements; unified modeling language (UML) use case diagrams to capture study specific and governance requirements; and finally, business process models, using business process modeling notation (BPMN). Discussion: These requirements and their associated models should become part of research study protocols.

  • Krause PJ, Perez-Minana E, Thornton J. (2012) 'Bayesian Networks for the management of Greenhouse Gas emissions in the British agricultural sector'. Elsevier Environmental Modelling and Software, 35, pp. 132-148.

    Abstract

    Recent years have witnessed a rapid rise in the development of deterministic and non-deterministic models to estimate human impacts on the environment. An important failing of these models is the difficulty that most people have understanding the results generated by them, the implications to their way of life and also that of future generations. Within the field, the measurement of greenhouse gas emissions (GHG) is one such result. The research described in this paper evaluates the potential of Bayesian Network (BN) models for the task of managing GHG emissions in the British agricultural sector. Case study farms typifying the British agricultural sector were inputted into both, the BN model and CALM, a Carbon accounting tool used by the Country Land and Business Association (CLA) in the UK for the same purpose. Preliminary results show that the BN model provides a better understanding of how the tasks carried out on a farm impact the environment through the generation of GHG emissions. This understanding is achieved by translating the emissions information into their cost in monetary terms using the Shadow Price of Carbon (SPC), something that is not possible using the CALM tool. In this manner, the farming sector should be more inclined to deploy measures for reducing its impact. At the same time, the output of the analysis can be used to generate a business plan that will not have a negative effect on a farm's capital income.

  • Leppenwell E, de Lusignan S, Vicente MT, Michalakidis G, Krause P, Thompson S, McGilchrist M, Sullivan F, Desombre T, Taweel A, Delaney B. (2012) 'Developing a survey instrument to assess the readiness of primary care data, genetic and disease registries to conduct linked research: TRANSFoRm International Research Readiness (TIRRE) survey instrument.'. Inform Prim Care, England: 20 (3), pp. 207-216.

    Abstract

    Clinical data are collected for routine care in family practice; there are also a growing number of genetic and cancer registry data repositories. The Translational Research and Patient Safety in Europe (TRANSFoRm) project seeks to facilitate research using linked data from more than one source. We performed a requirements analysis which identified a wide range of data and business process requirements that need to be met before linking primary care and either genetic or disease registry data.

  • de Lusignan S, Liaw ST, Krause P, Curcin V, Vicente MT, Michalakidis G, Agreus L, Leysen P, Shaw N, Mendis K. (2011) 'Key Concepts to Assess the Readiness of Data for International Research: Data Quality, Lineage and Provenance, Extraction and Processing Errors, Traceability, and Curation. Contribution of the IMIA Primary Health Care Informatics Working Group.'. Yearb Med Inform, Germany: 6 (1), pp. 112-120.

    Abstract

    To define the key concepts which inform whether a system for collecting, aggregating and processing routine clinical data for research is fit for purpose.

  • Marinos A, Krause P. (2010) 'Towards the web of models: A rule-driven RESTful architecture for distributed systems'. Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), 6403 LNCS, pp. 251-258.
  • Hierons RM, Bogdanov K, Bowen JP, Cleaveland R, Derrick J, Dick J, Gheorghe M, Harman M, Kapoor K, Krause P, Luettgen G, Simons AJH, Vilkomir S, Woodward MR, Zedan H. (2009) 'Using Formal Specifications to Support Testing'. ASSOC COMPUTING MACHINERY ACM COMPUTING SURVEYS, 41 (2) Article number ARTN 9
  • Razavi A, Moschoyiannis S, Krause P. (2009) 'An open digital environment to support business ecosystems'. SPRINGER Peer-to-Peer Networking and Applications, 2 (4), pp. 367-397.

    Abstract

    We present a Peer-to-Peer network design which aims to support business activities conducted through a network of collaborations that generate value in different, mutually beneficial, ways for the participating organisations. The temporary virtual networks formed by long-term business transactions that involve the execution of multiple services from different providers are used as the building block of the underlying scale-free business network. We show how these local interactions, which are not governed by a single organisation, give rise to a fully distributed P2P architecture that reflects the dynamics of business activities. The design is based on dynamically formed permanent clusters of nodes, the so-called Virtual Super Peers (VSPs), and this results in a topology that is highly resilient to certain types of failure (and attacks). Furthermore, the proposed P2P architecture is capable of reconfiguring itself to adapt to the usage that is being made of it and respond to global failures of conceptual hubs. This fosters an environment where business communities can evolve to meet emerging business opportunities and achieve sustainable growth within a digital ecosystem.

  • Moschoyiannis S, Krause PJ, Shields MW. (2009) 'A True-Concurrent Interpretation of Behavioural Scenarios'. Elsevier Electronic Notes in Theoretical Computer Science, 203 (7), pp. 3-22.

    Abstract

    We describe a translation of scenarios given in UML 2.0 sequence diagrams into a tuples-based behavioural model that considers multiple access points for a participating instance and exhibits true-concurrency. This is important in a component setting since different access points are connected to different instances, which have no knowledge of each other. Interactions specified in a scenario are modelled using tuples of sequences, one sequence for each access point. The proposed unfolding of the sequence diagram involves mapping each location (graphical position) onto the so-called component vectors. The various modes of interaction (sequential, alternative, concurrent) manifest themselves in the order structure of the resulting set of component vectors, which captures the dependencies between participating instances. In previous work, we have described how (sets of) vectors generate concurrent automata. The extension to our model with sequence diagrams in this paper provides a way to verify the diagram against the state-based model.

  • Bryant D, Krause P. (2008) 'A review of current defeasible reasoning implementations'. CAMBRIDGE UNIV PRESS KNOWLEDGE ENGINEERING REVIEW, 23 (3), pp. 227-260.
  • Krause PJ, Fenton N, Neil M, Marsh W, Hearty P, Radlinski L. (2008) 'On the effectiveness of early life cycle defect prediction with Bayesian Nets'. Springer Empirical Software Engineering: an international journal, 13 (5), pp. 499-537.

    Abstract

    Standard practice in building models in software engineering normally involves three steps: collecting domain knowledge (previous results, expert knowledge); building a skeleton of the model based on step 1 including as yet unknown parameters; estimating the model parameters using historical data. Our experience shows that it is extremely difficult to obtain reliable data of the required granularity, or of the required volume with which we could later generalize our conclusions. Therefore, in searching for a method for building a model we cannot consider methods requiring large volumes of data. This paper discusses an experiment to develop a causal model (Bayesian net) for predicting the number of residual defects that are likely to be found during independent testing or operational usage. The approach supports (1) and (2), does not require (3), yet still makes accurate defect predictions (an R 2 of 0.93 between predicted and actual defects). Since our method does not require detailed domain knowledge it can be applied very early in the process life cycle. The model incorporates a set of quantitative and qualitative factors describing a project and its development process, which are inputs to the model. The model variables, as well as the relationships between them, were identified as part of a major collaborative project. A dataset, elicited from 31 completed software projects in the consumer electronics industry, was gathered using a questionnaire distributed to managers of recent projects. We used this dataset to validate the model by analyzing several popular evaluation measures (R 2, measures based on the relative error and Pred). The validation results also confirm the need for using the qualitative factors in the model. The dataset may be of interest to other researchers evaluating models with similar aims. Based on some typical scenarios we demonstrate how the model can be used for better decision support in operational environments. We also performed sensitivity analysis in which we identified the most influential variables on the number of residual defects. This showed that the project size, scale of distributed communication and the project complexity cause the most of variation in number of defects in our model. We make both the dataset and causal model available for research use.

  • Krause PJ, Fenton N, Neil M, Marsh W, Hearty P, Marquez D, Mishra R. (2007) 'Predicting software defects in varying development lifecycles using Bayesian nets'. Information and Software Technology, 49 (1), pp. 32-43.
  • Fenton N, Neil M, Marsh W, Hearty P, Marquez D, Krause P, Mishra R. (2007) 'Predicting software defects in varying development lifecycles using Bayesian nets'. ELSEVIER SCIENCE BV INFORMATION AND SOFTWARE TECHNOLOGY, 49 (1), pp. 32-43.
  • Krause PJ, Ambler S, Elvang-Goransson M, Fox J. (1995) 'A Logic of Argumentation for Reasoning Under Unertainty'. Wiley Blackwell Computational Intelligence, 11, pp. 113-131.

    Abstract

    We present the syntax and proof theory of a logic of argumentation, LA. We also outline the development of a category theoretic semantics for LA. LA is the core of a proof theoretic model for reasoning under uncertainty. In this logic, propositions are labelled with a representation of the arguments which support their validity. Arguments may then be aggregated to collect more information about the potential validity of the propositions of interest. We make the notion of aggregation primitive to the logic, and then define strength mappings from sets of arguments to one of a number of possible dictionaries. This provides a uniform framework which incorporates a number of numerical and symbolic techniques for assigning subjective confidences to propositions on the basis of their supporting arguments. These aggregation techniques are also described, with examples

Conference papers

  • Ryman-Tubb NF, Krause P, Iliadis L, Jayne C. (2011) 'Neural Network Rule Extraction to Detect Credit Card Fraud'. SPRINGER-VERLAG BERLIN ENGINEERING APPLICATIONS OF NEURAL NETWORKS, PT I, Corfu, GREECE: 12th INNS EANN-SIG International Conference (EANN 2011)/7th IFIP 12 5 International Conference (AIAI 2011) 363, pp. 101-110.
  • Michalakidis G, Kumarapeli P, Ring A, van Vlymen J, Krause P, de Lusignan S. (2010) 'A system for solution-orientated reporting of errors associated with the extraction of routinely collected clinical data for research and quality improvement.'. Studies in Health Technology and Informatics: Proceedings of the 13th World Congress on Medical Informatics, Cape Town, South Africa: MEDINFO 2010 160 (Pt 1), pp. 724-728.

    Abstract

    We have used routinely collected clinical data in epidemiological and quality improvement research for over 10 years. We extract, pseudonymise and link data from heterogeneous distributed databases; inevitably encountering errors and problems.

  • Krause P, de Lusignan S. (2010) 'Procuring interoperability at the expense of usability: a case study of UK National Programme for IT assurance process.'. Studies in Health Technology and Informatics: Seamless care, safe care: the challenges of interoperability and patient safety in health care: Proceedings of the EFMI Special Topic Conference, Reykjavik, Iceland: EFMI Special Topic Conference 155, pp. 143-149.

    Abstract

    The allure of interoperable systems is that they should improve patient safety and make health services more efficient. The UK's National Programme for IT has made great strides in achieving interoperability; through linkage to a national electronic spine. However, there has been criticism of the usability of the applications in the clinical environment.

  • Moschoyiannis S, Marinos A, Krause P. (2010) 'Generating SQL queries from SBVR rules'. SPRINGER-VERLAG BERLIN Lecture Notes in Computer Science: Semantic Web Rules, Washington, DC, USA: Rule ML 2010: The 4th International Web Rule Symposium: Research Based and Industry Focused 6403, pp. 128-143.

    Abstract

    Declarative technologies have made great strides in expressivity between SQL and SBVR. SBVR models are more expressive that SQL schemas, but not as imminently executable yet. In this paper, we complete the architecture of a system that can execute SBVR models. We do this by describing how SBVR rules can be transformed into SQL DML so that they can be automatically checked against the database using a standard SQL query. In particular, we describe a formalization of the basic structure of an SQL query which includes aggregate functions, arithmetic operations, grouping, and grouping on condition. We do this while staying within a predicate calculus semantics which can be related to the standard SBVR-LF specification and equip it with a concrete semantics for expressing business rules formally. Our approach to transforming SBVR rules into standard SQL queries is thus generic, and the resulting queries can be readily executed on a relational schema generated from the SBVR model.

  • Krause PJ, Marinos A. (2009) 'An SBVR framework for RESTful Web Applications'. Springer Lecture Notes in Computer Science: Rule Interchange and Applications, Las Vegas, Nevada: IRuleML 2009 International Symposium 5858, pp. 144-158.

    Abstract

    We propose a framework that can be used to produce functioning web applications from SBVR models. To achieve this, we begin by discussing the concept of declarative application generation and examining the commonalities between SBVR and the RESTful architectural style of the web. We then show how a relational database schema and RESTful interface can be generated from an SBVR model. In this context, we discuss how SBVR can be used to semantically describe hypermedia on the Web and enhance its evolvability and loose coupling properties. Finally, we show that this system is capable of exhibiting process-like behaviour without requiring explicitly defined processes.

  • Marinos A, Krause P. (2009) 'Using SBVR, REST and Relational Databases to develop Information Systems native to the Digital Ecosystem'. IEEE 2009 3RD IEEE INTERNATIONAL CONFERENCE ON DIGITAL ECOSYSTEMS AND TECHNOLOGIES, Istanbul, TURKEY: 3rd IEEE International Conference on Digital Ecosystems and Technologies, pp. 424-429.
  • Marinos A, Krause P. (2009) 'What, not How: A generative approach to service composition'. IEEE 2009 3RD IEEE INTERNATIONAL CONFERENCE ON DIGITAL ECOSYSTEMS AND TECHNOLOGIES, Istanbul, TURKEY: 3rd IEEE International Conference on Digital Ecosystems and Technologies, pp. 430-435.
  • Krause PJ, Razavi AR, Moschoyiannis S, Marinos A. (2009) 'Stability and Complexity in Digital Ecosystems'. IEEE 2009 3RD IEEE INTERNATIONAL CONFERENCE ON DIGITAL ECOSYSTEMS AND TECHNOLOGIES, Istanbul, TURKEY: 3rd IEEE International Conference on Digital Ecosystems and Technologies, pp. 200-205.
  • Marinos A, Razavi A, Moschoyiannis S, Krause P, Damiani E, Zhang J, Chang R. (2009) 'RETRO: A Consistent and Recoverable RESTful Transaction Model'. IEEE 2009 IEEE INTERNATIONAL CONFERENCE ON WEB SERVICES, VOLS 1 AND 2, Los Angeles, CA: IEEE International Conference on Web Services (ICWS 2009), pp. 181-188.
  • Moschoyiannis S, Krause P, Bryant D, McBurney P. (2009) 'Verifiable Protocol Design for Agent Argumentation Dialogues'. IEEE 2009 3RD IEEE INTERNATIONAL CONFERENCE ON DIGITAL ECOSYSTEMS AND TECHNOLOGIES, Istanbul, TURKEY: 3rd IEEE International Conference on Digital Ecosystems and Technologies, pp. 459-464.
  • Razavi A, Marinos A, Moschoyiannis S, Krause P, Gaedke M, Grossniklaus M, Diaz O. (2009) 'RESTful Transactions Supported by the Isolation Theorems'. SPRINGER-VERLAG BERLIN WEB ENGINEERING, PROCEEDINGS, San Sebastian, SPAIN: 9th International Conference on Web Engineering 5648, pp. 394-409.
  • Razavi A, Marinos A, Moschoyiannis S, Krause P. (2009) 'Recovery management in RESTful Interactions'. IEEE Proceedings of 3rd IEEE International Conference on Digital Ecosystems and Technologies, Istanbul, Turkey: DEST 2009, pp. 436-441.

    Abstract

    With REST becoming a dominant architectural paradigm for web services in distributed systems, more and more use cases are applied to it, including use cases that require transactional guarantees. We believe that the loose coupling that is supported by RESTful transactions, makes this currently our preferred interaction style for digital ecosystems (DEs). To further expand its value to DEs, we propose a RESTful transaction model that satisfies both the constraints of recoverable transactions and those of the REST architectural style. We then show the correctness and applicability of the model.

  • Fenton N, Neil M, Marsh W, Hearty P, Radlinski L, Krause P. (2008) 'On the effectiveness of early life cycle defect prediction with Bayesian Nets'. SPRINGER EMPIRICAL SOFTWARE ENGINEERING, Minneapolis, MN: 3rd International Workshop on Predictor Models in Software Engineering (PROMISE 2007) 13 (5), pp. 499-537.
  • Moschoyiannis S, Razavi AR, Zheng YY, Krause P. (2008) 'Long-running Transactions: semantics, schemas, implementation'. IEEE Proceedings of 2nd IEEE International Conference on Digital Ecosystems and Techonologies, Phitsanuloke, Thailand: IEEE DEST 2008, pp. 208-215.

    Abstract

    In this paper we describe a formal model for the distributed coordination of long-running transactions in a Digital Ecosystem for business, involving Small and Medium Enterprises (SMEs). The proposed non-interleaving model of interaction-based service composition allows for communication between internal activities of transactions. The formal semantics of the various modes of service composition are represented by standard xml schemas. The current implementation framework uses suitable asynchronous message passing techniques and reflects the design decisions of the proposed model for distributed transactions in digital ecosystems.

  • Razavi AR, Moschoyiannis SK, Krause PJ. (2008) 'A Scale-free Business Network for Digital Ecosystems'. IEEE 2008 2ND IEEE INTERNATIONAL CONFERENCE ON DIGITAL ECOSYSTEMS AND TECHNOLOGIES, Phitsanuloke, THAILAND: 2nd IEEE International Conference on Digital Ecosystems and Technologies, pp. 196-201.
  • Zheng Y, Krause P. (2007) 'Automata semantics and analysis of BPEL'. IEEE 2007 INAUGURAL IEEE INTERNATIONAL CONFERENCE ON DIGITAL ECOSYSTEMS AND TECHNOLOGIES, Cairns, AUSTRALIA: IEEE International Conference on Digital Ecosystems and Technologies, pp. 307-312.
  • Zheng Y, Zhou J, Krause P, Latifi S. (2007) 'A model checking based test case generation framework for web services'. IEEE COMPUTER SOC International Conference on Information Technology, Proceedings, Las Vegas, NV: 4th International Conference on Information Technology - New Generations, pp. 715-720.
  • Razavi AR, Moschoyiannis SK, Krause PJ, McEwan AA, Schneider S, Ifill W, Welch P. (2007) 'Concurrency Control and Recovery Management for Open e-Business Transactions'. IOS PRESS WOTUG-30: COMMUNICATING PROCESS ARCHITECTURES 2007, Univ Surrey, Guildford, ENGLAND: 30th WoTUG Technical Meeting 2007 65, pp. 267-285.
  • Razavi AR, Malone PJ, Moschoyiannis S, Jennings B, Krause PJ. (2007) 'A distributed transaction and accounting model for digital ecosystem composed services'. IEEE 2007 INAUGURAL IEEE INTERNATIONAL CONFERENCE ON DIGITAL ECOSYSTEMS AND TECHNOLOGIES, Cairns, AUSTRALIA: IEEE International Conference on Digital Ecosystems and Technologies, pp. 215-218.
  • Razavi AR, Moschoyiannis SK, Krause PJ. (2007) 'A coordination model for distributed transactions in Digital Business EcoSystems'. IEEE 2007 INAUGURAL IEEE INTERNATIONAL CONFERENCE ON DIGITAL ECOSYSTEMS AND TECHNOLOGIES, Cairns, AUSTRALIA: IEEE International Conference on Digital Ecosystems and Technologies, pp. 319-324.
  • Zhang F, Povey D, Krause P. (2007) 'Protein Attributes Microtuning System (PAMS): an effective tool to increase protein structure prediction by data purification'. IEEE 2007 INAUGURAL IEEE INTERNATIONAL CONFERENCE ON DIGITAL ECOSYSTEMS AND TECHNOLOGIES, Cairns, AUSTRALIA: IEEE International Conference on Digital Ecosystems and Technologies, pp. 53-58.
  • Zheng Y, Zhou J, Krause P, Muller P, Liggesmeyer P, Maehle E. (2007) 'Analysis of BPEL data dependencies'. IEEE COMPUTER SOC SEAA 2007: 33rd EUROMICRO Conference on Software Engineering and Advanced Applications, Proceedings, Lubeck, GERMANY: 33rd EUROMICRO Conference on Software Engineering and Advanced Applications, pp. 351-358.
  • Bryant D, Krause PJ, Vreeswijk GAW, Dunne PE, BenchCapon TJM. (2006) 'Argue tuProlog: A Lightweight Argumentation Engine for Agent Applications'. I O S PRESS COMPUTATIONAL MODELS OF ARGUMENT, Univ Liverpool, Dept Comp Sci, Liverpool, ENGLAND: 1st International Conference on Computational Models of Argument (COMMA) 144, pp. 27-32.
  • Bryant D, Krause P, Moschoyiannis S, Fisher M, VanDerHoek W, Konev B, Lisitsa A. (2006) 'A tool to facilitate agent deliberation'. SPRINGER-VERLAG BERLIN LOGICS IN ARTIFICIAL INTELLIGENCE, PROCEEDINGS, Liverpool, ENGLAND: 10th European Conference on Logics in Artificial Intelligence 4160, pp. 465-468.
  • Mak L-O, Krause P. (2006) 'Detection & management of concept drift'. IEEE Proceedings of 2006 International Conference on Machine Learning and Cybernetics, Vols 1-7, Dalian, PEOPLES R CHINA: 5th International Conference on Machine Learning and Cybernetics, pp. 3486-3491.
  • Bryant D, Krause P, Fisher M, VanDerHoek W, Konev B, Lisitsa A. (2006) 'An implementation of a lightweight argumentation engine for agent applications'. SPRINGER-VERLAG BERLIN LOGICS IN ARTIFICIAL INTELLIGENCE, PROCEEDINGS, Liverpool, ENGLAND: 10th European Conference on Logics in Artificial Intelligence 4160, pp. 469-472.
  • Zheng Y, Krause P, Mei H. (2006) 'Asynchronous semantics and anti-patterns for interacting web services'. IEEE COMPUTER SOC QSIC 2006: Sixth International Conference on Quality Software, Proceedings, Beijing, PEOPLES R CHINA: 6th International Conference on Quality Software, pp. 74-81.

Book chapters

  • Razavi A, Krause P, Moschoyiannis S. (2010) 'Digital Ecosystems: challenges and proposed solutions'. in Antonopoulos N, Exarchakos G, Li M, Liotta A (eds.) Handbook of research on P2P and grid systems for service-oriented computing Hersehy, PA : Information Science Reference - Imprint of: IGI Publishing , pp. 1003-1031.

Performances

  • Sansom M, Salazar N, Krause P. (2012) MindBeat Quintet: Kinetifying thought through movement and sound.. Studio 2, Ivy Arts Centre, University of Surrey, UK:

    Abstract

    MindBeat is a software developed at the University of Surrey that facilitates collaborative thinking within a multi-disciplinary set-up. The project involved the development of an electronic space that enabled an academic ensemble to carry out an 'ideas improvisation'. Five academics from very different disciplines were invited to post short 'beats' (texts made up of no more than 3-4 sentences), around a predefined question: what makes multidisciplinary collaborations work? The beats developed in time into a progressive thread of ideas. The aim of the software was to track the emergence, development and decline of new ideas within a multidisciplinary environment, and also to be able to understand the patterns that emerge in this process by representing the ideas visually as coloured squares. The MindBeat software was launched in June 2012 as part of an electronic theatre production of Peter Handke's 'Offending the Audience'. The five Surrey academics played the parts remotely by feeding Handke's text onto the Mindbeat website as part of a three-day durational performance. An open audience was then invited to interact with the play's five voices by sitting at one of five iMac stations set up in the studio space. These five computer monitors showed the text broken down into coloured square patterns. The audience could open the text by clicking onto the coloured squares, which would reveal the short text or beat. They could then add a comment or thought to the original text. The audience's participation produced almost 500 additional beats, creating an alternative version to the Handke script. The Mindbeat software visualised this ideation as a complex pattern of coloured squares. The installation featured generative video and generative electronic music played live throughout the entire three-days. Using the colour and shape patterns of the ideas-exchange as their score, the musicians shared the software visualisation as a basis for their durational sonic improvisation.

Teaching

COM2025: Web Application Development

COMM013: Agile Web Development with Ruby and Rails

COMM035: Service Oriented Architecture

Departmental Duties

Director of Research

Chair of Academic Misconduct Panel

Chair of Research Management Committee

Page Owner: css1pk
Page Created: Tuesday 2 December 2008 10:09:17 by mf0009
Last Modified: Wednesday 19 March 2014 17:05:20 by aa00325
Assembly date: Fri Jul 25 23:09:48 BST 2014
Content ID: 2140
Revision: 16
Community: 1028