Read PDF Labelled deductive systems

Free download. Book file PDF easily for everyone and every device. You can download and read online Labelled deductive systems file PDF Book only if you are registered here. And also you can download or read online all Book PDF file that related with Labelled deductive systems book. Happy reading Labelled deductive systems Bookeveryone. Download file Free Book PDF Labelled deductive systems at Complete PDF Library. This Book have some digital formats such us :paperbook, ebook, kindle, epub, fb2 and another formats. Here is The CompletePDF Book Library. It's free to register here to get Book file PDF Labelled deductive systems Pocket Guide.

Upload PDF. Follow this author. New articles by this author. New citations to this author. New articles related to this author's research. Email address for updates.

My profile My library Metrics Alerts. Sign in. Get my own profile Cited by View all All Since Citations h-index 69 36 iindex Current investigations are focused on two aspects; i how to exploit the additional expressiveness in CLDS achieved by the Hybrid operators, and ii how the tableau formulation enables decidability results to be incorporated into CLDS, as is achieved in References [1] K. Broda, D. Gabbay, L. Lamb and A. Broda and A. Artemov et al, College Publications, pp , Broda, A.

Russo and D. A unified compilation style natural deduction system for modal, substructural and fuzzy logics. Broda, M. Finger and A. Blackburn and B. Pure extensions, proof rules, and hybrid axiomatics. Studia Logica 84 2 : , Kakas and F. Bolander and T. Broda and D. Basin et al, Kluwer Academic Publishers, Related Papers. From Gabbay-style rules to labelled deduction. By Patrick Blackburn. Labelled Natural Deduction for Substructural Logics.

By Krysia Broda. By Marcello D'Agostino. Language and proof theory. By Ruth Kempson. Download pdf. That is, it is the verbal interaction between several agents that facilitates the information flow that enabled the logical reasoning to be undertaken.

Refine list

It is at this point that multi-agent epistemic logic raises new questions regarding the information in a group. Group knowledge is importantly different from common knowledge Lewis ; Fagin et al. In other words, common knowledge concerns the hard information that each agent in the group possesses about the hard information possessed by the other members of the group. With group knowledge each agent in the group may possess the same hard information hence achieving group knowledge without necessarily possessing hard information about the hard information possessed by the other agents in the group.

There are a group of competing spies at a formal dinner. All of them are tasked with the mission of acquiring some secret information from inside the restaurant. Furthermore, it is common knowledge amongst them that they want the information. Given this much, compare the following:. Very obviously, the two scenarios will elicit very different types of behaviour from the spies.

The first would be relatively subtle, the latter dramatically less so. See Vanderschraaf and Sillari for further details. A still more fine-grained use of S5 based epistemic logics is that of Zhou Zhou demonstrates that S5 based epistemic logic may be used to model the epistemic states of the agent from the perspective of the agent themselves. Hence Zhou refers to such an epistemic logic as internally epistemic. See the full entry on Dynamic Epistemic Logic. As noted above, the waiter example from the beginning of this section is as much about information-gain via announcements, epistemic actions , as it is about information structures.

In this section, we will outline how it is that the expressive power of multi-agent epistemic logic can be extended so as to capture epistemic actions. Hard information flow, that is, the flow of information between the knowledge states of two or more agents, can be facilitated by more than one epistemic action. Two canonical examples are announcements and observations. Dynamic epistemic logics extend the language of non-dynamic epistemic logics with dynamic operators.


  • Luis Lamb (Universidade Federal do Rio Grande do Sul) - PhilPeople!
  • Labelled Deductive Systems : Dov M. Gabbay : .
  • Psychiatry and Clinical Neuroscience.
  • Guido Governatori's Research;

The key reduction axioms of PAL are as follows:. RA1—RA5 capture the properties of the announcement operator by connecting what is true before the announcement with what is true after the announcement. For an in depth discussion see Pacuit RA1 states that announcements are truthful. RA5 specifies the epistemic-state-transforming properties of the announcement operator. The interaction between the dynamic announcement operator and the knowledge operator is described completely by RA5 see van Benthem, van Eijck, and Kooi The exact relationship between public announcements and common knowledge is captured by the announcement and common knowledge rule of the logic PAC as the following:.

Again, PAC is the dynamic logic of hard information. The epistemic logics dealing with soft information fall within the scope of belief revision theory van Benthem ; Segerberg Recall that hard and soft information are not distinct types of information per se , rather they are distinct types of information storage. Hard-stored information is unrevisable, whereas soft-stored information is revisable. Variants of PAL that model soft information augment their models with plausibility-orderings on information-states Baltag and Smets These orderings are known as preferential models in non-monotonic logic and belief-revision theory.

The logics can be made dynamic in virtue of the orderings changing in the face of new information which is the mark of soft information as opposed to hard information. Such plausibility-orderings may be modelled qualitatively via partial orders etc. Such quantitative measures provide a connection to a broader family of quantitative approaches to semantic information that we will examine below. Recent work by Allo ties the soft information of dynamic epistemic logic to non-monotonic logics.

This is an intuitive move.

[PDF] LDS - Labelled Deductive Systems: Volume 1 - Foundations - Semantic Scholar

Soft information is information that has been stored in a revisable way, hence the revisable nature of conclusions in non-monotonic arguments makes non-monotonic logics a natural fit. On this very topic, see also Chapter Private information. Private information is an equally important aspect of our social interaction.

Consider scenarios where the announcing agent is aware of the private communication whilst other members of the group are not, such as emails in Bcc. Consider also scenarios where the sending agent is not aware of the private communication, such as a surveillance operation. For an excellent overview and integration of all of the issues above, see the recent work of van Benthem , where the author discusses multiple interrelated levels of logical dynamics, one level of update, and another of representation. For an extensive collection of papers extending this and related approaches, see Baltag and Smets The modal information theory approach to multi-agent information flow is the subject of a great amount of research.

The semantics is not always carried out in relational terms i.

Labelled Deductive Systems, Volume 1, Dov M. Gabbay

For more details on algebraic as well as type-theoretic approaches, see the subsection on algebraic and other approaches to modal information theory in the supplementary document Abstract Approaches to Information Structure. Quantitative approaches to information as range also have their origins in the inverse relationship principle.

To restate—the motivation being that the less likely the truth of a proposition as expressed in a logical language with respect to a particular domain, the greater the amount of information encoded by the relevant formula. Another important aspect of the classical theory of information, is that it is an entirely static theory—it is concerned with the informational content and measure of particular formulas, and not with information flow in any way at all. The formal details of classical information theory turn on the probability calculus. These details may be left aside here, as the obvious conceptual point is that logical truths have a truth-likelihood of 1, and therefore an information measure of 0.

Bar-Hillel and Carnap did not take this to mean that logical truths, or deductions, were without information yield, only that their theory of semantic information was not designed to capture such a property. They coined the term psychological information for the property involved.

See Floridi for further details. A quantitative attempt at specifying the information yield of deductions was undertaken by Jaakko Hintikka with his theory of surface information and depth information Hintikka , This itself is a considerable achievement, but although technically astounding, a serious restriction of this approach is that it is only a fragment of the deductions carried out within full first-order logic that yield a non-zero information measure.

The rest of the deductions in the full polyadic predicate calculus, as well as all of those in the monadic predicate calculus and propositional calculus, measure 0, see Sequoiah-Grayson The obvious inverse situation with the theory of classical semantic information is that logical contradictions, having a truth-likelihood of 0, will deliver a maximal information measure of 1. Referred to in the literature as the Bar-Hillel-Carnap Semantic Paradox , the most developed quantitative approach to addressing it is the theory of strongly semantic information Floridi The conceptual motivation behind strongly semantic information is that for a statement to yield information, it must help us to narrow down the set of possible worlds.

That is, it must assist us in the search for the actual world, so to speak Sequoiah-Grayson Such a contingency requirement on informativeness is violated by both logical truths and logical contradictions, both of which measure 0 on the theory of strongly semantic information. See also Brady for recent work on the relationship between quantitative accounts of information and analyticity. For a new approach to connecting quantitative and qualitative measures of information, see Harrison-Trainor et al.

The correlational take on information looks at how the existence of systematic connections between the parts of a structured information environment permits that one part may carry information about another. For example: the pattern of pixels that appear on the screen of a computer gives information not necessarily complete about the sequence of keys that were pressed by the person who is typing a document, and even a partial snapshot of the clear starred sky your friend is looking at now will give you information about his possible locations on Earth at this moment.

The focus on structured environments and the aboutness of information goes hand in hand with a third main topic of the information-as correlation approach, namely the situatedness of information , that is, its dependence on the particular setting on which an informational signal occurs. Take the starry sky as an example again: the same pattern of stars, at different moments in time and locations in space will in general convey different information about the location of your friend. Shannon considered a communication system formed by two information sites, a source and a receiver, connected via a noisy channel.

Services on Demand

He gave conclusive and extremely useful answers to questions having to do with the construction of communication codes that help maximising the effectiveness of communication in terms of bits of information that can be transmitted while minimizing the possibility of errors caused by channel noise. Situation theory Barwise and Perry ; Devlin is the major logical framework so far that has made these ideas its starting point for an analysis of information. Its origin and some of its central insights can be found in the project of naturalization of mind and the possibility of knowledge initiated by Fred Dretske , which soon influenced the inception of situation semantics in the context of natural language see Kratzer The next three subsections survey some of the basic notions from this tradition: the basic sites of information in situation theory called situations , the basic notion of information flow based on correlations between situations, and the mathematical theory of classifications and channels mentioned in b.

The ontologies in a span a wide spectrum of entities. They are meant to reflect a particular way in which an agent may carve up a system. The list of basic entities includes individuals, relations which come with roles attached to them , temporal and spacial locations, and various other things.

Distinctive among them are the situations and infons. Roughly speaking, situations are highly structured parts of a system, such as a class session, a scene as seen from a certain perspective, a war, etc. Situations are the basic supporters of information. Infons, on the other hand, are the informational issues that situations may or may not support. Such basic infon is usually denoted as. Infons are not intrinsic bearers of truth, and they are not claims either.

They are simply informational issues that may or may not be supported by particular situations. As an example, a successful transaction whereby Mary bought a piece of cheese in the local market is a situation that supports the infon. The discrimination or individuation of a situation by an agent does not entail that the agent has full information about it: when we wonder whether the local market is open, we have individuated a situation about which we actually lack some information. See Textor for a detailed discussion on the nature of situation-like entities and their relation with other ontological categories such as the possible worlds used in modal logic.

Besides individuals, relations, locations, situations and basic infons, there are usually various kinds of parametric and abstract entities. For example, there is a mechanism of type abstraction. There will be some basic types in an ontology, and many other types obtained via abstraction, as just described. The collection of ontology entities also includes propositions and constraints. They are key in the formulation of the basic principles of information content in situation theory, to be introduced next. The idea is that it is concrete parts of the world that act as carriers of information the concrete dot in the radar or the footprints in Zhucheng , and that they do so by virtue of being of a certain type the dot moving upward or the footprints showing a certain pattern.

What each of these concrete instances indicates is a fact about another correlated part of the world. It is the existence of this constraint that allows a particular situation where the dot moves to indicate something about the connected plane situation. With this background, the verification principle for information signalling in situation theory can be formulated as follows:. The [IS Verification] principle deals with information that in principle could be acquired by an agent. The access to some of this information will be blocked, for example, if the agent is oblivious to the correlation that exists between two kinds of situations.

In addition, most correlations are not absolute, they admit exceptions. Thus, for the signalling described in [E1] to be really informational, the extra condition that the radar system is working properly must be met. Conditional versions of the [IS Verification] principle may be used to insist that the carrier situation must meet certain background conditions.

The inability of an agent to keep track of changes on these background conditions may lead to errors. So, if the radar is broken, the dot on the screen may end up moving upward while the plane is moving south.

Unless the air controller is able to recognise the problem, that is, unless she realises that the background conditions have changed, she may end up giving absurd instructions to the pilot. Now, instructions are tied to actions. For a treatment of actions from the situation-theoretical view, we refer the reader to Israel and Perry The basic notion of information flow sketched in the previous section can be lifted to a more abstract setting in which the supporters of information are not necessarily situations as concrete parts of the world, but rather any entity which, as in the case of situations, can be classified as being of or not of certain types.

Research Interests

The mathematical theory of distributed systems Barwise and Seligman to be described next takes this abstract approach by studying information transfer within distributed systems in general. A model of a distributed system in this framework will actually be a model of a kind of distributed system, hence the model of the radar-airplane system that we will use as a running example here will actually be a model of radar-airplane systems in plural.

Setting such a model requires describing the architecture of the system in terms of its parts and the way they are put together into a whole. Once that is done, one can proceed to see how that architecture enables the flow of information among its parts. A part of a system again, really its kind is modelled by saying how particular instances of it are classified according to a given set of types. In other words, for each part of a system one has a classification. In a simplistic analysis of the radar example, one could posit at least three classifications, one for the monitor screen, one for the flying plane, and one for the whole monitoring system:.

Consider the case of the monitoring systems. That each one of them has a screen as one of its parts means that there is a function that assigns to each instance of the classification MonitSit an instance of Screens. The core determines the correlations that obtain between the two parts, thus enabling information flow of the kind discussed in section 2. This is achieved via two kinds of links. Thus, in the radar example a particular screen will be connected to a particular plane if they belong to the same monitoring situation.


  1. Refine list;
  2. LDS - Labelled Deductive Systems: Volume 1 - Foundations.
  3. SIMATIC - S7-200 Programmable Controller.
  4. Labeled Families in Modular Software Development!
  5. Becoming a Manager: How New Managers Master the Challenges of Leadership;
  6. Then such relation captures a constraint on how the parts of the system are correlated. This regularity of monitoring situations, which act as connections between radar screen-shots and planes, reveals a way in which radar screens and monitored planes correlate with each other. All this affords the following version of information transfer. Then a distributed system consisting of various classifications and infomorphisms will have a logic of constraints attached to each part of it, [ 4 ] and more sophisticated questions about information flow within the system can be formulated.

    It is then desirable to identify extra conditions under which the reliability of the inverse translation can be guaranteed, or at least improved. In a sense, these questions are qualitatively close to the concerns Shannon originally had about noise and reliability. Another issue one may want to model is reasoning about a system from the perspective of an agent that has only partial knowledge about the parts of a system.

    For a running example, think of a plane controller who has only worked with ACME monitors and knows nothing about electronics. Natural questions studied in channel theory concerning these notions include the preservation or not , under translation, of some desirable properties of local logics, such as soundness. A recent development in channel theory Seligman uses a more general definition of local logic, in which not all instances in the logic need to satisfy all its constraints. This version of channel theory is put to use in two important ways. Firstly, by using local logics to stand for situations, and with a natural interpretation of what an infon should then be, a reconstruction is produced of the core machinery of situation theory barely presented in sections 2.

    Secondly, it is shown that this version of channel theory can deal with probabilistic constraints. The rough idea is that any pair of a classification plus a probability measure over the set of instances induces an extended classification with the same set of types, and where a constraint holds if and only if the set of counterexample instances has measure 0. Notice that this set of counterexamples might not be empty. For an extensive development of the theory of channels sketched here, plus several explorations towards applications, see Barwise and Seligman See van Benthem for a study of conditions under which constraint satisfiability is preserved under infomorphisms, and Allo for an application of this framework to an analysis of the distinction between cognitive states and cognitive commodities.

    Finally, it must be mentioned that the notion of classification has been around for some years now in the literature, having being independently studied and introduced under names such as Chu spaces Pratt or Formal Contexts Ganter and Wille For information to be computed, it must be handled by the computational mechanism in question, and for such a handling to take place, the information must be encoded. Information as code is a stance that takes this encoding-condition very seriously. The result is the development of fine-grained models of information flow that turn on the syntactic properties of the encoding itself.

    To see how this is so, consider again cases involving information flow via observations. Such observations are informative because we are not omniscient in the normal, God-like sense of the term. We have to go and observe that the cat is on the mat, for example, precisely because we are not automatically aware of every fact in the universe. Inferences work in an analogous manner. Deductions are informative for us precisely because we are not logically omniscient.

    We have to reason about matters, sometimes at great length, because we are not automatically aware of the logical consequences of the body of information with which we are reasoning. To come full circle—reasoning explicitly with information requires handling it, where in this case such handling is cognitive act. Hence the information in question is encoded in some manner, hence Information as code underpins the development of fine-grained models of information flow that turn on the syntactic properties of the encoding itself, as well as the properties of the actions that underpin the various information-processing contexts involved.

    Such information-processing contexts are not restricted to explicit acts of inferential reasoning by human agents, but include automated reasoning and theorem proving , as well as machine-based computational procedures in general. Approaches to modelling the properties of these latter information-processing scenarios fall under algorithmic information theory. In section 3. Categorial information theory is a theory of fine-grained information flow whose models are based upon those specified by the categorial grammars underpinned by the Lambek Calculi, due originally to Lambek , The motivation for categorial information theory is to provide a logical framework for modelling the properties of the very cognitive procedures that underpin deductive reasoning.

    The conceptual origin of categorial information theory is found in van Benthem The motivation for categorial information theory is to model the cognitive procedures constituting deductive reasoning. Consider as an analogy the following example. You arrive home from IKEA with an unassembled table that is still flat-packed in its box.

    Now the question here is this, do you have your table? Well, there is a sense in which you do, and a sense in which you do not. You have your table in the sense that you have all of the pieces required to construct or generate the table, but this is not to say that you have the table in the sense that you are able to use it.

    That is, you do not have the table in any useful form, you have merely pieces of a table. Indeed, getting these table-pieces into their useful form, namely a table, may be a long and arduous process…. The analogy between the table-example above and deductive reasoning is this.

    So, when you possess the information encoded by the premises of some instance of deductive reasoning, do you possess the information encoded by the conclusion? To be sure, when you possess the information-pieces encoded by the premises, you possess some of the information required for the construction or generation of the information encoded by the conclusion. As with the table-pieces however, getting the information encoded by the conclusion from the information encoded by the premises may be a long and arduous process.

    You need also the instructional information that tells you how to combine the information encoded by the premises in the right way.

    This information-generation via deductive inference may be thought of also as the movement of information from implicit to explicit storage in the mind of the reasoning agent, and it is the cognitive procedures facilitating this storage transfer that motivate categorial information theory. The conceptual motivation is to understand the information in the mind of an agent as the agent reasons deductively to be a database in much the same way as a natural language lexicon is a database see Sequoiah-Grayson , In this case, a grammar will be understood as a set of processing constraints so imposed as to guarantee information flow, or well-formed strings as outputs.

    Recent research on proofs as events from a very similar conceptual starting point may by found in Stefaneas and Vandoulakis forthcoming. Categorial information theory is strongly algebraic in flavour. The merge and function operations are related to each other via the familiar residuation conditions :. In general, applications for directional function application will be restricted to algebraic analyses of grammatical structures, where commuted lexical items will result in non-well-formed strings.

    In other words, we have a domain with a combination operation. The operation of information combination and the partial order of information inclusion interrelate as follows:. Just what this processing consists of will depend on the processing constraints that we set up on our database. These processing constraints will be imposed in order to guarantee an output from the processing itself, or to put this another way, in order to preserve information flow. Such processing constraints are fixed by the presence or absence of various structural rules , and structural rules are the business of substructural logics.

    Categorial information theory is precipitated by giving the Lambek calculi an informational semantics. At a suitable level of abstraction, the Lambek calculi is seen to be a highly expressive substructural logic. Unsurprisingly, by giving an informational semantics for substructural logics in general, we get a family of logics that exemplify the information as code approach.

    This logical family is organised by expressive power, with the expressive power of the logics in question being captured by the presence of various structural rules.