OASIS Mailing List ArchivesView the OASIS mailing list archive below
or browse/search using MarkMail.

 


Help: OASIS Mailing Lists Help | MarkMail Help

humanmarkup-comment message

[Date Prev] | [Thread Prev] | [Thread Next] | [Date Next] -- [Date Index] | [Thread Index] | [Elist Home]


Subject: [humanmarkup-comment] Knowledge Base Development Projects


Kukahone,
 
Thank you for the comment:
 
The best way to share a paper might be to make the paper into a URL and post the URL.  I can do this and post into a scholars' section of the BCNGroup
 
http://www.bcngroup.org/area3.htm
 
The members of the categorical abstraction forum will post comment about any scholarly paper that is presented to the group.  Our notions, derived from the BCNGroup Charter, is on virtual collaboration and IP disclosure/protection.  BCNGroup membership is open as a means to support our work.
 
http://www.bcngroup.org/admin/Charter.html
 
As you have indicated, the concepts of emergence, and of semiotics, speaks about knowledge in a non trivial manner. 
 
But how is knowledge spoken of today, in a way that is trivial?  Is the knowledge representation community and the community of computer scientists, generally missing something important about the "ontological" difference between computer processing space and the natural world?   This difference is similar to the difference between the concept of the number "three" and a pile of three stones.  The number three does not have quantum mechanical fluxuation as part of its essence, whereas the pile of stones does (for example.)  The number is "non-complex", and the stones are complex. 
 
We do not want this to be a philosophical discussion, and so we  define very clearly what the business objectives are for challenging the way that "knowledge" is spoken of today (by almost everyone - including Frank Sowa and most of the knowledge representation community - Topic Maps, XML, RDF, KIF, Cyc Ontologies) 
 
We are all, everyone, looking for a programming/business-engineering unification.  Why?  Because the social value of such a unification is larger than anyone can imagine.  
 
The OSI/BCNGroup proposal is that the unification can be more easily made if the natural science controls the computer science - and not the other way around.  We need to flip the current control situation in terms of how the government and business regards the promise of computer science and the application of computer science at the current paradigms in Information Technology (IT).
 
In the near future, our society simply must reduce funding of a AI mythology and the AI dream, and start to fund, for the first time, a historically gournded paradigm where the distinction between the ontology of the computer processing space and the nature of natural systems is fundamental (as it is in the topic maps conceptual model.)   Why? 
 
The IT mechanisms currently available are wholly inadequate to address the complex time-critical problems we are likely to face in the 21st century. 
 
If we make this shift, then things like a Cyber Defense Knowledge Base, Knowledge Base of COBOL conversations, etc become feasible and will provide a high value to society.  Shining a light into the dark alley of the Internet is entirely possible - but not with any of the current business propositions that come from the Defense Industry. 
 
By understanding the natural science aspect OF computer science, then the problems of computer science are by-passed in exactly the same way as the Zeno paradox, or the Russell paradox is by-passed - (we simply quit talking about rational constructs, like static pre-existing "scope" in Topic Maps  that are not productive - IN THE GENERAL CASE .) 
 
But many of the thought leaders actually are trying to reduce cognitive experience to formal constructs - and in fact to formal constructs that are constrained by what Sowa calls "first order logic", as if a certain interpretation of Frege, Peirce, Schroder, Peano etc has in fact been able to reduce the phenomenon of mental experience to a formalism.  This is NOT the closed issue that Sowa would have us believe. 
 
Whereas the computer science community may demand that such a reduction is possible, the natural science community simply points out that
 
1) such a demand is unreasonable based on specific principled arguments (related to stratified complexity)
2) that there is simply no evidence that current failures and limitations of knowledge representation will be overcome - there is no example of an intelligence produced by a computer process.
 
Real and natural emergence is in fact what is missing.  What we are left with, in my opinion, is the potential for the development of semiotic control languages that treat the computer processing space as a simple (but highly complicated) machine to be described with machine based ontology.    But as long as the leading scholars in knowledge representation continue to push the AI Myth, then capitialization of this potential work will fail to have a grounding in the natural sciences, and the buisness community will continue to be mislead by the promises of these standards. 
 
We must reach into the natural sciences to control a misunderstanding that is generating some considerable error from the computer science community.    A reformed computer science will be more productive and will have greater social value. 
 
In a recent communication, on a proposed Common Logic standard, to the topic maps community Dr. Sowa said:
 

>One of the significant decisions was to choose a new name,

>Common Logic (CL), for the proposed standard. The intent is to

>reduce any bias toward the two starting notations, KIF and CGs,

>and to emphasize the common basis in first-order logic, as it

>was originally developed by Frege, Peirce, Schroder, Peano, and

>many others during the late 19th and early 20th centuries.

>

>In keeping with that decision, CL will be defined by an abstract

>syntax, which specifies the major categories, such as Quantifier,

>Negation, and Conjunction, without specifying any concrete symbols

>for writing them. At the abstract level, even the ordering is

>left undefined so that there is no bias toward a prefix notation

>such as KIF, an infix notation such as predicate calculus, or a

>graph notation such as CGs (or Peirce's original existential graphs).

>

>Since it is impossible to write a purely abstract syntax, the CL

>standard will also contain grammars for three concrete syntaxes:

>KIF, CGIF (the CG interchange format), and traditional predicate

>calculus (TPC) with a Unicode encoding of the commonly used symbols.

>Each of those grammars will specify how the abstract categories are

>mapped to the printable (or computer representable) symbols of their

>notations. Any of the three concrete notations can be mapped into

>any of the others by reversing the mapping from concrete to abstract

>in one notation and then mapping from abstract to concrete in the

>other notation.

>

>The standard will also contain a version of model theory defined

>in terms of the abstract syntax. The model theory will specify

>the truth conditions for any abstract statement, and any conforming

>concrete statement in any syntax that is mapped from that abstract

>statement would be required to have exactly the same truth conditions.

>This requirement will ensure identical semantics for statements

>represented in any concrete syntax that conforms to the standard.
 
 
The facts may be that scholars like Roger Penrose and Robert Rosen have laid out formal reasons why a reduction of mental experience to a first-order logic will not be completely successful.  I will not repeat that arguments here except to give a reference to my interpretation of this argument:
 
http://www.bcngroup.org/area3/pprueitt/kmbook/Chapter2.htm
 
What I am proposing is that the computer processing space (the Internet and all finite state systems activity at any one time or over a period of time) is not a natural system in a specific sense.  The processing space is built to not have an influence from any but one level of organization - and this organization is essentially representable as a first order logic.  This is the present computer science that must be reformed.
 
So if one studies the "addressable space" (in the language of topic maps) one can do this with a standard like Common Logic, perhaps.  The fact is that NO standard of logic is being successfully used in such a way as to describe the processing space AND be uniformly consistent with user expectations.  This failure can not be considered to be a failure of the human species, but rather a failure of the reductionism in computer science.  Perhaps this was one of the points of the movie The Matrix?
 
But perhaps this description of the addressable space is only lacking a proper and universally accepted standard.  This is one issue. 
 
The issue that has been that we are all, everyone, looking for a programming/business-engineering unification and that many of the thought leaders are reducing cognitive experience to formal constructs.
 
In doing this, we feel that a category error is made where by the "formal system", which ever one you might be most happy with, is falsely understand as if the formal system IS the natural system of business reality or of the tacit knowledge of individuals.
 
***
 
We have specific approaches to gaining near term return on investment based on the principle of separation of the artificial world of the computer processing space and our understanding of the natural world.
 
http://www.ontologystream.com/OSIConsulting/TheDeploymentTeam2.htm
 
 
 
 
 
 


[Date Prev] | [Thread Prev] | [Thread Next] | [Date Next] -- [Date Index] | [Thread Index] | [Elist Home]


Powered by eList eXpress LLC