[Date Prev] | [Thread Prev] | [Thread Next] | [Date Next] -- [Date Index] | [Thread Index] | [Elist Home]
Subject: [humanmarkup-comment] Knowledge Base Development Projects
Dick (Ballard), As always, your perspective is clear and illustrative of the fundamental issues in knowledge production. http://groups.yahoo.com/group/categoricalAbstraction/message/9 From a semiotics perspective, knowledge CANNOT occur in the computer simply because knowledge is an experience and the computer does not experience. (There is no example of a computer "experiencing" anything.) Social knowledge is seen, from a semiotics perspective, as an experience by a social unit. I will not defend the notion that a social unit experiences - simply because the vast literature on social experience is hopelessly burdened with confusing trends in Western philosophy. The style of conceptual deconstruction practiced by reductionist will not allow the notion, that a social unit experiences, to be properly considered. The past two years experience in the kmci e-forum should be proof of this warning. Non-reductionist science is taking hold, slowly and in resilient opposition to methodological reductionism. Roger Penrose, for example, makes the principled argument that there simply cannot be a reductionist theory of experience. The burden actively placed by reductionist, such as practiced by Joe Firestone's KMCI, causes the development of a science of knowledge great harm. Why? The harm caused is because without the development of certain principles related to learning, memory, awareness and anticipation; the development of knowledge systems simply fail. Seeing certain preliminaries in an incorrect fashion is like building on shifting sand. It is better to spend one's wealth on something else. The issue of experience is related to the physical phenomenon of emergence. Social emergence of a political movement is a physical phenomenon of emergence related to a social unit "experiencing" something. But in spite of works by some; on autopoiesis and implicate order, the science on this is not yet determined. But common sense might lead one to agree that social knowledge is in fact a reality even if this reality is not reducible to classical notions of physical phenomenon. What CAN occur in the computer are the artifacts of sign systems where some correspondence has been identified between the signs and human experience of knowledge. The computer can then be evocative of specific mental events. A knowledge operating system is only possible of the cognitive load is not placed on the computer. That is to say that knowledge can not be (totally) specified formally. In Peircean terms, there has to be an interpretant. A Synthetic Perceptual System produces a sign system that is informative about the invariance in artificial worlds, such as the Internet or an body of COBOL code. The development of a Synthetic Perceptual System depends on a clear delineation between what the computer system is asked to do and what is left for human cognitive load. One can not preserve the AI myth and find the holy grail. Robert Rosen, and others, have developed scholarship on what my school of thought calls the "category error" where a human makes the mistake in confusing the sign system itself for the reality that the sign system "signs". Applied mathematicians make this error all of the time, and it does not quite matter in most cases - since these cases involve very mechanical aspects of the natural world. However, in information science, the mistake is the source of almost all of the confusion that occurs in business process re-engineering and other commercial activities that require the development of a computer intangible model of social, economic, and other natural processes. These processes are too non-simple to treat this way and then not treat the limitations that the category error imposes. So the tools, including relational data bases, have marginal use in many - but not all - circumstances. You have said the following: *** "Information" is that set of constraints imposed by particular situations or realities. "Structure" is the sum of all requirements (constraints) imposed by the theories we have learned directly from experience or from the teaching of others. *** This statement is about natural systems, not artificial ones such as the computer process. The challenge that you have undertaken, and that I share, is to make a corresponding statement that is about a computer process - but to make this statement in such a way as to avoid the Rosen category error. The machine changes state. Using data aggregation methods, at OntologyStream Inc we developed categorical definitions from the aggregation of invariance in the structure of machine state. Our notion of a Synthetic Perceptual System depends on the notion that there is invariance in the structure of machine state. This invariance can be identified by algorithms such as link analysis or pattern matching (statistical, neural networks, genetic algorithms etc.) Similarity metrics can be used to allow a variation in the exemplars of a category, so that "run" and "urn" might be defined to be in the same category (of invariance in the structure of machine state). One can formally describe "use patterns" in the text of the code and in the micro-events that instrument the Internet or a Linux server. For example, a body of COBOL code can be parsed to find regular expressions and output to a log file the pair (expression, unit the expression is found in). http://www.ontologystream.com/cA/tutorials/download/cobolview3.zip As show by this software, (703-981-2676) these pairs are processed with the generalized latent semantic indexing, by the SLIP browsers. We find the functional load on the categories of regular expressions. This "functional load on the categories of regular expressions" is the key to extending the SLIP analytic conjecture developed for cyber event detection and intelligence fusion. The functional load on the categories of regular expressions is the simplest of all consequences of performing generalized latent semantic indexing. In this simple consequence, a theory of type is developed that is expressed as logical atoms, logical relationships, (n-aries) and as aggregated compounds derived from self organizing feature algorithms (Kohonen, scatter gather, stochastic limit distribution aggregation). The result is formative machine ontology. What Dean, Alan ( www.netobjectives.com ) and I and others are looking for are more complex "measurement" structures and the instrumentation of the machine state. For example, Dean ( www.cyber-security.com )has several times expressed (verbally) a way of looking for data on computer intrusions that could clearly be instantiated as a measurement device (a convolution operator on in-memory structural holonomy) for the process of performing generalized latent semantic indexing - and event chemistry. This verbal expression is "from" personal tacit knowledge that the expert has to bring to the as yet underdeveloped theory of categorical abstraction. Robert ( www.cobolshop.com ) may be able to express tacit knowledge of COBOL use patterns so that the signature of the original COBOL programmers can be identified and used to extend or modify the body of code without upsetting the human communities that depend on the (well understood - but only in a tacit way) behavior of these large embedded information systems. Categorical Abstraction could automate this process, resulting in faster code updates and more enterprise productivity. http://www.ontologystream.com/OS/workFlow.htm Once the machine state is aggregated into a class of categories we have an machine analog to the human perceptual system. http://www.ontologystream.com/OS/G-PArch.htm The development of the measurement of invariance in an artificial world by an artificial process is an open problem in the theory of categorical abstraction. But it is also clear that human perception is understood only through a full and formal understanding of the phenomenon of categorical abstraction by a natural system (the experiencing brain) and the various transformations of the phenomenon that occur in awareness, memory production, memory use, and anticipation. The measurement problem is the problem is having the proper instrumentation and conjecture. So this problem has been solved in the cyber defense domain in the simplest fashion with SLIP plus eventChemistry. We have to wait for a business process to sort out the intellectual property issues and the issue of Return of Investment. So the application to cyber defense is shelved due to the anticipated conflicts imposed by the "powers-that-be". But our strategy is to generalize the CONCEPT and then create new intellectual property that has a proper expression as new patent applications. Then the many verticals (cyber defense being only one small vertical) can be properly protected. Capitalization of OntologyStream is necessary to fulfill this strategy. Beyond the measurement problem there is a second problem, that I call the judgment problem. The inputs to this problem are a finite set of event representations, such as the set of functional load compounds on the regular expressions. Before judgment can occur, a "knowledge base" of event types has to have been developed, such as the set of simple compounds in: http://www.ontologystream.com/cA/tutorials/download/cobolview3.zip but these compounds may or may not be meaningful. This is for the expert to decide. Experts must make judgments and annotate the visual rendering of categorical abstraction. If the experts within a community of experts starts to find meaning then the annotation and the remembrance of past experiences may evolve a language system that is known by the members of the community. Knowledge is shared, and knowledge artifacts are encoded into a knowledge system. What could be simpler?
[Date Prev] | [Thread Prev] | [Thread Next] | [Date Next] -- [Date Index] | [Thread Index] | [Elist Home]
Powered by eList eXpress LLC