[Date Prev] | [Thread Prev] | [Thread Next] | [Date Next] -- [Date Index] | [Thread Index] | [Elist Home]
Subject: [topicmaps-comment] automatic patenting of intellectual property
The previous message was developed so that I might talk a bit about mapping intellectual property. Axel's (www.nqcg.com) group brings some renewed possibility to the notion that one might select a domain of inquiry and develop a descriptive enumeration of the concepts that are being expressed in patents in specific areas of endeavor. I have not just a little to say about how such enumeration might occur. http://www.bcngroup.org/area3/pprueitt/private/KM_files/frame.htm Perhaps it might be clear, to a few, that the technical-means exists to take (for example) the native text collection that has been recently gathered together by the government of China as a representation of the intellectual property that was up to now collectively owned by the State. The government of China seems to wish to create a Patent and Trademark Office. A PTO in China would certainly be a positive thing. But other countries need similar systems. So NQCG has this as one of the new business value propositions? ** Automating IP evaluation and deployment using an architecture based on human memory, awareness and anticipation http://www.bcngroup.org/area3/pprueitt/kmbook/Chapter7.htm is possible.. quickly and by integrating high value linguistic tools. This was what I suggested, but in the context of FSU IP, to Dan Quayle in 1992 in the proposal I and Professor Ed Finn made from Georgetown University (via the office of E. Roy Chalk), and which was briefly discussed at the highest levels of both the Russian government and the US government. Then again the issue was raised in an invited presentation (at GU) I made to university IP lawyers in 1993. Here the concept was that representation of the (1) intellectual property in a specific area of endeavor and (2) the market needs in a market space; might be associated using a simple neural network. Then there was the development of the BCNGroup Charter and its publication in 1997. http://www.bcngroup.org/admin/Charter.html Then there was the work and discussion that I have had with the creators of Dr. Link at Manning and Naplier Information Service (in conjunction with work that Liz Liddy has done on the representation of concepts (in text) and concept affordance as Peircean graphs.) Then there are the decade long discussions that I have on-going with 5 or 6 "machine linguists", who have developed variations on Latent Semantic Indexing and the application of other stochastic type processes to the mapping of meaning in text. Then there was the publishing of the important book: "Technological Innovation as an Evolutionary Process", edited by John Ziman 2000, Cambridge university Press. And then more recent introduction to the goals of M-CAM, (www.m-cam.com) . Then there has been the development of the SLIP Browsers and the foundational work on eventChemistry (work that is just now beginning, on a world wide scale). So, I am NOW ready to develop a tri-level informational system that acquires language independent (?) IP citations from Natural Language Parsing. Of course, the notion of the many-to-one and one-to-many communication manager http://www.ontologystream.com/distanceLearning/learningSystem.htm could control the presentation over television and the Internet of the results from IP evaluation and adoption processes, as mediated and facilitated by the BCNGroup and NQCG. see: http://www.ontologystream.com/OS/Capability.htm **** Ok, so the major point I want to make..... There are short term and long term objectives that can be established. The short term does not need a quantum computer, but rather a tri-level software system running on a von Neumann computer. This has been now accomplished by Don Mitchell's simple programs where by some elements of Inmentia's concept of tensor type information systems is combined with SLIP and eventChemistry to produce a new computing, and knowledge representational framework based on In-memory structural holonomy (not reducible to Codd and Date's notions on data bases). Moving to optic computing is a step towards making the tri-level very fast, but does not yet move the physics out of what we as ultra-quantum existences can control. The notions of quantum information channels, and how the brain clearly uses the quantum level dynamics to see, is well developed in the literature by Pribram and in his many conferences at Radford and Georgetown. The issues of environmental decoherence can go away if the ground level is seen as the elements of a deconstruction of memes. Then the scope of the project can be made only on the Nordic communities production of IP and the controlled (for the common good) packaging and presentations of these common results into the multi-media framework that NQCG has worked out. ** So there is not reason why we can not start today, and have an operational knowledge base and presentation stream within 6 months, now is there? ** What we need is capital to move in this direction.
[Date Prev] | [Thread Prev] | [Thread Next] | [Date Next] -- [Date Index] | [Thread Index] | [Elist Home]
Powered by eList eXpress LLC