OASIS Mailing List ArchivesView the OASIS mailing list archive below
or browse/search using MarkMail.

 


Help: OASIS Mailing Lists Help | MarkMail Help

humanmarkup-comment message

[Date Prev] | [Thread Prev] | [Thread Next] | [Date Next] -- [Date Index] | [Thread Index] | [Elist Home]


Subject: [humanmarkup-comment] RE: [eventChemistry] Information Channels in aQuantum Computer



Dears friends,

The eventChemistry group is interested in developing the correspondences
between quantum information channels (be these formal and not physical
realities, or actual physical realities) and stratified reasoning /
eventChemistry (which is a formal theory and not a physical reality).

I invite to this moderated discussion those who can contribute. Feel free to
pass this note on to others.

see:  the white page at:

http://www.nqcg.com

***

Sometimes it is important to point out where I am missing complete
understanding.  First, I do not understand why the Q-bit was decided to be a
two dimensional matrix.  For the quantum information channel, if having an
exact correspondence to Schrodinger wave equation in the un-collapsed
nature, there are a countable infinity (what ever countable infinity is)
number of dimensions?


Paul Werbos has some extended thoughts on what this could possible mean.

What I argue for is that:

1) the two comes from an entanglement of the substructural level of
organization with the environmental level of organization (which I call
"ultra structure" after Jeff Long's work on notational systems.)  So we have
an opposition (entanglement) between the substance and form as in formative
linguistics.  The notion of escapement and dissipative processes is an other
way to think about this, and then we have a thermodynamical framework, where
escapement is the local emergence of form from substance that is not
observable (and not subject to "middle" layer conversation laws) and
dissipative is from a universal (holonomic) law acting everywhere in the
middle layer at the same time.  Again, the "cause of gravity" is not
observable.  Kugler would do a much better job of explaining this than I.

2) the two might come from an oppositional scale involved in the enslavement
process the drives (controls) the point-mass escapement process from the
substructural level of organization.  In the case that we have a linguistic
system, then there is likely 117 oppositional scales (according to private
remarks (1997) by D. Pospelov, leading Russian semiotician) related to
linguist based oppositional scales (love - hate, in - out, up -down, etc).
I suppose one could say that the I-Ching has 6 oppositional scales.  The
scatter gather in the SLIP Technology Browser has one oppositional scale
(near, not near) derived from one analytic conjecture.

http://www.ontologystream.com/aSLIP/files/functionalLoad.htm

Like the Q-bit, we have the problem of how to resolve two analytic
conjectures (but in our case the problem is completely solvable by
scatter-gather on the **surface** of a (n+1) dimensional sphere.) (Claim: I
am the only one who has done this in simulations, so far).  It is a
technical solution - but one still has to have good data and good analytic
conjectures.  (This is the measurement problem in artificial spaces that I
hoped to deploy into the CERT centers this quarter.  )

http://www.ontologystream.com/SLIP/files/OSISummary.htm


***begin side remark***

Don Mitchell is working on (actual code) for an artificial eventChemistry to
resolve event compounds from the SLIP event atoms.  There are more than one
artificial chemistries (perhaps like Mars natural chemistry and Earth
natural chemistry?) I suspect that James's PhD thesis will have one of more
eventChemisties related to modeling the emergence of terrorism cells... or
something like that.  Ray, are you following this notion of a chemistry...
it is like the diagrams that you develop in your book.

Determining functional load for Incident Management will take some
willingness of the CERT community to think out of the box - something that
they (and their management) may not be willing to do.

{{ ** This problem in seeing relevant innovation by Industry management
becomes a national security issues, since the "other side" of this New War
is thinking outside the box. ** }}

Thus we may have to by-pass the intustry and go to policy makers for
permission to deploy a global tool for incident management regarding cyber
war.  The generic and general (prevasive) problem with Industry management
and DoD agencies "not getting it" has become so well understood, that at one
point the message WILL be heard by the White House:

http://www.bcngroup.org/area3/manhattan/sindex.htm


Determining a situational theory for functional load for IP emergence will
make someone rich (I think.)


***end side remark***



Ok, so this is to frame what I do not know regarding why the Q-bit has two
dimensions, only.  (Sorry to get entangled in political pragmatics of moving
the community forward as a whole, hopefully.)

The square ness of the matrix is a different issue.  Formally, one might say
that the square ness is simply an artifact that (it is turtles all the way
down *s) tensor cross products of square 2-dimensional matrix is always
square.   However, there is a notion (by Gruenwald) that an informational
tensor need not be square.. at least that is what I understand him to have
said.  I do not follow his concept of information tensor very well.

In Latent Semantic Indexing (LSI) we have the possibility of a non square
matrix, even though this upsets the mathematics related to diagonal zing to
a matrix with eigen values in the diagonal.  (right?)   The rows can be
units at one level of organization (the paragraph) and the columns can be
units from a different level of organization (the word token or phrase).  It
has been a while since I looked at the issue of what LSI folks do here to
fix this formal problem.

But the critical problem that everyone is stuck on has to do with scope (as
in machine ontology - like a topic map) and the related notion of
situated-ness.

http://www.ontologystream.com/prueitt/whitePapers/Situationedness.htm

How do two Q-bit's combine?

For this problem we suggest that rough sets is useful....

http://www.pvv.ntnu.no/~hgs/project/report/node37.html
http://www.kbs.twi.tudelft.nl/Education/Cyberles/Trondheim/

but also that a stratified theory of physical process organization is also
important.

***

But essential to the stratified theory is the application of science to an
understanding (by the topic maps, XML, KM communities) as to what the
experience of knowledge is in fact.  Topic maps started out acknowledging
the endophysics and exophysics of a machine representation of (1) things in
the machine, and (2) things outside the machine....  but so far the big step
has not been taken.

This big step is in the development of computer technology that produces
small situated ontology through a formative (just in time) process using the
tri-level (stratified) paradigm.  {Memory, awareness, anticipation}.

Please direct comments ONLY to:
http://groups.yahoo.com/group/eventChemistry


****** copy posted below for others****


Dear Paul,

Thank you for introducing NQCG to this very interesting intellectual group.
I have together with team members of NQCG studied your intriguing
interpretation to quantum computation theory. Quantum information theory
makes use of the superposition principles of quantum mechanics. This
principle is what is making quantum computation such a novelty. The white
paper that you are referring to (NQCG WP#01) is meant to provide the reader
with a model for how to better understand how this principle can be used in
computer design to tremendously increase computational power. To practically
make use of this principle is the main motivation for research on quantum
computation. This paper is not intended for giving a detailed model for a
quantum computer design, rather to describe aspects relevant mathematical
physics theory that might lead to better insight in how to develop a
competitive quantum computer. We do not, however, believe it will be
sufficient to implement quantum mechanical theory to describe, or operate,
complex adaptive systems, based on classical mechanics.

The description of the minimal voting procedure by systems based on
classical "theme phrases" differ from what we are trying to explain by our
tensor products of Q-matrixes. By the tensor product of the two Q-matrixes
at the top of page four (4) in our white paper, we simply try to describe
why the principle of superposition gives rise to the quadratic growth of the
number of outputs of a calculation by linking another binary Q-bit to the
chain (linked by entanglement in a Q-bit system of finite length). These
links are what we are describing by our tensor product formalism. For
example, the tensor product of the two Q-matrixes at the top of page four
(4) in our white paper, is giving the square 4x4 matrix. This matrix can now
be considered as the observable of the quantum system with four eigenstates
with respective probabilities: | c1 b1 |, | c2 b1 |, | c1 b2 | and | c2 b2 |
to be measured.

However, we notice the use of matrixes in your models for the minimal voting
procedure system, but miss the claim for these matrixes to be square to
enable a tensor product decomposition.



Sincerely Yours,


Axel P. Mustad, NQCG

Tel:                      +47 22 987 000
Fax:                      +47 22 987 001

E-mail:                   axel@nqcg.com
URL:                      http://www.nqcg.com



-----Original Message-----
From: psp [mailto:beadmaster@ontologyStream.com]
Sent: Saturday, January 05, 2002 9:17 AM
To: eventChemistry
Cc: Robert Shaw; Stu Hameroff; Douglas Weidner; Donn Milton; Dennis
Wisnosky; Abdul Halim; amchoi@msn.com; Inc. Fourth Wave Group; Karl H.
Pribram; Vfrizzel; chi@physiology.spb.su; rado@math.arizona.edu;
acs@math.arizona.edu; revonsuo@sara.utu.fi; alexei@nsma.arizona.edu;
p@dorrell.demon.co.uk; Dorothy Denning
Subject: [eventChemistry] Information Channels in a Quantum Computer


This group is a semi-private discussion currently with twenty members.
However, I, as moderator, will continue to try to attract leading scholars
whose work has relevance to these very interesting discussions.

Karl Pribram, Steward Hameroff, please considered joining from:

http://groups.yahoo.com/group/eventChemistry

at least for a little while....  or call  me  703-981-2676  (Paul Prueitt)

***


In reading the single 6 page white paper from Axel's web site:

http://www.nqcg.com

I have a number of thoughts and ideas.

Let me outline.. as these are not so easy to express.

I would think that the first principle in considering quantum computing is
to show where and how emergence is involved in each step, transformation,
observation, encoding, etc.

One may adopt a stratified view on this, since quantum computing (QC) is
from "our" point of view.  Let me establish this stratified view as NOT
being about QM but being about cross scale entanglement (as an
abstraction) - specifically as an refinement of the full text data-mining
techniques known as Latent Semantic Indexing (abstracting the current theory
from a matrix based theory to a tensor based theory  as a future theory).

It is to bad that Bjorn G. is not able to contribute here in the public
view.  (www.inmentia.com) But his work on this is truely exceptional.

Let me push this point a bit.  Suppose that there was nothing special about
the relationship between QM "states" (beables) and atom "events" seen as
collapsed beables, when abstractly considering cross scale phenomenon where
the ground events were at "our" level of observation.  One simply has two
levels of organization, one of them being un-observable due to the
non-formation of boundary conditions or initial conditions **in the frame of
reference** of the other level of organization.  (The question of an
ecological contribution from a "third level" is absent from QM - at least I
think that it is except as the problem of interference..  Paul W.?  Bob
Shaw?)

Anyway, the notion arises as to whether one might regard normal computing
with the Q-matrix (see Axel's paper) as a rough set.. as in the Prueitt
voting procedure

http://www.bcngroup.org/area3/pprueitt/kmbook/Appendix.htm

and Mill's logic as extended by Kugler and Prueitt (unpublished except on
the Internet)

http://www.bcngroup.org/area3/pprueitt/kmbook/Chapter9.htm

as the ground level.

An emergent computing process would then be the middle level into which a
collapse of the matrix into observables would be realized.  If there is a
top down expectancy, then under-determined computation (otherwise leading to
an Turing type halting problem) would be quantally decided as a
"discontinuity of the clustering process."  This is what Don and I are
trying to show in the eventChemistry work.

http://www.ontologystream.com/SLIP/index1.htm


So we might have a abstraction of the principle of quantum computing. But
the abstraction lets us consider cases where the environmental interference
(that is the "fence" baring actual QC using physical devices) is not
present.  This is deployable NOW as a stratified "understanding" of

1) events from the Internet (cyber warfare)
http://www.ontologystream.com/aSLIP/files/stratification.htm

2) events in individual financial transactions (tracking terrorism using the
economic process records)

3) human perception (as computational simulations of each aspect of Karl's
Holonomic theory on human perception, Pribram (1991))

4) events on e-Business web sites
http://www.ontologystream.com/OS/G-PArch.htm

5) value chains in economic systems

6) Intellectual Property disclosers to the Patent Trademarks Office (see
www.m-cam.com)

7) mapping of IP in developing counties such as China

8) a process model for vetting basic innovations into adopted technology
following the model of the Cambridge Group (Don Campbell)



etc...



In the place of this "distraction" we have a top down expectancy from a
knowledge base.  Grossberg's Adaptive Resonance Theory provides the best
commonly known example of "computational" top down expectancy.  Werbos's
little understood advanced adaptive critic provides top down expectancy in a
different way (a comment,  Paul (W)?)


I believe that Dick Ballard's Mark 2 (and the even move exciting Mark 3
knowledge encoder will provide the same type of top down expectancy as seen
in Grossberg's ART, and in the quantum computing version of ART developed by
Walter Schempp (in Germany).)

Axel, why is this important to your project on QC?  The answer is that the
physical problem of interference during the self orchestrated collapse
(Penrose, Hameroff)

http://www.phys.ualberta.ca/~biophys/banff1997/abstracts/hameroff.html


can be seen in the under-determined emergence of clusters in the
eventChemistry engine.

The re-examination of Hameroff's microtubin simulations in this context is
also of interest simply (as if this is not ground shaking enough) because
the problem of preserving the un-collapsed state via orchestration is
directly observable in biology.

The application of eventChemistry to mapping the social behavioral
expression in various settings (terrorism) is of critical importance in the
New War.  Ray Bradley's contribution to this effort is vital and greatly
welcomed.


Comments..?

















[Date Prev] | [Thread Prev] | [Thread Next] | [Date Next] -- [Date Index] | [Thread Index] | [Elist Home]


Powered by eList eXpress LLC