[Date Prev] | [Thread Prev] | [Thread Next] | [Date Next] -- [Date Index] | [Thread Index] | [Elist Home]
Subject: KT 2002 and anticipatory web services
Kurt
(Cagle) and colleagues,
I
invite those on the cc list to join the **private** forum
at:
and to
circulate this annoucement. Please do not use this cc list when you
forward or reply. This is for invitation only, not to start up a cc
discussion.
To the
Protege forum, Topicmaps-comment, Steve Pepper and Steven Newcomb; I ask that
you consider papers to Knowledge Technologies 2002 in support of the new
paradigm: "Ecological Knowledge Management, issues, technology and social
science."
**Please regard this communication as a late call for
papers. **
***********
In
your recent post, you described a number of variables at play in modeling
anticipation as a web (.NET) service.
These
where:
Input
Granularity
Input
Data Confidence
Processing latency
Model
update size
and
Model
Confidence
To
align your thinking with the BCNGroup thought, we need to consider deeply the
implication of Robert Rosen's work on anticipatory systems. Rosen's work
is available in most academic libraries, is referenced from my manuscript,
and is
discussed at meeting between Peter Kugler, Bob Shaw and myself..., and at other
meetings of the school of thought founded by Shaw and his
colleagues.
You
have already made some reference to the problem of level of confidence.
However, the notion of formal confidence leads to some discussion about sense
making, or perhaps better stated "false sense making" while using a formal model
to reflect a natural system.
I ask
us to pause here and reflect.
Peter
Kugler's knowledge of and understanding of this problem is most specific and has
been developed into a set of lectures. These lectures can be and should be
made into the virtual course on "Ecological Knowledge Management", and given as
a tutorial at the Knowledge Technologies 2002 (March 2002). As
one of the conference board advisors, I have made arrangements for a one day
pre-conference tutorial that includes some help on buying Peter's plane tickets
and paying the conference fees. We have also scheduled one or two sessions
in this conference for presentations from "our school".
Shallow Link analysis, Iterated scatter-gather and
Parcelation (SLIP) is a new technology I have developed for cyber warfare.
But the technology is applicable to e-commerce and to scientific investigation
about any signal structure, such as in astrophysics or CERN type data
streams.
The
issue that I have framed in the SLIP technology is how a "measurement" of the
artificial world can be made. Let us NOT think about how perceptional
measurement is made by human perceptional system, or by biological systems
through autopoiesis (Maturana's term).
Let us
think only about how a computational system can construct a model of the
invariances in a data stream.
The
issue of correspondence to things in the real world can be treated AS A
COMPLETELY SEPARATE ISSUE.
It is
my hope that you will take on the task of editing my manuscript and getting this
published as a first step in developing a new perceptive about knowledge
management using computational means.
***
Please
consider joining the Einstein Institute e-forum and making an introduction after
reading some of the posts. All posts to the forum are reviewed and
moderated and each post must not include text from previous posts except as
referenced in a scholarly fashion.
I have
control of the session pre-review and thus have at least 30 days to decide on
which presentations to choose. All we need now is a topic title and
author.
Respectfully,
Dr.
Paul Prueitt
Founder (1997), BCNGroup.org
Founder (2001), OntologyStream.com
703-981-2676 (Virginia)
|
[Date Prev] | [Thread Prev] | [Thread Next] | [Date Next] -- [Date Index] | [Thread Index] | [Elist Home]
Powered by eList eXpress LLC