There is something powerful to be gained from changing the representation
we commonly work with, increasing the level of abstraction and the
reification Let's unpack this together:
>From words to concepts:
When we increase the level of granularity in text processing from a focus
on words to working with concepts, we 'lift' the cognition level one step,
we are walking on an information rather than a data landscape. There are
less entities to deal with, new emergent patterns and exciting insights
that come from wandering around a concept rather than a word landscape.
Just take a look at what MeansBusiness has done in the business content
world!
>From concepts to knowledge objects:
Now say we use a finite number of knowledge objects as our representation?
We use more complex objects and entities and we start to work with:
Patterns: optimal solutions to recurring issues /
forces
FAQs: answers to common questions
Enumerative description situation descriptors
Profiles: collections of pointers around object
or people
Distinctions: differences that make a difference
Best practices: what is known to work
Lessons learned: things to avoid also called anti-patterns
Problem / solution pairs: working answers to recognized issues
Each of these 'knowledge objects' is then a validated encapsulation of
local experience and expertise. Representation has moved from the
individual to community levels and there is distributed ownership,
validation and alignment.
At this (higher?) level, we will then have shared meaning, reification and
distributed understanding as by products (otherwise the proto-object
withers & dies!). The next step is to build an object level ontology so we
have standards and interoperability. So if we apply inference, (event,
activity, time, spatial logic & dependency reasoning) at this higher
level, and go across object types, will we not bootstrap ourselves to a
new landscape?.
It is kind of like the observation that natural language evolution is slow
in comparison to computer language generations, where we can leverage
interactions with artifacts, embed functionality, employ visual and tacit
senses and capture meaning and being in our digital creations.
Enumerative description comes from the work of Paul Prueitt:
http://www.voght.com/cgi-bin/pywiki?EnumerativeDescription
This post was sparked by Ed Swanstrom's remarks in the KMCoP Yahoo Group
Denham Grey
====================================================*
Homepage: http://www.voght.com/cgi-bin/pywiki?DenhamGrey *
KM Wiki - largest collaborative KM repository on the web - join us *
--Denham Grey <dgrey@iquest.net>
Learning-org -- Hosted by Rick Karash <Richard@Karash.com> Public Dialog on Learning Organizations -- <http://www.learning-org.com>
"Learning-org" and the format of our message identifiers (LO1234, etc.) are trademarks of Richard Karash.