Transaction Graphs

From Sharewiki.org
Revision as of 08:05, 28 June 2012 by Dante (talk | contribs) (semantic web, interface being developed, ...)

The purpose of this page is to discuss the development of Ontologies and Softwares enabling, midst others, "Transaction Graphs", as to visualize and understand interdependencies, suggest requests, and support requests, within graphs using metadata.

Such approach, including Folksonomies on top of a very basic Ontology, may be seen as some kind of Bayesian network for expressing desired states and interdependencies in the form of narratives of proposed contracts, enabling queries, signing desired transaction contracts and visualizing established and past interdependencies , building up reputation of participating agents.

In many ways, it can be compared to a form of open emergent forum.

It can be used to facilitate choices of agents as to what interdependent transactions to support or not based on their preferred outcomes and on all available information, hence enabling emergent forms of governance, through a potential blend of relational dynamics across various interconnected economic networks.

Metadata produced by graphed interactions and transaction contracts can become currency ( such as various kinds of IOU's ) for non-linear exchanges.

It can be used as a tool to represent "Value Networks" and "Business Ecologies"

It can also be combined with other existing tools, such as "Ripple" (distributed clearing along trusted routes )

Ideally it can work along a Federated Database System

also see :

Contact

Feel free to contact Dante

excerpts from conversation - reply from S.

XML Schema ?

http://www.w3.org/XML/Schema

XSD Schema ?

Schema Matching ?

http://en.wikipedia.org/wiki/Schema_matching

Ontology Alignment ? and Visualization :

http://en.wikipedia.org/wiki/Ontology_alignment#Visualization_Tools

Minimal Mapping

http://en.wikipedia.org/wiki/Minimal_mappings

Topic Maps ?

http://en.wikipedia.org/wiki/Topic_Maps

Using Netention?

http://www.automenta.com/netention


1) there are several versions of Netention prototype, the source codes all of which are in github:

http://github.com/automenta

they are all made with Java, some of the versions are desktop application, and some are web-server based.

i consider them all prototype and very incomplete, so their value mainly lies in the potential they demonstrate.

if i continue working on it, it will probably be from scratch. and it doesnt need to be called Netention as its just a working codename... i'd feel fine if the ideas were transplanted into someone else's project too. so to be clear, i'm not attached to the Netention name for whatever is to become of its concepts: the philosophy, user-interface design, and algorithm design.

if you'd like to try the prototypes, i could try to package a runnable distribution. but there isn't much to evaluate besides what's in the videos. though i dont know if i made a video of the latest web-server prototype, so i might just make a video of that.

regarding transaction graphs

yes i suppose we are talking about the same overall process. but in netention i wanted to explore how simple of user-interface could provide that functionality. i concluded that all it needed was to essentially let someone describe what's on their mind in terms of actual (ex: i have) and virtual (ex: i want) objects.

hybrid of both a object-oriented and prototype-oriented system

2) with regard to the existential article (which i havent read in detail yet) http://existentialprogramming.blogspot.com/2010/08/not-all-properties-are-created-equal.html

i'm well aware of the shortcomings of object-oriented systems. in contrast, there are prototype-based programming languages:

http://en.wikipedia.org/wiki/Prototype-based_programming

i would consider Netention to be somewhat of a hybrid of both a object-oriented and prototype-oriented system (i was using the term Pattern to describe a template, several of which can be utilized when describing one Object=Idea=Thought=Concept ...), in that the ontology only suggests, but does not require, which properties=aspects=details one may specify (ie. whatever details come to mind about a particular idea).

this means that Netention ontologies could be evolved passively by a group of people utilizing the system. the ontologies would also be probabalistic in the sense that pattern Pa is likely to have Property X with 45% and pattern Pb is likely to have Property X with 3% etc... these probabilities would determine the ordering and visibility of certain user-interface parts. though, these ontologies would benefit from being seeded by existing ontologies, many of which are available, for example:

http://protegewiki.stanford.edu/wiki/Protege_Ontology_Library#OWL_ontologies

Netention descriptions, would in effect, be a programming language for reality. you could describe how things are, and how you would like things to be, and it would solve for whatever changes are necessary to bring your desired reality into existence. this would happen for everyone, piece by piece, so conflicts could be detected and resolved in the open like a wikipedia of the world's desires.

Semantic Web and Netention

from a conversation on the global survival mailing list. Seth's reply to Tom :

Tom : "Semantics basically deals with the analysis of linguistics and the semantic web basically is formatting text in a manner that is easier for machine interpretation, to return in searches or route traffic, perhaps. Maybe somebody could correct me it I am mistaken. It does reflect maybe a small subset of something else that I think we want to pursue, the availing of relevant information to where it is needed but I think that will involve a lot more than just text."

Seth : "i think that's an accurate description. to me, "semantic web" aka "web 3.0" is a buzzword for technology that can involve graph-based data structures as opposed to, for example, trees and matrices (row x column, like SQL databases).

another important aspect of semantic web is increased "granularity" of data. for example, instead of only linking pages as the original HTML was designed, individual concepts (within and without pages) can be linked.

machine readability is achieved through ontology which is a way for computers to agree on the (semantic) "meaning" of the (syntax) data.

in terms of the evolution of software on this planet, we are only beginning to experiment with user interfaces that take advantage of semantic-web technologies. Netention is one such proposal and set of prototype software that is designed to generate machine-readable semantics of personally-relevant concepts, which compose the "story" of your life and its potential (imaginary) futures. everyone's stories can then be aggregated and interlinked to form suggestions of how you can realize the futures you have described. at this point, we need more developers, user-experience designers, philosophers, and basically anyone with any real life experience to take a look at the design and either validate or invalidate its principles. if it's valid, then developers ought to help build it.

i see Netention and Global Survival System being two aspects or modes or user-interfaces of the same system, and I have an intuition that Tom's CIM is another, as well as other projects like maybe the "Pull Platform" - though all hopefully can be seamlessly transformable into each other within a common open and resilient (p2p / distributed / decentralized / federated) architecture.

the ideal and simplest interface for Netention would involve total natural language (NL) interpretation - but it would need a way to allow a human to confirm that the input was interpreted correctly and correct it if not. in the meantime, we can use a semi-natural language interface that is somewhere between NL and RDF-like semantics to allow people to freely express their "stories" aka personal narratives.

to be able to express all aspects of human life, and to achieve ubiquitous machine readability requires a rich ontology aka schema aka vocabulary. fortunately there are many available to use and netention should support all of them while also being a tool for editing them and creating new ones. at the same time we should find ways to simplify the diversity of the semantics for computational efficiency and accuracy, and this might be achieved through the use of "upper ontologies" like SUMO."

Vocabularies

FOAF

Good Relations

IEML

http://www.akasig.org/2008/05/14/pierre-levy-vs-tim-berners-lee-round-01/

"The main difference between URIs and IEML identifiers is that IEML identifiers are semantically rich. They carry meaning. "


"Zero Exchange" platform development

http://zeroexchange.sourceforge.net/en/book.html

Mutually Positioning Contracts?

A parallel with a bitmap or voxel approach, with positions of URI's ( agents, contracts, ... ) based upon their relative position to other elements.

Voxel

http://en.wikipedia.org/wiki/Voxel

"As with pixels in a bitmap, voxels themselves do not typically have their position (their coordinates) explicitly encoded along with their values. Instead, the position of a voxel is inferred based upon its position relative to other voxels"

Seth:

Voxels are an alternative to 3d Polygon models. they can be converted back and forth but with some loss of exactness

check these out too:

Metaballs

http://en.wikipedia.org/wiki/Metaballs

Dante:

I wonder how this could be used to have visualization systems of, for example, databases of inter-dependent contracts ?

http://wiki.blender.org/index.php/Doc:Manual/Modeling/Metas

Meta objects are implicit surfaces, meaning that they are not explicitly defined by vertices (as meshes are) or control points (as surfaces are): they exist procedurally.

Some more links

Under "Reference Maps"

http://www.delicious.com/deliciousdante/ReferenceMaps http://www.delicious.com/deliciousdante/ProcessDimensions

and more lately :

http://www.delicious.com/deliciousdante/ReQuest http://www.delicious.com/deliciousdante/ontology

Is there a way to ( mathematically ? ) enable objects to be mutually defined by a folksonomical approach ?

Perhaps a solution can be found through a mixture of each of these three approaches ? Enabling several vectors, and enabling each object to keep a history of its changes based on a unique URI, yet have the objects positioned within multidimensional vector graphs based on their "mutual positioning" ? With the possibility of zoom outs into various levels of abstraction / dimensions based on some kind of multidimensional reference system ?

a reply  :

the IEML Dictionary seems like a very informative and complete "top level ontology"

Netention's fundamental distinction between 'actual' and 'hypothetical' descriptions of reality is exactly the IEML's 'actual'/'virtual' distinction.

( http://automenta.com/netention )

If this is the case, then Netention could be considered a very valuable application of IEML, which would also be an opportunity to bridge IEML and RDF (semantic web).

re: dynamic reference point, i think you might be discussing the difference between a subjective and objective description system? quantum physics implies the necessary involvement of an observer (subjective)

Further Links

The social-semantic web (s2w)

http://en.wikipedia.org/wiki/Social_Semantic_Web

aims to complement the formal Semantic Web vision by adding a pragmatic approach relying on description languages for semantic browsing using heuristic classification and semiotic ontologies. A socio-semantic system has a continuous process of eliciting crucial knowledge of a domain through semi-formal ontologies, taxonomies or folksonomies.

Topic Maps

http://en.wikipedia.org/wiki/Topic_Map

The semantic expressivity of Topic Maps is, in many ways, equivalent to that of RDF, but the major differences are that Topic Maps (i) provide a higher level of semantic abstraction (providing a template of topics, associations and occurrences, while RDF only provides a template of two arguments linked by one relationship) and (hence) (ii) allow n-ary relationships (hypergraphs) between any number of nodes, while RDF is limited to triplets.

Computer Supported Cooperative Work

http://en.wikipedia.org/wiki/Computer_Supported_Cooperative_Work

Over the years, CSCW researchers have identified a number of core dimensions of cooperative work. A non-exhaustive list includes: Awareness: individuals working together need to be able to gain some level of shared knowledge about each other's activities. Articulation work: cooperating individuals must somehow be able to partition work into units, divide it amongst themselves and, after the work is performed, reintegrate it. Appropriation (or tailorability): how an individual or group adapts a technology to their own particular situation; the technology may be appropriated in a manner completely unintended by the designers.

Stigmergy

http://en.wikipedia.org/wiki/Stigmergy

Stigmergy is a mechanism of indirect coordination between agents or actions. The principle is that the trace left in the environment by an action stimulates the performance of a next action, by the same or a different agent.

Other Tools

http://mindraider.sourceforge.net/presentations.html

MindRaider mission is to organize not only the content of your hard drive and favorite areas of web, but also your cognitive base and social relationships in a way that enables quick navigation, concise representation and inferencing.

+

VUE = Visual Understanding Environment

http://vue.tufts.edu/

Xml based

http://www.akomantoso.org/

Akoma Ntoso is a set of simple, technology-neutral XML machine-readable descriptions of official documents such as legislation, debate record, minutes, etc. that enable addition of descriptive structure (markup) to the content of parliamentary and legislative documents.

http://www.akomantoso.org/rss-manager/africa-i-parliament-action-plan-has-recently-released-the-version-2.0-of-akoma-ntoso-standard

Global Survival List

http://groups.google.com/group/global-survival?hl=en

Dante's contact :

dante -dot- monson -at- g mail