REVUE  E.J.D.E.  ISSN  1776-2960




Hypermediating sites : towards new forms of technology intelligence ? Between collective intelligence and semantic web

L. Verlaet et S. Gallot (LERASS-Céric (EA-827), Université Paul Valéry, France.)

Summary In this article we present a new form of access to knowledge through what we call « hypermediator websites ». These hypermediator sites are intermediate between information devices that just scan the book culture and a “real” hypertext writing format.


KeywordsHypermediator website, hypertext, semantic web, collective intelligency, model markup systems, Information Visualization, technology of intelligence.


I.     Introduction


hese two last decades have propelled our societies into the digital age and it’s contemporaries in hyperconnexion. The advent of information technology and digital communication seemed like a great opportunity in terms of diffusion and transmission of knowledge. Were we not supposed to live a revolution in knowledge? Web pioneers would finally see their dream come true: increase human cognitive abilities, allow the humanity to increase its knowledge and thus accelerate its evolution?

If many efforts are made ​​towards this, we must admit that the dissemination procedures adopted by new technologies  are for most part, very far from the issues of knowledge transmission. Simple digitalization for some, still in the grips of the bookish volume (culture from which it is difficult to emancipate after several millenniums of reign). For others, it is the deviance of simplified and standardized uses of technical tools which neglect partially their potential to serve knowledge. Finally, let us not forget that the tools of the social Web are still resolutely based on telecommunication principles.

If technique can nowadays in theory do almost everything, it is to rethink the dynamic organization and management of knowledge by looking into such “natural” models of knowledge construction processes, in order to rethink them in an adequate way to organize them “artificially”. The anthropologists of communication demonstrated that  linguistic thoughts are interdependent with anthropological thoughts, these being related to ontological thoughts. The semantics of a field of knowledge constitute by essence the musical notes used in the structure of the harmony of a culture, following the example of the orchestral metaphor (Winkin, 1981, p 5-6).  This synchronization is the reflection of the culture, which depends on the environment in which the actors evolve, dependent on their ecosystem. As Goody (2007) indicates, language is a “technology of the intellect”, in this perspective, semantics, to be understood as the study of a language and its system of meaning, is at the heart of this technology of the intellect, this one that has allowed  humanity to evolve and get organized by information and communication.

In this sense, our reflection meets the formal approaches of the hypertext (Rety, 2005) to develop new forms of technology intelligence through the concept of “hypermediator website”, which is based on the principles of collective intelligence (Levy, 1994) and operates according to the precepts and semantic Web tools. After reviewing the theoretical and practical models that brought us to conceive this hypermediator website, we will explain the possibilities of searching and browsing offered by this device.

II.     Theorical and practical models of « hypermediating sites »

A.     Formal approaches of hypertext

With the rise in power of the Internet and the democratization of its tools, hypertext has become synonymous with “hyperlink”. If the latter is omnipresent on the Web and within digital devices, its use is far from that imagined by the precursors of the Web when they invented the hypertext concept, its role is now often summarized to an interactive table of contents or a bookmark. Following the example of Rety (2005), we believe that a return to formal approaches of the hypertext is necessary in order to reposition scientifically this concept in its early definitions. Hypertext is a way to forge constructive links between ideas, between units of sense stemming from non-sequential writing. It has to be considered as a set of fragments of texts of variable granularity and semantically interconnected. In fact,  hyperlink” should be thought of as a part of hypertext. This non-sequentiality is described as most likely to match with the cognitive human mnemonic association system, but also as the foundation of its own digital writing (Bush, 1945, Nelson 1981, 1999).

Following the lineage of the pioneers, we see in the interface “man / machine” a considerable opportunity to increase, transmit and manage knowledge. We support the idea that hypertext can account the complex reality of a knowledge area, that it allows to accompany human reasoning.


B.     Principles of Semantic Web

Aware of the excesses of the current Web, Berners-Lee and the W3C launched in early 2000 the project of the Semantic Web. “The Semantic Web is an extension of the current web in which information is given well-defined meaning, better enabling computers and people to work in cooperation   (Berners-Lee & al., 2001). The aim of the latter is simple, it is to give machines a semantic scope by tagging digital content using XML, and organizing it with RDF ​​and ontology layers of its architecture. This having the effect of clarifying, organizing information content, facilitating processing by computer systems and thereafter consultation by users. Another purpose - and not the least - of the Semantic Web project is to ensure interoperability between computer systems for sharing and reuse of information. Knowing that the culmination of the Semantic Web, according to its creators, is home automation, that is to say the set of automated techniques and technologies facilitating everyday life.

We have taken particular interest in XML - whose flexibility and manoeuvrability really allow us to think up a hypertext system - and knowledge organization.


C.     Principles of collective intelligence

Lévy (2002, p.243-245) defines intelligence as “power of self-creation”, which develops from a cognitive point of view through the “capacity for autonomous learningand  evolution process”. It emerges from a “circular interaction process and self-generator between a large number of complex systems”. In fact, “the intelligence is always the result of a collective entity, which is both numerous and interdependent”, he underlines the pleonastic character of “collective intelligence”. Lévy states that “major stages of cultural evolution correspond to mutations in the process of collective intelligence, almost always linked (in a complex way, and on circular causality mode) with mutations in the life of language and insists that “language is precisely what makes culture - that is to say collective intelligence deliberately working to its own development – possible. Lévy also raises problematics inherent to cultural prejudices postulating that intelligence would be “the property of individuals”, phenomenon that has important resonances within scientific and technical communities (among others).

We fully agree with the path Lévy takes when he states that the digital sphere is a more than favourable area  to study the knowledge emanating from collective intelligence, and thus to reveal current trends. Lévy's vision on collective intelligence is also reminiscent of the orchestra model advanced by Winkin  (1981), itself inspired by the theatrical metaphor of Goffman (1975): everyone must listen and tune in to others in order to form a melody in unison (dominant culture), the dissonances sometimes being part of the game (subsidiary crops).

Our work supports collective intelligence to uncover governance of knowledge that governs a given field.


D.     Principles of “hypermediator websites”

Our sayings consider the governance of knowledge as aartifact” of the human thought system, as an extension of the memory. The process of transmission of knowledge,  knowledge governance, appears as the result of a mediation between the “reader”, his system of organizing knowledge, the system of organizing knowledge of a given “learned society”, in which the technical tool is simply a backup mediator to facilitate the process, tool conceived and organized by men for men, aiming to make access to knowledge easier. Through this governance of knowledge, we seek to develop new forms of intelligence technologies.

The principles of hypertext, Semantic Web and of collective intelligence form a theoretical and practical background from which we have nourished ourselves to develop the idea of ​​”hypermediator websites”. We define  hypermediator websites” as an artefact space of mediation allowing linking: field of knowledge, non-sequential knowledge, reader, mediator-tagger. He is found to be the intermediary device between digitalization of the book culture (the current Web) and “real” writing hypertext (tomorrow’s semantically organized web). An intermediary device because it is not intended - at least for now - to be a standalone device, but should be more considered as a “mediator” website (Davallon & Jeanneret, 2004). However, an important distinction is to be made between “mediator website” and “hypermediator website”. If the first acts as a portal website composed of hyperlinks that point to other external portal websites, the second is conceived as a complementary and intrinsic website to a collection of documents offering an effective treatment of it’s corpus in order to extract a new meaning.

A hypermediator website does not only consider the full texts or the elements of the paratext but also units of meaning of the concepts that compose them. As specified by Clément “the layer of the conceptual hypertext which is the most characteristic and which justifies the appellation of  “intellectual technology” because by organizing the data in a system that binds them together, it gives it a meaning and results in new information” (2007, p.3). Hypermediator websites intend to allow the reader to learn, to compare, to compare ideas, to understand the interactions between concepts (etc.) and thus gives place to further reflection but also more specific to a certain field of knowledge.

Hypermediator websites therefore plan to restore, through specific storyboarding, and knowledge stemming from the grouped intelligence of the authors represented within the processed digital documentary in question. To reach this goal, it is first required to conduct a semantic markup of the corpus. This step allows us to collect not only data but information on this data. Indeed, data does not have a real interest if used alone, it is only once this data collected, organized and interrelated that it takes on all its meaning, that it becomes information (Mazza, 2009). As we will see, the markup model that we offer picks up and discloses enough contextual clues about information so that they become knowledge. The confrontation between the received information and our relevance system (Schütz, 1987) or our terms of reference (Goffman & Kihm, 1974) ensure the construction of meaning and the development of contextual intelligence.

The particularity of our work rely on the chosen approach to apprehend, process and analyze the corpus. Indeed we chose to adapt the analysis tools generally used to study situations and communication systems between social actors in order to transpose them to their writings and products of their knowledge. Furthermore, it seems to be important to note that we are interested in scientific literature and more specifically scientific papers. We share Farchy’s thoughts and al. (2010, p.10) that the “article retains its role of fundamental building block in modern science”.

III.             Hypermediator website’s markup model

The hypermediator website is based on the markup model by the Semio-Contextual Approach of Corpus (SCAC), model formulated from an approach through information contextualizing: semiotic context analysis (Mucchielli, 2005) which is a method of contextual and cognitive analysis of communication situations. By necessity for this article and our demonstration, we will only expose a portion of the markup model, the one dedicated to the contents of the corpus, to the units of meaning concerning concept. Know that the SCAC model is a generic model and it considers the same elements whether applied to paratexts,  thematics or the authors of the collection of articles (Verlaet, 2011). We will not address the transposition of the semio-contextual analysis of communication situations to the ASCC's markup model (Verlaet, 2010) either.


A.         ASCC’s model markup

The SCAC's markup model was designed to identify the relevant units or fragments of meaning supporting concepts in order to meet the readers information need for a search related in encyclopedic or terminologic issues of a knowledge field. “Culture in the anthropological sense can be summarily understood as a reading grid of the world and therefore a matrix of representations, of knowledgs, of beliefs and behaviors transmitted by the family, by the group or society” (Max Weber quoted by Engelhard, 2012, p.135). The contexts highlighted by semio-contextual analysis act as a reading grid in order to allow to emerge the meaning of a communication situation between actors. We have inserted this reading grid within the SCAC markup model to see emerge the meaning of concepts which can be found in scientific articles.










Units of meaning


Identify a significant fragment of a concept

<norm id=“concept X”>

Fragment of information stating the definition of the concept.

<stakes id=“concept X”>

Fragment of information explaining the stakes, the objectives of the concept.

<position holonym=“concept X” meronym=“concept Y”> or <position hypernym=“concept X” hyponym=“concept Y”>

Fragment of information marking a specification relation between concepts.

<relations a=“concept X”

b = “concept Y” type = “link”>

Fragment of information exposing a relation (except specification).

<time id=“concept X” date=“2009”>

Fragment of information disclosing timeine clues about the concept.

<spatial id=“concept X” lieu=“France”>

Fragment of information linking place and concept.

<quote id=“concept X” auteur=“auteur” reference=“reference”>

Fragment of information indicating a quote borrowed from an author about a concept.

Fig. 1. XML markup grid  of concepts according to the SCAC model


The table above therefore describes the XML markup grid that the tagger has to follow,  it being understood that the identification of the meaning can only be revealed a human action.


B.     The mediator-tagger

The role of the tagger is very important. It is essential that he be a field expert, since it is necessary to fully understand the conceptual universe inherent to the scientific corpus. It is the expertise of the tagger that ensures the relevance of the fragments of information. The tagger’s position is also sensitive because his expertise directs his reading, which is necessarily under the influence of his social framework, of his reference framework. All the difficulty lies in the mobilization of his reference framework whilst respecting the thoughts and writings of the authors. The tagger is considered an actor who has the facility to understand the corpus but who is also sufficiently neutral and honest to be the mediator between the author and the reader. The tagger is perceived as performing a fine architects job to bond knowledge in a shared common framework that he helps to define and, as we will see, to model.

It is not the tagger who is in charge of evaluating the reliability or validity of scientific content, his task is only focused on identifying relevant text fragments according to the SCAC grid markup. The reliability and validity of concepts and their interconnections are provided by the collective intelligence of the authors, through recurrence and frequency.


C.     Scenarization through “recomposed documents”

SCAC markup model allows us to obtain different fragments which corelate to concepts, they can therefore be transcluded (Nelson, 1999). In other words, it becomes possible to decontextualize them from their original article (the source document) to recontextualize them within a new document, a recomposed document with a singular meaning. This document recomposition is possible thanks to XSLT stylesheets, which operate both to filter fragments of information, sort, organize and arrange them according to the device designs. In this case and at present, the “hypermediator website” offers “concepts records” that assemble in a single interface all the fragments of information related to a concept, and categorize them according to the nature of the tags and therefore of their meaning.




Fig. 2. Example of “concept record”.

Here fragments of information outlining the relation between the concept of “framing” and other concepts.


If in practice the principles of “decontextualization-recontextualization” is close to making a copy / paste of fragments of information, it is distinguished by the traceability of information fragments to their original article. This traceability of information fragments respects the intellectual property of the authors but also allows the reader to consult the source article if necessary. The scenarization chosen in order to present fragments to the readers - the concept records  - also follow this direction. Moreover, it has been developed so as to accompany the reader in the information hyperspace and in the building of his knowledge.

Although interesting at this stage of development, the hypermediator website seems too borrowing of the bookish volume and does not allow the reader to have a meta view of the conceptual univers of a knowledge field.

IV.             The complex system approach

A. Fundamental principles of the complex systemic : from human systems to knowledge systems

    The paradigm of complexity and the contributions of the systemic approach allow us to take a fresh look at knowledge systems in the direct lineage of precursor hypertext ideas. The knowledge of a given field of knowledge is not sequential, although it is structured by authors around concepts in paragraphs, articles, they are a connected whole: they are the institution of the field’s knowledge. And the holistic vision of a knowledge system and its constituent subsystems offered by the complex systems approach allow us to envisage a new angle on the organization of knowledge and its visualization.

    The complex systemic vision is generally relevant in order to report and understand a complex reality co-constructed by actors in a situation of interaction. The researcher models the system and subsystems that are maintained to provide an understanding of the structure and dynamics of interactions that make up the situation. From our point of view, this structural-functional conception of the systemic necessarily adorns the idea of ​​collective construction of meaning. Therefore, the system is understood as an “intelligible and finalized tangle of interdependent activities” and allows to perceive both a phenomenon in its consistency, unity and  internal interactions (Le Moigne, 1999, p 30), and can be described in terms of decomposable elements, and almost decomposable or not decomposable.

    We assume the idea of ​​non-composability in order to argue that by studying the meanings relating to this co-contruction of reality, systemic approach may also be relevant to represent and analyze the construction of meaning in a “relevant system” may it be a communication system or a knowledge system.

In this sense, the systemic, as it applies to organize in understanding of communicative action of the human groups, can equally help organize and report the semantic construction of a global knowledge system. To do this we are basing ourselves on one of the fundamental axioms raised by the School of Palo Alto (Watzlawick, Beavin & Jackson, 1979). In the acceptance of a complex systemic, communication consists of two dimensions: content (information) and the relationship (link). Thus, it is between these two dimensions that the meaning emerges by contextualizing information and adding, through their relation, meta-communication. So, information and their linkage, their relation, would allow to organize and to give meaning with the aim of understanding them  within the global system.

    This complex vision includes in particular the hologrammatic principle that postulates that the systemic “everything” is included in the part and the part in the whole (Morin & Le Moigne, 1999) making central the idea of ​​”reliance” (Morin, 1990). Therefore, our thesis is the following: if the complex systemic allows us to explore in understanding the meanings relating to a co-construction of situational and communicative “reality”, it may also be relevant to represent and analyze the semantics of a field of knowledge. This, by interesting ourselves in the “information” and the “relation” between knowledge (micro-whole) and how they are organized to construct a system of relevant knowledge (whole). Each part has its singularity but is nonetheless but “fragments” of the whole, they are virtual “micro-wholes”. The hologrammatic principle transposed to information returns to postulate that knowledge is included in understanding and understanding in knowledge. Each knowledge has its own singularity but they are also “particles” (concepts) of knowledge, they are “plots of knowledge” that we need to organize and to connect in order to make sense of them.


B.     Systemic modelization of knowledge

Whether to participate in a situation, to solve a problem, to understand a phenomenon, to understand a concept, or to build knowledge, men organize their thoughts according to schemes. These schemes are constructed upon the basis of their recurrences and allows them to recognize, to understand and therefore to “know”. According to Valéry, to understand a phenomenon, men can not escape developping a “natural” modelization process. Therefore, it seems quite plausible to transpose “artificially” this natural organization phenomenon.

The systemic approach induces this modelization process to “think of the phenomenons as a system”, it then allows to consider, design, visualize and process knowledge organization. Its universal character allows based on the semantics of a field of knowledge to modelize the socio-scientific thought and to reveal “patterns”. “patterns” (Simon, 1979, 1981 )being models organizers, recurring complex schemes (Bateson, 1979) used to represent an ontological thought.

Patterns (Simon, 1979, 1981) are organizing models, being recurring complex shapes (Bateson, 1979) used to represent an ontological thought. “The pattern is an organized and organising form identified by a cognitive act of perception” (Le Moigne, 1999, p 47). So based on this ontological thought - the “pattern recognition” (Ibid.) becoming an anthropological culture and then an organized and structured semantical culture. It becomes possible to think of the complex construction and the complex restitution of systems of knowledge based on patterns that Le Moigne calls “gestalt patterned” implying inseparability of product and process (Ibid.)

While it is undeniable that the process of semantization is inherently highly complex, unpredictable and unstable, it is nevertheless modeled upon a “common cultural contextual framework” if one refers to the definition of Le Moigne modelize is the act of elaborating  and to intentionally construct through composition of symbols, models that make intelligible a complex phenomenon and to amplify the deduction of the actor putting forward a deliberate action within the phenomenon; reasoning aiming to anticipate consequences of the project of possible actions” (1999, p.5). This definition of the modelization process seems fruitful enough to think up knowledge modelization in a given field. For the reader the field of knowledge is a complex system composed of texts, author networks, schools of thought, concepts, contexts, which are involved in the societal effects across spatial and temporal contexts. Here the hypermediator website offers models of knowledge organization precisely to amplify the reasoning of the reader, knowledge acquisition, assimilation of knowledges, and the understanding of phenomena or action.


C.     Conceiving intellectual paths within a domain of knowledge

According to Simon, our memory works on the accumulation of traces formed by our experience, our past knowledge, which we will recombine in case of problematic situations. This recombination of traces constitute what he calls “pathing map”. Each recombination, depending on the problems to solve and on the context generates a more appropriate different recombination based upon these traces which “form” the basic structure of our knowledge. Basic structure according to which, by organizing the elements, we can build an infinity number of maps, each map is then seen as a pattern in which the problem becomes solvable or intelligible. In seeking to uncover, to model, via the SCAC markup model these traces, we aim through the hypermediator website and the markup work  to present “pathing maps” in the complex system of a knowledge field by respecting the “natural” process of discovery, of design and of construction of knowledge.

If the idea of ​​”map” does not fail to resonate with the idea of ​​visualization, Simon describes the memory as a library of symbols, of patterns, of shapes, of metaphors that can combine, recompose to infinity and inform reciprocally (Demailly, 1999). If the human memory is not “expandible” no one today denies that the techniques can play the role of external memory, even of “universally shared library” in reference to Otlet, library of signs, symbols, knowledge, no longer congealed in the shackles of volume but dynamic through networking and sharing, a form of collective memory in which it would be possible to propose courses of reading. The hypermadiator website, the semantic markup, can concretely conceive  this library, they can offer it directly as a map thanks to information visualization.

How to build pathing maps? The SCAC markup model can point interconnections between concepts but mostly it focuses on the typification of semantic relations. This markup model offers a structure in correlation with the meaning units identified in the documents, the concepts' relations can provide a  modelization, a mapping of the knowledge domain. As for systemic study of human systems, that is recurring behavior, the relational form that repeats itself which is interesting. However to apply systemics, omitting the recessive or isolated facts is to simplify. But the use of statistical weighting based on strong ties between recurrences, moderate or low, as developed in the work of Simon (1977) will help reveal the complex architecture of knowledge.

V. Hypermediator website and infovizualisation

Following the precepts of the systemic approach, we sought to model the conceptual universe outlined in our corpus. The latter is composed of 33 scientific articles from the field of science of information and communication, which represents over 500 pages. These articles are based upon the theme of communication of organizations.

The data analysis from the SCAC markup model has allowed us to track down 149 concepts within the corpus, 100 interrelations between concepts. Amongst these 100 interrelations, we can count 29 meronym/holonym relations, 6 hypernym/hyponym relations and 7 analog relations. The other interrelations between concepts are associative ones. To view the system of knowledge extracted from our corpus, we used the free software Gephi ( Gephi allows to process data graphically and to analyzed it statistically. “We are witnessing the emergence of a fourth paradigm of scientific research based on the intensive use of computation on large volumes of numerical data” (Bell, 2009, quoted by Le Crosnier, 2010, p.54). “The data analysis covers a wide range of activities taking place at various stages of progress of these operations, including the use of databases (as opposed to sets of raw files which the databases can access), analysis and modeling, and finally, data visualization” (Bell, 2009, quoted by Le Crosnier, 2010, p.54).


A.     Global system visualization

The <position>  and <relation> tags inherent in the SCAC markup model allow us to extract from the corpus all the links between the concepts expressed by the authors of our corpus. Thus, we were able to modelize the global system with the concepts composing the corpus via statistical weighting of links (strong, moderate or weak) between nodes.


Capture d’écran 2013-09-17 à 09.24.02.png

Fig. 3. Visualization of the global system


   This statistical weighting in relation to our sample only reveals four strong concepts whose recurrence is between 10 and 14, 3 ​​ moderate concepts (between 5 and 9 recurrences), the vast majority of concepts (142) having a low recurrence of less than 4.


B.     Focussing on emergent patterns of the corpus

The statistical weighting also allows the dominant patterns to emerge from our corpus. Figure 4 highlights the existing interconnections existing around a strong concept of “systemic”. The links between concepts are more or less marked on the one hand according to recurrence but on the other according to the consensus among the authors on link typification.

Capture d’écran 2013-09-17 à 15.14.00.png

Fig. 4. Pattern “Systemic”


We note, in this regard, the consensus on the concept relations between “systemic” and “framing”. For collective intelligence inherent in our corpus, the concept of “systemic” “depends” on the concept of “framing”.


Capture d’écran 2013-09-17 à 14.01.51.png

Fig. 5. Pattern « framing »


   The study of the “framing” pattern allows to highlight the relation that the concept of “framing” maintains with the “problem” concept: “framing” “allows to identify” the “problem.” According to this pathing map we understand that given the collective intelligence that the “systemic depends on framing, framing is itself needed in order to identify the problem”.


C.     Perspectives of navigation for the reader

Thus, statistical weighting will not only reveal the complex architecture of knowledge relating to a domain, but will also allow you to focus on the patterns that compose it. These patterns are all paths, possible pathways for the reader. The information visualization of the overall system and patterns allow the reader to navigate freely within the information area according to his interests whilst offering courses of constructed reading which were formalized from collective intelligence. Thus, a hypermediator website offers the reader two ways to access information. The first is based on documents recomposed by the fragments of information through "concept files", which are ultimately an aggregation of informational content previously categorized and traceable. The second is based on an information visualization of the knowledge system, which allows the reader to understand the global or specific structure of a field. The two ways afore mentionned of accessing information are obviously interacting, the actor may at any time switch from information visualization to recomposed documents and vice versa. As such, the hypermediator website is fully involved in the co-contruction of knowledge for the reader and can claim the title of intelligence technology.

V.                Conclusion

Therefore our position is clear, we support the idea of ​​going back to the source, to the organization of the system of thought, "communication" of knowledge and ideas, and to the construction of meaning as an iterative process  of knowledge construction. In addition, we consider that the organization of knowledge, knowledge construction is part of a process of knowledge modeling. In doing so, we argue the information visualization as an "artificial" model close to intra-psychic processes intending to organize knowledge, to formalize the relations between knowledge offered in a non-sequential manner but through a network and taking into account links, relations between the different meaning units, in order to navigate through the content without losing the fundamental idea of ​​constructed meaning.

Indeed, if the technique allows us to capitalize the informations, the process of knowledge construction, the organization of "natural knowledge" can only take into account the complex processes of constructing meaning.  semantics are at the heart of the project, the hypermediator website, such as we organize it, meets the initial prospects of hypertext and semantic web. It has to be considered as a technology of intelligence for the benefit of knowledge. Therefore, thanks to the "semantic markup", the potentials of such a mediation space allow to conceptualize, to link, to organize, to capitalize, to restore, to enhance, to transmit a world of knowledge by taking into account, first of all, the human complex process on the construction of meaning.

Finally, if our current corpus is far from representing an important data mas, its treatment nevertheless suggests interesting perspectives to understand and analyze the knowledge field in question, or to interview several areas of knowledge in order to extract concepts acting as "boundary objects" (Carlile, 2002, 2004) between these areas.


[1]   G. Bateson. Mind and nature : a necessary unit, Batam books, 1979.

[2]   T. Berners-Lee, J. Hendler, O. Lassile. “The semantic Web”, î, mai 2001.

[3]   D. Berthier. Le savoir et l'ordinateur. Paris, France : l'Harmattan, 2002.

[4]   V. Bush. “As we may think” in “The atlantic monthly”, vol 176 n°1, Boston : 1945, pp. 101-108.

[5]   P. Carlile, “A Pragmatic View of Knowledge and Boundaries: Boundary Objects in New Product Development”, Organization science, 2002, 13 (4), p.442.

[6]   P. Carlile, “Transferring, Translating, and Transforming: An Integrative Framework for Managing Knowledge across Boundaries”, Organization Science, 2004, 15(5): 555- 568.

[7]   J. Clément. « L’hypertexte, une technologie intellectuelle à l’ère de la complexité », in Claire Brossaud et Reber Bernard (dir.), Humanités numériques 1., Nouvelles technologies cognitives et épistémologie, Paris : Hermès Lavoisier, 2007, En ligne :

[8]   H. Le Crosnier. Internet : la révolution des savoirs. Paris: La Documentation française, 2010.

[9]   J. Davallon, Y. Jeanneret. « La fausse évidence du lien hypertexte », Communication et langages, vol. 140, n 1, 2004, pp. 43–54.

[10]       A. Demailly. “Herbert Simon ou la quête des patterns pour voir et concevoir”, Synergies Monde, n°6, pp. 89-94, 2009.

[11]       J. Farchy, P. Froissart, C. Méadel,  “Sciences. com--libre accès et science ouverte”. Introduction Hermes n°57, 2010, pp. 9-12.

[12]       E. Goffman, A. Kihm, Les rites d'interaction. Paris: Editions de Minuit, 1974.

[13]       E. Goffman, A. Kihm, A. Stigmate: les usages sociaux des handicaps. Paris : Editions de minuit, 1975.

[14]       J. Goody. « L’oralité et écriture », Communication et langages, vol.154, n1, 2007, pp.3-10.

[15]       B. Juanals. La culture de l'information: du livre au numérique, Paris, France : Hermès science publications, 2003.

[16]       P. Lévy. L'intelligence collective: pour une anthropologie du cyberspace. Paris: La Découverte, 1994.

[17]       P. Lévy. Cyberdémocratie. Paris : Odile Jacob, 2002.

[18]       R. Mazza. Introduction to Information Visualization, Springer, 2009.

[19]       J.L. Le Moigne. La modélisation des systèmes complexes, Paris, France : Dunod, 1999.

[20]       J.L. Le Moigne, “Edgar Morin, le genie de la reliance”, Synergies Monde, n°4, 2008, pp 177-184.

[21]       A. Mucchielli. Etude des communications : Approche par la contextualisation. Paris, france : Armand Colin, 2005.

[22]       T.H. Nelson, Dream machines. South Bend, In the distributors, 1974.

[23]       T.H. Nelson. Literary Machines, Sausalito, California : Mindful Press, 1981.

[24]       J.H. Rety « Ecriture d’hypertextes littéraires: approche formelle et approche pragmatique », Actes de H2PTM 2005 Paris créer, jouer, échanger: experiences de réseaux, 2005. En ligne

[25]       F. Rouet. Le livre: mutations d'une industrie culturelle. Paris, France : La Documentation française, 2000.

[26]       A. Schütz. Le chercheur et le quotidien. Paris : Méridiens Klincksieck, 1987.

[27]       H.A. Simon, Models of thought, vol I & II, Yale, Yale university press, 1979.

[28]       H.A. Simon. The Sciences of the Artificial. Cambridge: MIT Press, 1969.

[29]       H.A. Simon, “Sur la complexité des systèmes complexes » in The Philosophy of Science Association vol 2 1977, pp 507-522 trad. Disponible en ligne

[30]       L. Verlaet, La recherche pertinente sur le Web. Concevoir un dispositif d’information adapté aux activités cognitives des lecteurs. Editions Universités Européennes, 2011.

[31]       L. Verlaet, Stratégie de balisage générique pour des bases de données contrôlées : le modèle par l’ASCC. In Amos David, dir. L’Intelligence Économique dans la résolution de problèmes décisionnels, collection Hermès Sciences Publications, Edition Lavoisier, Chapitre 8, 2010, pp.187-212.

[32]       P. Watzlawick, J. Helmick-Beavin, D. Jackson. Une logique de la communication, Paris, France : Seuil, 1979.

[33]       Y. Winkin. La nouvelle communication, Paris, France : Seuil, 1981.




REVUE  E.J.D.E.  ISSN  1776-2960




Article soumis le 17 septembre 2013.

Lise Verlaet is affiliated in LERASS-Céric (EA-827), Université Paul Valéry, Route de Mende, 34 199 Montpellier cedex 5, France. (e-mail :

Sidonie Gallot is affiliated in LERASS-Céric (EA-827), Université Paul Valéry, Route de Mende, 34 199 Montpellier cedex 5, France. (e-mail :