|
Rhetoric, Epistemology and the 'Net: The Ethics Of Web Publishing
Marc Demarest
November 1996 |
|
We booksellers, if we are faithful
to our task, are trying to destroy, and are helping to destroy,
all kinds of confusion, and are aiding our great Taskmaster to
reduce the world into order, and beauty, and harmony
Daniel Macmillan, Memoir of Daniel Macmillan (1882)
Zeta interdimensional spacecraft will beam up people as an evacuation of the Earth. Everything from our Etheric body up will be collected (our physical body being left behind) and we will use the hybrid Zeta-human physical bodies as a 'shell' to adjust and transform to beings of Light until the new dimension is fully prepared. Posting to an Internet list service (1996) |
|
How Do We Know?
Information wants to be free. And information,
being free, is also dangerous, destabilizing, and potentially
deadly.
Pierre Salinger's recent attempt to
use documents published on the Internet to bolster his claims
that TWA Flight 800 was downed by one or more US missiles highlights
one of the fundamental problems with the Web as a community of
self-publishers: it is no longer possible for anyone, no matter
what their level of sophistication or education, to believe what
they read, to know, in fundamentally actionable ways, that we
can act on the basis of information and not do substantial harm
to ourselves or others.
I would argue that this loss of epistemological
sureity is due, in large part, to the fact that the 'Net, and
particularly the Web, truncates an older system of publication
that - whatever its limitations - acted as a significantly valuable
safeguard against the publication of "information" that
was, in one form or another, toxic to the reader: productive of,
as Daniel Macmillan suggested, "confusion" in one for
or another. The people operating inside the conventional publishing
system - whatever their politics or mercenary motives - also abide,
in the main, by codes of conduct and moral standards that, in
aggregate, protect readers from toxic information by suppressing
it prior to publication.
The Web doesn't do that: toxic information
is scattered all over the Web, in obvious and not-so-obvious forms.
Our challenge, if we want to see the Web grow and fulfill its
communal self-publication potential, is to
We also have to understand, in rock-bottom fundamental ways, what new kinds of "reading skills" the Web requires of us and future generations, and begin to teach the hermeneutic and rhetorical skills required to produce proficient, critical Web readers.
A Thumbnail History Of The Publishing System
If we think about it for a while, it
becomes apparent that the World Wide Web represents a radical
truncation of the value chain that stands between an author --
a person with content to publish and the desire to publish it
-- and that author's audience.
As a rough historical framework, we
might think about the evolution of this writer-reader relationship
as follows:
The history of the publishing value
chain is, then, a history of the distancing of writer from reader,
and a history of an increasingly elaborate set of filters and
stabilizing mechanisms in the chain between reader and writer,
most of which served primarily economic functions and operated
in one way or another on the saleability of the publisher's product:
the book. This network had other, perhaps less intentional effects. A thumbnail critique of the traditional publishing value chain -- the way books get to the shelves of bookstores, or articles into the pages of newspapers or magazine -- might look something like this:
The Web Versus Traditional Publishing
Seen in the light of historical schema
and critiques like this, the Web is a both radical return to a
point in the value chain evolution before the advent of
the printer, and a superior system of publication.
In some very significant ways, the Web
permits every author to act as her own printer, and the Web publishing
infrastructure can be subjected to scrutiny and control, as was
the printer in the 1700s and early 1800s, only by focusing on
the point of presence (POP) provider used by the author/printer.
The Web's claim to superiority, when compared to the traditional
publication value chain, seems to be that the Web does not discriminate:
that publication is a fundamental democratic right, that the traditional
publication value chain excludes more voices and views than it
admits to public discourse (there is after all a strong correlation,
in any large social context, between one's ability to publish
to an audience and one's ability to make one's self heard to any
effect), that the Web allows anyone with a command of a language
(English, mostly) and the ability to use a few relatively simple
tools (a browser, HTML) to self-publish: to be heard in a forum
that is used by 20 million people or so worldwide.
All of this is undoubtedly good, from
a political perspective, at least at first glance. It's clear
to most people what we gain when we begin to contemplate the Web
as the basis for ubiquitous world-wide self-publication.
The questions I want to raise are:
The Value Of The Traditional Publishing Value ChainIf the traditional publishing value chain has, over the last two hundred years or so, censored and suppressed information that ought to have been made available to the public, and participated in various ways in the publication of misinformation and disinformation, that value system has also worked to stabilize and make reliable most of what constitutes "knowledge" in the West today.
Although we like to say, "don't
believe everything you read," the fact is that we do believe
- that is to say, we are willing to act - on much of what we read.
An article in Mother Earth News about the efficacy of a particular
herb in curing headaches, for example, leads a substantial percentage
of the readership of that publication to try the herb, without
much conscious thought about whether the herb will, in addition
to curing headaches, cause brain damage or hair loss. Because
the publishing system that produces Mother Earth News has both
vetting functions (editors) and legal status, its readers can
make use of the information it provides with some degree of certainty
that the information in the article is:
Cases when these rules are found not
to apply - as was the case, for example, several years ago, with
a 60 Minutes story on Washington apples - are (a) widely reported,
and (b) swiftly remedied.
From a theoretical perspective, the
traditional publishing system answers, implicitly, some very fundamental
questions for the reader about what we might call quanta
of information. Quanta of information, in our society, draw boundaries
around themselves, assert implicitly or explicitly their completeness
or self-sufficiency, and can be transported as such. Newspaper
articles, books, journal articles, Web pages, mail messages, news
postings, documents are all quanta.
About these quanta, the traditional
publishing value chain provides the reader with several kinds
of warrantees:
We don't need to go any father than
the critiques of Noam Chomsky and others to understand that what
is true of the traditional publishing system in the large is not
true in the small. Newspapers have blatant political biases and
exercise sociopolitical agenda in their editorial work; magazines
more than occasionally collaborate with governments and corporations
to 'spin' particular kinds of information to particular ends,
and publishers are sometimes the willing agents of people and
organizations with behavioral designs on readers. None of these
critiques however, can shake the fundamental epistemological tenet
of Western culture: if it's in a book, it must be good and
true, and if it turns out to be false, I can do something about
it.
The Shortcomings Of The Web Publishing System
About a Web page, or a posting on an
Internet list service, the reader cannot say the same thing.
First of all, the rhetorical warrantee provided by the traditional publishing value chain is completely undone by the Web. Anyone - anyone that is with access to the Web - can say anything she likes about any topic without any credentials check. One need look no further than the fantastical conspiracy theory communities thriving on the Web to be convinced that people who know virtually nothing about the history of the planet are hacking, dismembering and reconstituting the world-historical record with impunity, and for an eager - if smallish - audience. Consider, for example, this exerpt from a recent list service posting:
Posted by: ray@strategicsw.com Posted to: SNETNEWS@XBN.SHORE.NET Posted on: Wednesday, February 28, 1996 4:31 PM What is the New World Order? 1700's - Illuminati (Adam Weishaupt-Founder, Jesuit Priest and Freemason) Illuminati name translated to "bearers of the light" - Lumen derived from Lucifer, ancient "angel of light" spoken of in the old testament. 1800's - FreeMason/Illuminati Organizations: Rothschilds/Jacob Schiff Nathan Rothschild vows to kill Czar of Russia and his family. 1900's - Illuminati: Rothschilds/Cune, Loeb & Co. (Jacob Schiff)/ Rockefellers 1913 - Federal Reserve Act put into law by Rockefellers on Dec 24 1913. Only 3 congressmen were present as it was Christmas Eve. 1913 - 16th Amendment (IRS TAXES) added to Constitution. 1917 - Czar of Russia is killed by Bolshevik revolutionaries. Lenin, Trotsky and Stalin are financially backed by Jacob Schiff with 20M in Gold (paid by Rothschilds/Illuminati). 1920 - League of Nations proposed by Woodrow Wilson. 1921 - COUNCIL ON FOREIGN RELATIONS (CFR) created by Rockefellers/ New York, deemed as Illuminati Organization in US. 1921 - Royal Institute for International Affairs created by Rothschilds/ London. 1929 - Rothschild/Rockefellers/Carnegie/Morgan (CFR) created stock market crash, worldwide depression ensues. 1933 - President Roosevelt (CFR) declares US. bankrupt - Signs over US. monetary power to world bankers (Rothschilds/Rockefellers - Illuminati) 1939 - Hitler Invades Poland - Financial backing by Rothschilds/Warburgs/ Krupps. 1939 - Rothschild companies financially back both Hitler and Stalin for World War II. 1941 - US. enters World War II (planned by Rothschild/Schiff/Rockefeller/ Roosevelt.
Rhetorically, this quantum presents itself as historical fact, when it is really an egregiously improbable
(mis)interpretation, and quite certainly dangerous to employ as
fact in most normal social contexts [1].
Secondly, the epistemological guarantee
provided by the traditional publishing value chain is completely
undone by the Web. Any quantum of information found on the Web
has to be assumed, by a careful reader, to be both incomplete
(missing information of a fundamental sort) and inaccurate (that
is to say, subject to independent verification). Finally, the Web offers its readers no real warrantee of redress. If I find an Internet information quantum (like the one quoted at the beginning of this essay) that suggests we are about to be taken into the heavens by a benevolent alien race to find spiritual fulfillment, I cannot seek redress. I cannot have that Web page decommissioned - that is to say, have that content removed from circulation - unless it libels me personally. I cannot enter into the community from which the quantum originated and refute it, in part because part of the fundamental rhetoric of that community is that anyone offering contradiction to such tripe is by definition an "asset": an agent of the conspiracy to hide, in this case, the imminent arrival of the Zeta-Reticulans. And even if I could refute the quantum within the community, I have no ability to force the author to retract publicly his publication.
The Product Of Web Publishing: Gems, Junk and Poison
Like the traditional publishing value
chain, the Web publishing system has to date yielded a relatively
few superlative works, and a sea of crap: badly-written, unimportant,
meandering, blathering, forgettable junk. That is the nature,
as far as I can see, of publishing systems generically, and not
something remediable.
However, the Web produces in abundance
what the traditional publishing value chain produced only rarely:
toxic information, quanta that if believed and acted on
will produce real material harm to the reader, the reader's community
or society at large.
And when the Web produces this toxic
knowledge, it cannot (as the traditional publishing value chain
nearly always does) clean itself up; in fact, the Web is structured
so as to spread, rather than contain, the toxicity.
Who Censors Whom?
These shortcomings of the Web have been used, and will continue to be used, as grounds for centralized
censorship of Web content. That is not what I want to argue
for, if by censorship one means the centralized vetting
of Web content by some organization.
What I want to argue for is:
These two things will produce a kind of censorship: readers who are capable of censoring writers by dismissing or discounting Web quanta that, on the basis of critical examination, are not "information" but either misinformation, disinformation or decontextualized information that, if acted on, produce undesirable consequences for the reader or others. This distributed, reader-based censorship is the only kind of censorship I can see that is in keeping with the fundamental philosophy of the Web.
A New Model Of InformationFirst of all, we need to clarify the
terms information, misinformation and disinformation.
Secondly, we need to revisit the issue of context. Most Web pages and Internet postings are
decontextualized: their authors do not fully describe the context
in which their comments should be situated (including pointers
to contrary or mediating statements) and the wanton habits of
forwarding and reposting exacerbate this decontextualization by
allowing forwarders to "re-author" texts by clipping
and chopping them, for the most part silently. Where we have,
consciously or otherwise, assumed that traditional publishing
systems produced contextualized information, we have to assume,
as state of nature, that all 'Net-based publications are decontextualized,
and we have to provide mechanisms within the Web itself for readers
to apply context to pages and postings: to do what hypertext theorists
have called annealing.
Part of this contextualization model is a distinction between narrow and broad contexts. The
narrow context -- who is speaking, speaking about what, speaking
to whom - can often be reconstructed by a well-educated, well-equipped
reader (see below), but the broad context -- what discourse is
this speech-act a part of, and what is the history of this discourse
- can almost never be reconstructed by anyone (often including
the author). As suggested below, the Web publisher has the responsibility
for explicitly exposing the narrow context of any published quantum,
and we need to build into the Web a mechanism for allowing readers
to expose, to the extent they are able, the broader context of
which any quantum is a part. Thirdly, we need to revisit the fundamental question of veracity: can this quantum be true under any probable set of circumstances, and can it be acted on? This is somewhat beyond conventional notions of epistemology, in which justified, true belief is what constitutes knowledge. Because the Web is behavioral - because people come to the Web looking for information so as to be able to behave in a particular way - we have to ask not just "Is this justified, true belief?" but "Can I act on the basis of this quantum, and to what effect?"
That discussion will lead us quickly
to the discovery of a continuum between benign misinformation
and toxic misinformation or disinformation: a qualitative estimation
of the effect a quantum of information may have, in or out of
context, on a reader/auditor not able to evaluate the narrow or
broad context of the quantum, or the veracity of the quantum itself.
Out of this discussion about what information,
in being free, has become, we would come, I think, to several
inescapable conclusions:
A New Model Of Ethical Publishing
Web authors should, to optimize their
readers' abilities to factor and evaluate material published,
should always:
None of these guidelines will prevent
the spread of toxic information so prevalent on the 'Net, so I
would in addition suggest that some non-profit organization design
and install on the Web a consumer safety system
in which Web publishers
and 'Net posters voluntarily register their acts of publication,
and submit their materials to evaluation by their readers, and
within which any reader can retrieve peer-reader rating/evaluation
information on any page or posting.
A New Model Of Reading
This new model of reading strikes me
as critical to future social welfare. Although most adult readers,
stumbling upon the examples of toxic information I have noted
above, would be immediately able to spot it for what it is - drivel
- I am convinced, by watching my sone and other younger readers
navigate the Web, that we are raising a generation of children
so poor in critical, analytical, hermeneutic and historical skills
that we will be faced, in a decade or so, with millions of readers
who are not able to differentiate between the veracity of a NASA
finding about life on Mars, and the phantasies of the Zeta-Reticulan
fans.
To remedy this situation, I suggest
we need to do at least the following:
Ultimately, the Web succeeds as the
publishing system for the next several generations only if it
is used by readers who understand how the Web operates, what it
provides, and what it cannot provide, and who further understand
their obligations as Web readers (and Web authors).
Conclusions
These remarks are incomplete, and, frankly,
motivated by fear. That people believe in Zeta Reticulans does
not cause me heartache (or amusement); that a respected journalist
and former Presidential press secretary doesn't know how to use,
and not use, information published on the Internet frightens me
to death. My objective in publishing this is simple: to begin
a discussion, to spark thought and hopefully additional work in
these areas.
People will no doubt argue that I have
made use of edge-case examples; that UFOs and Illuminati conspiracies
do not reflect the state of knowledge on the Web today. For readers,
I offer a list of sites that I believe contain toxic information,
and leave it to those readers to judge whether or not this problem
is confined to the fringes of Web discourse.
People will also argue that I have made
a difficult problem - how do we know what is true? -- too easy.
That may be true, since I am concerned not so much with truth
as with what happens when untutored readers ingest junk and act
on it. It is certainly easy to make this sort of problem infinitely
more difficult by playing epistemological games - how, for example,
do I know that the person who published the tripe on Zeta Reticulans
isn't telling the truth? How do I know the Rothschilds don't control
the world monetary system? How do I know the Holocaust really
happened? How do I know that Darwin is right? How do I know that
sticking a coat hanger in one's ear is a bad thing to do?
The practical test is a simple one:
if we can without damage to ourselves, others and our social groups
act on a quantum of information published, then it is admissable
to discourse. If we cannot act on it, the quantum is benign (and
pointless). If acting on it harms us - whether harm is intellectual
confusion or death - or harms others, the quantum is toxic information,
and ought to be expunged from public discourse.
I challenge anyone to demonstrate that
a country in which Zeta Reticulan-faciliated exodus is a matter
of fact, acted on by the citizenry, can play a meaningful productive
role in the world, produce healthy, well-adjusted citizens, conduct
meaningful scientific or social endeavors, or do anything other
than degenerate into a society of glossalalian dingbats. In closing, something from Nietzsche, to light the way:
NOTES
[2] Citations continue to be the best
mechanism, on the Web, to judge the veracity of postings and pages.
Scurrilous tripe nearly always has no citations beyond the unnamed
"informed sources" that are the sine qua non
of bullshit.
[3] These e-texts are an example of
toxicity operating at a silent level. The way a text becomes an
e-text is determined by copyright laws. People interested in,
say, putting the work of Joseph Conrad on the Web will be looking
for versions of his texts no longer subject to copyright restriction.
The fact that the sources of these texts - say, the first edition
of his first novel, Almayer's Folly, -- are hopelessly
polluted, when compared to the manuscript or to later, authorially-edited
editions, by bad printing, silent editorial amendation, wholesale
omission, and other kinds of errors is almost never pointed out
by the e-text creator. In fact, it is very difficult to find e-texts
that even explicitly cite their origin, or discuss the textual
variants associated with the e-text. Those variants are often
significant, as is the case with, say, Shakespeare or Thomas Wolfe.
Admitting bad literature to the electronic canon pollutes the
canon, and can materially mislead readers. |
|
Last updated on 06-22-97 by Marc Demarest (marc@noumenal.com) The authoritative source of this document is http://www.noumenal.com/marc/toxic.html
|