Twitter and Facebook

When I went to Twitter today, it displayed a dialog

We were hoping you could help us make it easier for people to discover
their friends and colleagues on Twitter. Review your settings below to
make sure the people you care about can easily find you.

asking me to update my name, bio, location and email fields.

This suggests that both Twitter and Facebook are insecure about each other, seeing strengths in the other and weaknesses in their own service. Twitter feels threatened by Facebook's focus on a true circle of friends and colleagues. Facebook feels threatened by Twitter's capacity for marketing and building followers in public.

It suggests they may eventually become very similar in the features they offer, with Twitter integrating photos, video, circles of friends and Facebook making their content more public (which they are doing). Perhaps both sites will give users more control about who can see what content.

Labels: , , , ,

Twitter and A Flock of Seagulls, Publishing in a Networked World

I'm not going to name the site that got me starting writing this post. Its a sentiment I've seen on many sites with a traditional publishing orientation. They follow the old tradition from the age of print, where all submitted works are required to be "not published elsewhere," requiring "first print" rights and demanding every "reprint" (copy) should cite the publisher as place of first publication (what is this, vanity?).

These guidelines ignore the reality of the new age of immediacy, of information abundance, of venue abundance, the network. There is no scarcity in publication, there is no value in "first publication" or artificial scarcity on the network. The document is the conversation the conversation is the document. The old publishing world is gone, stop trying to hang on.

The attitude simply does not fit with a universe of networked information being shared and reshared by millions of people, winding its way in bits and pieces and fits and starts through the social network of friends, family, colleagues. The network is the world of social publishing.

Why? Because it is to difficult to find works online among billions of documents and uncounted trillions of ever expanding words. You just can search for things you do not know exist. The social network trades in attention, which is necessary to discover what exists, through your social contacts.

It just does not make sense to "publish" a work to a certain location (or a physical book), then try to get everyone to come read it through clever marketing. It makes no sense to prevent copying, since copies are the method by which information spreads through a social network. The idea of scarcity and exclusivity makes no sense at all in a socially networked world, unless by exclusive you mean being friends with the author.

The network, by the way, does not really need to worry about this issue of citation, since there is usually a trail back to the original author, through a 'retweet path' (if dutifully or automatically maintained) or through carrying authorship information with the work through the social network (as I've talked about here before).

As a poet, nearly every poem I write is immediately published to the social network, so I can't give anyone "first rights" to it, and moreover, that is meaningless. I noticed the Haiku Society of America states, at least for some submissions, " The appearance of poems in online discussion lists or personal Web sites is not considered publication." a much more adaptive policy.

What happens on Twitter is more like a flock of seagulls, making all references to publication, first publication, second publication utterly meaningless, as we tweet to others and they tweet back at us, retweeting and retweeting. I suppose the next thing, is they will want "first tweet" rights. I understand the goal is to keep your publication fresh, but that simply does not fit reality. It says something about a publishing world where the consumer needs to be reassured they are not being "cheated" by recieving old goods, which are turned over from elsewhere, similar to the way "shovelware" became a problem in the 1990s CDROM publishing era. I suppose the same problem exists with bloggers, twitterers, who merely repeat what others write, but I just don't see the problem. In a network world it costs nothing to unfollow or unfriend a source.

Labels: , , , , , , , , , , , , , , ,

The new manuscript culture

In a 'manuscript' culture, the distinction between written and verbal text is not as sharp as it happens in a culture dependent on printing. It appears we are heading into a new era of manuscript culture as social networked content emerges and comes to dominate, as documents become conversations and conversations become documents. In a manuscript culture, such as the period in the West before the invention of movable type, or in China before printing became universal, manuscripts offered a 'more fluid transfer of information' where the copyist (think of 'retweeting' or sharing information socially), could make changes to the text, purposeful or inadvertent, could leave sections out or add new ones, combine with illustrations (as in illuminated manuscripts, perhaps similar to Storybird).

(References http://goodlifezen.com/wp-content/uploads/2007/11/the-road-to-nowhere.pdf)

Labels: , ,

Twitter outage creates panic

According to CNN, the twitter outage left users feeling as if they had lost a limb or left home without their cell phone. It is suggested Twitter needs competition to provide alternatives when an outage occurs, as they inevitably will. There were (or are?) a couple of alternative social microblog services available (is jaiku still running?). Of course, this won't help if multiple sites get attacked at once.

What would help is Google Wave. This outage is an incredible opportunity to demonstrate the potential for resiliency a "federated" or distributed social media system has. Content in the Google Wave universe is independent. Every user can have a copy of a bundle of posts, comments, content on their own device. Multiple copies can exist on different servers. It could be possible for a group to continue working, or at least work offline with their content, during an outage and then when connection is reestablished, the changes can be merged back into the conversation. This is what we can do with email now. We can read messages in our inbox (as long as it is not webmail) even when the network is unavailable. We can keep messages around as long as we want. We can write draft replies, take notes, resuse content through quoting or editing the text we recieve and at any time later, forward to others or send the revisions back to the sender.

What did I do during the panic? I just waited for Twitter to come back up. I only post once a day (if I am feeling up to it).

Labels: , , , ,

You can't just put content behind a blank wall

I caught a discussion of Newscorp's new plan to get users to pay for online news content. It will be difficult to sell news online because there are so many fragmentary ways to get the news for free. If any scheme for getting online users to pay for news does work, it just has to be easy. No matter what news sources online do, they must make paying for the news easy and transparent like iTunes. As easy as putting a coin in a paper box at the corner bus stop. The pricing is not as important as convenience.

Also, the customer must have a feel for the worth of the content before they buy or they must get a cheap bulk subscription so the content is cheap enough to take the irrelevant, incomplete, incompetent or useless with the relevant, complete, competent and useful content. I hate sites that put up a poorly written summary and a login or subscription screen. It breaks the rhythm of navigation on the web when a link leads to nothing. It stops you cold and punishes the user for following a link. It would be a sad web of balkanized content with links as obstacles. If content is to be shuttered behind closed doors, it must be quick and easy to open those doors with some kind of universal pass like OpenID connected to a micropayment system.

It started me thinking again about how to get online users to pay for content again. You can't just put content behind a blank wall and expect it to work. No one will ever find it, be able to search for it, search engines index it. Its not enough to provide a meta data summary like a bibliographic catalog does. Meta data will never be the answer to our search problems, at least not as long as humans are responsible for providing it. Nearly everyone ignores meta data, fails to include it, or includes incomplete or incorrect meta data. Who is going to keep all this meta data up to date? No, this is unworkable. Meta data must be generated automatically from content and that is subject to a high error rate using current technology.

The solution google books provides gets much closer to a real solution to the problem of hidden content. Instead of trying to describe the content using faulty and hard to keep up meta data, why not grant access to a sample of content? This gets much closer to a successful model for selling content online. When I read a book in Google books I get a random sample of pages around my keywords. Each user received a custom sample of content tailored to their interests and needs. In my experience reading a few pages of a book without restriction, as I would in a bookstore, gives a feel for the content. I am more likely to buy the book if it proved useful repeatedly over several searches. Yes, sometimes I find what I want in the sample pages, but I generally bookmark the source, take down the title in my notes and will cite the source in any work derived from the information gleaned for "free" which is actually a fair exchange I think for citation and a link.

I do not understand the hostility and opposition to Google Boooks. I am willing to pay less but buy more books in electronic form for reference purposes. If I find an interesting book in google books but it is not one I would pay $30 for a hardback I would pay $10 to download to my book reader. If I have to pay $30 for one book, it is going to be the one I value most and need the information most, which I want to keep around for a lifetime, not a casual read or reference work.

There are books I would buy on the reader as convenient portable references. I would buy more ebooks at lower cost to fill out my "search space" of texts on my ebook reader. If a book adds to the information I have available on a subject but only partially or tangentially, I can't afford a $30 hardback, but I can afford three $9 works related to my subject to add to the search space on the reader

An idea I had a long time ago, when I was wondering how to pay for hosting my first website, was the "vanishing page" model. This would work a bit like PBS where content slowly disappears unless readers pay a small fee to keep it available. The individual reader does not pay for each content page, butsimilar to donations to PBS, a small number of readers or viewers pays for free access by others (this actually gives the donor a feeling of superiority, if it were not for me...). Mechanically, the web page would be publicly available to all readers and search engines but a count of page views would be kept. Each time the page is viewed the number of views or days left would be decremented by some amount. A button to make instant micro-payents would be displayed ok the page along with a thermometer displaying how close the page is to being removed from the site. If enough people donate, days (or credits, it could be a ratio of views to donations similar to bitorrent) are added to the life of the page, if not, it is replaced by a summary and a button to start donating again.

What we need are ingenious "social engineering" methods to get people to buy content online, similar to the ones used to manage "soft security" on wikis. We need soft methods, like Google Books, which gives readers a peek into books that might interest them.

Labels: , , , , ,

Mixing Conversation and Story

I realize now the real problem I have been working on and off for ten years now is 'conversation' versus 'story', but particularly applicable to journalism. In a way, conversation and story are like oil and water, they do not like to mix. Yet, stories are filled with dialog, or conversations, so why is that journalistic stories cannot contain dialog? Well, when it is an interview, they do. So what we need is a network tool that seamlessly integrates conversation (interview, written dialog, transcript) with story (narrative, reportage, essay and analysis). It looks like Google Wave has the closest technology to achieving this flexible confluence of conversation and story, even the potential for our conversations and stories to be both mobile and distributed. If every smart phone adopted Google Wave, and given that it works similar to email, which mobile computing already provides and is a robust and well-known commodity service, it promises quick adoption avoiding any centralized monopoly.

I envision the same tool could be used by a reporter to do an interview (dialog) and for personal self-expression (dialog, like Twitter, only sharing little bits of information, such as links). An interview consists of dialog, little snippets of information associated by place and time. This has the form of Twitter messages, but a chat application is much better for doing an interview than Twitter, so some new mechanism must be created to accommodate flexible use, moving between story and conversation, between longer and shorter length posts, between collaborative and authored posts.

Labels: , , , , ,

Google Wave and Portable Social Media

A quick observation about Google Wave.

I wrote some time ago about the problem of social media losing its social context as it moves around the digital universe. I thought some mechanism should be created to enable the social context pertaining to a unit of social media to be portable, so it moves along with it. It appears that Google Wave associates the people who pertain to a document (the authors, editors, people with access to view or edit the content, etc.) with the content in a portable way, through its "wavelets" concept.

It seems possible to share or transfer a piece of collaboratively authored content across the Wave system and into other systems with its social context intact. If so, this is a revolutionary step in the evolution of information technology. It gets my vote as the first technology I've seen that truly could be called Web 3.0, as far as I'm concerned.

It would only be right, if you downloaded a image from such a Wave based system to your pc, that it would somehow preserve the social context, perhaps with XML sidecar or embedded meta data, like the EXIF standard for photographs. The content could be uploaded back into a Wave ecosystem with its social context intact, possibly even after local edits.

Labels: ,

Is Your Life Poetry or Nihilism?

ReadWriteWeb asks this question. Poetry is reflective. Journalism also should be reflective (if all journalism were like C-SPAN, we would be better off for it). I am sure we could and perhaps will find ways to mine activity feeds for patterns and other useful information. It may find uses in many fields and places in life, perhaps even in medicine. But the real reason why there is so little reflection on the web is simply because the structures and tools of the web encourage shallow interaction, quick posts, short content, quick reads, quick writes. This is an area I've given some thought to and posted to the blog about it.

What is required is not some new gizmo for finding patterns in bits of trivial data, but tools that encourage people to slow down, to be reflective and create meaningful content. My idea presented here has been of a "quick-slow" system. This system would recognize the importance of brief, concise posts when things are happening (like you've just landed safely in an aircraft with the landing gear stuck and want to tell your friends or the world) and longer, slower, more reflective posts. This system would allow users to post concise messages like Twitter does, but those messages could be expanded on, by expanding the text or by associating longer texts with them. The idea is not entirely new. About ten years ago, I played with a prototype application trying to combine blog and wiki elements. Later, I discovered a more successful project to combine blog and wiki, and an application exists called a bliki.

What I propose is a system like Twitter, which retains its immediacy through a connection to text messaging (cell phones) and the "stream of concise posts" format, yet also provides a way to extend those posts in a meaningful way. Perhaps a user's followers could be allowed to edit the extended content, creating a community of editors and contributors.

What we really need is to encourage people who grew up "network native" to slow down and think before the write, or at the very least, if they have to capture an event or thought with quick, impressionist strokes, they or others should be able to return later after reflection to revise. A kind of "slow news" for journalism, akin to the slow foods movement, asking people to sit down and think a while before they write. This may be asking too much for journalism, but a quick-slow approach could support both quick impressions (what's new) and reflection (analysis). Moreover, this could support a collaborative approach that mixes reportage (the initial concise post, possibly with a picture) and analysis (the associated post, perhaps by an analyst).

The poet Basho revised his haiku many times over the years, sometimes refining the wording and other times he would write a new poem, depicting the same experience from a different aspect. This kind of revision and reflection should be encouraged and supported by technology. Haiku are an ideal model. Brief, concise, experiential, yet through juxtaposition and the many hours of careful writing, they convey higher truths.

I see a number of people writing on Twitter in haiku form, quite a few who are just arranging prose in haiku form and really have no understanding of haiku as an art form (poetry has to say something to be poetry, and say it in a way that affects us). I want to be clear, there is a new form of haiku practice emerging on Twitter, which is akin to the the impressionist movement in painting, where haiku are written on the spot and posted to Twitter from a cell phone. This is a new development in haiku, since most haiku are written down long after the poet has left the place of experience (not always, Basho sometimes wrote haiku and left them behind, but nearly all the haiku that reach us were probably revised many times long after he had visited the location). It bears watching.

Labels: , , , , ,

New Tools for Men of Letters (or Not)

"The art of conversation, with its counterpart the dialogue as a literary form for presenting ideas, has also declined since the days of Galileo, while the art of advertising has advanced. Advertising is easily recognized as the literary form that most completely responds to the technique of the printing press, because it demands, above all else, a numerous and receptive "public" of readers."
New Tools for Men of Letters
The Yale Review, Spring 1935.
Sounds a lot like Twitter, does it not? The success of Twitter is largely due (as has been generally true of web services) because of the possibilities inherent in the medium for promotion and self-promotion, or advertising. Now, since helping independent farms survive is another fascination of mine, I believe using Twitter for self-promotion may be beneficial, but it is important to recognize how much our tools are influenced by advertising. Also, it is important to note how technology shapes culture. Technology often defines what is possible in art or culture, and then shapes its direction and expression (think of the woodcut or electric guitar and the idioms of graphic art and music that sprang from the technology). So Twitter is not always good for us, like eating too much cake, because it is a medium that "demands ... a numerous and receptive 'public' of readers" and authors that meet the demand. Of course, all good authors keep the audience in mind while they write, but Twitter and concise social messaging systems orient our writing and conversation toward the jangle of advertising.

We hear a lot of talk about conversation on the web, but there it seems very lacking in real conversation. I learned recently that Ward Cunningham when he originally envisioned the wiki, believed people would begin with conversation and then shape the results into an article, which would then be refined collaboratively. As it turned out, authorship on most wikis occurs in reverse, with articles being started then shaped by conversation (if we are lucky).

Ever since I made my first foray into the world of networked content and community, starting with bulletin boards and then moving onto the web, I have been fascinated by the idea of capturing expertise and knowledge "lost" in conversations. Forums, discussion groups, bulletin boards, message systems, all formats for conversation are ephemeral. When a person asks a question on a help forum, the answers they receive are generally lost. The web made it possible to ask a search engine a question and bring up one of these threads of conversation archiving knowledge.

Much of the knowledge of experts falls into the category of "folk wisdom" or "folk knowledge." This may upset some rationalists who believe all knowledge is found in books, which are the mechanisms that "separate us from the medieval" by storing knowledge without the requirement of memory. The reality is that many of the solutions for common problems coders face on a daily basis are not written down in books. A book is generally written by an academic about generalities or abstract theories. Or it is a technical cookbook about a particular language or technology. Many of the solutions for little quirks, bugs and problems solved with little tricks or algorithms are passed from one coder to another by oral tradition, sharing code, looking at other people's code or in forums. Coding is not the only professional or practice that this process occurs within, but serves as an example.

One of the great problems of the web is how to capture the knowledge being generated by this process of dialog about small problems. It is a Long Tail problem. It is a "exponential" problem because it consists of very small parts that add up to a larger whole, which exercise a large influence over our life (think of software controlling aircraft of a medical robot). It does not just apply to coding, but to any knowledge.

Not only would it be good to capture this knowledge in a better way than just stumbling on a solution in a forum or blog post, it might prove beneficial to author a work "conversation first," like the old carpenter's adage to measure twice, cut once (of course, real carpenters use a template but that is another story) .

In a way, Twitter achieves the conversation part, but as I've observed before, lacks the means to capture the essence of a valuable conversation (other than favorite tweets). What could be a first step would be to allow favorite tweets to be organized by tag and the browsed like a social bookmarking site. The better solution would be to enable Twitter users to create a wiki page for extending the thought or observation in the tweet collaboratively, perhaps allowing followers to edit the content. That is the idea I will be working on, if I can get some time away from farmfoody.org and folkstreams.net activities.

Labels: ,

By Twine or By Time?

I ran across an interesting answer in an interview about Twine:

[Nova Spivack] I think the above solution would work for this too. Basically you are asking for a new view of the content – not “by twine” or “by time” but “by popularity” or “by relevance to me”.

Notice the question being posed. What he is asking is, why don't you like the view our "intelligence" provided, why do you insist on these existing, simplistic views like by time or popularity?

The last is odd. "Relevance to me" is the primary criteria for all information I want to receive. Even if I don't yet know it is relevant, such as when a person I follow in Twitter shares something I've never seen before and would never have found on my own. Do you understand? Even that is relevant to me. Everything I want is relevant to me.

I understand what they mean though. They mean serendipity. Like overhearing a snatch of conversation in Twitter by seeing posts by friends of your followers, but who you do not follow. But it still is relevant to me, you're just increasing the chaos in my information feed. Perhaps what we need is a "volume control" on chaos in information filtering systems.

Moreover, I suspect that humans being humans, really want to order information in the ways they are familiar with, the way their brain was designed to process information through evolutionary psychology (hmm, this is a new kind of "design" process, contradictory to the meaning of design, but seems appropriate to say design, designed by evolution). The upshot of this is people still want to order things by time or popularity. What other measures are there than the one's we've known?

Authorship: When we buy a book because the author's name is on the spine or cover in 96pt type. We are buying authority.

Sharing. When we "hear it through the grapevine" from our friends. Another high trust information source.

Some finding aids are a form of recommendation, as when we used to go to the reference desk librarian and ask for a book on a subject. This is a kind of sharing.

Look at the role trust plays in gathering and accepting information. Yet, we trust the smartness of crowds (or at least the smartness of cliques) at Wikipedia. I use it all the time and find the information is always a good starting point, usually reliable for technical information.

With trust comes the opportunity for abuse of power. The power of authority to stifle innovation and knowledge, to be used for sustaining false views (think of how the view of the Amazon civilization by anthropologist maintained for a hundred years turned out to be completely wrong and opposite to reality, despite the application of the "scientific method" and mountains of "evidence" all chosen, selected by a reductionist process, which only knows what it measures, can only measure what it sees).

Labels: , , , ,

Trouble in dead trees and inky fingers land

Newspapers and thinking the unthinkable

An excellent analysis of the situation newspaper based journalism is in.

I like the idea of micro-payments for content, such as New York Times articles. The only problem I have with it and why I would be reluctant to use it, is simply that I have to pay for the article before I've read it. Even if I saw an excerpt, it might not be enough to determine whether it is worthwhile or not. A solution for this problem might be found in social networking. I usually read articles my friends share with me (by sending a link in email or chat). I would be much more willing to pay for an article they recommend. Keep the price low and integrate with a social sharing system and it might work as long as the payment is by an "easy button."

The greater problem is content and authorship is changing radically with digital content available through the network, given the unlimited perfect copying and access without distribution. What we are seeing is a working out of the many pieces loosely joined paradigm described a decade ago. The newspaper started as a handwritten piece of paper passed around coffee shops in Enlightened London. I see nothing sacred about its continued existence.

The problem of journalism online is of course that Twitter is the new journalism but the content is too brief, chaotic and frequently idiotic. Micro-blog formats do encourage conciseness and sharp thinking, but they also promote a hyperactive and fragmentary view of subjects. As I wrote in my blog, there needs to be a "slow thought" or "slow news" or such movement (like the Slow Food movement), which you might say is what blogs already give, but not really.

Labels: , , , , , , ,

Stackoverflow.com

There is a good article on the principles driving the development of stackoverflow.com, a site where programmers get help with their coding problems on ReadWriteWeb.

I was particularly struck by the design points where Spolsky highlights the frustration created wrong answers and obsolete results.

I can remember when I was able to circumnavigate the web through a search engine for the topic of history of photography. It was that small. I could see everything there was to see about history of photography online in a week, a week of drudgery wading through duplicate results page after duplicate results page, until I had made sure I had seen everything about my topic. Although filled with a fair amount of junk and duplicates, I was still able to find a single web page if it contained sufficiently unique keywords, until about a year before Google emerged, I had relied on AltaVista to take me back to a web page in one go, when I could not remember where I had found a code solution on some obscure personal page, for example. Then the search engines began to fail me, and single pages I had found before became nearly impossible to find, but eventually, search engine technology improved and with Google, you could find that one blog page with the coding. That was one the solution to the problem of finding things.

Spolsky is right to observe the problem now is that search is failing to distinguish between correct and incorrect answers; between current and obsolete answers to technical questions.

When I first started programming using Microsoft Visual C++ (I was just a dabbler), I had a question about how to render bitmap graphics. I turned to the library of articles and code intended to help developers. I was happy when search quickly turned up an article on how to introduce bitmaps into your application. After an hour or two of reading, it slowly dawned on me the author was not talking about what I was familiar with, Microsoft Foundation Class applications. I was seeing unfamiliar code and unfamiliar techniques. I glanced up at the date. The article was from the mid 1990s. It was about coding C under Windows before MFC was introduced. The first, supposedly most relevant, documents search had brought up from MSDN was completely obsolete and about coding without an application framework. I had wasted hours reading the wrong articles.

Stackoverflow.com is an example of a great site. It is well designed, the developers learned the lessons of the last fifteen years of web technology and applied them. It is clean, beautifully presented and well organized site. I have to admit they did right what I failed to do with phphelp.com, which started by envisioning many of the same goals. They had to courage to go ahead with "soft security," collaborative editing, and content surfacing and valuing through a user voting system. Of course, with the volume of content and edits, such tools are necessary. What two humans could watch and police such a flow of content while doing their day job? User contributed and curated content is the only rational answer.

(By the way, it would probably be better to describe their principles as being informed by behavioral economics or an evolutionary branch of the field, than anthropology or social psychology, I feel the way people use voting systems to surface content, how "soft" social engineering strategies are employed on wikis, etc. to be close to the phenomena studied by behavioral economics, not just financial choices.)

Labels: , , , , , ,

Graphic Recording

I didn't know it, but all my life I've engaged in "graphic recording" when it came to exploring new ideas or learning. I never went as far as the artists who made a series of recordings for the sustainable agriculture and food conference, but my subjects were technical, and I was a technical kid growing up, so my "confections," as Tufte calls them, were more mathematical, graphical and textual in nature. I used them to illustrate things to myself, like working out visually how cycles represent waveforms in musical instruments. Now, I see them as graphic recordings. I was a bit ashamed of them, since I thought it meant that I wasn't a good learner and tried to suppress or limit them. That was a mistake.

The drawings are simply wonderful and I got put onto them by Brenda Dawson who tweeted about the graphical recordings made for the March 29 2009 conference
Inaugural National Symposium on Food Systems and Sustainability at the University of California, Davis. How much better a "presentation" these graphic recordings make than a PowerPoint presentation!

These drawings are a lot like my vision for an information system, called Strands, which would be as thick and filled with complexity as the Talmud and as visually expressive as these graphic recordings. If only the web could be like this. When I think of Twitter and Tabloo, if they could be combined, I think we'd be close. Tabloo enables users to create visual narratives (through the structure and relationship, size and aspects of images) and Twitter enables users to create conversations out of small fragments of thought flowing continually.

Labels: , , , , , , , , , , , , , , , , ,

Twitter and the Principles of Illusion

It is worth noting the two guiding principles of illusion are "suppressing context" and "preventing reflective analysis" (according to Tufte, in Visual Explanations). The first applies also to the ubiquitous photographic image, nearly all of which appear without context. A situation that apparently few people find troubling. A good example of the phenomena is the iconic image from the Vietnam war of the Viet Cong operative being summarily executed by a village officer. The photographer who took the picture often wished he hadn't because of the damage the image did when used out of context (as was the usual case). Several iconic images from the Vietnam war were frequently presented without context. It was left up to the viewer to interpret and may very well be people at the time did not want to know the context, enabling them to press the image into service of their political aspirations or personal, psychological needs. Visual media is inherently weak at providing context.

The emergence of email, web discussion forums, short messages, video sharing, all network native forms of communication create an environment hostile to reflective analysis. What is needed to alleviate this trend is a movement akin to the "slow foods" movement, perhaps a "slow media" movement, asking people to slow down, consider context and think reflectively within a network information ecosystem. The content of a Twitter stream can be informative, but it can also be trivial, and despite its benefits, it does not encourage reflective analysis. I personally find a tweets (Twitter messages) are frequently a touchstone for an innovative thought, connecting me to something I did not know and probably would not have had someone not passed on an interesting web link or thought out loud. But it would still be nice to pull wisdom from the ether by capturing tweets in some reflective and expandable form.

Although not yet a visual medium, these concise messaging and blogging systems are most attractive to television journalists. A quick turn Twitter before the commercial graces many newscasts. These context free nuggets are ideally suited to a medium described as a "wasteland" and it troubles me that networked content has been so eagerly adopted by television news shows. It points out the need for reflection and context in networked short message content.


I have explored this theme before (see Twitter Wiki, "quick-slow" bliki articles previously). The question is how to accommodate the fragmentary, context free units of networked content and encourage expressions of context and reflection to balance them. It is a daunting task because people often do not see a need for context or reflection and are often unwilling to bother with the story behind a photograph or take time to expand on a short message.

We need to accommodate the uses for which short messages are legitimate and when they are beneficial (such as the conciseness they encourage...concise writing requires reflective analysis before posting, you must know your subject well to pare it down to its essentials and wordiness often just adds confusion...we must be prepared for abuse of longer text forms connected to short text forms). But also we must make it possible for reflection to take place. The "quick-slow" approach to networked content systems encompasses this. We can then turn the two principles to our advantage, by encouraging their opposites context and reflection.

Labels: , , , ,

Biological Construction and Networked Content Creation

The order and symmetry of biologically created structures, such as an egg or the human body, are expressions of how correctly those biological systems worked to construct the natural artifact. Biological organisms are collections of cells cooperating with each other. The order and correctness is an expression of the successfulness of the collaboration.

An egg comes out more egg-like when the biological processed working to make it cooperate and collaborate more correctly in its construction. I believe this has implications for the collaborative processes operating in networked software development and information science. The biological process of construction is inherently different than the one humans have inherited from their tool making and industrial heritage. What will we make of it?

Labels: , , , , , , , , , ,

Where are we going?

The issue of whether people should pay for forums or not came up on dpreview. With the current economy, I expect how to pay the bills will be a growing question for many web services.

The problem is with forums there is perfect competition. Anyone can setup a forum and run it for next to nothing. If one forum decides to charge a fee, the users can flee to another forum. The only reason they might stay is because of the audience. For example, photographers pay for to host their photographs on Flickr primarily because it provides a rich audience of people who love to look at still photographs. Flickr is the Life and Look magazine of our time, it is the revival of the great picture magazines, not because of its technology (that helped orient the site in the right direction to succeed, just look at the abject failure of Picasa to be social---too little too late). Flickr just happened to be where most people who like to look at pictures gathered, mostly because of its blog-like streams of every changing pictures and social tools. It is easier to pay a small fee to use Flickr (perhaps even to "read" it) than it would be to overcome the "capital" costs of changing sites. Flickr users have a lot invested in Flickr and it might just cost less to stay and pay than to move elsewhere. Besides, there is no where else to move. The closest thing I could see to Flickr would be for every photographer to put up their own photo blog software and then join photoblogs.org, which would become the "magazine" and "social hub." This is a distributed vision of photo sharing online. I used to wonder which would be successful. But it really was simple, Flickr did it all for you, some for free, a little more for pay, well worth it to promote your photography.

Despite the somewhat juvenile and absurd environment of Flickr with regard to art photography (you know, the dozens of people giving out "Great Photograph" awards to pedestrian, derivative and mediocre images mostly to promote themselves or because they are too young to know what a derivative image is), it is useful to professional photographers and art photographers because Flickr is where the eyeballs are. It attracts people who still love still photography, which in this age of video, is a bit of a miracle that anyone takes an interest in photography. However, photographs can make the world sit still long enough for people to pay attention, and that is a very similar experience to poetry, which at least in part, is there to draw attention to things. I've heard from professional photographers they get an order of magnitude more requests or work through Flickr than through one of the professional portfolio sites.

One reason, perhaps the principal one, Henri Cartier Bresson and other great photographers became well known, was through their images being published in the great picture magazines. When television came along, the picture magazines went into decline. Photojournalism began its long decline at this time, for the simple reason people could learn about their world visually through television, a more attention grabbing (the barrier to entry for television was lower, you didn't have to be intelligent to watch it, a good example where low barrier of entry is destructive to society) and free medium. Without the picture magazines it was no longer possible for a photographer of acknowledged artistic merit to become known and their images have significance in society. The audience was gone. Flickr reestablishes this audience.

So the question still stands. Will people in the future pay for their online content. Pay to create it. Pay to consume it. What is happening now? People are already paying to create content. They pay for a Flickr account with better tools. They pay for services to create graphics, three dee art, property in virtual communities. A few sites charge for reading content, but not many. But given human history and the recent past, when most content was paid for, in newspapers, books and magazines (except for tv), it seems reasonable to assume the free ride will be over someday.

There may be a tipping point when a non-pay site is no longer competitive. When most good content has gone to pay sites and the community of interest for that content willing to pay is consuming all they can (this is what happens with books and magazines today), the other sources will be driven out in a kind of perfect competition. The free sites will be filled with garbage and what passes for content on local cable access.

The network is not the old traditional world of libraries and publishers. It will be different. Project Gutenberg. Open source projects. The collections of enthusiasts sick and tired of the crap shoveled out by the traditional content and software businesses have taken it on their own to produce quality products where the marketplace would not or could not. This is an order of magnitude different than the pre-networked world, where people could not work together, providing little bits of effort or expertise to collaboratively create a cultural artifact. This is entirely new and we don't know where its going.

As an aside, the idea of tipping or donation comes up. Frustrated with no way to fund my original website, I considered taking a modern high tech variation on the PBS approach. I considered (in the 1990s) creating a content management system where each article would have a countdown timer displayed like a reverse donation thermometer. If you didn't contribute something to the article, it would count down, when it reached zero, the page would be pulled from the site. Of course, the ability to cache networked content presents a threat to such schemes, the wayback machine can regurgitate considerable missing content and so can the Google search cache. What about caching? If the Wikipedia were to dry up funding and blow away today, would its content still remain available in a myriad of niches around the network? On people's computers, disks, servers here and there, in caches? Would it evolve another life in a peer to peer environment? Will all information become distributed over billions of cell phones and have no location at all?

Labels: , , , , , , , , , , ,

Twitter is a 'starfish' enabler

Twitter is a 'starfish' enabler. It's what makes Twitter powerful and empowers those who use Twitter. It puts individuals at the center of the star.

Twitter friends (followers) are more like information flows you choose, organizing the flow of information for yourself and others, curating, editing, creating than other social network friends, which are more passive, something you collect or at most create a space to explore. This is because friends/followers bring content to you automatically. It is the flows of information resulting from following that make Twitter different from other social networks.

I didn't know much about Twitter when we started designing Farmfoody.org and thought it was something to do with short text messages on cell phones. I am currently integrating Twitter into farmfoody.org, after having considered a Facebook social feed model and finding it overly complex and confusing. We need as low a barrier to participation as possible. Farmers don't have time for complex systems, blogging, social feeds with posts and comments and threads and six dfferent types of publishing and bold and italic.

Neither do people standing at a farm stand with a bag of white corn tucked in their arm have time for complexity. It turns out that the social bulletin system we were envisioning two years ago exactly describes the information flows in Twitter. The way your friends (followers) tweets (messages) aggregate on the Twitter homepage is identical to how we envisioned messages from our users collecting on the user's profile page. In our bulletin system, all the friends of a user receive a bulletin, similar to the "homepage" on Twitter, creating an information flow. The only difference is bulletins are like craigslist ads and expire. That original requirement for bulletings to operate as classified ads with an expiration date, similar to craigslist, held us back. I should have looked into Twitter integration then, since we would not have needed to develop one of our own.

Labels: , , , , , ,

A Twitter Wiki

As the popularity of short, fragmentary messages grows, I have become concerned the public conversation may lose the capacity for thoughtfulness and reflection. At the same time, I would like to caution those who condemn Twitter or other systems based on micro content to not throw the baby out with the bath water. The long form newspaper article found in the New York Times or Washington Post contains a lot of material used to provide background for the reader, often at the end of the article. Not only is this text boring and redundant to the knowledgeable reader, it takes up previous space. The one thing the web is good at is connecting one piece of knowledge to a broader context of other pieces of knowledge. There is no sane reason to continue repeating background and further reading material in a long form newspaper article when on the web, a writer can simply link to the information.

The brief, concise texts of micro content can be connected to many other sources of information, some just as concise (a kind of "blizzard" of small pieces connected loosely) as well as to other longer, deeper and reflective sources. This loose, disjoint and connected type of writing is simply the network native way of writing and connecting information. It is beneficial, as long as both kinds of writing and forms of content are available and can be connected.

My concern is really with lowering the barrier of entry, enabling and encouraging those longer, deeper and reflective forms of writing. I recognize that there are benefits from shorter, more concise writing, which leaves redundant, expansive or source material hidden (properly) under a link or conntected through a network of tags or a network of people. Perhaps will will see fewer long texts divided up by headings and sections and more smaller texts connected together through search, tags and linkages into a variety of wholes, determined by the user's interests and needs.

About ten years ago, I was fascinated by the idea of a long text (article, book, etc.) entirely constructed of fragments, similar to the kind of texts you see posted on Twitter today, which could be freely rearranged similar to those magnets used to write poetry on refrigerator doors. I imagined that instead of writing a large text with a single coherent whole, they way books have always been written, the pieces of information on a topic could be combined to create a "book" in innumerable ways by rearranging those pieces.

It would be like taking all the paragraphs in a book, shaking them out on to the floor, and then allowing or enabling those pieces to be rearranged for each reader or interest. The pieces would be tied together by keyword or by search result and only lastly by links. I coded a small prototype application called Strands to test the idea, but work and life caught up with me and I shelved it. I was and am still surprised by the ease and rapidity with which people have adopted Twitter.

Not only are people using Twitter, despite the fragmentary nature of its texts, they are participating creatively in shaping the technology and usage of this kind of system based on fragmentary texts.

The use of tagging emerged spontaneously from the user base. Using "hashtags" brief texts can be connected to media, such as images and video, with the tag at the center of a network of content.

Also, I've noticed users are starting to fit the tag word into their text. Some examples are:

"Young Nebraska farmer explains how limiting direct payments would affect his #farm at www.nefb.org"
(Tweet from http://twitter.com/farmradio)

and

"farmanddairyGet four issues of #Farm and Dairy FREE! Click on the big promo on our home page: http://www.farmanddairy.com/"
(Tweet from http://twitter.com/farmanddairy)

At the heart of my Strands prototype were small texts connected by keywords. I wanted to create the lowest possible barrier of entry, so a user could create a keyword (essentially a tag, I called them "strand words") just by writing it into the text. In this system, what was essentially a tag was created by writing it (texts were scanned on post or edit for the presences of tags and any new ones added to an index), which is hauntingly similar to how people have started using tags on Twitter. They started out adding the tags to the end of a message, but have now begun incorporating them directly into the flow of text. I hesitated to continue working in this direction on Strands, partly because I expected people would find the tags sprinkled through the text troublesome.

My current interest is in providing tools or ideas that will encourage and enable a society addicted to short messages, however beneficial they may be, however native to the networked way of writing and reading in a connected fashion, to engage in greater contextualization and thoughtful reflection, to enable collecting some of the knowledge quickly flying by in the "Twitterverse" into slower, more reflective pools of knowledge, like eddies on the edges of a fast flowing stream.

The first tool I want to build is a "Twitter Wiki" enabling anyone to associate a text of any length with a Tweet and anyone to edit it. If I have the energy, I will post any experiments on my site or at least attempt to describe it.

Labels: , , , , , ,