Why Twitter is just an awful amalgamated mess for me

Twitter is described as new media and, more specifically, a new medium.  It’s a replacement for RSS, or so many say , eliminating the need for most blogs, etc.  Here’s why I don’t buy that.  When news flows to me, I really want to be able to scan the news from the time I left off to the time I’ve picked it up again.  The real world analog would be knowing that yesterday’s news was in yesterday’s paper, and today’s news is in today’s paper.  If I want to skip ahead and ignore yesterday’s news, I can, but if I want to go back that’s an option as well.  When I’m done, I can throw away the papers as acknowledgement I’ve read them.  In the RSS world, most readers have already implemented this type of workflow, but Twitter clients universally treat Twitter as a stream and do not give you the ability to easily establish what you’ve read and what you haven’t.  Ironically, the last link even advocates that this is good for workflow, and I agree if you consider the amount of traffic that I used to get in my RSS reader when blogs were as much inane chatter as Twitter holds today.  However, with the absence of noisy traffic, RSS has become an extremely high signal to noise method of consuming the news, and the “time value decay” as Tom Tunguz calls it really doesn’t work for me when consuming items I actually want to read.

For a stream of “chat”, which is essentially what Twitter is, to come streaming by where I’m going to assume the communication is very low value, then I think it should behave like oldschool IRC and chatrooms, or like Campfire for the hipper web-guys.  In that model, conversation organizes around topics.  I don’t have a bunch of people just shouting out hoping people will follow them, people individually join conversation topics, some choosing to stay there and establish an identity for themselves and their handle around that topic, and others flying from topic to topic.  Either way, Twitter doesn’t work in that model.  Sure, there’s hashtags, but frankly every time I follow a hashtag I rarely see a conversation, more like that scene from Monty Python’s Life of Brian where you walk down the street and hear all these soothsayers and prophets spouting prognostications… they seem like a hashtag to me.

I’d like to see a de-centralized protocol that gives us an IRC for the web.  Short messages, but sent and hosted on our own servers, with the protocol bringing together the conversation in a way to subscribe to the entire “channel.”  Perhaps rather than being standing topics, they’d organize around a particular news item.  We could all contribute to the discussion, and the discussion would remain archived at that particular URL but the content would be contributed from each of our individual servers.  If for some reason we wanted to remove our portion of the conversation, we’d simply delete it from our server.  Web pages already routinely pull content from dozens of sites, so this doesn’t seem unfeasible.  I think Dave Winer has some very similar ideas, and I see him writing about them, but I’m at a loss to know whether we have the same idea.  I wish he had a place where his new vision for Internet conversation that he’s working on was clearly articulated very succinctly.

One other thing I’ve been meaning to mention, is why isn’t anything developed as a standard anymore?  When I was first coming onto the Internet, everything used open standards.  Now everything is proprietary.  I recently read Dave Winer’s piece from 1994 called “Bill Gates versus The Internet,” which isn’t really all that relevant for this discussion, except he said:

The next versions of Windows, Macintosh and OS/2 are all Internet clients, with the standards supported — Gopher, WAIS, FTP, Telnet, Mosaic, news groups, etc. It’s an incredible thing because none of the platform vendors had any say in the definition of these standards!

In 1994, the standards were winning, and now it’s time to start the next wave of new standards.  The type of conversation I’ve envisioning starts with a news topic or a blog post, and organizes itself, with content fed from any open source, and is discoverable on the web, indexable, and permanently referenceable.  I think we should build it.  Where can we bootstrap the conversation to build it?  Is Dave already building that?


My FireAnt Story

So, if you hadn’t seen the news FireAnt was acquired by Sonic Mountain (Odeo).  You can read recaps of the news on two of my favorite blog networks, NewTeeVee (run by Om Malik), and Tech Crunch (by Mike Arrington).

 I came to be involved in FireAnt through my connections to Jay Dedman and Josh Kinberg.  We had some discussions at Vloggercon in July of 2005 which extended into the following months involving my helping them get FireAnt off the ground.  I had started a project I was calling MediaFeedr, which would poll RSS feeds, examine any links, and then develop a new RSS 2.0 feed with enclosures for downloading into FireAnt.  The theory was that you could put any feed into MediaFeedr and then come out with any linked content as enclosures.  In reality, it never really got out of testing, but the initial feedback was good and I was proud of the code and the idea.

 Jay and Josh were in need of a directory.  Josh had put together some rudimentary code to implement some server side components to tie in the Mac and PC versions of FireAnt, but while Josh is an excellent visionary and a good leader, he is by his own admission a pretty poor coder.  I took the best of what I had and the best of what Josh had developed and we developed a videoblogging directory and some really innovative server side features to go along with with the video aggregation clients.  We spent months developing it, and we released it to the public on January 24th of 2006 (initial TechCrunch coverage can be found here).  We were ironically directly competing with Odeo at the time for one of the best directories available on the web.  It was developed with AJAX technology which at the time was still fairly new and required a lot of hand coding of JavaScript, etc.

 I was incredibly proud of the work I had done, but even by that point it was becoming obvious that the things we had thought were important weren’t what the market felt was important.  YouTube had in the course of a year become huge, and flash-based web video was where the traffic and the money was at.  The idea of aggregating different forms of video (of which Flash was incredibly hard to play on a PC based client and for the most part no sites supported RSS 2.0 with media enclosures) was falling by the way-side.  After a successful launch but a limit in the amount of video content to be obtained through podcasting, I left in March of 2006 shortly before Katie was born to pursue other opportunities and to limit my workschedule to spend time with my newborn child.

What went wrong then?  I’ve had over a year to reflect on this, and I think I can boil it down to a few choice areas where we wrong:

  • Too much focus on the business and not enough focus on the technology
    • We brought in BizDev people very early in the process, in fact before I even officially joined the company.
    • Our BizDev people were unsuccessful at selling the technology.  Simple fact is, they were opportunists who were looking to make a quick buck and really didn’t believe in the company other than they thought they had a gravy-train to ride on.  The early stages of the startup should focus on the technology first and the business second.
  • Poor initial design of the business and ownership structure
    • The initial design of the business was a 5 way partnership between two visionaries, two developers and one business development guy.  First of all, equal partnerships never work.  There was no clear leader and far too many chiefs without enough Indians.  When I was brought in, the initial founders were reticent to give up more of their ownership structure since it was already fairly deluted as it was.
  • We bet wrong
    • We bet people wanted offline content and simple aggregation of feeds across many websites across the Internet.  Fact was, people wanted one destination in their web browser to view content.  YouTube won, we lost.

 There were great people involved in the founding of the company, but there were just too many.  The next startup I do will have a clear leader, a core set of technology people, and we’ll worry about making money last.  There just isn’t enough of a small company to split it 7 ways.  It should be split three ways and then a quarter left over for the rest to come.  The development people, the ones doing the work to get the technology off the ground should come first.  I’m slightly bitter over the fact that I worked hundreds of hours and at the end of the whole story I ended up with virtually none of the company.  The technology I developed for them was critical to the initial success of the company and I felt from the beginning that even thought my work was highly valued, the ownership percentage was never ponied up.  This is probably why I left early and didn’t stick with the project.  I think had I have stuck with it and not run out of personal funds we probably could have been much more successful.  There were also numerous problems with the client development founders who were also having to work day jobs.  I was the best suited financially at that time due to my severance with Cingular to work for no money, and I was rewarded the least.

 While this may seem harsh to the people who were involved with the company, I want to point out that I feel no ill-will towards the people who I worked with.  Mistakes were made all around, and I have the highest respect for Josh, Jay, Daniel and Erik who were involved in the project during my tenure.  They are all excellent people, and I’d work with all of them again.  I only note these things largely for my own reference, and I point them out so that if I were to ever team up with these people again we can have an open and honest discussion of our mistakes so we don’t repeat them again.  This was a learning experience for all of us, and I hope that some time in the future I can find a way to work with these people again.

 I’d especially like to point out Josh’s effort.  Josh stuck with FireAnt from the beginning to the end.  Josh sacrificed far more than any of the rest of us, even delaying his wedding so that he could see this through to the end.  I consider Josh a close personal friend, and I’d jump at the chance to work with him again.  Josh is an excellent person of the highest moral caliber.  Josh has endured personal threats, personal hardship, and he has endured and completed this project while the rest of us moved on.  I have the utmost respect for the sacrifices he made, and I tip my hat to the Sonic Mountain team who more than the technology we developed got the best part of FireAnt when they got Josh.

 You can still see the technology I developed for FireAnt at getfireant.com.  Some of our more unscrupulous shareholders stole fireant.tv as part of a petty personal squabble, but at least it’s still available there.  To those of you shareholders who were involved in that, shame on you.  Being involved in a small company with no revenue is about sacrifice, dedication and a pursuit of developing your vision, not about cashing out.  Stealing money, lieing, and personal threats are no way to end a failed startup, and I hope you feel ashamed of your behavior.  You know who you are.

 Jay’s thoughts can be viewed here.  Josh’s thoughts can be viewed here.


Web 2.0 Marketing

Tom has a good post over on his personal blog about marketing in the Web 2.0 age. Something we’re finding a lot of people are missing, especially in Arkansas, is how to integrate their web presence into their existing marketing strategies.

It’s especially fascinating to be bringing the web to people,especially skipping the last 10 years of the Internet, and trying to bring them up to what people are calling Web 2.0. People, even in Arkansas, are either going to get that the Internet is changing everything about the way they do business, from marketing to customer interaction, or go out of business. Tom’s a leading mind in this area, IMHO, right up with the best of them.


The Local Web Experiment: Fort Smith, Arkansas

A while back, I wrote about what I’m calling the Local Web. The Local Web, in my mind, is a group (an infinite number of groups are possible) which arrange their interconnectedness by sharing a geographical point of reference, traditionally Metropolitican Statistical Areas, or MSAs. The Local Web is already built in many of the larger cities, with directories and vertical search engines to allow you to search for stuff in major metropolitan areas, but a good percentage if not the majority of Americans live outside of a major metropolitan area. The connected netizens from those areas are being largely overlooked by current major initiatives to create localized web experiences.

I’m starting an experiment in a town that should be the perfect size. My hometown is Fort Smith, Arkansas, a town of about 80,000 with about a quarter million in the MSA. There are billions of dollars of business done every year here, and many companies here ship worldwide. However, for doing business in town, most people still reach for the phone book. The reason for this, of course, is because you can spend days Googling around for information about Fort Smith businesses without finding much but spam sites. No one in this town has made a concerted effort to make sure things are easily found on the web about businesses they’d like to do business with.

So, I’m starting an experiment. I’m going to organize a blogger meetup to start. I’ve already found several local bloggers and I’m going to find or create more. I’m going to organize them and attempt to get them to write about business and other activities (softball, church, whatever) they that they do locally and where they do them at. I’m going to try to incent people to create links from site to site across town and try to make information more easily indexable by the search engines so that when you search for something in the area you don’t end up at a spam site. We will be holding the meetings at Kirkham Systems of Fort Smith.

Once this is going strongly, I, along with the staff of Kirkham Systems are going to start showing the results to local businesses and convince them they should have a website with a blog and incent them to link to the people they’re doing business with and write about their experiences with it. The goal is to create an interconnected web of links focused on this geographical area, so that if you end up at Kirkham Systems website you’ll find annotated links about the people we do business with, and when you end up there you can find the people they do business with.

If I’m right, by the time I’m done, Google will be a far more interesting resource to find information about businesses, things and places in Fort Smith, Arkansas than any other resource, anywhere. This may seem boring to people who live on the coasts and can find a well designed and well organized website for even local businesses, but for the large portions of the country that have been ignored by businesses attempting to organize information for them on the web, I think this will be a large step forward. No one understands or cares about this because they haven’t been educated as to what it can mean for both their businesses, themselves and their community. My goal is to educate everyone here.

The Local Web is long overdue.


Some discussion in the comments

There’s some discussion going in on in the comments of my last post. Check it out, I think we might be in for an interesting discussion.


My friend Raymond…

My friend Raymond is getting some attention for his love of OPML over on Dave Winer’s Scripting.com here and here. As is typical of format geeks, there’s a debate over on Raymond’s blog comments about why you’d use OPML over XHTML ordered and unordered lists. When are people going to realize that 99% of people don’t care? I’ve been involved in more format discussions than I care to remember, and in the end the reason RSS and OPML will become popular is because Dave Winer goes to the effort to develop tools rather than writing specifications and hoping someone will write tools for them. The format in the end doesn’t really matter much, it’s just a way to format data. There have been thousands over the years, and as long as everyone can read it, the rest is just syntax and semantics.

Josh and I were having a debate the other day as to whether using a pseudo-protocol like fireant:// was an acceptable solution for one-click subscribe in our aggregator. Most of the other aggregators are fighting over feed:// or some other specific file format (like iTunes pcast files). Why should we worry about all this when all we want is to enable easy one-click subscribe for people who already have our software? Josh’s concern is that the geeks will be upset over our use of a protocol that’s not really a protocol (of course, no need to remind people that feed:// isn’t a valid protocol either) instead of doing it through a file or some other method that’s more robust. Sorry, it works. The facilities are already in the OS and the browser to facilitate it, why not use it? Is it a hack? Yeah, so what? It works!

The same people who would be upset about us using fireant:// as a protocol are the ones who’d be upset that people are using OPML rather than XHTML formatted unordered and ordered lists. Hello? Who fucking cares. The user cares that it works! We spend far too much time debating the merits of one format over another and lot less time than we should making sure that software works for the end user. This is why Dave Winer continues to be a success in getting formats adopted, because unlike the Atom folks who have spent years making a format that’s the most robust and most well-documented, there isn’t a refrence implementation. Why is Microsoft Word the default format for exchanging documents and not OASIS? Because of the software people use. Why is RSS the preferred format for exchanging feed information? Because there was software that worked when the format was introduced that everyone could use as a reference implementation.

There something also to be said for simplicity. OPML and RSS are simple. Perhaps the specs are not complete and don’t cover all the use cases, but I can also code something up to work with them in a matter of hours. I investigated the Atom publishing protocol, and it would take me a couple days to do a pull implementation. By contrast, I have done a full Metaweblog implementation in a couple of hours.

Dave Winer can be an ass, but I give him credit where credit is due. The people who spend so much time complaining about him are excellent at complaining and not so good at getting things done. For that, I look to Dave.


New York Times misses it again

The New York Times has ran an article on vlogging. Like most of the mass media coverage on vlogging, they’ve once again missed the point. We are not TV! Chuck Olsen has summarized the problems best on his blog. I wonder when the general public will begin to realize that reporters and the mass media generally spend very very little time becoming acquainted with the subject they’re writing about before publishing. It would have taken less than an extra couple of hours to contact the people mentioned in the article to get sound bites and get a decent overview of what the community is about, like Wired did, but I guess since they’re the NY Times they need not bother with such trivialities. Sad.