|
2004-02-16 - Tim O'ReillyO'Reilly & AssociatesAn interview conducted by Alain BuretFOSDEM - First and traditional question : Please present yourself ... Tim O'Reilly - Founder and CEO of O'Reilly. Most people know us from our book publishing, but we also run conferences such as the Open Source Convention (in Portland, Oregon July 26-30, 2004), the O'Reilly Emerging Technology Conference (February 9-12, in San Diego, CA), and the O'Reilly Mac OS X Conference. In addition, we publish a number of technology-related websites, collectively known as the O'Reilly Network (www.oreillynet.com), but also including sites such as www.perl.com, xml.com, ONjava.com, ONdotnet.com, openp2p.com, and java.net. The last is particularly interesting -- it's a "private label" site with Sun branding, but operated by O'Reilly. Sun valued our editorial independence and the quality of our work, and so they asked us to build a site for Java that would be more accessible to other Java players than Sun. This has become a new line of business for us. We're now in discussions with a number of companies to operate their developer-relations sites. Overall, we try to describe our corporate goals more broadly than any of our individual businesses. I read a book called Built to Last, which described how great companies have "big hairy audacious goals." So I sat down and wrote what my own goal: "Changing the world by capturing the knowledge of innovators." That is, we try to find interesting people at the cutting edge of technology and then amplify their effectiveness by sharing what they know, and helping other people to follow them. We've been doing this for years, starting by writing documentation for widely used but poorly documented internet utilities, then the X Window System, then the internet, then perl, linux, and other open source technologies. Now, I'm focusing a lot on the next generation applications that run on top of Linux and the Internet. You can see that in books like Google Hacks, Amazon Hacks and the like, but also in the subject of my talk at FOSdem, which is a meditation on the way that these internet applications are changing the rules of the computing game. FOSDEM - How and why did you start publishing books ? Tim O'Reilly - We were a technical writing consulting company. We were exposed to Unix at one of our clients. Since it had originally been developed in a research setting, a lot of the programs were poorly documented. So we started writing about them in between our paying contracts. We figured we might as well put our time to good use, so we wrote manuals for things like vi, make, curses, uucp. And then we figured we might as well try to sell them. FOSDEM - Why did you start publishing Open Source contents ? Tim O'Reilly - Even though the early Bell Labs Unix license wasn't open source, a lot of the programs we still use in Linux today were part of the early Unix distributions as well. And, like Linux, they were built by volunteers inspired by the collaborative architecture of Unix, which made it easy for distributed teams and individuals to add pieces to Unix. Of course, many of these pieces came from Berkeley, which did eventually explicitly make them open source. But we were writing about these things back in the mid-80s (Unix in a Nutshell was first published in 1984, Learning the Vi Editor in 1986.) Our first conscious exposure to the philosophy of free software came from the X Window System. We started documenting it in 1988. Of course, the way we did it -- adding proprietary enhancements to the free documentation -- was seen as a problem by folks like Richard Stallman, but it was accepted enthusiastically by Bob Scheifler, the creator of X and the head of the X Consortium. This circumstance really illustrated the fundamental split in the open source/free software community very early on. Some folks were ideologically committed to free, while others were using free as a vector to get their ideas and code adopted. The latter group hoped for and planned for proprietary enhancement, and saw it as a good thing. Not long after, we got involved with perl and Linux (Programming Perl was published in 1991, Running Linux and the Linux Network Administrator's Guide in 1992 or 1993 (I don't remember exactly). And here we followed the same philosophy that had guided us with X: we took our cues from the authors of the software or the underlying documentation. In the case of Perl, we used a strategy that has worked very well -- we incorporated a lot of the free online perl docs into the book, but also used the book process to update the docs. And that's a process that has continued ever since, and has resulted in a lot of good free documentation for Perl as well as a commercial offering that's been very good both for O'Reilly and for Larry Wall. In the case of the Linux Network Administrator's Guide, the author really wanted to put the book out under the GPL, so we did. We had some problems with competing publishers putting out the same material, and online availability of the material in exactly the same format depressed sales of the book, so I don't think that this is as effective a strategy as the one we adopted for Perl, but it's what the author wanted, and we were willing to try it. We've done it numerous other times over the years. My goal is to create the maximum value for users, developers, and everyone in the software ecosystem. And I try to find the mix of techniques that will produce the best result. If an author is motivated by money alone, we may need to do a proprietary only book (since in my opinion that does create maximum dollars) but in others, where availability of the information at no cost is a priority to the author, we'll go that route instead. Whatever gets the best book written. FOSDEM - The O'Reilly website is a huge site, not only selling books, but also providing lots of information. Do you think it's a key of your success ? Tim O'Reilly - We definitely believe that having the web presence, and the conferences, are a huge part of our success. But it's more appropriate to say that the books, the web presence, and the conferences are all part of an integrated business model in support of our big goal. Another key part of it is an advocacy-based style of marketing. We don't market our products so much as we market the ideas and issues underlying our products. We first took this approach in 1992, when we published a book called The Whole Internet User's Guide and Catalog. At about the same time, we hired the former director of activism for the Sierra Club to do PR for us. He taught us to focus on the issues, and that our products would follow. We took his advice, and never looked back. From pushing the world wide web in 1992, to fighting with Microsoft over internet standards in 1995, to taking up the banner of open source in 1997-1998, to fighting with Amazon over the 1-click patent, to trying to get people to think about the deeper significance of the next generation of internet applications (which I've now been working on for the past five years), we focus on the ideas that are changing the world, and try to figure out how to advance those ideas in the right direction. FOSDEM - Editors would like to sell electronic books : do you think it's a good idea ? Have you plan to release eBooks or other stuffs like that ? Tim O'Reilly - We do sell electronic books, in a variety of formats. My favorite is our Safari service (safari.oreilly.com). My premise is that ultimately, we have to get beyond the standalone book. Once you have a huge database of technical content (which is what we have now in Safari), you can create a whole new set of interfaces to that content. The first implementation of Safari simply gave you a reproduction of the book experience online (with a subscription-based business model), but we're now exploring new services based on the Safari back-end. These include a system for college professors to build custom textbooks, mixing material from Safari with their own materials, and creating a print-on-demand textbook for each class. (This is in beta now, will be released later this year.) We also have a web services API for Safari, which is about to be released, which will allow integration of Safari content into development environments and software help systems. It will also allow our users to surprise us with new applications that we didn't ourselves think of -- which is part of the attraction of open source and open APIs as well. FOSDEM - We all know that you don't have a "favorite" book, but is there a book you would like to publish ? Tim O'Reilly - Actually, I do have a few favorites. Unix Power Tools was my all time favorite book to work on. I loved the format. Our current Hacks series is in some ways a descendant of that book -- small bite size pieces of useful knowledge, in a kind of hyperlink format. It was inspired by the web. I also have a remaining fondness for one of my first books, Managing UUCP and Usenet. It played an important role in popularizing one of the technologies that got the early software sharing community off the ground -- and for that matter, played an instrumental role in the spread of what we now know as the internet. FOSDEM - Introduce in few words what you're going to talk about during your presentation ... Tim O'Reilly - My fundamental premise is that the world we all grew up in -- the world of both Microsoft and the Free Software Foundation -- is fundamentally challenged by the internet. The internet (not linux) is the greatest triumph to date of the open source approach, yet it has changed the rules of software deployment so fundamentally that many of the techniques embraced by the open source community as first principles don't necessarily give the desired results. We need to reinvent open source in the age of the internet. My talk gives some suggestions for what we need to think about. FOSDEM - Right after your talk, we will have RMS talking ... it's a well known fact that he's again the fact that people "sell" knowledge/information. Do you think he's too extremist in his opinions ? Tim O'Reilly - Yes. I love Larry Wall's comment, "Information doesn't want to be free. Information wants to be valuable." In 2001, I wrote an essay on that comment for Nature (the well known UK science magazine.) should just point you to that essay. It's at: http://www.nature.com/nature/debates/e-access/Articles/oreilly.html. Hmm... Site says it's down for maintenance. So here's the content of the piece, in case Nature's site is still down when you put this interview up online: "Information doesn't want to be free. Information wants to be valuable." I first heard this gem from Larry Wall, creator of the Perl programming language. Like many other open source software authors, from Linus Torvalds to Tim Berners-Lee and his spiritual descendants at the Apache web server project, Larry discovered that one way to make his information (his software) more valuable was to make it free. By making his software freely available, Larry was able to increase its utility not only for himself (because others who took it up made changes and enhancements that he could use), but for everyone else who uses it, because as software becomes more ubiquitous it can be taken for granted as a foundation for further work. The Internet (based on freely available software including TCP/IP, BIND, Apache, Sendmail, etc.) demonstrates clearly just how much value can be created by the distribution of freely available software. Nonetheless, it is also clear that others (with Bill Gates the paradigmatic example) have found that the best way to make their information valuable is to restrict access to it. No one can question that Microsoft has created enormous value for itself and its shareholders, and even its critics should admit that Microsoft has been a key enabler of the ubiquitous personal computing on which so much of our modern business world depends. What many people fail to realize is that both Larry Wall and Bill Gates have a great deal in common: as the creators (albeit with a host of co-contributors) of a body of intellectual work, they have made strategic decisions about how best to maximize its value. History has proven that each of their strategies can work. The question, then, for the creators of information, is one of goals, and strategies to reach those goals. The question for publishers and other middlemen who are not themselves the creators of the content they distribute, is how best to serve those goals. Information wants to be valuable. Publishers must focus on increasing the value, both to its producers and to its consumers, of the information they aggregate and distribute. I am neither a practicing scientist nor a publisher of scientific journals, but as a book and web publisher who works on a regular basis to document widely available "infrastructure" software (both free and commercial), I am daily confronted with decisions akin to those reflected in the debate now being carried on in these pages. Because I publish books about free software, the people best qualified to write about it are often the authors of the software. Like scientists, those authors often have the widest possible dissemination of their software and information about how to use it, rather than the greatest economic gain, as their ideal. They'd like to see the documentation they write distributed freely along with the software. At other times, though, software authors see documentation as an afterthought. They'd rather not deal with it, and hope that someone else will. In those cases, the question of compensation often comes into play. Will a third party who is motivated chiefly by money earn enough from this book to justify the time writing it? In helping authors to navigate this discussion, I try to bring them back to their goal. Is it maximum dissemination of information? Is it earning enough to justify the work ? (I should note that the jury is still out on whether making the text of a book freely available helps or hurts sales of a print book. There is evidence on both sides. In some cases, such as Eric Raymond's book, The Cathedral and the Bazaar, free distribution of the content created the "buzz" that allowed us to publish the same material successfully in print. In other cases, such as our initial publication of the Linux Network Administrator's Guide, sales were reduced because other companies republished some or all of the book at lower cost (which they could do because they had no development costs or royalties). However, over time this problem abated, because the fact that those publishers were not adding value was recognized by the target audience, and eventually marginalized their products.) I see many parallels between the work of free software authors and the work of scientists. In most cases, both are more interested in making sure their work is disseminated than in maximizing their return from it. In most cases, the target reader is a peer of the author. Publishing is designed to enhance reputation as well as to spread the word. Publishers must be careful to keep prices fair, lest they be seen as taking advantage of the goodwill of their authors, gouging the very customers who also produce their content. In this kind of environment, you have to ask about the role of the publisher as middleman. No one who started as a self-published author and gradually developed all the infrastructure of publishing (as I did) can question the enormous added value that a publisher brings to the table. This value includes editing (which starts with content filtering -- the choice of what to publish and what to refuse -- and extends through content development and quality control), manufacturing of the physical product, marketing, sales, distribution, and collecting and disbursing money. In the early days of the World Wide Web, the rhetoric was that anyone could be a publisher. After all, with cheap, ubiquitous web servers, the cost of printing and inventory was minimized. There was a great deal of talk of disintermediation. In a few short years, the reality has turned out quite otherwise. It's quite easy to put up a web page, not so easy to discover it. At bottom, the job of publishing is precisely mediation: mediation between a huge class of potential authors, and an even larger class of potential readers. Simple mathematics dictates the rise of multi-tiered distribution chains, in which publishers aggregate authors, various types of resellers aggregate readers, and wholesalers aggregate publishers for resellers and resellers for publishers. The same multi-tiered distribution has emerged on the web. Betting on this logic, my company created the world's first web portal, a site called GNN (the Global Network Navigator) in early 1993. We sold the site to AOL in 1995, and they later folded it into their main service, but the vision of web aggregators (i.e. publishers) has unfolded pretty much as I imagined it. Many people with their own web pages end up writing for better-established web sites; those sites are further aggregated for readers by search engines, directories and other portals like Google, Yahoo! or AOL. In fact, web publishers now employ full time workers to ensure that their pages are listed on these gateway sites, much as printed book publishers employ sales people. A large proportion of internet advertising has come from web sites trying to get better visibility for their product. However, the web does bring another wrinkle: the ability of groups to self-aggregate. The core functions of publishing, from content filtering to audience aggregation, can be performed by a group of interested users. This is particularly true when there is already a well-defined target community. This can be a disruptive force in the publishing marketplace. So, for example, sites like CNet and ZDnet spent tens of millions of dollars building and promoting portals for technical information on the web, while two college students built a site called Slashdot ("News for Nerds. Stuff that Matters.") into a similarly powerful market presence simply by inviting their readers to submit, organize and comment on their own content. Interestingly enough, though, as Slashdot has grown in popularity and evolved into a real business, it has needed to add more editorial staff, to filter the submissions of a growing marketplace of readers who now recognize that exposure via slashdot is a powerful marketing tool. In short, even a community-centric effort ends up recreating some of the fundamental dynamics of publisher as middleman and aggregator. What this evolution illustrates is that publishers will not go away, but that they cannot be complacent. At bottom, the job of publishing is to connect authors who have something to say with readers who want to say it. Publishers must serve the values of both authors and readers. If they try to enforce an artificial scarcity, or charge prices that are too high, or otherwise violate the norms of their target community, they will encourage that community to self-organize, or new competitors will emerge who are better attuned to the values of the community. FOSDEM - What are you expecting from your talk at FOSDEM and from the interactions with other people present at the event ? Tim O'Reilly - I'm hoping to spend more time with the many wonderful open source hackers in Europe. I also want to get a feel for the uptake of Linux in Europe. It seems to me that we're at a great turning point. This is very exciting. I also want to get the message of my talk out to as many open source developers as possible. I believe that it's critical for us to come to grips with the way the computer landscape is changing, and not keep fighting old battles from the 1980s and 1990s.
|
|
© FOSDEM 2003-2004 - powered by Argon7 |