Glints of Profession

Recently I caught a few tweets from Patrick Welsh where he briefly mentioned pair-programming with his young daughter while she was home sick from school. In particular the second one referenced being "green", as in all tests passing. It struck me on a couple of different levels, even beyond the "awww" factor.

(Let me not understate the "awww" factor. Stories like this set anticipation brewing in me. I can't wait to be able to share my passion for programming with my own son or daughter. Hopefully she listens better than I did when my dad attempted to enact the rituals of fatherhood with me at his workbench, under the hood of the car, and in the kitchen. Sorry Dad! If I could do it again, I'd pay attention this time around.)

This brief snapshot of their interaction conjured visions of the apprentice at the elbow of the master craftsman, absorbing subtle waves of well-worn trade wisdom. We're talking about more than just "measure twice, cut once" type of stuff here. The equivalent of that might be "fail fast". Very good, very fundamental advice, to be sure. But testing and TDD are at least one level above that.

At the level of TDD, we're dealing with explicable, transferrable techniques. Not just muscle memory, but practices. Practices that, when followed, guide the hands of the initiate on the path toward skill, consistency, and quality. To me, this is a huge indicator of mature and professional behavior on the teacher's part. And such things are themselves markers of the continuing evolution of programming from simply a job to a profession.

Compare these types of hallmarks to the ad hoc, piecemeal, on-demand way that most programmers still learn the tricks of the trade. Most of us are still tossed into the job without mentorship to speak of. We're lucky if we've been taught theory in school, but this is incomplete knowledge at best. (This is like the difference between knowing how a car works, and being able to build one.) But a relevant degree is not the path most commonly arrived by for programmers anyway. Most programmers learn through the work alone, maybe supplemented by online information gleaned from Stack Overflow questions, or blog posts where people have documented their own unique travails. Lessons are gleaned through individual trial and error, with the occasional vicarious horror story of warning.

This story is still common. Though happier alternatives are becoming more common. Many of us make quite a show of seeking to make programming a true profession. There's a Craftsmanship Manifesto. There are apprenticeships. There are actual, literal journeymen, as well as prolific, sough-ought masters. But these are all bright trappings, and it is quite easy to lose track of substance.

Profession is not found in titles and roles, or documents of good intent. Profession is not even found in the practices themselves, for TDD does not a craftsman make. Profession is found in the fact that there ARE practices which are known to improve the quality and consistency of work. Profession is found in the wisdom that is passed on by mentors, in concert with the mechanical skills of the job. Profession is found in the common platform of discipline, integrity, and mutual education upon which quality work, worthy of pride, is built.

I haven't seen what happens in Obtiva's apprenticeships, or in Uncle Bob's classes, or when Corey Haines pairs with his hosts. I trust that they do bear forward, at least in part, the evolution of our chosen trade. But in the words and excitement of Patrick Welsh's little apprentice, I can see for myself the rough diamond of profession glinting in the light.

Community Service

As I see it, programming questions tend to take one of two forms:
  1. How do I do X?
  2. How do I do X in a clean, maintainable, and elegant way?
Matt Gemell has a great response to questions of the first form. I have nothing to add to his excellent post on that topic.

But I think it's crucially important to make a distinction between the two forms. The second form is a far more difficult question, asked far less commonly, and to which good answers are even rarer. When questions of this form are asked in forums, people are usually given a link to a page describing a pattern, or if they are really "lucky" a page with a sample implementation of that pattern. To be fair, patterns usually are the answer to these types of questions. But what we find written up on the web, or even in books, is most commonly pathetically oversimplified, without context, and often even without guidance on what support patterns are necessary to obtain the benefit or when alternatives may be preferable.

Essentially, most developers are left with no choice but to apply Matt's answer to form 1, to questions of form 2, in a much less information-rich environment. I contend that, while it may be one of those proverbial activities that "build character", it is ultimately more likely to be harmful to their immediate productivity--possibly even to their continued professional growth.

What we end up with is a pandemic of developers trying to hack out pattern implementations, being discouraged by the fact that the pattern seems to have no accounting for any possible deviations or complications in the form of the problem. Worse, developers are often dismayed to find that the one pattern they were told to use is merely the tip of a huge iceberg of support patterns without which the first may actually be more problematic than an ad hoc solution. Most often the developer in this position will end up determining that their deadlines will never allow them to go through the painful trial and error process on every one of these patterns, and accordingly drop back to the ad hoc solution.

It's time that we acknowledge that software development, whether you consider it an engineering discipline, an art, or a craft, has a history--albeit a short one. Things have been tried. Some work, some don't. There do exist "solved problems" in our problem space. To say that every developer should try and fail at all these efforts on their own ignores and devalues the collective experience of our community. Worse, it stunts the growth of the software development industry as a whole.

Yes, one can learn these things by trial and error. Yes, the understanding gained in this way is deeper and stronger than that gained initially by being tutored on how to apply the solution. And yes, there's a certain pride that comes with getting things done in this way. But this is not scalable. Putting each person forcibly through this crucible is not a sustainable strategy for creating experienced, productive, wise programmers. Every hour spent grappling in isolation with a question of "why won't this pattern do what I've been told it will" is an hour that could be spent creating functionality, or heaven forbid solving a new problem.

That is why those of us who have managed to obtain understanding of these "solved problems" must be willing to shoulder the responsibility of mentoring the less experienced. Being willing to explain, willing to discuss specifics, mutations, deviations, exceptions. Willing to discus process, mindset, methodology. These are the things that make the distinction between a programmer and a software developer, between a software developer and a software engineer.

The internet may very well not be the appropriate place to seek or provide this type of mentoring. I suspect it's not. And unfortunately, there are too many development teams out there buried in companies whose core competencies are not software, and consisting solely of these discouraged developers, lacking an experienced anchor, or even a compass. There are online communities that attempt to address the problem at least partially. ALT.NET is one such, for .NET technologies. But there really is no substitution for direct, personal mentorship.

So I would encourage young developers out there to seek out more experienced developers, and ask them the tough, form 2 questions. And I would even more strongly encourage experienced developers to keep a watchful eye for those in need of such guidance. Maybe even consider proactively forming a local group and seeking out recruits. Be willing, able, and happy to provide this guidance, because it benefits all of us. Every developer you aid is one less developer creating an ad hoc solution which you or I will be condemned to maintain, overhaul, or triage somewhere down the line.

Is There a Place for Deletionism in Wikipedia?

Just dipping a toe in the water with something too big for Twitter, but that I definitely have a few thoughts on which I am willing to pitch out onto the internet.

Yesterday evening and this morning I watched (Or overheard? No idea what the appropriate term here is...) an exchange over Twitter between Tim Bray and Jeff Atwood about "deletionism". Tim's thoughts on the matter were apparently too strong to be held within the restrictive confines of Twitter and so he wrote a strongly worded post on his blog as well.

The post is obviously... passionate. But putting aside the bulk of the post that simply expresses that passion, he makes a couple really strong points.

The first point is that deletionists are just your garden variety elitists.
"the arguments from the deletionists are jargon-laden (hint: real experts use language that the people they’re talking to can understand)"

The other point really could have comprised the whole post and, IMHO, would have been a sufficient argument all by itself.

"What harm would ensue were Wikipedia to contain an accurate if slightly boring entry on someone who was just an ordinary person and entirely fame-free? Well, Wikipedia’s “encyclopedia-ness” might be impaired... but I thought the purpose of Wikipedia was to serve the Net’s users, not worry about how closely it adheres to the traditional frameworks of the reference publishing industry?" [emphasis added]

This, to me, undermines the deletionists' whole platform. That is what Wikipedia was supposed to be: a new model of information archival. A model not subject to the sensibilities of some authoritarian arbiters of what is "notable" or "interesting". And the deletionists are unequivocally trying to undermine this goal, by deciding what "deserves" to be archived.

I say, if they want to take that old dead-tree encyclopedia model and just port it to the web, they can go find their own site, and their own content, to do it with. I've even got a couple recommendations for them.

Identity Crisis in Computer Science Education

A while back, the seeds of a post started rolling around in my head, inspired by my lack of satisfaction with the preparation my education provided me for a programming career. But I didn't quite know what I thought. Then not long ago, Joel Spolsky presented a radical repackaging of Computer Science degrees, supposedly geared toward producing better programmers, and my thoughts started to gel.

I knew what my personal complaint was, and I knew that it was connected to a larger problem with the state of computer science education in general. But I didn't know exactly what the larger problem was, let alone have any ideas worth sharing on what might be done about it.

Now that seemingly the entire remainder of the blogosphere has weighed in on this and related topics, I think I am finally ready to throw my two cents in. I'm going to barrage you with links in the following paragraphs, to ensure that I credit everyone I read who assisted me in coming to my final conclusions. Feel free not to click through, but be aware that they represent a rich cross-section of an important discussion.

It took me several weeks to realize that what we have going on is essentially a three-way tug of war from people in different regions of the vast sphere of software development, who need very different things out of their workers, and hence out of their education. Below I will give a run-down of some of the claims made, expressing the different forces pulling on CS graduates these days. You'll quickly see it's no wonder that the schools are so confused....

The Artisan Programmer

Joel laments the uselessness of theory courses in computer science curricula, saying "I remember the exact moment I vowed never to go to graduate school" and then proceeding to recall a terrible experience he had with a Dynamic Logic class. Jeff Atwood insists that real-world development environments need to be in place and mandatory, including, but not limited to, source control, bug tracking, deployment, and user feedback. Then, as mentioned above, Joel proposes offering BFAs in software development, to make darn well sure that none of the academic (in the pejorative sense) theory stuff gets mixed in unnecessarily. The upshot of most of these points are that computer science / programming degrees should spend as much time as possible teaching people what they need to know to go into a career in software development, writing business software or software products.

The Computer Scientist

Brian Hurt comes in from another direction entirely, and in the process makes some very good points about the true purpose of higher education. He lays the blame for the flood of single-language programmers entering the workforce at the feet of schools who do just exactly what Joel and Jeff are asking for. He makes some great points. And while he sounds more than a little reminiscent of the classic Joel post about the perils of java schools, his argument is much more thorough than just blaming the tools. Chris Cummer joins this party, wishing that his theory foundations had been firmer, and making an excellent analogy to the difference between someone who studies a language, and someone who studies language. We also have the respectable Raganwald, who although he has admirably pointed out good points from all sides, doesn't shy from offering his opinion that programmers ignore CS fundamentals at risk of their own career advancement.

The Software Engineer

But thats not all. Several people have weighed in from yet another direction. Robert Dewar and Edmond Schonberg wrote one of the posts that started off this blog firestorm. Along with alluding to a similar sentiment as Hurt and Cummer, they heavily criticize the state of software engineering education for focusing too much on how to use specific, limited tools, when there are more sophisticated ones available. They claim understanding these will allow software engineers to easily pick up the other tools that may come along and direct them to appropriate purposes. Ravi Mohan stops short of calling the education satisfactory, instead sensibly pointing out simply that an engineer who doesn't use standard engineering tools such as system modeling, isn't really an engineer. Mohan comes on a little too strong for me in the comments, but the posts themselves (of which there are also a precursor and a successor, and should soon be one more) are worth reading. Like the others he makes valid points.

Resolving the Crisis

Mark Guzdial is one of the few people that really puts his finger near the pressure point. Though maybe not near enough to feel the pulse beneath it when he did so. At risk of quoting a little too heavily....
Rarely, and certainly not until the upper division courses, do we emphasize creativity and novel problem-solving techniques. That meshes with good engineering practice. That does not necessarily mesh with good science practice.

Computer scientists do not need to write good, clean code. Science is about critical and creative thinking. Have you ever read the actual source code for great programs like Sketchpad, or Eliza, or Smalltalk, or APL 360? The code that I have seen produced by computational scientists and engineers tends to be short, without comments, and is hard to read. In general, code that is about great ideas is not typically neat and clean. Instead, the code for the great programs and for solving scientific problems is brilliant. Coders for software engineers need to write factory-quality software. Brilliant code can be factory-quality. It does not have to be though. Those are independent factors.
And there it is.... Different environments require different mindsets/approaches/philosophies. Research requires one mindset/philosophy of work, engineering requires another, and in-the-trench-based programming requires yet a third.

When a person suffers from a personality fracture, the resolution is often to merge the personalities by validating each as part of a whole. Fortunately, since we are not dealing with a person, we have the freedom to go another direction: make the split real and permanent.

Associate of Science in Computer Programming

To fill Joel and Jeff's need, the student who wants to work in the craft of software development / computer programming, who wants to be an artisan, needs to have an appropriate degree. It needs to provide them with an exposure to the generalized idea of the programming platforms and tools that they will have to deal with for the rest of their career. Lose the lambda calculus, compiler-writing projects, etc. These things not necessary for them to get stuff done in the trenches. But they do need to be exposed to the fundamental generalities that pervade programming. And they need to be prepared to learn at an accelerated rate while in the field. That just comes with the territory. Focus on core programming skills like program analysis, debugging, and test practices. Introduce industry-standard tools (emphasizing generality and platform-independence) such as source-control, bug tracking, etc.

I think a two-year associate degree is perfect for the code-monkeys and business programmers that just love to dig in and mess around with code, and don't want to concern themselves with the overarching concerns. Especially with these jobs increasingly being pushed offshore, computer science grads are rapidly being priced out of the market. An associate degree is cheap enough to be worth the investment for a lower-paying programming job. And it doesn't carry the overhead of any unnecessary theoretical content that they may not be interested in learning. It should be noted though that this type of programming job enters the realm of the trades, with all the associated benefits and drawbacks.

If you're looking for a more well-rounded individual capable of moving up out of this position into a lead position, or even management, then a 4-year bachelor of science (or Joel's BFA, but I tend not to think so) may be a viable option as well.

Bachelor of Science in Computer Science

There's not much to say about this degree, because if you look at all the schools that are famous for their CS degrees, this is pretty much what you'll find. Lighter on general studies, heavy on theory, heavy on math. Light on tools because the students will be expected to find (or make) tools that work for them. Light on specific language education because students will be expected to adapt to whatever language is necessary for their problem domain.

This is a degree that will produce people primed for going on to masters and doctorates. They will end up in research, "disruptive" startups, or working on new languages, OSes, etc. This degree is designed for people who want to work at the edge of things. Who want to solve new problems and push the boundaries. They are people upon whom will be placed the burden of pushing the state of knowledge in CS into the next era.

Bachelor of Science in Software Engineering

I am hesitant to propose this degree, because I am not certain that the practice of Software Engineering has evolved to the point where we have 4 years worth of general knowledge that's worth teaching, and that won't be out of style by the time the student's graduate.

It seems that some people, when they talk about Software Engineering, are talking about architecture and design, and others are talking about process, resource allocation, estimation, etc. To be frank, I don't think the former qualifies as a true engineering discipline. At least not yet. I don't know how much the modeling of programs that Ravi Mohan talks about is going on out there in the industry. I suspect that it happens more in the process of really big projects, and maybe in digital security. The second type of engineering people think of, however, I think is very similar to what we see in manufacturing, with industrial and process engineers. These are people who get an intimate knowledge of the domain, and then figure out ways to get everything to run smoother, more efficiently, and producing higher quality.

I can definitely see some education possibilities here, though I am not sure myself how to fill out the whole degree. It should at least encompass a good portion of the Associate of Science in Computer Programming, because they need to understand the intricacies involved. I can also see this degree teaching some of the more established measurement and estimation techniques found among the industry's established and experienced software project managers. Generally more management-related topics such as resource allocation, planning, product design, feature negotiation, etc. might fit in well here. Different project processes, testing/QA models, and of course an ability to keep up to date with technologies and platforms, are all par for the course as it's all critical for making decisions in the industry.

Conclusion

I really, honestly believe that Computer Science education as a whole needs a makeover. It needs more structure, more integrity in the vision of what each degree means, across schools. When someone has one of these degrees, you need to be able to reliably assume they should have learned certain things, regardless what school they went to. Many of the degrees currently on offer don't satisfactorily prepare their students for any one of the possible careers discussed above. I'm not saying their hand needs to be held from enrollment right on through to their first job. That's not the purpose of college. The purpose of college is to provide a cohesive education, directed to some relatively well-defined goal of capability and knowledge. Today this is tragically non-uniform at best, and absent altogether at worst.

So I see plenty of room in software development education for a clarification of purpose, and a readjustment of goals and curricula. A few different tracks, each geared toward a distinct section of the sphere with different goals and different responsibilities. And if we resolve to use existing terminology with some respect for the historical meaning of the words, we can re-use our existing nomenclature. But there can be no more of this muddy slurry of computer science, craft of programming, and software engineering all overlapping in claims of purpose, treading on each others' territory without care. Everyone can have what they are asking for. They just need to accept that no one can claim the "one true way".

I am not a Computer Scientist

Prepare yourselves. I have an embarrassing and melodramatic admission to make.

My career is a sham.

Although my degree and education are in a field that is typically referred to as "computer science". I am not actually a "scientist". Nor do I "practice science". But I won't be satisfied to go down alone for this charade. I'll going on record saying that I am convinced that for the vast majority of people who were educated in or work in the field of "computer science", the ubiquitous presence of the word "science" in proximity to our work or education, is a tragic misnomer.

I don't know how long this has been on my mind, but I know almost precisely when I became conscious of it. It was a couple months ago. I was newly exposed to devlicio.us, and perusing the blogs hosted there, when I came across a post by Bill McCafferty about a lack of respect and discipline in our field.

Early in the post, Bill reveals an injustice he encountered during his education.

...When I started my undergrad in this subject, I recall reading articles debating whether it should be called a science at all. Gladly, I do not see this argument thrown around much anymore.

I think I am probably not going to make the exact argument here that he disagreed with back then. The things we all studied in school are definitely part of a nebulous field of study that may rightfully be called "computer science". As Bill points out,

"From Knuth's classic work in The Art of Computer Programming to the wide-spread use of pure mathematics in describing algorithmic approaches, computer science has the proper foundations to join other respected sciences such as physics, concrete mathematics, and engineering. Like other sciences, computer science demands of its participants a high level of respect and pursuit of knowledge."

I have no argument with any of this. He's right on. Donald Knuth (who is indeed my homeboy in the sense that we share our hometown) studied and practiced computer science (which if you know anything about Knuth, you'll know is an almost tragic understatement). And thousands of people who have followed in Knuth's foot steps can lay the same claim. However, that's not me. And it's not more than 99% of all programmers in the field today.

Computer science suffers the same type of misnomer as many other disciplines who have adopted the word "science" into their name, such as political science, social science, animal science, food science, etc. And it seems that most such fields, if not all, have done so because the very validity of the field of study itself was subject to severe criticism at some point in the past. So we take on the term "science" to get it through people's heads that there is a root in formal practices and honest intellectual exploration. But to then blanket every profession that derives from this root with the term "science" is a misappropriation of the term.

I can think of a number of examples.... The programmer working for the bank to develop their website, or for the manufacturing company to manage their transaction processing system is no more necessarily a "computer scientist" than the election commentator is necessarily a "political scientist". When someone gets an electrical engineering degree and goes to design circuits for a living we do not say he "works in electrical science". We say he is an electrical engineer. When someone gets a technical degree in mechanics and then goes to support or produce custom machinery, we do not say he "works in mechanical science". We say he is a mechanic, or a technician. Why, then, when someone gets an education that amounts to a "programming degree", and then goes to work doing programming, do we say that he "works in computer science"? It's a uselessly vague and largely inappropriate label.

By contrast, if you have a doctorate in computer science, I'm prepared to say you deserve the label. If you write essays, papers, articles, books, etc. for use by the general practitioner, you probably deserve the label. If you do research, or work on the unexplored fringes of the field--if you are exploring the substance and nature of the information or practices that the rest of us simply consume and implement, then in all likelihood you deserve the label.

Please, please understand that I am by no means belittling the value of our work, or the nobility of our profession. Often we simply consume the information produced by true "computer scientists". But we transform it from theory into practice. We resolve the concrete instances of the abstract problems that the true scientists formally define. We take the pure thought-stuff produced by scientists and turn it into tangible benefit.

This is not trivial. It is not easy. It deserves respect, discipline, study, and care. But it is not "practicing science".

I should say in closing that I am not as upset about all this as the tone of this post might imply. I don't even really have a big problem with the use of the word "science" to refer to a field of study or work that largely does not include research-type activities. I don't like it, but I accept that it happens. But "computer science" has a problem that other similar "sciences" don't. When someone says they work in "political science" or "food science", you can make a guess as to the type of work they do, and it's hard to be significantly incorrect. Though maybe it's my outsider's naïveté that allows me to make this claim. At any rate, "computer science" as a field is so broad and vague that I don't think the term communicates a useful amount of information. But you wouldn't know that by talking to programmers, who seem only too ready to attempt to take hold of the term and own it for themselves.

I think this is one small facet of a larger and far more critical issue in our field in general, which I fully intend to write more about very soon. But until then, lets take the small step of starting to consider what we really mean when we use much of the popular but often ambiguous terminology when discussing our profession.

I work in the field of computer science. This tells you nothing except that I am unlikely to be a prime specimen of the wondrous human physiology. But.... I am a programmer. I have a degree in Computer Engineering. I am interested in programming theory. I work as a software development consultant. And now, you know something about what I know and what I do.

Now what about you?

Update: I forgot to note that in McCafferty's blog entry, he himself makes use of "trade" terminology to categorize different levels of reading materials. Which belies the uncertain nature of programming as a profession. We certainly wouldn't say that a carpenter works in the "wood sciences", would we?

A new kind of Democracy

Friday night I was listening to the Boston-based NPR show On Point. The episode was about how reading affects the brain. It was interesting, but that's not what I want to talk about. There were two guests on the show. The main guest is a professor of child development at Tufts, and the other a professor of educational communication and technology at Wisconsin's very own UW Madison. This latter guest was brought in as a counterpoint to the main guest's fear that people spending more time with videogames and on the internet were missing out on many of the amazing benefits that reading bestows upon the brain.

Given the UW prof's area of focus, the conversation of course eventually came around, briefly, to blogs. She claimed that just as the Guttenberg printing press enabled the "democratization of knowledge", blogging will bring about another fundamental shift. One where, as before, some very nice things will be left behind, forgotten by most, in sacrifice to the emergence of a new mode of interaction that will bring about it's own useful new evolutions and societal impacts. I found this a tremendously interesting proposition, and hoped they would follow this line of thought a bit more, but unfortunately the host had latched onto some earlier point and swung back around to that, never to return.

I'm not sure I heard another word of the broadcast anyway though because my mind was off and racing.

Many people have already staked a claim on the blogosphere as the next frontier in journalism. Rejoicing has already begun that journalism will migrate into the blogosphere and morph into something new and different than ye olde journalisme, bringing enlightenment to all and ushering in the Age of Aquarius.... Certainly it will offer a new value proposition, but I am not yet convinced of the extent to which blog journalism will supplant traditional journalism. Regardless of that, however, these people are thinking too narrowly.

Let's go back for a minute to that statement about "democratized knowledge". I've heard this turn of phrase several times in reference to the rise of the printing press, but never really stopped to think about what is really being expressed there. The history of human communication seems to me to track very closely the history of significant "advancements" in civilization, and with a bit of thought I think it becomes abundantly clear why this should be.

When speech and body language were all that was available, the spread of knowledge was a social affair. You needed to have at least two people, in close proximity, interacting more or less directly. The advent of writing freed knowledge from the bounds of proximity in space and time. But consumption was serial because the recordings required a huge time investment. First one person, then another, then another.... If the recording was not lost or stored away in a private library. The printing press resolved these issues. It became trivial to make multiple copies of a manuscript. Copies could be made as long as there was demand for them. If some were destroyed others would survive.

The printing press made knowledge available to all those who desired to consume it. It was no longer the privilege of rich or powerful men. If you could read, you could learn, regardless whether someone was available to teach you personally. One man's idea could be made available to thousands, across time and space, and generally without concern for the social position of the consumer. This is what is meant by the democratization of knowledge, and it really was a significant turning point in human history.

I contend that the rise of blogging is the next great evolution of human communication. Sir Isaac Newton is quoted as having said "If I have seen farther, it is only by standing on the shoulders of giants." How do we know they were giants? Because they had the time and resources to make recordings of their ideas, and people valued them enough to demand and preserve them. In the past there were generally two ways you could get your ideas disseminated to humanity at large. If you were privileged enough to have the time and resources, you could finance a recording of your idea regardless of its perceived value. Or if you really believed in your idea, you could make a big sacrifice to get it recorded, and pray that people valued it enough to compensate you for your investment.

With blogging, these restrictions are largely abolished. If you have access to a computer, you can create a blog, for free, as I've done here. You can record your ideas as quickly as you can type them, and once you post it, it is instantly available for billions of people to simultaneously search for, stumble upon, consume, discuss, and share.

Newton and Einstein stood on the shoulders of giants. With blogging, we no longer have need of the giants. For you and I to see farther, we can assemble a great pyramid of normal human beings, each with a small contribution that in aggregate has potential that the "little people" of the past could only dream of participating in.

We can also move beyond ideas and into expression and experience. Blogging allows us all to share with the world our unique experiences, our viewpoints, our disasters, our epiphanies, our humanity. No matter how mundane our individualities may be judged by most, we now have the potential to find the other people out there who are weird in the same ways we are. With increasing frequency and ease we can now connect online with someone else who has shared our joys and sufferings, our beliefs and needs. Someone with whom we can identify and be comforted. Or we can peer into the "mundane" individualities of others, and experience the wide breadth of simple but beautiful diversities of humanity. These are often things that are difficult to get published in print, unless the presentation is particularly engaging, or the story is something that we expect will touch large groups of people. Online these barriers do not exist. And we need not simply consume the others' narratives either. We are empowered to interact with the speaker and participate in their narrative.

I could go on into more specifics, but instead maybe I will dedicate another post to that sometime down the line when I've thought about it a bit more.

To put it succinctly, Gutenberg's printing press enabled the democratization of the consumption of knowledge. With the emergence of blogging, we are witnessing the democratization of the production of knowledge and of expression of the human experience.

Maybe not the beginning of the Age of Aquarius.... But I think it's a bit of a "big deal" nonetheless. Though I am probably late to the party on this.