Sunday, December 27, 2009

Groovy XML parser for ITunes Music Library

I work in the Video On Demand field and so am very interested in the distribution of content selection. In other words, which content gets played and how often.

A few years ago the standard thinking was that most people were watching the small set of popular content. This was known as the 80/20 rule, that 80% of people were watching 20% of the content. Then the notion of Long Tail came along which said that although some content would be popular the set of content that got at least some plays (i.e. the long tail of graph) was quite large.

Being a bit of geek I decided to write a Groovy based program to analyze the data in my ITunes library (even though this week I switched to the Motorola Droid).

ITunes keeps all of its information about your music, the play counts and so on in a file called library.xml. This file can get quite large so I decided to go with a SAX XML parser approach to minimize my memory consumption. The program reads the libary.xml file, extracts the Artist, Song title and play count and emits a csv file that can later be read by Excel.

The program is shown below:

import javax.xml.parsers.SAXParserFactory
import org.xml.sax.*
import org.xml.sax.helpers.DefaultHandler

class MyHandler extends DefaultHandler {
def tempVal
boolean expectingArtist = false
boolean expectingSong = false
boolean expectingPlayCount = false
String artist
String song
Integer playCount
def outFile = new File('\\musicPlays')

void endElement(String namespace, String localName, String qName) {
if(expectingSong) {
song = tempVal
expectingSong = false
}

if(expectingArtist) {
artist = tempVal
expectingArtist = false
}

if(expectingPlayCount) {
playCount = Integer.parseInt(tempVal)
expectingPlayCount = false
String thisLine = artist + '; ' + song + '; ' + playCount + "\n"
println(thisLine)
outFile.append(thisLine)
}

if(tempVal.equalsIgnoreCase('Artist')) {
expectingArtist = true
}
if(tempVal.equalsIgnoreCase('Name')) {
expectingSong = true
}
if(tempVal.equalsIgnoreCase('Play Count')) {
expectingPlayCount = true
}
}

public void characters(char[] ch, int start, int length) throws SAXException {
tempVal = new String(ch,start,length)
}
}

def handler = new MyHandler()
def reader = SAXParserFactory.newInstance().newSAXParser().xMLReader
reader.contentHandler = handlerdef inputStream = new FileInputStream('\\library.xml')reader.parse(new InputSource(inputStream))
inputStream.close()


This program let me analyze my listening preferences. Turns out I have about 8000 songs in my library, of which I've listened to about 3200 of them at least once.

I've listened to about 260 songs at least ten times and about 1000 songs at least five times.

Wednesday, December 9, 2009

The Killer App for E-Book Readers is Smell

After resisting the Kindle for a long time because of DRM fears (I don't like to be locked in to a single vendor) I got a Sony EReader a few months ago. I love it, and I'm reading a lot more because its such a nice reading experience.

Many people have said to me that they are reluctant to go to E-Readers because they miss the smell of books.

So, I propose E-Book-Smell Strips! These would be strips containing the smell of physical books that you would stick to your E-Reader.


You could even charge more for "Hard Cover Smell"!

Wednesday, November 4, 2009

This month's issue of Pragmatic Programmers Magazine

This month's issue of the Pragmatic Programmer's magazine is out and as always it is interesting, and in an unexpected way. This magazine continues to look at things a bit differently. For example, there is an article on hand writing a letter...remember when people did that? Its interesting to look at what has been gained and lost by the transition.

I have an article in this month's issue on interruptions, specifically interruptions of people who think for a living. I explore the idea that most current attempts to limit interruptions are doomed to failure because they place the cost of the interruption on the wrong person. Go read the article to learn more!

Wednesday, October 28, 2009

Mocks Run Amuck

I just spent 2 hours debugging a problem that turned out to be tied to a lazy use of mocks.

In our core code we have a number of Mock objects defined for the key objects in our system. These are moderately functional mocks in that they support a fair bit of state. They are Good. They should be used.

I ran into an issue where I believed I was setting some state on a mock, and some later code appeared to be making the wrong decision based on the state of the mock.

It turns out that someone had needed a mock and didn't bother to see that a really good mock class already existed. So they created their own completely non-functional mock class, and they made it an inner class of the test class to boot. So, when I set the state on the mock I was actually calling a no-op method of the bad mock.

It took me a long time to come to grips with the fact that "my" mock object was misbehaving and not remembering state. I didn't think of this as an option because I had written that mock class and the state preserving code was trivial. So trivial that it could not be failing...unless I was actually using someone else' mock object.

Before writing a class, even a test related class you ought to look around a bit and see if the class already exists. In this case that's especially easy since the file will be named MockThing.java (or .groovy), and especially since Eclipse can find your class so easily.

Come to think of it, this new no-op mock class was probably created by Eclipse. Normally I'm a fan of tools making it easier to do things, but not when its the wrong thing to do!

Wednesday, September 9, 2009

Recharging Yourself; getting an E-Reader or a netbook

Having written several blogs about why I don't like the pricing model of the Kindle book reader I've been very interested to read about Sony's new e-book readers, especially as you can download books to it from your local library. I've played with the $300 Touch model and quite like it. The only real competition to it for me is a netbook. If you're not familar with netbooks, they are basically really small (8 or so inch) laptops with limited disk and memory; designed to view documents and web pages but not really to do authoring.

The standard discussion goes like this: e-readers are single purpose devices, netbooks are general purpose devices. E-readers are better for actually reading books but can't do much else. Netbooks are less suited for reading but can do more. For techies like us the argument that no one would want to look at an LCD screen for hours at a time is kind of funny since that's what we do all day, every day.

I was about to get a netbook, especially since much of what I read are PDF versions of technical books from Manning Press or The Pragmatic Programmers, when a senior collegue made the following remark:

"Brian, we are thought workers, designers, creative people. We have an obligation to read non-technical material so that we can keep approaching problems from unexpected directions and keep bringing creativity to tasks. You should get the device that makes reading fiction easier".

That really struck me. So, after posting this entry I'm driving to the Sony store to buy the Touch E-Reader!

Sunday, September 6, 2009

Classes = Structs plus Closures.

I was leading our Groovy book club this week and had an insight. We have Java programmers, C/C++ programmers and script writers so we try to use a wide range of examples to illustrate points.

We were discussing closures, which can be challenging for people unfamiliar with them. I was making that point that Closures are Objects which means they are instances of classes. Particularly, they are basically classes with a (generally) single method and no data members.

That made me think about Structs, which are basically classes without methods.
So, I had the thought that Classes = Structs plus Closures.

I'm not sure if this is a helpful insight yet, but anytime I can think of things in a new way I like to explore it a bit. I think the C/C++ programmers in the book club found it helpful, perhaps because they're already used to thinking about different types of objects such as structs. So, thinking about Closures as another kind of class was not such a stretch for them.

Thursday, September 3, 2009

New issue of Pragmatic Programmer's magazine

The third issue of the new magazine PragPub from the Pragmatic Programmers is now available at http://www.pragprog.com/magazines.

In the interest of full disclosure I'll say that I have an article in this issue, so I encourage you to go read the issue just for that.

If you're a programmer you probably already know about the Pragmatic Programers and likely have some of their books, but you may not know about the magazine. With authors like Kent Beck, Dave Thomas, Andy Hunt (and me!) it's a worthwhile investment of your time.

The unexpected thing is that many non-programmers will find a lot of value here as well. One of the regular columns is called Get A Life and is about the various ways to find balance in life and recharge. I'm an apprentis instructor at a Tai Chi school and am passing this article around to the staff because it's so relevant.

Take a few minutes and check out the magazine, especially my article (with Dave Koelle) titled: "And Your Bugs Can Sing"

Thursday, August 27, 2009

Thinking about levels of Risk and Releases

One aspect of releases in the real world that seems to get little mention is the reality of working on more than one of them at a time. Many of us work in shops with a main branch and then multiple named releases hanging from various places off of the main branch. This is standard.

So, the inconvenient truth is that when I make a fix in release 2.3 it probably needs to also go in release 2.3.1, release 2.4 release 3.0 and "main" (whatever you call your root branch). I tend to think of those as pass through integrations. Someone made a fix in a past release, likely for a real customer (so it was done quickly) and then wanted to make sure it got propagated to all of the other releases that might correspond to any existing or future customers.

Now, the awkward question is: do you think the developer ran a full regression test on every release they checked the code into? Officially we all probably say yes, but in reality the answer is "I made sure it compiles" if you're lucky.

Depending on your branching strategy and the level of automation of your regression tests you may not need all builds deployable at all times or you may simply not be able to afford it even if you want it. If your tests are not automated it may simply be too expensive for each checkin to cost a developer-day of regression testing.

So, the two key questions are: how much risk do you have and how much can you afford?

A proposed approach to dealing with these questions is as follows. Each time a developer makes a check in to a branch the risk that the branch has destabilized has increased.by a quantum. Each time QA runs the regression suite the risk is reduced. Make the invalid but simplifying assumptions that all checkins add a constant risk and the regression suite reduces the risk to zero.

Imagine a graphical tool that shows a graph of the accumulated risk associated with each branch. This would allow you to answer the question of how much risk you have with each branch. Put another way, it lets you know how much deferred work would be required should any particular branch need to be released.

The Atlassian tool suite might already do this but I suspect that any development shop with a code management system and decent script writer could create a system for building such graphs.

Once you have the ability to measure your risk you can make an informed decision about how much to accept. You might decide that some branches should be regression tested everytime they’ve accumulated x checkins, while other branches only need testing every y checkins. The point is that you get to decide.

Tuesday, August 18, 2009

Spark: The revolutionary new science of exercise and the Brain. John Ratey, MD

This book is strongly recommended reading for anyone who uses their brain.

This book starts with the assertion that conditioning the body is just a side effect of exercise, and the real benefit is the changes that exercise causes in and on the brain. Many people have made the case that our current lifestyle doesn't match the conditions we evolved for, but the assumption is that too many caleries and not enough running only effects the heart and lungs. Ratey argues that part of what we evolved for was hunting and gathering that required speed, stamina, cunning and fine motor control. We lose all of that when we're sedentary.

In a Chicago High School gym class was transformed into exercise class, with emphasis on effort as measured by heart rate monitors. This allowed even unfit kids to succeed, by working enough to elevate their heart rate. This school's obesity rate is now 3% compared to the national average of 30%. They are also setting records for academic performance.

This book has a lot of chemical names in it and one of the most important is brain-derived neurotrophic factor (BDNF). BDNF causes an increase in synaptic connections which is how we learn. This molecule is produced in the hippocampus and studies show the exercise causes an increase in its production. Ratey calls BDNF "miracle grow" for the brain. BDNF doesn't make you smarter but it creates an environment conducive to forming new connections. So, if exercise and then learn something its easier for the brain to encode the learning.

We used to think that we were born with all the brain cells we'd ever get, but we now know the neurogenisis occurs all the time. New neurons are born and then have 28 days to get connected into a network. If they don't they die. Exercise increases both the rate of nerurogenisis, and the ability of new neurons to make the connections they need to survive.

Ratey goes on to say that the benefit can be enhanced by combining aerobic exercise with complex activity such as Tai Chi. "The more complex the motions, the more complex the synaptic connections". Although these neurons are associated with exercise they can be recruited for other tasks (such as learning)

When under stress we produce cortisol which tells the hippocampus to selectively process data and memories (so as to focus on the stressor). While beneficial in the short term, in the long run it actually causes the non-stressor related nerves to degenerate and lose connections. This makes it harder for the non-stressor memories to be accessed, which can lead to more perceived stress. Exercise on the other hand causes a reduction in the production of cortisol, which can break the cycle.

The book does get somewhat repetitive as it explains the positive effect exercise has on the neurochemistry of anxiety, depression, ADHD, addition and aging. On the other hand you really can’t fault the author for providing so much good news. Each of these conditions represents a deviation from normal brain chemistry and exercise is a strong force for bringing the brain back into its normal condition of plasticity. Which is to say, that our brains were designed to be adaptable and exercise creates the right chemical soup for the brain to swim in to enhance the flexibility.

Wednesday, August 5, 2009

Another class of Bugs that Are Hard to Unit Test

Imagine the set of classes listed at the bottom of this post. The point is that we're performing a calculation using an expensive test, but when using a derived class we can make use of an inexpensive test. Both tests will give the same result but clearly its better to use the inexpensive test if we can. How can we write a unit test that verifies that we've used the inexpensive test (and used it correctly)?

Several possibilities come to mind. One option is to write a unit test that times the call to the method. This is pretty clearly not a good idea as it leaves the results hostage to the load on test system.

Other approach is to embed a flag in the class that gets raised when the expensiveTest method is called. That's not a terrible idea but it does put test code in the actual production code. A slightly better idea is to create a test class that extends BaseThing and implements the flag in its own version of expensiveTest.

@Test testQuickReject() {
DerivedClass foo = new DerivedClass () {
boolean flag = false;
boolean expensiveTest() {
flag = true;
return super.expensiveTest();
}
};

foo.pickBestThing();
assert(flag == true);
}

This leads to an observation about the effect of testing on code design. The approach being taken only works if the expensive test is a separate and thus overridable method. If the code representing the expensive test was just a bunch of inline code then there would be nothing for the test class to override. One could certainly go back and to a Refactor:ExtractMethod on the code, but if the class had been designed with testing in mind from the start that would not be necessary.


class BaseThing {
Thing pickBestThing(List thingList) {
Thing bestThing = null;

for(Thing thing : thingList) {
if(quickReject(thing)
continue;

if(expensiveTest(thing))
bestThing = thing;
}
return(thing);
}

boolean quickReject(Thing thing) {
return false; // no-op method
}
}

class DerivedClass extends BaseThing {
boolean quickReject(Thing thing) {
// some real test that only applies in the derived class case
}
}

Thursday, July 30, 2009

Sharpening Your Sword

We’re recently reinstituted our technical book club, even though we’re probably now busier than ever before. I sent an email asking our technical staff if they thought there would ever be a time in the future where we would be less busy. If not, were they willing to permanently suspend their own technical growth? Interestingly enough that email generated a lot of interest in our book club!

Choosing which book to study is an exercise in learning about balancing your needs with the needs of the company and the other developers. Having recently attended JavaOne I can think of lots of interesting topics: alternate JVM languages (Groovy, Scala, Ruby), frameworks: (Spring Roo, Spring DM, OSGI, Jigsaw), gui languages (JavaFX, Flex), tools (Maven, Eclipse goodies), general (“Concurrency In Practice”, “Pragmatic Thinking and Learning”).

To some degree you need to build consensus around the book choice but remember that your choice doesn’t have to be perfect. You may also find as I did that most people are busy enough to defer the choice to you.

Having selected the book we’re taking a slightly different approach to studying it than we have previously. In the past we all read the book chapters offline prior to the meeting and then discussed the chapter during the meeting itself. When questions would arise we’d sometimes open up a laptop and try something but it was largely a text centric discussion. Since our current book is discussing a language that’s new to most attendees we’re going to be much more laptop and code-centric. Our first meeting in fact will be devoted to getting the language and associated tools installed on everyone’s laptops so that we can all type along with the examples in the book. The goal is to make this more than an academic exercise. We’ve all attended trainings that were interesting and yet covered technologies that we never touched again. We’re aiming with the code-centric approach to fairly quickly add this new language to the tool box of our developers and testers.

Some of the practical steps in that direction include:

Installing the IDE plug-in for the new language
Modifying our main system build to compile classes written in the new language
Adding a HelloWorld class in the new language to the source tree to ensure that the modified build scripts actually
work.

And probably most useful, pick an existing but non-critical bug/feature request in our product to fix by adding a new class in the language. I think that this set of steps will make the new language real for people.

As a side comment I say that I like asking interview candidates what books they’re recently read. It can be fairly revealing about the ongoing educational habits of the candidate. It can also reveal how they react to an unexpected question, and one that they might have answered “wrong”.

Learning from your unit testing mistakes

By now one hopes that we don't have to convince developers of the need to write unit tests, and to use a tool like Cobertura to enforce some level of code coverage. And yet, there are other steps that really ought to be taken, steps that I don't see many development shops taking.

Even with fairly high levels of code coverage (e.g. 80% or greater), we still find defects in our code. The galling thing is that we find defects even in the code that Cobertura tells us we tested. How is that possible?

The first step in this process is a painful and often time consuming step, but one that can be very revealing. The next time a defect gets reported against 'tested' code I suggest you stop and write a unit test that finds that exact defect. (Now, many shops require a unit test for all defects so this step isn't new for them, but lots of other shops don't require this step).

The next step is the most interesting: reflect honestly on why you did not write that unit test already. Write down the reason in your engineering notebook (you do keep an engineering notebook, right?) Over time you may detect patterns in the types of tests you do and don't tend to write.

A fairly common case is where mainMethod() ought to have a call to subMethod() but doesn't. The defect is that subMethod() is not called. Even if you have individual tests that execute every line of mainMethod and subMethod you will not detect the missing call unless you go further. Presumably subMethod accomplished something useful, something that your tests of mainMethod could detect.

The question really morphs from line coverage to results analysis. As sad as it may seem many developers do not really understand that your unit tests are useless until you test the results. The following test increases your coverage numbers but doesn't actually test anything

@Test
public void myTest() {
int results = mainMethod();
}

There are static code analysis tools such as PMD that will flag unit tests without asserts. Try adding one of these to your build…but don’t be surprised if many of your unit tests turn out not to be tests at all.

A question that has bothered me about at this is what to do about stub modules. We sometimes have to implement an interface (a bad interface) that has dozens and dozens of methods, only a few of which are actually necessary. Your concrete class has to have stub implementations of the unused methods, which leads to a code coverage decision. You can write unit tests for these no-op methods, but since they don’t do anything you have nothing to test (which may be hard to explain to your static analyzer). Or you can skip testing these modules which may reduce your coverage numbers enough to get your coverage enforcement scripts to complain. I’d like to see an annotation of some sort like @STUB that told Cobertura to ignore this method or class for calculating its coverage metrics.

Thursday, June 4, 2009

JavaOne Thoughts and Observations

It's now the second to the last day of JavaOne 2009 and I've had a chance to reflect a bit on what's been going on. First, I have to talk a bit about my experience as a speaker at the conference.

- On Tuesday we got interviewed by Java Talk Radio (who knew there was such a thing!); the interview is available at: www.log4jfugue.org/javaone_interview.mp3.

- I had a conversation with Rod Johnson (inventor of the Spring Framework) and he later tweeted that he thought Log4JFugue was a "brilliant" idea.

- Dave Koelle and I gave our presentation yesterday and it was extremely well received. The room was full, not a single person left during the presentation, we got laughs and applause in the appropriate places, and several people approached us afterwards asking to collaborate with us.

- I got to play my Native American Flute during the presentation to illustrate a musical point...and I really enjoyed performing. I think I should do more of that.

- This morning at breakfast I sat down at a table where some people were already sitting, and I said hello. One of the people looked at me and said "Hey, you're that Log4JFugue guy!". That was fun, and it turns out the person wants to work with me on some extensions to the product.

Switching gears to the conference itself several key ideas stand out.

- cost cutting was painfully present in lots of small ways: the conference backpack was very cheap, the food quality was much reduced from previous years, most of the entertainment was recorded rather than live, and most significantly the conference attendance was way down.

- there didn't seem to be an overall theme to the conference. I think the official theme was ubiquity but it didn't feel pervasive.

- no one would touch the question of whether there would be a JavaOne next year...even Larry Ellison who made a guest appearance ducked the question. Its possible that this is due to a "quiet period" required by the upcoming merger but uncertainly is scary.

- dynamic languages or more generally other languages that run on the JVM continue to gain in popularity. I think this is a really good thing. When I went to school engineers were expected to be fluent in multiple languages (assembler, fortran, pascal, prolog, etc). But after a while it seemed to C/C++ and then Java took over and a generation of programmers grew up thinking that they only had to learn a single language. It feels like things are coming full circle back to the notion of using different tools for different tasks. So Java 7 won't have closures? No worries, just write the method needing the closure in Groovy.

Tuesday, April 7, 2009

Super Survey - An idea for a new kind of College Class

Super Survey is an idea generated by my somewhat eclectic college education but one that I think might have broad appeal.

I spent my freshman year at Kenyon College where I expected to be a biology major. I had placed out of Intro Biology by getting a good score on my Advanced Placement Biology test but somehow got shut out of the next biology class and so ended up taking Intro to Philosophy...and discovered that I loved it.

I transfered to Wesleyan University after freshman year and continued to take philosophy classes as well as computer classes. I spent a junior year semester at Goldsmith's College of the University of London. All in all I took classes in 14 different departments:

athletics (fencing)
anthropology
biology
chemistry
economics
english
geography
government
math
music
philosophy
political science
psychology
religion

I got to thinking that most people didn't get the opportunity to take such a broad range of classes, or if they did it happened during senior year. I can't imagine only discovering my passion at the end of my college career.

Super Survey is the simple idea of having each class be the Introductory lecture of a different department. In a 3 day a week class with a 13 week semester you can exposed to 39 or so departments. The class would be pass/fail based on attendance so the effort for the students would be low. Each department only has to give one lecture so the effort for them would be low as well; though they have incentive to make it a good lecture.

The smaller departments have a special incentive to give a good lecture as they can almost always use more students. If a school has more than 39 departments they could ration participation in the class to give preference to the departments with lower enrollment

When I look at a course catalog I'm struck by the departments I never touched (at least undergraduate), at least that list is shorter than the list of departments where I did study:

art history
astronomy
dance
history (don't get me started on how badly we teach history, see * below)
linguistics (took post-grad)
physics (took post-grad)


* In Junior and Senior High School we studied the revolutionary war every year. We never studied any non-US history, and we never studied even US history after the revolution. I can tell you when the War of 1812 happened, and who the Spanish-American war was between, but that's about the limit of my knowledge. For a high end school system (Winchester, MA) that's pathetic.

When I tried to take history in college I found the classes to have a micro-focus such as "The Japanese Imperial Court from 1901-1902" (I'm not making up that title). I was looking for (and am still looking for) what I'd call Meta-History...what Jared Diamond writes about...the large scale sweeping trends of history.

Tuesday, March 31, 2009

Package Scope, the often forgetten good choice

Scoping refers to how software programs control which parts of the system can see which other parts. Back in the day, most things were global which meant that everything could see (and touch) everything else. This made writing the code easy, until you had a variable suddendly change out from under you. You then got to spend long hours finguring out who might have changed it...and you had to look everywhere because everyone had access.

Programming languages such as Java provide a number of scopes that can be specified to control who can see a particular variable. Ask the average Java programmer how many types of scope there are and they'll probably say three: public, private and protected. Public scope is like the old global...a public object can be seen and touched by anyone. Private scope is basically the opposite...a private object can only been seen by the class that owns it. Conventional wisdom says that most things should be private. Protected scope is a variation on private. A protected variable can be seen the class that owns it or by any class derived from that class.

Thats the answer you'll get from the average programmer, but its the wrong answer.

The right answer is that there are four scopes in Java. The scope that most people forget about is package scope. An object with package scope can be seen by any other object in the package. This is a surprisingly useful scope and so its a bit surprising that so many programmers ignore it. Part of the reason for that is that while the other three scopes each have keywords, you get package scope by not specifying any scope. Personally I think the language should have included a package_scope keyword so as to make this clear but thats water under the bridge.

Why is package scope so interesting?

Consider a software system with two classes: One and Two. Assume that we want to expose two methods to the outside world "begin()" and "end()". Further assume that each class needed to call a method in the other class. The following is the sort of code that most people would write.

package sample;
public class One {
public begin() {
new Two().internal();
}

public internal1() {
}
}

package sample;
public class Two {
public end() {
new One().internal1();
}

public internal2() {
}
}

Most people would tend to make the two internal methods public so that the two classes could use them. The problem with that is that it makes the two internal methods part of the publically accessible API of the overall system. A user of this package has no particular clue that they should or should not call those internal methods.

A better approach is to remove the "public" in the declaration of the two internal methods. This leaves them at package scope, which is just fine. The two classes are in the same package and so have access to all package scope methods. The two internal methods are now however absent from the publically visible API of the system, which is as it should be.

So, the rule of thumb should be:
default to making classes and methods package scope
make methods only used within a class private
and, only grudgingly make the hopefully small number of truely public methods be declared as public.

Your users and maintainers will thank you.

Monday, March 2, 2009

My JavaOne proposal was accepted!

My proposal to give a talk at JavaOne with my friend David Koelle was just accepted.
We'll be giving the following presentation in San Francisco in June.

Abstract: Would you like to create Java™ technology-based programs that play or create music but don't know where to begin? Come to this session to learn all about JFugue, an open-souce API that enables you to program music with ease. With its simple but powerful API, new UI components, and cool features, JFugue promotes creative music programming and exploration. For example, what if you could listen to what your application has been trying to say to you? Learn about Log4JFugue, which combines the power of Log4J and JFugue to turn your application's logging into a real-time song. By listening to your application, your pattern-matching brain can detect subtle changes in behavior that would normally be lost in a sea of log messages. The intended audience for this technical session is developers at any level who are interested in writing musical programs or who would like to use more parts of their brain to increase their productivity. In the session • Learn how to get and use JFugue • Learn about some advanced and exciting features of JFugue, including new ones • Learn about Log4JFugue for turning your log files into songs

Monday, February 23, 2009

Re-experiencing content, followup

I've had several interesting conversations about re-experiencing content since the last post.

One friend asked about the phenomenon of people getting stuck on a particular content and watching / reading it over and over again. While this is usually associated with younger children who can watch the same episode of Thomas the Tank Engine multiple times a day, it sometimes occurs with adults as well. I think these are cases of the person looking for comfort through familiarity. When life seems unpredictable it can be comforting to revisit a content that won't surprise us. For small children most of life is unpredictable. Try to imagine how weird the world must seem before you figure out object permanence! I know that when I'm sick I'll tend to dig out an old movie to watch. This might also be why we watch certain movies (It’s a Wonderful Life) every year at a special time.

I’d distinguish this from watching a movie a few times to hear or memorize more lines, or to experience it again with the ending already known. I’ll confess to having watched some Monty Python movies too many times so as to memorize the lines from a scene or two. Inexplicably this is sometimes viewed as an accomplishment. :-)

Another reader wrote me to say: "These are very interesting matters. I haven't thought about them in an organized way before. I can watch a movie a few times (not multiple times, unless there's a lot of time in between). I've always supposed that the reason I want to listen to music over and over again is that I cannot really retain music--I mean I can remember short songs etc, but I can't unroll a whole symphony in my head, just parts of it, although musicians can replay the whole thing in their heads. I think I don't have the capacity. On the other hand, words stick in my head like glue. I have hundreds, maybe thousands, of word things, in my head but not a whole symphony or sonata. I remember songs, but probably because of the words. Do you think the content thing may have something to do with one's capacity for enjoying a particular medium? Some people are visual, verbal, or whatever? Interesting things to ponder"

Another person said that for them music could be re-experienced many times because they were different each time they heard the song. When it was pointed out that they were also different when they re-read a book they said that experiencing a book or movie was more immersive than listening to a song...there was less room for them. Perhaps listening to music is more of a participative thing while books / movies are more like being an observer? I don't necessarily share this belief but I can understand it. After all, I can listen to my ipod while coding at the office but they tend to frown on people watching movies there!

I wonder how this relates the concepts of Popular vs. Long Tail content? One school of thought says that most people will be watching the same small set of popular contents (the new “House” or “Heros” or “Battlestar”). The other school of thought says that most people will be off in their own corner watching their favorite old episode of “I Love Lucy”, “Hogan Heros”, or “F Troop”. For those of us designing Video Servers the difference matters. I’ve never heard the issue of whether individuals re-watch a given content brought up in the Popular vs. Long Tail discussions. Perhaps it should be.

Sunday, February 15, 2009

The Kindle and the cost/payoff of experiencing content again

In thinking about the Kindle (Amazon's electronic book reader) I got thinking about a larger question about how people experience content. I'm one of those people who can watch the same movie multiple times, and I tend to accumulate tapes and DVDs of my favorite movies. My wife has never understood this, once she's seen a movie she's done with it and has no interest in seeing it again. We've both tended to think the other one was rather peculiar. We were at a brunch today and I tried to enlist support for my way of thinking but found surprising diversity.

Some people viewed movies as one-shot experiences but would read a book multiple times, others viewed both movies and books as one-shots, but most people viewed music as something to experience over and over again.

So I asked people: what's different about experiencing a song from experiencing a movie?

Several people said that while movies are based on a narrative, they did not see music that way. Several said that they did not feel that songs told a story (some of the others of us found that idea shocking...of course a song tells a story...for us). So for these people a movie is boring the second time around because the know the story but a song is just an experience ... so repeating it is still rewarding...perhaps like eating ice cream? The fact that you've eaten ice cream before doesn't diminish your enjoyment of more ice cream.

I wonder if there is a different kind of answer underlying this: limited time. I can listen to a song in 2-3 minutes, watching a movie takes about two hours, and reading book might take 5-10 hours. Who has that kind of time for a repeat?

On the other hand, there is a payoff for getting involved in a long story...you get to know the characters. I tend to dislike short stories because why should I get invest time in a story that won't return my investment? Think of the angst many people feel when a beloved TV series ends (The Sopranos, Friends, Mash, etc). There has been an investment and the end of the story brings an end to the return on that investment. For some of us, re-watching a movie gives us some extra return on the time we spent on the content in the first place.

The link back to the Kindle for me is that for many people the fact that Kindle does not contain your existing library of books is simply not an issue. Many people view their existing library of books as more or less dead storage; perhaps to be rarely accessed at some time but certainly not in need of ready access. From this point of view loading the Kindle with only new material is just fine. I suspect that this sort of person would also be happy with a Kindle-like device with very limited storage, perhaps only enough to hold the books you were currently reading.

Saturday, January 24, 2009

How Digital Cameras remind me of the old B&W days

Recently my Tai Chi instructor asked me to take pictures of my class for the school's web site. His digital camera has some shutter lag which makes photographing people in motion a hit or miss proposition. So, I took a hundred or more shots, knowing that I only needed a couple of good shoots.

As I was doing this I was reminded of my old days learning to shoot black and white photos, (which happened long before there was such a thing as digital photography) and I was struck by the similarities.

I learned photography from an old friend, long since passed away named Christopher Resnik. Chris taught photography from the ground up. We bought 100 foot rolls of 35mm film and made our own film cartridges. Cameras back then were fully manual and Chris wouldn't even let us use a light meter...preferring to train our eyes to read the light and determine the exposure. To this day I can look around and know that such and such a day requires an F8, 1/250 second exposure with 400 ASA film. Today most people probably don't even know what those terms mean.

Since we bought film in bulk and did all of our own darkroom work it was quite inexpensive and so we shoot a lot. Chris used to say that is you got one or two good shots out of a 36 exposure roll you were doing well. When we found an interesting subject we'd take a number of shots of us...from various angles, with various exposures and with various perspectives. We might take whole roll of film on a single subject. Taking 100 Tai Chi shots reminded me of that.

Back in the days before digital cameras a roll of color film might cost $3-$5 to buy and then perhaps $10-$12 to process. With that price model you were careful how you spent your photos and a roll of film might sit in the camera for months or a year.

We've now come full circle in a way. Photos are basically free, especially since many people never make prints but instead view their photos on a digital picture frame or online via Snapfish or KodakGallery.

A missing element for most digital photographers however is the culling process. Back in the day, after processing a roll we'd stay in the darkroom evaluating our photos and would throw out 90% of them. We shot 36 pictures of that flower so that we could find the best one, but then we left the other 35 shots on the darkroom floor. Today most digital photographers take the 36 photos, post them all to Snapfish and go on their way. After a few months they have 100s or 1000s of pictures, most of which are awful and have no idea where the dozen or so gems are.
I'm sure there's a metaphor in here for the pace of modern life but it escapes me. I do however like when life comes full circle.

For my own pictures, (and I have a 3 year son so I take a lot of pictures), I've developed a culling process. When I plug my digital camera or my IPhone into the computer it creates a new directory where all the pictures go. I then create a subdirectory called so_so_shots. I immediately look at each picture and if it isn't great it gets moved to the so_so_shots directory. Since I'm just moving them rather than deleting them I can be quick and ruthless in the decision. This lets me easily manage a much smaller set of much better pictures.

Tuesday, January 20, 2009

Why I don't want a Kindle yet

The Kindle is Amazon's book reader and its an amazing device, but one that I don't want yet.

The Kindle has an astonishingly readable screen (using electronic ink rather than a display tube or LCD). This means that once its painted a page there is no electricity needed to maintain the image so its always on, and the battery life is very good.

So why don't I want one of these cool devices that would let me take all/most of my books with me? Because Amazon wants me to pay for all of those books again.

As I look at my book shelves I see hundreds of books, most which were purchased from Amazon. They know this. And yet, when I buy a Kindle I have to turn around and repurchase all of those books again if I want to put them on the device. And I have to pay what amounts to full price.

When you buy a Kindle Amazon should give you and electronic copy of all they books you've purchased from them, say in the last couple of years. Until they do something like that the advertising line that you can take all your books with you simply isn't meaningful.

Thursday, January 8, 2009

A Tutorial for Adding Groovy to a Java Project

I program primarily in Java and have done so for close to ten years. Before that I programed in C++, C, Pascal, Prolog (yes, I had a real job using Prolog!), and even Fortran and assembler. So I know that languages wax and wane in their importance. So, I decided to add a class or two written in the Groovy language to one of my side projects.

For those not familiar with Groovy its one of a surprisingly large group of languages whose source code is different from Java, and yet runs on the Java runtime. Java source code is compiled into Java bytecode which is executed by the Java Virtual Machine (JVM) on your computer. Groovy source code is also compiled into Java bytecode which then runs on the very same JVM. This allows the two languages to interoperate in interesting ways.

As a starting point for this tutorial I assume you have an existing Java based project, use Ant to build your system and use Eclipse as your development environment. None of that is required for Groovy but it is a fairly standard setup and its what I use, both at home and at work.

The steps are basically
a) download and install the Groovy GDK
b) add a how-to-compile-groovy task to your Ant build.xml (the task is included in the GDK)
c) add the groovy-all.jar file to your library path
d) add the groovy-plugin to your Eclipse project
e) enable groovy-nature in your Eclipse project
f) write a Groovy class and use it.


Overall this should take no more than twenty minutes or so. One of the really cool things is that once you have a Groovy class you can use it from your Java classes just like any other class...your Java code has no idea its using a class written in Groovy.
Now lets look at the steps listed above in cookbook manner.

a) go to http://groovy.codehaus.org/Download and select the latest stable release (currently 1.5.7). Click on the download link and follow the very simple instructions.

b) add the following task to your build.xml to define how to compile groovy code: You then need to actually compile your groovy source files. A standard build.xml will have a compile target that invokes the javac task compile your java source files. I changed that so the my compile task was empty but depended on java_compile and groovy_compile targets. See the full definition of the groovyc task options here: http://groovy.codehaus.org/The+groovyc+Ant+Task


c) the Groovy GDK includes a groovy-1.5.7-all.jar file. Add this to your library and execute paths

d) From Eclipse, select Help, Software Updates, Find and Install, Search for new features to install, next.Now select New Remote Site. Enter Groovy as the Name and http://dist.codehaus.org/groovy/distributions/update as the URL.Press Ok and you should find Groovy added to the list of Sites to include in search. Press Finish.It should find Groovy, check the features you want and press Next.Accept the license terms and press Finish. At this point the plugin will be downloaded and installed.


e) From Eclipse select your proejct, right click and select Groovy->Add Groovy Nature. Eclipse will now understand files with the .groovy extension.

f) Remember that any Java file is also a Groovy file, so if you want to start with baby steps just rename one of your java files to have a groovy extension. Eclipse and Ant will recognize it and compile it to bytecode and then allow you to use it.


You could then take another small step and remove most of the semi-colons, because Groovy doesn't require them.If your class has import statements from various java packages you can probably remove them as well since Groovy automatically imports them.

Clearly you then want to write a class that actually uses Groovy features but that's beyond the scope of HelloGroovyInAJavaProjectWorld.


Enjoy.

Monday, January 5, 2009

Tai Chi, Software Design and simplicity

Its always interesting when different aspects of life intersect. In this case Tai Chi, Software Design and simplicity. One of the lessons I've been learning in my Tai Chi studies is the value of simplicity and minimalism. Most of the flaws in my technique end up being attempts to do to much. My motions tend to be too large and often too complex. Such large and complex motions are relatively easy to defeat.

In software as well there is the truism that the code you don't write doesn't contain any bugs. There are many possible refactorings that can increase the quality or robustness or performance of a program, but code that no longer exists takes no time to run, doesn't need to be learned and doesn't obscure the structure of a program.

The trick of course is that simplicity is hard. My Tai Chi Shifu (instructor) gives me simple instructions from which I somehow deduce complexity. Only after continued study am I sometimes able to get the flash of "oh, that's all I have to do?" as I see the simple and powerful move hiding inside my attempts.

Even though I know this to be the case I do not seem to be able apply to new moves I learn. There seems to be need to for the learning to be a process. The simple move doesn't seem to be obvious so I flounder about adding unnecessary nuances.

This makes more sense to me in software design, perhaps because I've been doing it for so long. In software I do have the ability to move fairly quickly to a simple design, and to spot the extraneous complexity in most designs. In software the simple code path usually seems to leap out of the spaghetti I often encounter.

One way of looking at this might be via the Dreyfus model of learning (see Pragmatic Programmers). The Dreyfus model describes five stages of skill:
Novice, Advanced Beginner, Competent, Proficient and Expert.

Novices need to be given a cookbook description of how to perform a task, while Experts simply do the task and might not even be able to describe how or why they are doing it. The inability to describe why can be a problem. I've had bosses ask me why I think design X is better than design Y. My answer that Design X "sings" or is "beautiful" while Design Y "feels" clunky does not tend to convince them.

I remember watching a documentary about an about-to-retire Cotton Inspector trying to teach his apprentice how to select the best cotton. The expert said to rub the cotton between your fingers and feel when it was "good". The apprentice had no idea what that meant but it was the best answer the expert could give him.

I think that most people stay in their comfort zone and thus stay within whatever skill level they've achieved. While I might claim to be in the expert group in software design I'm certainly no higher than advanced beginner or competent in Tai Chi. Hopefully this makes me a better instructor as I can relate to the students just starting to study Tai Chi.

Whether all of this eases my learning curve with Groovy or Scala (new computer languages) remains to be seen :-) It does however remind me of the quote by Blaise Pascal: "I have made this letter longer than usual because I lack the time to make it shorter".

Saturday, January 3, 2009

software haiku

I wrote this haiku a couple of years ago to describe the software I was working on and it seems pretty appropriate to Wabi Sabi. For non-software developers spaghetti is an adjective used to denigrate code whose logic seems to go in all directions at once with no hope of untangling it.

Code in Haste no Pattern
Spaghetti's not food for the brain
The mind recoils, I weep