Deprecated: Assigning the return value of new by reference is deprecated in /home/edelabar/ericdelabar.com/wp-settings.php on line 472

Deprecated: Assigning the return value of new by reference is deprecated in /home/edelabar/ericdelabar.com/wp-settings.php on line 487

Deprecated: Assigning the return value of new by reference is deprecated in /home/edelabar/ericdelabar.com/wp-settings.php on line 494

Deprecated: Assigning the return value of new by reference is deprecated in /home/edelabar/ericdelabar.com/wp-settings.php on line 530

Strict Standards: call_user_func_array() expects parameter 1 to be a valid callback, non-static method GoogleSitemapGeneratorLoader::Enable() should not be called statically in /home/edelabar/ericdelabar.com/wp-includes/plugin.php on line 311
Eric DeLabar

Hammering Screws: Programmers and Tool Blindness

screws.jpgIn my last post I told a half-truth by ending with “If you need me I’ll be uninstalling Eclipse.” Honestly, I only removed it from my laptop because I rarely do any real Java development directly on my laptop, and should I need a quick code editor I have TextMate which handles most of my coding needs pretty simply. However, the commotion that the statement caused is what I’m going to address in this post.

If all you have is a hammer, everything looks like a nail.
-Bernard Baruch

To continue with my tools theme I’m going to address what I’ll call “tool-blindness,” the mentality that the tools you have and know how to use are perfect for every situation. In other words, if the tools you have require you to hammer screws then by-god you’re going to hammer screws.

Recently there has been a grass-roots, developer movement at my employer to switch from Ant to Maven. I love Maven, it’s leaps and bounds better than the way we were using Ant. (Notice I didn’t say Ant in general, I’m only planning on starting one holy-war with this post!) My last four projects have used Maven, and it makes the build and deploy process significantly easier, especially when it comes to dependancy management. However, everyone’s favorite Java IDE does not play well with Maven because Maven’s standard directory layout is significantly different from Eclipse’s convention. People have attempted to alleviate this problem by using the M2 plug-in and following the instructions here, but the result still feels like you’re forcing Eclipse to do something it doesn’t want to. Add that to the fact that the version of Eclipse we’re using crashes at least once a day, and you should be able to see why I’m looking for alternatives.

In short, our metaphorical hardware (the build process) has changed from nails to screws and our hammer (Eclipse) is no longer the best tool for the job. However, we’re still using the hammer because it’s what everyone knows how to use and only a few of us are not tool-blind enough to look for something else. It also doesn’t help that the alternatives that play nicely with Maven, namely IntelliJ IDEA and Textmate-and-a-shell are not “free as in beer.”

As I mentioned in the comments of my last post, Eclipse has been feeling a little more like this tool than this one. So maybe it’s time to ask which tool you really want on your tool belt.

In an interesting turn of events, Alex Vazquez over at Wufoo/Particletree is going through the same process with a different perspective. Alex is moving, based in part by recommendation and in part by language choice, from Java on Eclipse to PHP on Textmate, and he seems to be liking it so far. Alex does, however, sum up Eclipse rather nicely, and coincidentally within my theme:

The right development environment can save a programmer countless hours and is like a hammer in the carpenter’s tool belt. Since my background was in Java, my preference was for large sledge hammers and my development environment of choice was the de facto Java IDE Eclipse. It has a number of amazing features like autocomplete, refactoring and hundreds of plugins for every task imaginable. It’s no secret Java requires mountains of code, but Eclipse was made to move mountains.
-Alex Vazquez

I think Alex really nails (pardon the pun) the fact that if you’re going to be doing Java Enterprise development you need an IDE that can handle it. You need something that generates the code and provides the re-factoring tools and autocompletion to make it possible to “move mountains.” However I’d like to pose a question to all Java developers who use Eclipse; how much of Eclipse do you really use? Besides re-factoring, code completion/generation, and cvs/svn/scm integration, is there anything else you couldn’t live without? Anything else that Textmate doesn’t do? (Besides run on Windows, we’ll save that tool for a different day.) Look at all of the stuff Eclipse does that you don’t use, is the added bulk really worth it? How much memory is your Eclipse process using right now? (Mine’s got ~254MB, 5x more than the next largest memory footprint, and my Eclipse process is basically idle.) Just my two cents, please form your own opinions, after all I’m just a kid who couldn’t possibly have any experience.

Maintenance

A quick welcome to everyone passing through via DZone! Please subscribe, I’d love to have you back!

If you’re new here, please excuse the mess, it’s still a work in progress since the content has been my number one priority. Speaking of content, I finally replaced the default Wordpress about page, so go check mine out if you’re interested!

Coding Your Fingers Off - Hand Tools, Power Tools, and Programmers

saw.gifI have read quite a few posts recently on the lack of quality programmers, web or otherwise, available in the current market. I’ve even written a post myself on some of the “differences” in the technology stack between now and when I started programming professionally just four years ago. Some people are saying we need to encourage children to become programmers, others are questioning the languages that are taught in schools, still others are criticizing the things that are not taught (or encouraged) during secondary education. I’m going to question how things are taught.

I spent my first year of college at RIT, not to downplay my last three years at Muhlenberg, but everything I really needed to learn I learned in three quarters at RIT. Computer Science 101-103 had labs in a Sun Unix lab. We wrote Java code using Emacs from a shell. We compiled it from that shell. We checked it into RCS from that shell. We ran diffs from that shell. We submitted our completed assignments from that shell. We loved that shell, whether we wanted to or not.

We did not have an IDE, not in today’s sense anyway; there was no code complete, re-factoring tools, or visual SCM merging tools. In the process we learned Unix, we learned how to grep, how to use sed and awk, telnet, ssh, and command line ftp. We learned how the internet worked by first learning how a network worked. We learned to write code, use a computer, and use the internet with the functional equivalent of hand tools. In the process we learned and understood how and why it all fit together.

As a matter of illustration, I’m reminded of the Home Improvement television show that was on when I was a kid. In it, Tim Allen plays Tim Taylor, the host of a cable TV tool show called Tool Time. He has an assistant, Al Borland, played by Richard Karn. On the show, Tim’s motto is “more power,” which usually leads him to the biggest Binford Tools power tools, disastrous projects, and eventually the emergency room. Al, on the other hand, is more of a renaissance man, appreciating the beauty, elegance, and simplicity of hand tools and the wood they’re used on. Although I don’t think it was ever stated, Al never ended up in the emergency room. Which character would you hire to work on your house?

But I digress, we’re seeing more and more computer science grads who have worked only on Windows. They’ve used Eclipse and Visual Studio. They know how to use the very basic IDE functionality with the mouse and they live and die by ctrl+c and ctrl+v. They were given power tools in the very beginning of their careers and now quite a few of them have figuratively managed to cut their fingers off. They’re crippled programmers because the “more power,” here’s-a-monsterous-power-tool-that-does-everything-you’ll-ever-need-really-fast attitude has physically removed their ability to operate the simple tools that solve their problems in an elegant manner. They’re afraid of the shell because they don’t know how to use it, but they’re not afraid of the IDE because it has a big, shiny button that promises to make their life easier if they press it. Power tools in the real world have warnings about loss of life and limb if operated incorrectly. Sadly, power tools in the digital world do not.

So, I propose my solution. Bring the hand tools back into the classroom. Eliminate IDE’s from the educational system. Teach students to use the shell, and with it the tools of our hacker forefathers. Let them nick their fingers with a hand saw instead of cutting them off with a circular saw. Encourage them to use open source. Encourage them to contribute to open source. The web in general runs on it, they should know how it gets made, and know how to give back to the community that has made a large part of their future pay possible. Teach them Emacs or Vi. Give them cvs, svn, or git, and teach them to read a diff. Make them create a website and share what they’re learning, or at least participate in the forums of some pet open source project. Do them a favor and scare the ones who aren’t meant to be doing this out of the profession. If they don’t have the passion to persevere they need to find something else to do. Ladies and gentlemen of academia, I ask you for one thing. Stop manufacturing cookie-cutter, power-tool graduates and start nurturing artesian, master programmers.

Thank you. If you need me I’ll be uninstalling Eclipse.

Border Weirdness in Internet Explorer

While helping a friend rework his Vintage Board Games site (rework not live yet), we came across an interesting IE bug. In a nutshell, in some cases, IE was placing a CSS background image relative to the outside of an element’s border instead of the inside.

The simplified markup of the bug and CSS are as follows:

1
2
3
4
5
<div class="content">
	<div class="left"></div>
	<div class="right"></div>
	<div style="clear: both;"></div>
</div>


1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
.container {
	width: 420px;
}
.content {
	border: solid 10px #3570d6;
	background: white url( background_invert.gif ) left center no-repeat;
}
.left {
	float: left;
	width: 200px;
	height: 300px;
	margin-left: 25px;
	background-color: green;
}
.right {
	float: right;
	width: 100px;
	height: 200px;
	margin-right: 25px;
	background-color: cyan;
}

Basically, it’s a two column layout with the columns wrapped in a div that has a large border. (That div also has a background image set on it. The .container div seems extraneous in this example but was a requirement for the layout.) The desired rendering of this markup should look something like the following: (Note: the black/brown box is the background image.)

ie_correct.gif

But in IE, we get this:

ie_bug.gif

If you don’t trust my images, please try for yourself.

We quickly found two solutions to this problem, the first involved altering the alignment of the background image to be center instead of left:

3
4
5
6
7
8
...
.content {
	border: solid 10px #3570d6;
	background: white url( background_invert.gif ) center center no-repeat;
}
...

This is how we actually solved the problem on the site. The second solution I found while attempting to narrow down the cause of this problem. For this solution we simply set a min-height on the .content div:

3
4
5
6
7
8
9
...
.content {
	min-height: 1px;
	border: solid 10px #3570d6;
	background: white url( background_invert.gif ) left center no-repeat;
}
...

I’m assuming this is some sort of hasLayout issue and giving the div a min-height (height in IE6, accomplished with conditional comments in my example) also gives it layout, but I honestly have no idea what causes this. Anybody have any thoughts?

The Parable of the Plumber and the Programmer

One day a small business owner, known to dabble in the do-it-yourself arts, was reading the visitor statistics from the website that his nephew had made him. Disappointed by the long page loads, low traffic, and cross-browser inconsistencies, the small business owner went to his kitchen for a glass of water. He turned on the faucet and nothing happened. A minute later, water started pouring out from under the sink. Quickly, he turned off the water and rushed back to his computer and looked up the name of a local plumber, he called, and had a plumber dispatched immediately. By chance, he also saw an advertisement for a local “low-cost, satisfaction guaranteed, web developer” and called him as well.

Twenty minutes later, they both arrived. Apologizing for the timing and offering a cup of coffee, the small business owner asked if the web developer wouldn’t mind waiting while he got the plumber started. The web developer, figuring he’d only had six cups of coffee today, decided a little waiting was not a problem.

The small business owner took the plumber to the water shut-off valve, and the plumber quickly shut it off. While discussing the solution, the small business owner told the plumber that he would like all of the pipes from the valve to the sink to be replaced with PEX pipes instead of the existing copper. Concerned, the plumber warned that there was nothing wrong with the existing copper pipes and that he was unsure if PEX was even allowed by local commercial building codes. The plumber even offered to first double-check the codes, but in a hurry to get back to the web developer before he drank all of the coffee, the small business owner told the plumber to just do it. He blindly signed the contract, completely ignoring the written warning about violating building codes, and left the plumber to do the job.

Relieved to have one problem scratched from his list, the small business owner left the plumber and returned to the web developer, who had already drank all of the coffee. Annoyed, the small business owner began making a new pot of coffee while explaining the problem with his website. Hearing the same thing a thousand times before, the web developer began his speech about web standards, usability, accessibility, and information architecture. Within seconds, the small business owner’s eyes had glazed over, and he began ignoring everything the web developer said.

A few hours passed. The plumber finished the job, the small business owner signed the web developer’s boiler-plate contract with 100% satisfaction guarantee, and both the plumber and the web developer left the small business owner to finish his list of things-to-do.

About a month later, the web developer returned to the small business owner’s office to show him the finished product. Coincidentally, the city building inspector arrived at the same time. Learning his lesson about web developers and coffee from the last visit, the small business owner told the building inspector that the work was done down the hall in the first room on the right, and left him to his job.

The small business owner and the web developer went to a computer and the web developer showed the small business owner the finished product. Unimpressed by the lack of motion, flashing, and changing colors the small business owner said that the new site was “nice, but needs something.” The web developer, who truthfully knew this was coming, began regretting his 100% satisfaction guarantee. By the time the small business owner was done stating his “satisfaction requirements,” the site had lost its information architecture, the typography was small and low-contrast, the flash-intro was back, and bigger than ever, and the back end programming was changed from a lightweight CMS to a large enterprise framework that the small business owner’s friend had heard about at a trade show. Concerned for his portfolio, but tired of this particular client, the web developer left the small business owner’s office; defeated and hoping to just get the job done as soon as possible.

Glowing from his victory against the tasteless web developer, the small business owner went down the hall to find the building inspector. The building inspector was sitting in the corner of the room finishing up his report on the plumbing changes. He looked up from the paper and told the small business owner that all of the PEX pipes had to be removed because it was against code in a commercial building. Aghast, the small business owner quickly called the plumber to chastise him for performing such a blatant building code violation. Unimpressed, the plumber simply referred the small business owner to section 4c of their signed contract where it stated that the small business owner had been informed of all building codes to the best of the plumber’s knowledge and that any specific requirements performed by the plumber per the customer were the responsibility of the customer. The plumber also stated that he would be glad to come out and replace all of the PEX with brand-new copper, under a brand-new contract.

About a month later the “improved” website was live and still suffering from the same long page loads, low traffic, and cross-browser inconsistencies. The web designer however, had already folded his company and taken up a new career as an apprentice to the local plumber.

The Morals

Mostly everything we do as web developers is according to some specification or recommendation, but it’s still very easy to ignore them all and still produce a “functioning” website. We compete in an amazing marketplace against everyone from the boss’ nephew to Fortune 500 companies. Because of this, highly skilled web developers working at small companies or even as freelancers often make stupid promises, including 100% satisfaction guarantees. These guarantees do not help anyone because the client is rarely satisfied, the end result isn’t worthy of the web developer’s portfolio, and none of the original problems have been solved.

Plumber’s have the law, the union, air-tight contracts, years of collective experience, and the fact that there’s no such thing as WYSIWYG plumbing software to keep the average Joe from calling themselves master plumbers.

The Soapbox

I’m not sure if it’s a question of attitude, experience, or the fact that there are parts of the plumbing profession that literally stink, but people rarely tell a plumber how to do their job, and I doubt many plumbers would even let someone if they tried. To the same extent, a plumber’s work is usually not up for scrutiny in the public eye, in fact other than in basements and under sinks, it’s rarely even visible to the proerty-owner’s eye.

So what’s my point? With the amount of very talented web developers and designers available world wide, the web has the potential to be a very beautiful, very usable, and very accessible place. However, the web wasn’t build by graphic artists, typographers, interactive designers and information architects, it was built by tinkerers, inventors, and do-it-yourselfers, kind of like the first indoor plumbing. We have a long way to go before there are laws requiring best practices, and professional reputations that allow us to say “this is the way I’m going to do it, and it’s non-negotiable.” But that doesn’t mean we should stop educating our customers and return to table based layouts and excessive use of the blink and marquee tags. It means there’s a light at the end of the tunnel. Until until we reach that time, sell the fact that you’re an expert and not the fact that you’re guaranteeing satisfaction. Let your work speak for itself, a well-designed and well-implemented site will just work and will lead to a satisfied customer, whether or not it’s been guaranteed in writing.

Found Code: Optimizing Large Form Performance in JavaScript

As I’ve covered before, ill-used JavaScript can lead to some serious performance problems, most of which are caused by simply not thinking about what the code is really doing. Recently I came across a site that provided digital photo printing, This site had a nice interface that allowed my to upload close to three hundred photos. On the resulting page, each photo was displayed with all of the available sizes as input boxes, which looked something like this. I liked the interface, but came across a very serious problem. The event handlers that updated the totals box ran on the keyup event and recalculated the total of the entire form! This worked fine with ten or twenty photos, but the 300 that I provided brought my browser to a screeching halt.

I’ve taken the liberty of creating a very simplistic mock-up of the form and a simplified version of the JavaScript, which is available in my examples section. The demo uses Firebug and Firebug Lite for logging just like I did in my dollar function article, and the benchmark class from that article as well. The site’s JavaScript was a bit more complex and actually did an AJAX lookup of the price on each keyup, but I’m more concerned with the JavaScript performance here, so I simplified the code to something like this:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
var PhotoSelector = Class.create();
PhotoSelector.prototype = {
	initialize: function( name ) {
		var init = new Benchmark();
		init.start();
		console.debug( "Beginning Initialization!" );
 
		$A(document.getElementsByTagName("input")).each( 
			function( inp ) {
				inp.value = 0;
				if( Element.hasClassName(inp,"qtyInput") ) {
					inp.onkeyup = this.recalculate.bindAsEventListener( this );
				}
			}, this
		);
 
		init.end();
		console.debug( "Initialization Complete in " + init.inMillis() + " milli(s)." );
	},
 
	recalculate: function(e) {
		var calc = new Benchmark();
		calc.start();
 
		$("fourby").value = 0;
		$("fiveby").value = 0;
		$("eightby").value = 0;
		$("wallet").value = 0;
 
		var inputs = $("pictures").getElementsByTagName("input");
		for( var i = 0; i < inputs.length; i++ ) {
			var totalId = inputs[i].id.match(/([a-z]+)[0-9]+/)[1];
			var total = $(totalId);
			total.value = parseInt(total.value) + parseInt($(inputs[i]).value);
		}
 
		calc.end();
		console.debug( "Recalculation Complete in " + calc.inMillis() + " milli(s)." );
	}
}
var ps;
Event.observe(window,"load",function(e){ps = new PhotoSelector()});

Basically, on window load, this code grabs every input element, sets its value to zero, and binds an event handler to it. The event handler runs on key up and loops through every input box in the “pictures” list, and updates the totals inputs at the top of the page. As I said above, this code works fine with 20 pictures, but it starts getting slow around 300, and becomes almost unusable at 1000. Care to try 10,000? (Be careful, it crashes my browser!) To test it, simply enter values in the photo inputs and watch the totals boxes increment.

The main problem with this code comes from the recalculate function. Problem number one is my personal pet peeve, the dollar sign function is called at least six times! Well, I guess six times wouldn’t be terrible for the entire page, but it’s called at least six times on every key up event! Problem number two, the biggest problem, is the fact that this code re-crawls what amounts to the entire DOM every time the event fires. Obviously the larger the DOM, the more time this is going to take.

So, how do we fix it? Well, here’s how I fixed it, I’ll explain the details below:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
var PhotoSelector = Class.create();
PhotoSelector.prototype = {
	initialize: function( name ) {
		var init = new Benchmark();
		init.start();
		console.debug( "Beginning Initialization!" );
 
		this.old = 0;
		var totals = {
			fourby: $("fourby"),
			fiveby: $("fiveby"),
			eightby: $("eightby"),
			wallet: $("wallet")
		};
		totals.fourby.value = 0;
		totals.fiveby.value = 0;
		totals.eightby.value = 0;
		totals.wallet.value = 0;
 
		$$(".qtyInput").each( 
			function( inp, index ) {
				inp.onfocus = this.enter( inp, this ).bindAsEventListener( this );
				inp.onblur = this.recalculate( inp, totals, this ).bindAsEventListener( this );
			}, this
		);
 
		init.end();
		console.debug( "Initialization Complete in " + init.inMillis() + " milli(s)." );
	},
 
	enter: function( inp, me ) {
		return function(e) {
			me.old = parseInt(inp.value);
		}
	},
 
	recalculate: function( inp, totals, me ) {
		var type = inp.id.match(/([a-z]+)[0-9]+/)[1];
		var total = totals[type];
		inp.value = 0;
		return function(e) {
			var calc = new Benchmark();
			calc.start();
 
			var newVal = parseInt(inp.value);
			if( me.old > newVal ) {
				newVal = ( me.old - newVal ) * -1;
			}
			total.value = parseInt(total.value) + newVal;
 
			calc.end();
			console.debug( "Recalculation Complete in " + calc.inMillis() + " milli(s)." );
		}
	}
}
var ps;
Event.observe(window,"load",function(e){ps = new PhotoSelector()});

To solve problem number one from above I created a simple object for storing references to all of the total input boxes (lines 9-14), now we have a simple associative array lookup whenever we need to update a total. Problem number two is mainly solved by recording the original value of the input on focus (line 22, 31-35), and then comparing them on blur (line 23, 37-54). Because we’re doing this on blur, we can update only the necessary total input (lines 45-49) instead of recalculating the entire form. I made one final tweak, mainly to make solving problem number one easier, and that is the recalculate function now returns a specific event handler for the given input so that the event handler itself does not need to call the dollar function.

So, comparing these in my regular, not-very-scientific fashion, I came up with the following results. I chose to measure the startup time, which will increase with the size of the page, as well as the event handler time. I also measured these times across a pretty decent amount of pictures, and across a few browsers.

Safari (OS X)

Optimized Time Unoptimized Time
Pictures Load Handler Load Handler
10 4 ms 0 ms 6 ms 3 ms
50 17 ms 0 ms 14 ms 13 ms
100 33 ms 0 ms 24 ms 26 ms
1000 365 ms 0 ms 452 ms 178 ms

Internet Explorer 7 (Windows Vista)

Optimized Time Unoptimized Time
Pictures Load Handler Load Handler
10 56 ms 0 ms 48 ms 9 ms
50 238 ms 0 ms 213 ms 76 ms
100 457 ms 0 ms 424 ms 235 ms
1000 4642 ms 0 ms 4584 ms 28110 ms

28 seconds!? Why!?

Firefox 2 (Windows Vista)

Optimized Time Unoptimized Time
Pictures Load Handler Load Handler
10 12 ms 0 ms 8 ms 7 ms
50 45 ms 0 ms 31 ms 30 ms
100 87 ms 1 ms 60 ms 59 ms
1000 985 ms 3 ms 584 ms 581 ms

The results pretty obviously speak for themselves, but there is one caveat, be sure to notice the initial load time. Since the event handlers still need to be assigned to each input on the page the more inputs there are the longer the page load takes, and the load time is even slightly slower on the optimized page. Be sure to consider this time, possibly by capping the number of inputs displayed, since the code itself is very processor intensive and appears to actually hang the entire computer while processing. Obviously, these fixes become more important as the number of inputs grows, but any speed increase when the user is directly interacting with the page is a good one!

Psychology, Avatars, and My High School Yearbook

At one point in high school I went to a workshop on making a great yearbook. It was interesting, got me out of a day of school, and really helped with the two pages of the book I actually got to work on. Besides that, I took out of it a few facts about portrait composition, mainly that looking right is generally associated with looking into the future and looking left into the past. Recently there’s even been some talk about Obama and his “looking into the future” pose, which definitely follows these guidelines.

But seriously, The Onion aside, every self-respection “Web 2.0” site out there has some sort of avatar available to its users. What does your avatar say about you? Well, I’m no psychologist, in fact I didn’t even take the course on it in college, but here’s what I think anyway.

  • Facing Right - Already covered, looking into the future. Possibly optimistic, probably at least positive in nature.
  • Facing Left - Again, already covered, looking into the past, possibly back through history. Nostalgic. May or may not be negative, sometimes may seem a little jaded.
  • Looking Straight Ahead - Sometimes dominant, sometimes playful, sometimes creepy. As with the next two seems to have more to do with the rest of the facial expression.
  • Looking Up - If straight ahead, can be submissive or unconfident, if facing right, increases positive feel, if left, seems narcissistic.
  • Looking Down - If straight head, definite dominance, if left or right, shows increasing negativity on past or future view.
  • Serious - Boring, possibly pompous; only has an avatar to get 100% on LinkedIn profile or because boss or some self-help personal-branding article said so.
  • Goofy - Fun, party-animal, not ready to settle down now, or potentially ever. Probably not the best worker-drone code-monkey but could make a great rock star.
  • Picture of Somebody Else - Personally, I find this quite creepy. Usually very hard to actually identify these, in my experience when meeting somebody in the real world and finding their avatar doesn’t match I get a little turned off. This can be pulled off ironically by using an obvious-not-you type of photo as more of a caricature, but I have only seen this pulled off well a few times.
  • Picture of Somebody Famous - The more famous the better, if the picture is not recognized by he majority of your audience it simply becomes “Picture of Somebody Else” and therefore loses its power. May also appear very stalker-like or just plain juvenile if done in a fan-boy manner.
  • Professional Portrait - See “serious” above, screams real estate agent, lawyer, sales person, or scammy-internet marketer.
  • Candid - Probably my favorite type. Seems sincere and real.
  • Group Photo - Lack of self-confidence or own identity, possibly critical of own appearance.
  • Caricature - Makes me wonder about self-confidence, but could be done in irony. Potential for inside jokes here. A good caricature plays on flaws, so this may say positive things about self confidence.
  • Animal(s) - Awe!!!1 OMG look at the cute kittens and ponies! I can haz cheesburger? Seriously. Unless you’re twelve or doing it for purely and obviously ironic reasons I usually look down on this as somebody I’m not even interested in listening to.
  • Logo or Mascot - Definitely used for self-branding or maybe just branding in general, if done well can be a great asset, but may also appear very spammy if done poorly. May say things about self-image or confidence, especially if hiding behind anonymity.
  • Object - Like an animal, logo, or picture of somebody else, this gives the user complete anonymity and can become very hard to read-into. Lots of room for irony or playfulness can allow personality to shine through, but if not done well it may just turn into a poor inside joke.

Now, please understand, I’m not necessarily criticizing ANY use of an avatar in one of these manners; in fact there is probably somebody out there that has a very good reason for using a pony for their avatar, and I’m sure they do it well. I’m just saying this is what my first impression of you is based on the two seconds I’ve had to see your avatar.

Well, what do you think? Am I right? Am I completely wrong? Did I just insult you and your mother with my analysis of your avatar? Just to remind you, I have no scientific backing for doing this, which I’m guessing makes it a better analysis, but if you disagree, tell me why I’m wrong.

Google Giveth and Google Taketh Away

Michael Martinez over at SEO Theory recently posted an interesting article on contract law, terms of service, and how they apply to the web. It’s worth a read and raises some interesting points, but my main beef is the loosely stated complaint about Webmaster Guidelines, specifically those of everyone’s favorite search engine. This complaint, and variations on it, have been bouncing around the SEO/M and Affiliate communities for a while now and I’ve heard more than enough whining on the subject. Frankly stated, it seems they don’t like the fact that they actually need to work in order to keep their spam profitable on the search engine result pages.

Google, as with most other search engines, is a business. In order to actually remain in business, contrary to popular belief on the web, they need to make a profit. In most web business models profit is proportional to the number of users. So far, at least in my opinion, all of this is web business 101. Google has its number of users for one reason, as of now it’s arguably the easiest and most accurate search engine available. Google will retain its users as long as it remains the easiest and most accurate search engine available.

Easy is something that Google has down pat, you can’t get much easier than a single input and a button, accurate is where it gets interesting. In order to remain accurate Google needs to be unmanipulatable. Their algorithm needs to return the most relevant and authoritative content possible, and that means excluding spam. If you’re not publishing the most relevant and useful content out there you don’t deserve to be listed, let alone rank on the first page.

For better or for worse, the bulk of SEO exists to manipulate the search engines, and if you think otherwise you’re seriously deluding yourself. Don’t get me wrong, I believe SEO is absolutely necessary, if you don’t at least try to be listed in the search engines there’s a pretty good chance your site will never be found. However, SEO is only the start, it’s the framework to build your content upon. Good SEO establishes a solid base for accessibility, findability, and information architecture, which is a good thing. Good SEO, however, is not magic. If you do it the Google-approved “right” way it will probably take a decently long time to get a specific ranking, but once you have it, it should be pretty difficult to lose. Taking a shortcut and ignoring the webmaster guidelines may prove useful and in some cases successful, but comes with the underlying risk of being delisted altogether.

Basically, what I’m saying is that SEO, be it black hat or white hat is a gamble. It’s a simple question of risk versus reward, and relies very heavily on your business model. If your business model is to make a quick buck over a short-term, by all means, go black hat, but don’t complain when you’re discovered and your profit dries up. However, if your business model is to make a long-term name for yourself or your business, go white hat, take your time producing quality, relevant content, and rely on Google to keep the spam from appearing ahead of you in the SERPs.

Either way, Google will continue doing what they do, producing to the best of their ability the most relevant SERPs for a given query, and they’ll change their algorithm whenever necessary to make it happen. Instead of complaining about the Webmaster Guidelines, thank Google for them, without them you’d be shooting in the dark. Instead of complaining about quality guidelines, thank Google for them, the higher the consistent quality of the ads and results Google displays the greater the chance they’ll be clicked on.

Complaining about and attempting to change Google’s practices on these points will not help you in the long run. Take a second and think about this. If Google lowers its quality control standards to appease the SEO’s and affiliate marketers, Google becomes less useful to the end user. If Google becomes less useful, less end users will actually use it. If less end users actually use Google, you have less potential customers, and like it or not your profits are going to be less as well.

Programming In Pants

This topic has been bouncing around my head for a while, but an article I stumbled across on Digg brought it to the front. In this article the author states:

My favorite variation of this is the concept that your pants make you a better programmer. If I wear khaki pants to work it makes me a better worker then if I wear denim pants. Though I don’t have a client-facing position, it still makes me more effective if my pants come from Banana Republic.

-Sara Chipps

I work for an east coast, software services company that has yet to realize that breaking the spirits, creativity and individuality of their programmers is not a good thing. Conceptually, I love my job, or at least what I do, however it has nothing to do with where I work. I have no problem with rules, in fact I realize in most cases they are a necessity. However when the rules exist because it’s how it’s always been done, I start to get a little annoyed.

We have a dress code, it’s business casual. On Monday we can wear jeans, as long as they fit properly and are not torn or frayed. Casual day is on Monday because that’s the day the country club is closed, so the bosses can wear jeans and not have to change before they go to lunch. In the “summer” (a vague term not actually defined anywhere) we can wear khaki shorts. Until this year, shorts were only allowed on Tuesdays and Thursdays, had to be crisply ironed, and cargo pants were unacceptable (good luck finding those any place your grandparents don’t shop). During the “summer,” casual Mondays no longer apply, you’re allowed to wear khaki shorts so why would you want to wear jeans in an office that averages a temperature of about 64 degrees fahrenheit? Although, I guess it is nice that I will be comfortable walking twenty feet from the door to my car after work. On occasion a big new customer will be brought into the office and we’ll be asked to dress extra-nice for the resulting dog-and-pony show, further proving that the standard day-to-day business casual serves no purpose.

So, an entire whiney paragraph about our dress code, boo-hoo, poor programmer how will you ever survive?

This is not a question about surviving, it’s a matter of thriving. I’m as much an artist as I am a scientist or a mathematician, and scientifically speaking I’m a better, more creative artist when I’m comfortable than I am when I’m not. A collar does not make my code faster. Brown shoes do not make me a better designer. Khaki pants certainly don’t make me a better programmer. I’m not going to say that my code will be directly improved by wearing a t-shirt, sneakers, and jeans, but wearing them will improve my morale and enhance my creativity, and that has a chance of improving my code, or at least my desire to be here.

But I digress. This is not about a dress code, it’s about a mentality, both management’s and employee’s. They call it herding cats for a reason. Programmers are a different breed, and if you think you can prove otherwise you’re not dealing with the right kind of programmers. Look at the companies out there making a fortune in this sector. The ones that attract the best and the brightest developers. Is there a dress code at Google? Facebook? Apple? I don’t think so. These companies provide creative benefits like free lunch, dinner, snacks and sodas, interesting and non-conformist workspaces and places, car washes, dry cleaning, and bike repair to name a few. They think outside the cube. If you keep your talent happy there is a higher chance of keeping your talent, unless of course you enjoy nurturing rock stars until they’re good enough to leave you in the dust for a greener pasture.

Letting Google Help With Your Site Performance

Let’s face it, cross-browser JavaScript and AJAX without a helper library or framework is pretty difficult. However, these libraries can be pretty hefty when it comes to page download size, especially if sent uncompressed and un-optimized. Most of the libraries have a statement in their docs that says something along the lines of “grow up and compress your JavaScript” but not all of us have sufficient access to their web host to actually be able to do that. Enter Google. Google’s AJAX Libraries API serves as a content distribution network for providing pre-compressed versions of the web’s favorite Web 2.0 JavaScript libraries:

Access to these libraries is quite simple, in fact loading prototype can be accomplished with the following four lines of code:

<script src="http://www.google.com/jsapi"></script>
<script>
	google.load("prototype", "1.6.0.2");
</script>

Pretty simple, but let’s look at some metrics.

With Google Without Google
(no cache) (local cache) (no cache) (local cache)
KB Time KB Time KB Time KB Time
66 982ms 4 980ms 252 2.14s 0 1.19s

As you can see, the Google version is significantly faster and smaller on the initial load, subsequent loads a little less obvious, but still slightly faster. However, this test was not all that scientific, since I really only have the ability to do this with Firefox and Firebug, I only did it once, and my network speed can vary significantly from request to request. Despite all of that, testing this across a few other browser/OS combinations does reveal a pattern where the Google AJAX Library API pages do feel faster, even if only by a fraction of a second. I’ve provided my test pages, with Google and without-Google, for you to perform your own tests, and I’d love to hear what other people think and see their results.

The verdict, I like it, but I have a few caveats. First, as pointed out to me by the Unscrutable Designer, you’re relying on a third party site to host your scripts, in this case it means you need to trust Google to not be evil. Personally, I do, but that is a decision to be made on a case-by-case basis. Secondly, you have to trust the reliability of your content distribution network, can you risk your JavaScript functionality if Google’s server goes down? Thanks to progressive enhancement, having no library should be basically the same as no JavaScript, so my properly implemented site should still function, so this is personally not a show-stopper for me.

Now, one last thought. Keep in mind that this is not the most optimal solution since it still makes a request for each library you load and the libraries themselves are not optimized, but it does bring along with it an interesting benefit. The more sites that use this service the higher the chance of getting a local cache hit on one of these files, which of course means one less download.