December 24, 2011

Your Esoteric Language is Useless

You'd think that programmers would get over these ridiculous language wars. The consensus should be that any one programmer is going to use whatever language they are most comfortable with that gets the job done most efficiently. If someone knows C/C++, C#, and Java, they're probably going to use C++ to write a console game. You can argue that language [x] is terrible because of [x], but the problem is that ALL languages are terrible for one reason or another. Every aspect of a languages design is a series of trade-offs, and if you try to criticize a language that is useful in one context because it isn't useful in another, you are ignoring the entire concept of a trade-off.

These arguments go on for hours upon hours, about what exactly is a trade-off and what languages arguably have stupid features and what makes someone a good programer and blah blah blah blah SHUT UP. I don't care what language you used, if your program is shit, your program is shit. I don't care if you wrote it in Clojure or used MongoDB or used continuations and closures in whatever esoteric functional language happens to be popular right now. Your program still sucks. If someone else writes a better program in C without any elegant use of anything, and it works better than your program, they're doing their job better than you.

I don't care if they aren't as good a programmer as you are, by whatever stupid, arbitrary standards you've invented to make yourself feel better, they're still doing a better job than you. I don't care if your haskell editor was written in haskell. Good for you. It sucks. It is terribly designed. It's workflow is about as conducive as a blob of molasses on a mountain in January. I don't care if you are using a fantastic stack of professionally designed standard libraries instead of re-inventing the wheel. That guy over there re-invented the wheel the wrong way 10 times and his program is better than yours because it's designed with the user in mind instead of a bunch of stupid libraries. I don't care if you're using Mercurial over SVN or Git on Linux using Emacs with a bunch of extensions that make you super productive. Your program still sucks.

I am sick and tired of people judging programmers on a bunch of rules that don't matter. Do you know functional programming? Do you know how to implement a LAMP stack? Obviously you don't use C++ or anything like that, do you?

These programmers have no goddamn idea what they're talking about. But that isn't what concerns me. What concerns me is that programmers are so obsessed over what language is best or what tool is best or what library they should use when they should be more concerned about what their program actually DOES. They get so caught up in building whatever elegant crap they're trying to build they completely forget what the end user experience is, especially when the end user has never used the program before. Just as you are not a slave to your tools, your program is not enslaved to your libraries.

Your program's design should serve the user, not a bunch of data structures.

December 2, 2011

The Irrationality of Idiots

If Everyone Else is Such an Idiot, How Come You're Not Rich? - Megan McArdle
I run around calling a whole lot of people and/or things stupid, dump, moronic, or some other variation of idiot. As the above quote exemplifies, saying such things tends to be a bit dangerous, since if everyone else was an idiot, you should be rich as hell. My snarky reaction to that, of course, would be that I'm not rich yet (and even then, "rich" in the sense of the quote is really just a metaphor for success, depending on how you define it for yourself), but in truth there are very specific reasons I call someone an idiot, and they don't necessarily involve actual intelligence.

To me, someone is an idiot if they refuse to argue in a rational manner. If you ignore evidence or use nonsensical reasoning and logical fallacies to support your beliefs, you're an idiot. If you don't like me calling you an idiot, that's just fine, because I acknowledge your existence about as much as I acknowledge the existence of dirty clothes on my bedroom floor. It's only when there is such a pile of dirty laundry lying around that it impedes movement that I really notice and clean it up. In the case of suffocating amounts of stupidity, I usually just go somewhere else. The rest of the time, stupid people can only serve to grudgingly function in a society, not take part in running it. This is because designing and running a society requires rational thinking and logical arguments, or nothing gets done.

I can get really angry about certain things, but I must yield to opinions that have a reasonable basis, if only to acknowledge that I might be wrong, even if I think I'm not. Everything I say or do must have some sort of logical basis, even if it originated from pure intuition. So long as you can poke legitimate holes in an accepted theory, you can hold some pretty crazy opinions that can't be considered illogical, though perhaps still incredibly risky or unlikely.

All the other times I call someone an idiot, I'm usually being lazy when I should really be calling the action idiotic. For example, I can't legitimately call Mark Zuckerberg an idiot. If I call him an idiot, I'm not forming a legitimate opinion, and its probably because he did something that pissed me off and I'm ranting about it, and you are free to ignore my invalid opinion, at least until I clarify that what he did was idiotic, not him. Of course, sometimes people repeatedly do things that are just so mind-bogglingly stupid that it is entirely justified to actually call them a moron, because they are displaying a serious lack of bona fide intelligence. Usually, though, most people are entirely capable of rational thought, but simply do not care enough to exercise it, in which case their idiocy stems from an unwillingness to use rationality, not actual intelligence.

I bring this up, because it seems to be a serious problem. What happens when we lose rationality? People can't compromise anymore, and we get a bunch of stupendously idiotic proposals borne out of ignorance that no longer has to pass through a filter of logical argumentation. All irrational disputes become polarized because neither side is willing to listen to the other, and the emotions that are intrinsically tied to the dispute prevent any meaningful progress from being made. Society breaks down in the face of irrationality because irrationality refuses to acknowledge things like, people are different.

Well gee, that sounds like our current political mess.

I am an aggressive supporter of educational reform, and one of the things that I believe should be taught in schools is not only rational thought and logical arguments, but how rational thought can complement creativity and irrational emotions. We cannot rid ourselves of illogical beliefs, because then we've turned into Vulcans, but we must learn, as a species, when our emotions are appropriate, and when we need to exercise our ability to be rational agents. As it is, we are devolving into a prehistoric mess of irrational demands and opinions that only serve to drag society backwards, just as we begin unlocking the true potential of our technology.

Relevent:

December 1, 2011

The Great Mystery of Linear Gradient Lighting

A long, long time ago, in pretty much the same place I'm sitting in right now, I was learning how one would do 2D lighting with soft shadows and discovered the age old adage in 2D graphics: linear gradient lighting looks better than mathematically correct inverse square lighting.

Strange.

I brushed it off as artistic license and perceptual trickery, but over the years, as I dug into advanced lighting concepts, nothing could explain this. It was a mystery. Around the time I discovered microfacet theory I figured it could theoretically be an attempt to approximate non-lambertanian reflectance models, but even that wouldn't turn an exponential curve into a linear one.

This bizarre law even showed up in my 3D lighting experiments. Attempting to invoke the inverse square law would simply result in extremely bright and dark areas and would look absolutely terrible, and yet the only apparent fix I saw anywhere was simply calculating light via linear distance in clear violation of observed light behavior. Everywhere I looked, people calculated light on a linear basis, everywhere, on everything. Was it the equations? Perhaps the equations being used operated on linear light values instead of exponential ones and so only output the correct value if the light was linear? No, that wasn't it. I couldn't figure it out. Years and years and years would pass with this discrepancy left unaccounted for.

A few months ago I noted an article on gamma correction and assumed it was related to color correction or some other post process effect designed to compensate for monitor behavior, and put it as a very low priority research point on my mental to-do-list. No reason fixing up minor brightness problems until your graphics engine can actually render everything properly. Yesterday, though, I happened across a Hacker News posting about learning modern 3D engine programming. Curious if it had anything I didn't already know, I ran through its topics, and found this. Gamma correction wasn't just making the scene brighter to fit with the monitor, it was compensating for the fact that most images are actually already gamma-corrected.

In a nutshell, the brightness of a monitor is exponential, not linear (with a power of about 2.2). The result is that a linear gradient displayed on the monitor is not actually increasing in brightness linearly. Because it's mapped to a curve, it will actually increase in brightness exponentially. This is due to the human visual system processing luminosity on a logarithmic scale. The curve in question is this:

Gamma Response Curve
Source: GPU Gems 3 - Chapter 24: The Importance of Being Linear

You can see the effect in this picture, taken from the article I mentioned:
Linear Curve

The thing is, I always assumed the top linear gradient was a linear gradient. Sure it looks a little dark, but hey, I suppose that might happen if you're increasing at 25% increments, right? WRONG. The bottom strip is a true linear gradient1. The top strip is a literal assignment of linear gradient RGB values, going from 0 to 62 to 126, etc. While this is, digitally speaking, a mathematical linear gradient, what happens when it gets displayed on the screen? It gets distorted by the CRT Gamma curve seen in the above graph, which makes the end value exponential. The bottom strip, on the other hand, is gamma corrected - it is NOT a mathematical linear gradient. It's values go from 0 to 134 to 185. As a result, when this exponential curve is displayed on your monitor, it's values are dragged down by the exact inverse exponential curve, resulting in a true linear curve. An image that has been "gamma-corrected" in this manner is said to exist in sRGB color space.

The thing is, most images aren't linear. They're actually in the sRGB color space, otherwise they'd look totally wrong when we viewed them on our monitors. Normally, this doesn't matter, which is why most 2D games simply ignore gamma completely. Because all a 2D engine does is take a pixel and display it on the screen without touching it, if you enable gamma correction you will actually over-correct the image and it will look terrible. This becomes a problem with image editing, because digital artists are drawing and coloring things on their monitors and they try to make sure that everything looks good on their monitor. So if an artist were visually trying to make a linear gradient, they would probably make something similar to the already gamma-corrected strip we saw earlier. Because virtually no image editors linearize images when saving (for good reason), the resulting image an artist creates is actually in sRGB color space, which is why only turning on gamma correction will usually simply make everything look bright and washed out, since you are normally using images that are already gamma-corrected. This is actually good thing due to subtle precision issues, but it creates a serious problem when you start trying to do lighting calculations.

The thing is, lighting calculations are linear operations. It's why you use Linear Algebra for most of your image processing needs. Because of this, when I tried to use the inverse-square law for my lighting functions, the resulting value that I was multiplying on to the already-gamma-corrected image was not gamma corrected! In order to do proper lighting, you would have to first linearize the gamma-corrected image, perform the lighting calculation on it, and then re-gamma-correct the end result.

Wait a minute, what did we say the gamma curve value was? It's $$x^{2.2}$$, so $$x^{0.45}$$ will gamma-correct the value $$x$$. But the inverse square law states that the intensity of a light is actually $$\frac{1}{x^2}$$, so if you were to gamma correct the inverse square law, you'd end up with: \[ {\bigg(\frac{1}{x^2}}\bigg)^{0.45} = {x^{-2}}^{0.45} = x^{-0.9} ≈ x^{1} \]
That's almost linear!2

OH MY GOD
MIND == BLOWN

That's it! The reason I saw linear curves all over the place was because it was a rough approximation to gamma correction! The reason linear lighting looks good in a 2D game is because its actually an approximation to a gamma-corrected inverse-square law! Holy shit! Why didn't anyone ever explain this?!3 Now it all makes sense! Just to confirm my findings, I went back to my 3D lighting experiment, and sure enough, after correcting the gamma values, using the inverse square law for the lighting gave correct results! MUAHAHAHAHAHAHA!

For those of you using OpenGL, you can implement gamma correction as explained in the article mentioned above. For those of you using DirectX9 (not 10), you can simply enable D3DSAMP_SRGBTEXTURE on whichever texture stages are using sRGB textures (usually only the diffuse map), and then enable D3DRS_SRGBWRITEENABLE during your drawing calls (a gamma-correction stateblock containing both of those works nicely). For things like GUI, you'll probably want to bypass the sRGB part. Like OpenGL, you can also skip D3DRS_SRGBWRITEENABLE and simply gamma-correct the entire blended scene using D3DCAPS3_LINEAR_TO_SRGB_PRESENTATION in the Present() call, but this has a lot of caveats attached. In DirectX10, you no longer use D3DSAMP_SRGBTEXTURE. Instead, you use an sRGB texture format (see this presentation for details).


1 or at least much closer, depending on your monitors true gamma response
2 In reality I'm sweeping a whole bunch of math under the table here. What you really have to do is move the inverse square curve around until it overlaps the gamma curve, then apply it, and you'll get something that is roughly linear.
3 If this is actually standard course material in a real graphics course, and I am just really bad at finding good tutorials, I apologize for the palm hitting your face right now.