Monday, June 25, 2012

Minority Report: 15 Harmful Years Later

Wired took a look at the impact of the movie version of Minority Report, fifteen years later.

The year was 1999, and Steven Spielberg was preparing to turn Philip K. Dick’s short story “The Minority Report” into a $100 million action movie starring Tom Cruise. There was just one problem: The story was set in the undated future, and the director had no idea what that future should look like. He wanted the world of the movie to be different from our own, but he also wanted to avoid the exaggerated and often dystopian speculation that plagued most science fiction

...To mark the 10th anniversary of Minority Report‘s June 21 release, Wired spoke to more than a dozen people who were at the so-called “idea summit” that delved deep into the future. As participant Joel Garreau recalls, “I don’t think many of us knew what the fuck we were getting ourselves into.”

The tech from Minority Report that people remember most -- the scenes of Tom Cruise waving his hands to navigate a computer interface -- are probably some of the most harmful ever filmed.

Like voice control and television wristwatches, controlling our devices with three-dimensional physical gestures seems like a good idea. However, like those other technologies, its usefulness is limited.

While the image of Tom Cruise waving his arms all over the place like an amphetamine-boosted symphony conductor makes for dramatic cinema, it turns casual computing tasks into physical tasks exhausting for a healthy person and impossible for a disabled person.

The lowly computer mouse and its cousin, the trackpad, are marvels of efficiency. By moving one hand (or finger) a couple of inches, one can navigate thousands of pixels' worth of computer interface.

The precision of mice and trackpads is unparalleled as well. Mice and trackpads are precise to within several pixels. When considering touch interfaces, accuracy drops by an order of magnitude: Apple recommends that touch-based interface targets measure no less than 44x44 pixels. When considering three-dimensional, gesture-based, Minority Report-style interfaces, accuracy drops by another order of magnitude.1

Three dimensional gesture-based navigation clearly does have its uses. Microsoft's Kinect has shown that it can be very useful for specially-designed games, and low-precision tasks like simple media playback control.

Three-dimensional gestures are clearly a part of the future, but they are not the future. A variety of input methods (the command line, mice, keyboards, touch interfaces, gesture interfaces) will continue to be used, each fulfilling a role.

1 Speculation. I couldn't find hard data on this. It's tough to dispute, though.

Friday, June 15, 2012

This Developer's Life: Dinosaurs and Fortran

I've always had a weird semi-fascination with Fortran.

I don't even know what Fortran looks like, actually.

For years, I lumped it in with ancient languages like Cobol that are gone and not missed in the slightest. While there's always at least one weird guy living in a cave somewhere to prove you wrong, I don't think anybody misses Cobol.

Fortran, though, is apparently a different beast. It's not a general-purpose programming language, exactly. It's more like a thing that hardcore science and math dudes use to crunch numbers.

Apparently, Fortran has two interesting properties.

  1. Battle-Tested, Bulletproof Libraries. The kind of "bulletproof" you only get after a couple of decades of hardcore, NASA-launches-spaceships-with-this-shit, the-stock-market-runs-on-this-shit use.
  2. Disgustingly Parallel. Apparently it scales to about as many processors as you can throw at it, with no real extra work required. Hundreds, thousands. The kind of warehouse-filling computers that predict weather or whatever.

This episode of This Developer's Life features several segments, including one with a fresh-out-of-college kid found himself in a job learning Fortran. After laughing at it and attempting to convert their codebase to C#, he became a Fortran convert.

Monday, June 11, 2012

Doing It...

A Hacker News comment:

"I run Windows on my MBA for when I'm travelling. Apple make fantastic hardware; but their saccharine UIs make me retch.

One constant, though: over the past 10 years, I've moved my life further and further into the Cygwin command line, so that I'm insulated from the frippery going on at the edges; with my setup, I'm approximately as at home on Linux, Solaris, Mac and Windows. I'm not optimistic based on what I've seen of Windows 8's direction."

Hey, look. I do a lot of Windows development work on my Mac in a Windows VM too. And some of the recent OSX UI stuff is regrettable.

But running a VM in OSX (a real BSD Unix) so that you can run Cygwin (a sorta-Unix sorta-emulation layer) in a virtual machine?

Doing it wrong.

Wednesday, May 23, 2012

VMWare Fusion and Excessive Idle CPU Usage

I've stuck with VMWare Fusion (currently at 4.02) over the years because I find it much more stable than Parallels. However, I've noticed that my virtual machines always have rather high CPU usage under Fusion. My Windows Server 2008 R2 VM has always consumed about 40% CPU usage on the host side, even when at close to 0% CPU usage on the guest side.

This is a problem. It makes my Macbook Pro run hot and puts a serious dent in my battery life.

Things I ruled out by trial and error:

  • It's not a RAM issue - I have 16GB of RAM, 4GB of which is dedicated to the Windows VM, and I'm not seeing paging on the guest or host.
  • Removing the guest's virtual USB device is one suggested fix I've seen people offer. This seemed to improve CPU usage by several percent, but nothing significant.
  • Manually changing the virtualization engine didn't help.
  • Enabling or disabling hard disk buffering didn't help.
  • I disabled as many services as possible in the Windows 2008 R2 guess, and confirmed via Sysinternals' Process Explorer that nothing was chewing CPU or doing significant I/O in the background.
  • Enabling or disabling 3D acceleration in Fusion's virtual machine settings didn't help.
  • Enabling or disabling Aero on the guest didn't help.

In the end, you know what worked? I changed VMWare Fusion's settings for the virtual machine and reduced the number of virtual CPUs from 4 to 2. This took idle host CPU usage from ~40% down to ~20%, a figure I consider much more reasonable.

I'm not sure exactly why this worked. FWIW, this is on an early 2011 MacBook Pro with a 4-core Sandy Bridge i7 CPU. There are 4 physical cores and OSX "sees" 8 virtual cores. Therefore, my virtual machine is probably now running on a single physical core. I'm sure that has something to do with it.

Tuesday, May 22, 2012

New Software News: No More Aero Glass, GitHub for Windows, Coda 2

Bye, Aero Glass. Microsoft announced that it's phasing out the Aero Glass UI in Windows 8. The new interface is flatter and sharper. I like the direction they're taking.

“This style of simulating faux-realistic materials (such as glass or aluminum) on the screen looks dated and cheesy now, but at the time, it was very much en vogue,” [Jensen Harris, the Director of Program Management for the Windows User Experience] writes in the blog post titled ‘Creating the Windows 8 User Experience.’

GitHub For Windows. GitHub for Windows is now available. I've only played around with their OSX client very briefly, but it's friendly and works.

Coda 2. Over on the OSX side of things, Coda 2 is finally about to ship. I'm a big fan of Coda 1 for certain things - it's great at editing remote files via FTP/SFTP. (But isn't that kind of an outdated mode of development?)

Catching my eye in Coda 2: Git support, easier color scheming, and a CSS editor that appears to have great support for creating gradients and other CSS effects. There's a built-in MySQL management GUI, which is cool, but also a little bit "five years ago" - it seems like people are either moving "down" to SQLite or NoSQL databases, or "up" to a more fully-featured RDBMS like Postgres.

According to Cabel from Panic, "Coda 2 will be $75 ('upgrading pricing for everyone') for a while, Diet Coda will be $19. After the sale of course." The sale he's referring to is the 24-hour sale on May 24th when both apps are 50% off.

Tuesday, May 8, 2012

HP ZR2740w Monitor Update

In an earlier article I mentioned being pretty excited about ordering this monitor.

Hated it. Completely unacceptable monitor. I sent it back for a refund. However, it might work for you. Let me explain.

The anti-glare coating on the ZR2740w is unbelievably bad when you're looking at light-colored backgrounds. The anti-glare coating is so thick and coarse that the screen actually looks filthy. If you know what text looks like on a dirty monitor, then you know what the ZR2740w looks like.

Will It Work For You? It might, if you're not using any software that uses light backgrounds. If this is strictly a gaming machine, or if you're a coder who spends all day in customizable terminal windows or IDEs with dark color schemes, maybe it's worth a shot.

But then again, while ~$600+ is cheap for a monitor of this size and resolution, that's still a ton of cash to pay for something that's going to make anything on a light background look like shit.

"Mastered for iTunes" Revisited

Since my previous article on Apple's Mastered for iTunes program, more information has come to light. We now know what "Mastered for iTunes" actually means! NPR sums it up best; check the addendum at the end of this article.

"...I spoke again with Bob Ludwig, the mastering engineer quoted in the story, who has submitted "Mastered for iTunes" tracks to Apple. He says the company is simply providing mastering engineers with tools that allow them to see how songs mastered at 24 bits will clip (that is, distort audibly) when they go through the standardized AAC encoding process. The uncompressed files are then submitted to iTunes, which creates lossless versions before encoding the songs as 256 kpbs AAC files for sale in the iTunes store.

...Why is this significant? Because the fact that Apple retains the lossless versions of the high-quality studio masters means that iTunes, at any time it decides to, can begin selling higher-quality encodes, or even lossless files."

Ars Technica chimed in with their opinion. With the aid of some professional audio engineers, they concluded that "Mastered for iTunes" can make a positive difference, though it should be noted that not all of the audio engineers agreed with each other.

Is Everybody Missing The Point? Kind of. A lot of the discussion has centered around the fact that it's almost physically impossible for us to hear the difference between 24-bit and 16-bit audio, or 96khz and 44.1khz audio. While true, that misses the real point of high resolution audio.

Whenever audio is transformed, data can be lost. It's just a mathematical reality. By default, audio goes through quite a few steps in the pipeline before making it to your ears. iTunes' volume control, its Sound Check and Sound Enhancer features, and the built-in equalizer all play a role. So do the volume controls built into Windows/OSX, as well as other sound "enhancements" performed by your audio device.

With high-resolution audio, there's simply more room for error - all those little rounding errors likely won't add up to something your ears can detect. However, with 44.1khz/16bit ("CD-quality") audio, there's not much room for error: 44.1/16 is just good enough to cover the range of normal human hearing, and excessive audio processing quickly adds up to something our ears can detect.

In many ways, it's exactly like working with lossy JPEGs. JPEGs are fine for viewing and can be nearly indistinguishable from uncompressed master photos, but once you start editing JPEGs extensively all of the artifacts pile up pretty quickly.

Sunday, March 18, 2012

Apple: Keeping It… Modest?

While Apple certainly promotes itself as a premium brand, one thing they do not do is change their hardware designs frequently.

Without close inspection, nobody knows you're using a MacBook Pro from 2008 and not one from 2012.

It's even harder for somebody to tell if you're using an iPhone 4s or an original iPhone 4 from two years ago. HTC alone has introduced what, literally twenty designs in that timespan? Thirty?

With the exception of IBM/Lenovo's iconic Thinkpads ‐ which I also love ‐ Apple holds on to their external designs longer than anybody in the industry.

You could make a case that Apple actually has the most modest designs of any PC or smartphone manufacturer today. Were Apple to ever ditch the big glowing Apple logo from their laptop lids (not likely, of course) it wouldn't even be close.

Please note that I spend less than two hours a week watching television, and well over forty hours a week using a Mac and an iPhone. In contrast, I see perhaps thirty seconds of Apple advertising a week.

So I'm talking about actual Apple hardware and not the yuppified marketing image they present in their TV ads. If you watch a lot of television and see a lot of Apple ads, and don't own any Apple products, you'll probably feel differently - but just know that your opinion is based more on marketing than the physical reality of their products.

Friday, March 16, 2012

Ruby: Staying "In The Zone" With Code Completion

Nothing breaks my flow of thought like a bunch of compiler errors -- or worse, subtle runtime errors -- because I misspelled an identifier name somewhere in my code.

As dynamic languages like Ruby have gained in popularity, we've often had to choose between robust, Intellisense-sporting environments like Visual Studio and the dynamic languages we really love.

Out of the box, Sublime Text 2 (OSX, Windows, Linux; $59; free unlimited evaluation of Sublime Text 2 during its prolonged beta period) has some excellent code completion for identifier names and built-in language constructs. There's an important shortcoming, however: the editor only "knows" about identifier names that appear elsewhere within the current file. A variable declared in File1.rb is invisible to the editor in File2.rb.

SublimeCodeIntel is a promising attempt to fix that shortcoming in Sublime Text 2. Based on my simple, initial tests, it works. There still appears to be work to be done, as the code completion dropdown randomly fails to appear at times.

Another alternative for IDE-like Ruby development is JetBrains' RubyMine (OSX, Windows, Linux; $69), now on version 4.02. RubyMine aims for the full IDE experience, as opposed to the smart-and-extensible-text-editor approach of Sublime Text 2.

And then there's the venerable TextMate (OSX only) which may or may not see itself replaced by TextMate 2 if the author ever gets around to it. While I love TextMate, I've never found the code completion to be particularly useful.

While I like Sublime Text 2 the best for Ruby code completion, the saga of TextMate's pseudo-abandonment makes me awfully wary. Like TextMate, Sublime Text 2 is largely (if not entirely) the work of a single developer. What happens if he tires of the project, or other life circumstances prevent him from devoting himself to it?

This is by no means an exhaustive list. Please let me know if you've got a favorite of your own.


Finally: Affordable, High-Resolution Monitors?

Update, 5/1/2012 I posted an update on the ZR2740W. Long story short: This is a completely unacceptable monitor; don't buy it.

How much screen real estate do you need?

Every programmer has a different style. Command-line gurus are making the most of their screen real estate by using tools like tmux to tile several terminals together.

Others, by necessity or choice, have multiple space-gobbling GUI applications open at once. This is my reality, and shuffling through six or seven overlapping windows has a huge potential for interfering with my fragile mental focus. Dealing with multiple too-large windows on a too-small screen is like trying to do one's taxes on a tiny airline seat tray… maddening!

Fortunately, monitors with resolutions greater than 1080p have finally started to come down from the $999 price point. A number of eBay resellers have started to offer bare-bones, 2560x1440 27" S-IPS Korean displays for shockingly low prices right around $400. Quite a few members of have ordered these displays and have had generally good results.

Unfortunately, those displays are so bare-bones that they lack multiple inputs. None of the $400 displays offer DisplayPort compatibility, which means that Mac owners would need a pricy DisplayPort-to-dual-link-DVI adapter. Anecdotally, however, a minority of users have reported flaky results with MonoPrice's $69 adapter and even worse results with Apple's $99 adapter. Since the Korean eBay displays are essentially unreturnable, I wasn't willing to take the risk.

Luckily, salvation may be in sight. AnandTech reviewed the HP ZR2740w - a 27" S-IPS display that runs at 2560x1440. Long story short, this is their conclusion:

If all you really want is a good display for your PC and you don't need to hook up multiple devices, the ZR2740w is an excellent choice. For such users we recommend it with very few reservations and present HP with our Bronze Editors' Choice award.

Did I order one? You bet.

Tuesday, February 21, 2012

What Does "Mastered For iTunes" Mean?

It's not empty hype ‐ but there's no guarantee you'll be able to hear the difference.

Modern albums are typically recorded at sampling rates and bit depths far higher than those supported by CDs or typical home listening equipment. Whereas a CD supports 44.1khz/16bit digital music, music has typically been recorded at 96khz/24bit for quite a while now.

In the past, record labels typically supplied the 44.1khz/16bit masters to online music retailers like iTunes, who then converted them into AAC (iTunes) or MP3 (everybody else) format for sale. Therefore music was lossily converted twice:

Non-"Mastered For iTunes" (3 steps)
(#1) 96khz/24bit studio master recording → (#2) 44.1khz/16bit CD master
(#3) 44.1khz/16bit MP3/AAC version for online sale

Clearly, the middle step is a bit of a waste of time. "Mastered For iTunes" recordings skip the middleman, so to speak.

"Mastered For iTunes" (2 steps)
(#1) 96khz/24bit studio master recording → (#2) 44.1khz/16bit MP3/AAC for online sale

Mathematically, this holds water. It's a fact: all other things being equal, less information is discarded this way. Whether or not you'll be able to hear the difference is another story.

You're particularly unlikely to hear a difference if the studio master was recorded, mixed, or mastered poorly as a part of the ongoing "Loudness Wars." From a fidelity standpoint, these recordings are essentially pre-ruined before they ever reach step #2 or #3 in either scenario. =)


Is My Saw Sharp Enough?

"Saw-sharpening" is shorthand for this cautionary tale of a frustrated lumberjack.

There's a guy who stumbled into a lumberjack in the mountains. The man stops to observe the lumberjack, watching him feverishly sawing at this very large tree. He noticed that the lumberjack was working up a sweat, sawing and sawing, yet going nowhere. The bystander noticed that the saw the lumberjack was using was about as sharp as a butter knife. So, he says to the lumberjack, "Excuse me Mr. Lumberjack, but I couldn't help noticing how hard you are working on that tree, but going nowhere." The lumberjack replies with sweat dripping off of his brow, "Yes... I know. This tree seems to be giving me some trouble." The bystander replies and says, "But Mr. Lumberjack, your saw is so dull that it couldn't possibly cut through anything." "I know", says the lumberjack, "but I am too busy sawing to take time to sharpen my saw."

I typically fall prey to just the opposite: too much time obsessing over tools and -- I fear -- not enough time actually chopping down trees. That's not entirely misguided; in a fast-changing industry like software development it's easy to "relax" for four or five years find yourself terminally behind the curve.

Just ask those 50 year-old COBOL programmers who were laid off from a bank somewhere and can't find work any more because of "ageism."

Right now, I think I'm in a pretty good place. I'm mostly current on my tools, and I'm mostly focused on work instead of scanning Github and the Visual Studio Extension Gallery for new gems and extensions every day.


Thursday, February 16, 2012

But Don't Take My Word For It

Stevenf from Panic has a great write-up on Gatekeeper and code signing in OSX Mountain Lion. Panic is a leading third-party, independent software developer for OSX. They make Coda, Transmit, and a few other well-regarded apps. The majority of their apps are not available through the App Store, so they're quite sensitive to the needs of those who wish to independently distribute software.

Overall, he's extremely positive about the changes in OSX Mountain Lion. He does mention one important caveat about feature parity between App Store and non-App Store software. (I agree with his concerns there)

Relax! Apple Doesn't Want To Lock OSX Down Like iOS

Before reading this article, you'll want to be familiar with how Gatekeeper operates on the OSX Mountain Lion beta. Macworld has a concise overview with screenshots.

Let's start by examining the reasons why iOS doesn't allow you to run unapproved third-party software.

  1. Thirty percent. Apple certainly benefits from taking a 30% cut of software sales made through the App Store. (It should be noted, of course, that you can publish free software via the App Store as well)
  2. Carrier network limitations. If iOS users run bandwidth-intensive apps on wireless networks, there's a real potential for iOS users to overwhelm wireless networks already struggling to keep up with demand. This is why applications such as Skype, or even Apple's own FaceTime, don't let you videoconference over cellular networks. Apple has extraordinary leverage with carriers, but Apple simply can't sell a device that would overwhelm the networks it relies on.
  3. Because they can. iOS is a new platform, with no history of allowing you to run unapproved third-party software.

Of the above reasons, only #1 is relevant to OSX.

There's no doubt that a 30% cut of all OSX software revenue would appeal to Apple. Gatekeeper, however, really has nothing to do with funneling OSX software through the Mac App Store. Once you get your free developer certificate from Apple, you can distribute your signed software any way you like, with no obligation to funnel it through the App Store.

Apple has also made it quite easy to run unsigned applications. You can disable the check for signed applications entirely via a one-time setting in Security & Privacy under System Preferences. From the Macworld article:

"If you want Mountain Lion to run every app under the sun, you can just change the setting to Anywhere."

Or, you can override Gatekeeper on a per-application basis. From the Macworld article:

"Finally, it’s important to note that because Gatekeeper uses the File Quarantine system, it only works the very first time you try to launch an app, and even then only when it’s been downloaded from an app on your Mac like a web browser or email program. And once an app has been launched once, it’s beyond the reach of Gatekeeper.

Combine this with the ease of overriding Gatekeeper by using the Open command and it’s clear that Gatekeeper in Mountain Lion isn’t intended to be some sort of high-security app lockdown. It’s just a tool to encourage people not to run software they don’t trust. If they really, truly want to run an app, Mountain Lion won’t stop them."

The only remaining worry is: Are we on a slippery slope? Is this merely Apple's first step towards a total iOS-style lockdown? We can't rule that out. The downsides to such a lockdown would be tremendous: developers, power users and early adopters would flee OSX in droves… and, even if we assume Apple doesn't care about good will, remember that those are precisely the people Apple needs to write iOS software using its OSX-only iOS developer tools.

I do agree wholeheartedly with Lloyd from Mac Performance Guide on several points. LLoyd lays out several possible worst-case scenarios:

  • "Apple disables, removes or forbids an application because of Congressional pressure. No Rule of Law, no due process, just arbitrary removal or shut-down of an application. This already happened once last year. It doesn’t matter whether you agree or disagree with what any particular app does; that’s not the point. Policies regard can change at any time."
  • "A repressive country (Iran, China, Syria, etc) decides that it doesn’t like the certain apps being used by Undesirable Elements. So it demands that Apple disable those apps. Apple doesn’t want to risk its commercial market there, so it disables those apps."
  • "Hackers penetrate Apple’s systems. All applications of any kind worldwide could be suddenly disabled and/or removed. If a key cryptographic provider (RSA) can be compromised, it can happen anywhere. Or consider this article. Or this one. Or this one. Or this one. Or thousands of others. And that’s just what has been made public; assuming all such compromises are disclosed would be extremely naive. An exploit of this magnitude would be like an Olympic gold medal for hackers— highly attractive."

The likelihood of those scenarios is nearly irrelevant to me; I'm not comfortable with a private company (or a government) having that level of arbitrary control over my computer. However, as noted above in Macworld's Gatekeeper feature, the impact of those worst-case scenarios seems extremely small: you could simply disable Gatekeeper!

That's why I'm sticking with OSX, and I'll most likely give Mountain Lion a shot when it's released. Though, maybe not until it reaches 10.8.1…

Monday, January 23, 2012

New Study Reiterates Importance of Diet for ADHD Patients

This study by Drs. Millichamp and Yee provides further evidence for the importance of proper diet for ADHD sufferers. (Unfortunately, the full text is not online)

From the abstract:

The recent increase of interest in this form of therapy for ADHD, and especially in the use of omega supplements, significance of iron deficiency, and the avoidance of the "Western pattern" diet, make the discussion timely.

Diets to reduce symptoms associated with ADHD include sugar-restricted, additive/preservative-free, oligoantigenic/elimination, and fatty acid supplements. Omega-3 supplement is the latest dietary treatment with positive reports of efficacy, and interest in the additive-free diet of the 1970s is occasionally revived. A provocative report draws attention to the ADHD-associated "Western-style" diet, high in fat and refined sugars, and the ADHD-free "healthy" diet, containing fiber, folate, and omega-3 fatty acids.

As I've learned about ADHD over the years, one thing continues to stand out.

The factors that help us focus are the same as the factors that help others focus.

This similarity can confuse ADHD skeptics. After all, they ask: wouldn't a crappy diet affect anybody's ability to focus?

The difference is: for us, the stakes are often higher. For us it can be the difference between a productive day and a day when we're unable to focus on anything at all.