Thursday, December 22, 2011

Techies. Gadgets. Oh No.

Christmas is one of the worst times for technology. Everything happens so fast, people are short on time, and purchases are driven by sales and social norms rather than careful consideration - like buying a $99 "tablet" for somebody because mom heard they're cool. Technology isn't a goal. It's a way to enjoy the journey a little more and get to where you really want to be: being a fulfilled person. Purchases, technology or otherwise, should be careful and personal. You are an amazing person and your journey is important. I'm guilty of straying from this path many times. We bought a GPS navigation system for my father and wife this Christmas. My goal for them was to enjoy the piece of mind and simplification of life that owning a GPS has given to me. I'm not sure if I accomplished that. At least initially, I think I may have filled their lives with one more piece of intimidating technology.

Wednesday, December 14, 2011

Apple Proves The Amiga Was Right

No, I don’t actually believe that. That was just my attempt at crafting a headline even more blatantly eyeball-grabbing than Is Apple Making More Advanced Chips Than Intel? 

Interesting article, though you know it isn’t exactly going to be the most interesting journalism in the world when it comes from a site called “Cult Of Mac.”

What I do enjoy is watching the eternal struggle between general-purpose chips and specialized chips that do a few things really well at the expense of everything else.  There’s no right or wrong there, of course – it’s just the tradeoff that has to be made when any hardware is designed. 

Modern smartphones are a pretty awesome demonstration of the potential of specialized chips, with lots of specialized little co-processors performing specific jobs real fast with very little power required.  As a kid, I remember wondering how a 3mhz Super Nintendo could do things that a 20mhz 386 PC couldn’t do – specialized chips, that’s how. 

Those old lumbering dinosaurs, though – the powerful general-purpose CPUs – that’s where computing moves forward.  That’s where code gets wrung out of brain cells.  The “app” you’re running on your handheld marvel was probably written by somebody sitting at one of these ancient beasts.

Tuesday, December 6, 2011

Windows People: Scott Hanselman's 2011 Tools List!

As far as I'm concerned, there are exactly two capital-letter Events in the world of development-related blogging.

The first, and most well-known, is the epic review that John Siracusa writes for Ars Technica when each new version of OSX is released. Here's his OSX Lion review. These reviews are so anticipated that people train in anticipation of them - via epic training montages.

The second - and my personal favorite, actually, despite my general move away from Windows - is the semiannual list of Windows power tools that Scott Hanselman posts. His 2011 list is now up.

If you do any sort of work on Windows, check it out. Hanselman is one of the most insanely productive and talented Windows developers on the planet. This list is full of battle-tested tools. No fluff. You are guaranteed to find at least one genuinely useful thing you didn't know about before.

At the very least, scan the list and file a few things away in the back of your mind. Even if you don't need anything today you'll know to check it out later when the need arises.

Technorati Tags: , , ,

Tuesday, November 29, 2011

Converting Ruby (Unix) Time to Windows File Time

Had to figure this out for a project. One of those things that took a little while, but was simple once I dug a little.

A Windows file time is "a 64-bit value that represents the number of 100-nanosecond intervals that have elapsed since 12:00 midnight, January 1, 1601 A.D. (C.E.) Coordinated Universal Time (UTC)." Ref.

In contrast, Ruby stores times like Unix: "Time is stored internally as the number of seconds and microseconds since the epoch, January 1, 1970 00:00 UTC" Ref.

# difference between the Windows and Unix epochs, in 100ns intervals
EPOCH_DIFF_100NS = 116444736000000000

def rubytime_to_windows_filetime(t)
	(t.to_time.to_i * 10000000) + EPOCH_DIFF_100NS
end

Saturday, October 29, 2011

"Don't Call Yourself a Programmer"

It's been Slashdotted, but here's an article I agree with: Don't Call Yourself A Programmer, And Other Career Advice.

I've seen this point raised before, and it's a good one. My job is developing software, and programming - actually sitting down at the computer and typing out lines of code - is probably 25% of that job.

Lots of other excellent advice in that article too, such as:

You are not defined by your chosen software stack: I recently asked via Twitter what young engineers wanted to know about careers. Many asked how to know what programming language or stack to study. It doesn’t matter. There you go.

Do Java programmers make more money than .NET programmers? Anyone describing themselves as either a Java programmer or .NET programmer has already lost, because a) they’re a programmer (you’re not, see above) and b) they’re making themselves non-hireable for most programming jobs. In the real world, picking up a new language takes a few weeks of effort and after 6 to 12 months nobody will ever notice you haven’t been doing that one for your entire career.

This is one of those articles I've saved and will hand out to anybody that ever asks me about this career that I love so much. (So that's… what, three or four people over the next fifty years? Haha…)

Tuesday, October 25, 2011

Adobe, Your Installers Are Awful

Just spent several hours troubleshooting Adobe's Creative Suite 5.5 installer. It kept erroring out for me with the following unhelpful message:

"please insert disk AdobeDesignPremium5.5-English to continue"

Huh? Well, it turns out that their installer gets really confused if you install it from anywhere but the C: drive. Note I'm just doing a standard install to the C: drive. But you can't install from, say, a USB flash drive.

What I love is that I found the answer on The Pirate Bay. Mind you, I'm installing a legit licensed copy of Creative Suite. But Adobe's own resources had no information for me. Apparently nobody at Adobe owns a USB drive.

Maybe Adobe is confused -- somebody should tell them that just because Flash is dying, that doesn't mean that USB flash drives are dying.

The New PC Experience

A coworker bought a new Acer laptop at Best Buy. I'm setting it up and installing Office and Creative Suite for her. Now, is this a perfectly usable piece of hardware? Yes, mostly, if you don't count the trackpad.

But boy, do they cut just about every possible corner on a $500 PC laptop. Uneven screen lighting, unusable trackpad, bloated with crapware, crappy keyboard, ugh. I'm not that much of a snob, though. This PC will, basically, get you from Point A to Point B.

Why Do Automated Updates Require Me To Click "Okay" For Three Hours? Downloading all of the required Windows updates took two or three hours. The current state of Windows system updates is downright magical compared to the way they were say, ten years ago. Remember the days of Windows patches and hot fixes that had to be applied in exactly the right order, lest you render your OS more or less ruined?

The current system is commendably foolproof. But why do I have to hit "okay" to authorize a Windows Update reboot so many times? The procedure seems to be:

  1. Launch Windows Update
  2. Pull down and install all available updates
  3. Wait between 5 and 30 minutes
  4. Click "Yes" to allow a reboot (or wait for the timer)
  5. System reboots
  6. Oh, hey, a bunch more Windows Update updates are now available, presumably because the last batch of updates fulfilled a bunch of prerequisites/dependencies. Windows Update will find these in the background, eventually, or you can manually launch Windows Update in case you actually want your system updated, you know, now. Return to Step 1. Repeat four or five times.

What's aggravating is that this could be totally automated. There is no reason to even involve me, and no technical reason why there couldn't be a "Download and install every possible update and reboot as many goddamn times as you need to, and let me know when you're finished" button.

OSX is basically guilty of the same omission.

Sunday, October 23, 2011

Celebrating Steve

Apple has posted the video of the employees-only "Celebrating Steve" event held last week at Apple's main campus.

Very moving.

Tim Cook's speaking voice reminds me of Fred Rodgers, and I mean that as a compliment.

Note: It appears you need Safari to view the video. When I visited the page in Firefox or Chrome, I got a misleading "available soon" message.

ARM's New Cortex A15 CPU: Destined For Macbook Air?

Jon Stokes, formerly of Ars Technica, has a new article up: Meet ARM’s Cortex A15: The Future of the iPad, and Possibly the Macbook Air. It's a typical Jon Stokes article about CPU architecture - and that's a good thing. Lots of talk about pipeline depths and so forth. His conclusion, based on "paper" evaluations of the A15 and Intel's upcoming products, is that the A15 is going to be great but it won't be compelling enough for Apple to consider switching some of their notebooks to that architecture.

Why RISC? The question of a switch to an ARM-based RISC architecture for Macs is tantalizing. The short version of a long story is that, all other things being equal, RISC architectures are always going to offer more performance-per-watt than CISC architectures like Intel and AMD's x86 chips. That's why you don't see x86 chips in cellphones and you very rarely see them in consumer electronics.

Why Didn't RISC Win The First Time? In the 80s and 90s, everybody was going RISC - except Intel, who needed to keep their desktop x86 CPUs backwards-compatible. Intel was able to defeat RISC desktop chips like PowerPC with their economic might. Even though their x86 chips needed more transistors and more power to match RISC competitors, Intel's massive profits, economies of scale, and industry-best chip manufacturing processes allowed them to maintain a performance lead and sell those x86 more cheaply than their RISC competition.

Why Might ARM Win This Time? The increased power consumption of Intel CPUs is no big deal on the desktop, where a few extra watts don't matter much. But with everything going mobile, it matters again. And unlike Microsoft, Apple already has an operating system that's poised to run on multiple architectures.

So it's a little disappointing to read that Stokes doesn't feel ARM's latest CPUs are enough to start nibbling away at Intel's presence in mainstream notebooks. The thought of dropping the "CISC tax" is awfully appealing.

Friday, October 21, 2011

A "Techie" Resolution

A techie -- or worse, a gadget hound -- is somebody who's in love with buying, using, and owning electronic things as an end and not as a means.

Let's not ever let ourselves be "techies."

Let's be people who solve problems and create new works with the best technology possible… and the least technology possible. Technology doesn't have to be electronic: a hatchet is the best technology possible for chopping small amounts of wood.

Bruce Sterling has good advice on what to buy and own.

"Sell – even give away– anything you never use. Fancy ball gowns, tuxedos, beautiful shoes wrapped in bubblepak that you never wear, useless Christmas gifts from well-meaning relatives, junk that you inherited. Sell that stuff. Take the money, get a real bed. Get radically improved everyday things.

The same goes for a working chair. Notice it. Take action. Bad chairs can seriously injure you from repetitive stresses. Get a decent ergonomic chair. Someone may accuse you of 'indulging yourself' because you possess a chair that functions properly. This guy is a reactionary. He is useless to futurity. Listen carefully to whatever else he says, and do the opposite. You will benefit greatly.

Expensive clothing is generally designed to make you look like an aristocrat who can afford couture. Unless you are a celebrity on professional display, forget this consumer theatricality. You should buy relatively-expensive clothing that is ergonomic, high-performance and sturdy.

Anything placed next to your skin for long periods is of high priority. Shoes are notorious sources of pain and stress and subjected to great mechanical wear. You really need to work on selecting these – yes, on 'shopping for shoes.' You should spend more time on shoes than you do on cars, unless you're in a car during pretty much every waking moment. In which case, God help you."

Saturday, October 15, 2011

Checklisting Is A Big Problem

How often have you heard (or read) a conversation like this?
"The voice recognition in the iPhone 4S seems pretty amazing."
"What's the big deal? [Some other operating system] has voice recognition too."
This is bad. The difference between a good voice recognition system and a bad voice recognition system is huge. Same with the gap between lots of other features: GUIs, touch interfaces, and so forth. They're not things you can check off on a checklist. It's like comparing two restaurants by treating bread as a checklist-able thing.
"Wow. The fresh-baked bread at Restaurant XYZ is amazing. Light, flaky, still steaming hot when they serve it."
"What's the big deal? They have six-packs of hotdog rolls at 7-11. Why pay more?"
What's depressing is that a lot of smart, knowledgable people treat technology that way. I feel like it's almost the dominant way of thinking, even among the technology-savvy. How do we change this?

Friday, October 14, 2011

The Briefest iOS 5 Review

It's great.

You could argue that the new features are mostly things that ought to have been there ages ago, and you'd probably be right, but none of that takes away from the fact that this is an excellent release.

A Thought To Live By

This is a technology blog, not a personal blog, but -- remember the commandments of this space? -- we consider humanity as more important than technology.

So here's something for the weekend: a comic from that old zen master, Charles Schultz. What's life, if we're not appreciating the small and beautiful things around us?

Peanuts  snoopy leaves

Wednesday, October 12, 2011

Home Computing Without Steve Jobs

I'm not sure it happens, and if it does the results aren't pretty.

In the 1970s, the idea of home computing seemed ridiculous. Even the hobbyists like Woz that wanted computers in their homes weren't thinking much beyond their hobby and other hobbyists; Jobs was the one who saw that these could be people-centric devices and that everybody should have one in their living room.

Even if all Jobs did was market the darn things better than anybody else, that's still indispensable. Without everyday consumers buying computers and pumping billions of dollars into the industry, very little of the last several decades of innovation would have happened.

But he went way beyond that, being intimately involved in every project he ever participated in, bringing a healthy dose of the liberal arts with him - music! typography! color!

Without Jobs… eventually, yes, computers would have probably gotten small and affordable enough to be in peoples' homes. But would anybody have wanted them? Without Jobs, that prospect would have been about as appealing as having your own forklift, cash register, or timecard punch at home. Even if you could, why would you? Look at how terrible the IBM PCjr was -- and they made that thing after they had Apple's people-oriented computers to ape.

Commandments

A few guiding principles for this space.

1. We are human beings, and computers are here for us. Software and hardware should make our lives better. Everything will be evaluated within that context.

2. We practice pragmatism around here. All other things being equal, free (libre) software is better than a closed alternative - but only for pragmatic reasons: open-source software is more likely to receive iterative improvements, feedback, and bug-testing. Closed-source software that makes our lives better (see #1) is to be admired.

3. Operating systems and programming environments are like martial arts. They have strengths and weaknesses relative to one another but whichever one(s) you pick, you're going to be a badass if you pour enough effort into it.

4. Design is important. Design is how a thing works, not how it looks. A command-line environment can be well-designed and elegant.

5. Performance is fun but it is a means, not an end. More megahertz and megabits per second are good, but only if they help you do cool stuff better.

Tuesday, October 11, 2011

The Introduction Everybody Will Scroll Past

I'm John "Booty" Rose, software developer.

I'm located in the Philadelphia area. I work in a variety of technologies: SQL, .NET, and increasingly: Ruby.

People know me best as the creator of OtakuBooty.com, a profitable -- and ridiculously fun -- social networking site for nerds.

These are my views on technology.