Category Archives: Technology

HP C7000 Bladecenter: awesome new test lab in one unit!

Last year I picked up a surplus HP C7000 Bladecenter (generation 1, complete with BL460c and BL480c G1 blades, fibre channel, Cisco switches, etc) to run Openstack test loads. At about 150 KG, I paid just under £3 per KG, which is pretty damn good for a self-contained test lab (near scrap metal prices, actually).

It took me a while to get somewhere, because my super-cheap home-friendly rack wasn’t deep enough. Eventually I acquired a second-hand Kell Systems 24U silent rack (now called the APC Netshelter SX) and plopped it in the corner of the living room (after taking the front door to the flat off the hinges). Now I’m building up a new routing and switching core for the house, and enabling the C7000 at the same time. The rack also came with a rackmount APC SmartUPS with external battery pack, which is a nice addition.

I know the C7000 isn’t new, but let me tell you, from someone used to bootstrapping with generic white boxes, it is a pretty cool piece of hardware. I just spent an hour going through basic technical documents to get ready to put everything into service, and I have to say I’m impressed. For the price of two rackmount servers I’m putting 10 machines online, complete with remote management, Cisco switching, high-speed interconnects, and full redundancy. HP clearly knows what they’re doing.

Total cost so far — Bladecenter, Blades, Switching, UPS/PDU, silent rack, spares — including shipping, about £1,200. I’ve paid more for a laptop. Resale value (if I wanted to) probably three times that. But seeing as the newer HP BL servers are the same form-factor, and second-hand prices for something slightly older are extraordinarily reasonable, incremental upgrades aren’t expensive. So my test lab is set for the next several years.

HP engineers, my hat’s off to you. The architecture of the C-Class (or whatever your marketing wonks call it) is amazing, especially at second-hand prices. For anyone who is looking to test or startup with some in-house hardware (yes, I know, ‘the Cloud’, etc), take a look at a second-hand C7000.

When I get it all running, I’ll post some pictures. But, since the rack looks like decent office furniture, nothing is exciting until you open it up.

Now, anyone have a stack of HP driver DVDs they need to get rid of?

Pi-Top First Impressions

So, my Pi-Top arrived last week. Today I took everything out and put it together. Here are some initial observations:

Slick packaging, well thought-out, and very professional. It looks like a real product the minute you open the box.

This is a cool educational device that will help kids tinker, but it is definitely a work in progress. But this is not the Arm-based laptop I’ve wanted for years. When I get some time I’ll fire up my Samsung Arm Chromebook and see what modern Ubuntu looks like on it.

Each ‘education’ drive seems to want to roll their own UI. Sugar was pretty radical, but Pi OS just seems like a launcher on top of Raspbian, and is buggy out of the box. I haven’t used a Kano, so I can’t comment there. I do wonder if there’s any real benefit in not sticking to a standard interface — they’re not creating value here, and including LIbreOffice would also probably be a good idea.

The keyboard and trackpad really suck. I mean, really suck. There is no way to touch-type, keypresses often get lost, and the touchpad to the side thing isn’t working for me. I was going to write a review of the Pi-Top on it, but I’m already so frustrated with the keyboard that I don’t think that’s possible. It is the main method of input, so it would be nice if it worked well. Perhaps it will get better the more it’s used. For the record, the OLPC keyboard also sucked.

I’ll kick the tyres on it for a few days, then write up a more detailed review.

Moving to the public cloud? Yes, you still need operations staff.

A quick note, following from news of Google Compute outage yesterday, and outages caused by DNS changes at Amazon S3 slightly more than a week ago, it’s important to remember that moving to the cloud still requires operations (sysadmin, devops, whatever we want to call it).

There is a belief that moving to the public cloud allows companies to outsource most, if not all of their operations staff. But there is a very real danger in abrogation of ops responsibility.

If you are outsourced to the cloud, do you have a disaster recovery plan? What happens if the systems that are ‘too big to fail’ do just that?

I’m not saying the cloud is bad — it enables companies to go to the web with next to no capex investment. But that doesn’t mean it is the end all be all, and if you’re not taking care of operations in-house, it’s very likely that you will regret it.

Here’s some further interesting reading: TechTarget Cloud outage report 2014.

Do Not Throw Away That Laserwriter, Ramsay Wood!

Yesterday I was having dinner with Ramsay Wood at the Priory Tavern. We hadn’t seen each other for a little while, and were having a nice catch up. Then he said something which offended my every sensibility.

‘I have to throw away my Laserwriter,’ he said. ‘My guy came around to install a new printer, and he couldn’t figure out how to make it work. He said something about the drivers.’

‘That doesn’t make any sense,’ I said. ‘Drive me over to your house, and home after I fix it. I’ll have it done in 15 minutes.’

As a bit of history, I used to work with Ramsay, then for Ramsay, and part of that was to help him out with his Mac and small network. I told him to buy the aforementioned Laserwriter more than a decade ago (second-hand — those things were expensive). It is a Laserwriter 16/600 PS, one of the last couple of models made, definitely more than 15 years old, and a total workhorse. I also knew that Ramsay couldn’t have printed more than a couple thousand pages — nothing for this printer — in the last 10 years.

A bit of an aside, here. I expect that even if Ramsay was printing 10,000 pages per year, this printer would likely still be working. But my lovely Lexmark X544 colour laser multifunction printer lasted only two years, and not very many pages, at that. And why, Lexmark, does the scanner stop working if the printer fuser needs to be replaced? But I digress.

What followed was a series of simple troubleshooting. The printer was on the wrong IP subnet (confirmed via the config page that prints at power up). Corrected that via telnet (there aren’t any Macs old enough to run the Laserwriter utility at Ramsay’s), and configured the computers with the new IP using the generic postscript driver (really, new guy, no driver for a postscript printer?). I also took a minute with Ramsay’s new Sky router to make sure that there would be no IP conflicts by setting DHCP static assignments — Sky’s brilliant netadmins specify all IPs from .2 to .254 for DHCP, as well as using a default admin password of ‘sky’. After that, the Laserwriter was back online. I also set the new IOS-friendly printer to a static IP, so it would, you know, continue to work, even if someone else asked for the same IP. Total time elapsed, about an hour.

What lessons were hammered home by all this?

  1. I miss sometimes having simple problems that I can fix myself. Although I’m not sure that I miss doing desktop support for a living.
  2. My current printer is a POS compared to this one which is definitely more than 15 years old. Increasing revenue with consumables and repairs is definitely part of the busieness model.
  3. Ramsay’s new guy definitely didn’t go looking too far into the IP side of things. Or he didn’t understand how things work.
  4. Sky definitely do not care about making their router setup sensible or secure. Even O2 was better.
  5. Time to get the Powerbook G3 Pismo from the office and figure out what’s wrong with it. Who has a copy of MacOS 9 kicking around?
  6. I am not as good as I think I am, but it still was fun!


PS — Anyone have A/UX floppies and a clean ROM for an SE/30? It has ethernet….

A bit of PC history (personal computing, that is)

I just finished reading Walter Issacson’s biography of Steve Jobs. If you’re interested in the personal computer revolution, you should read it. It’s an insightful look at one of the men who definitely helped shape the digital world. I also realised that that as of April 2014, I’ve been running Linux on the desktop for more than 10 years.

Both of these made me think a lot about my computing history. Maybe this isn’t interesting to everyone. Maybe it is. I’ll leave it up to you to decide.

I originally wrote a blow-by-blow description of every computer I owned, and how each affected my development as a computer user, person, and proud geek. Then I thought more about the stages of my computing life, and through decoupling the individual items, I think there are more interesting themes. For those that are curious, the original list of primary workstations I’ve used is below.

The first part of my computing life was, like many of my generation, running games and basic word processing. My first computer was an Apple IIc. Complete with an Imagewriter II, it was an efficient platform for word processing and strategy games that ripped off Star Trek. I guess I spent a lot of time on it, but didn’t do much. I got it when I was 10, and immediately started making AD&D character sheets.

My second computer was a DEC Rainbow 100 with an XT attached via a slave interface. It was a hand me down from my mother, and frankly it was a POS. But it had a hard drive, so it was relatively more awesome than my IIc was. I think I had DOS 5 on it. Not that it could do much (even less than my IIc). I don’t recall having much in the way of software.

My third computer was a 286, complete with a new monochrome monitor. I think it even had a modem for the BBSs. 5.25″ floppy, still. One of my friends showed me that MODs were cool, and we soldered a headphone jack onto the speaker cable so I could play music. MODs are still cool, BTW. I had DOS and Windows 3.1, and I probably had MS Works or something like it. It was okay for school work. But not inspiring. But it marked the transition of my computing life from simple user, to tinkerer and someone who pushed beyond the boundaries of what was easy do do with computers.

Two of my friends had 486 boxes back then. I was envious. They also had SVGA and soundcards, which meant they played games, and I didn’t. I still don’t, much.

I started working on the high school newspaper the summer between my sophomore and junior years. They had these funny little microwave oven looking computers that all the students had to use. They were monochrome, with a 3.5″ drive, and could print out to a laser printer. And they were really expensive, so we were told not to screw around with them. There were two more capable computers, which colour screens. Those were for the advisers, of course. This was my first experience with Macintosh.

At first, I was reluctant. How could Macintosh be good, if it looked so little like other computers? But, with Aldus Pagemaker (the Ragemaker), a laser printer, an art waxer, and some backlit layout tables, I began to see what we could make with computers. And I began to love Macintosh. So much that I bought my own Macintosh IIsi, sporting 9 MB of RAM and an 80 Mb hard drive, hooked up to a 12″ colour monitor (my first). I made newspapers with that computer. We installed a network and made one of the older Macs a server. We made money, had fun, and won awards. And we were just high school students. The Mac empowered us to do more. By the time I went to university, I was in fully love with Macintosh.

And thus began the second part of my computing life. I like to call it advanced tinkering. The Mac made things like networking — which was frankly awful on WinTel machines — easy.

One of the first things I did at university was to buy one of the new Powermacs. An 8100/80 — the top of the range. There was a good deal through the university, and I had some money. And I knew that it would be even better than my IIsi. I collected it from the university bookstore, and disassembled it on the hood of my car in the parking lot to ‘see how it worked.’

And it was better, in every way (except for the case). I had been on the Internet before, but only with shell access. I added a 14.4 Kbps modem to my 8100, and with my university SLIP connection, I was on the Internet from home. Ragemaker was running faster than ever, and I discovered the joy of Filemaker Pro. I was doing more things with my Mac. I traded the 8100 for a Powerbook 520, and enjoyed my first Mac laptop, and also the first Mac I didn’t love. I traded it for the second Mac I didn’t love: a 7200/75, one of the first PCI-based Macs.

I call this period of my computing life: when Macs sucked. It lasted a good long time, even though I remained a Mac user, and advocate, for another 10 years or so. The 520 had a crappy passive-matrix monochrome screen and an underpowered processor. The 7200 was totally unstable — bad, rushed hardware combined with an ageing operating system. Even the 8100 probably wasn’t all that good, but I had never owned a new computer before, so it was exciting.

I was hanging out with an interesting crowd in those days, the slo.punks, a group of computer science students, systems administrators, and, well, authentic individuals. I also worked in computing on campus — I didn’t come from money, and computing paid a lot more than other jobs. And through these social and work roles, I began to run into Unix. I didn’t understand how these computers — which often had less resources than my Powermac — could handle multiple users, and operate on the Internet so well. I needed to replace the 7200, which I managed to sell off, but what could I replace it with?

Normal people would have bought a Powermac 7500 or maybe an 8500. They were regarded well. But I knew the Mac backwards and front, and frankly this was not a good time for the Mac hardware or OS. Clones were coming in and muddying the waters. So I thought about it for a while.

During that while, I cobbled together a 386 and installed Slackware Linux. My hard drive was too small to install X Windows, so I made do with a text console. It got me by for a few weeks, and reminded me how much I hated PC hardware. Remember IRQ conflicts, Himem, and all that other crap we put ourselves through to make computers work? Well, I had seen MicroSoft Plug and Play in operation at the university, and I was not impressed. So PC clones, Windows, and by extension Linux, were out of the running. If the Mac was out, and PC clones were out, then I was looking at Unix workstations. I was not rich enough to buy new, and neither Sun nor SGI impressed me. But something black, cubic, and monochrome did. So I set off to Mill Valley to pick up my NeXT (040) Cube.

I call this part of my computing life, Unix, and the rebirth of Steve Jobs.

It should have been obvious that the combination of a GUI with a fanatical eye for detail and Unix would lead me to NeXT. And, while I was stepping down significantly in processing power — the NeXT was discontinued when I was in high school — the integration of GUI and Unix was the most exciting computing experience of my life. I also began to inherently understand the Unix Way — small, elegant solutions could be linked together to solve larger problems. On the Mac, I had dozens of very complicated programmes. On NeXT, most of my computing revolved around the command line,, and

That Spring, I applied to study for a year in Sweden. Taking a NeXT Cube to Europe in economy class didn’t seem a wise idea. I managed to buy an ex-demo Apple Powerbook Duo 280C for the trip. Leaving my beloved NeXT in California, I began to install applications to make the Mac as NeXT-like as possible. Greg’s Browser gave me Miller Columns. Dragthing gave me something like a dock. All in all, the Duo 280C was a great portable computer for the time that I needed it. But, when I got back to California I sold it, and went back to my NeXT. Over the following two years, I moved off the Cube onto a Turbo Colour Station (faster, with colour). I also had a desktop Mac (7500/100) for DTP and Filemaker. This was just as MacOS X Server was coming out, and the NeXTies in the city (and there were more than a few of us) were excited that we might get a modern version of NeXTStep/OpenStep for Mach.

I finished university and moved to London for the first time. I knew that I would be between places, and that I needed a portable computer. Thinkpads were very expensive, as were Twinheads (those were the brands that were certified for OpenStep). So, while I was still a student, I purchased a Powerbook G3/266 (Wallstreet II). The G3 was a big upgrade in processor power, and with two batteries loaded, it was a real road warrior. As much as I loved the NeXT, I didn’t intend on coming back to California any time soon, so I sold my Turbo Colour Station (as well as my desktop Mac). MacOS was at version 9 (System 9 to some of us in those days), and I was making some money with Filemaker, BBEdit, and Webstar. As a NeXT fan, when I looked at MacOS X, all I could see were what I thought were UI design mistakes — like a single dock for launching apps, and one for running apps. There were lots of small annoyances, and frankly all the software I used would run in emulation on MacOS X. I didn’t think that was a great idea at the time.

But time marches on. I went back to California, replaced my Powerbook with a newer version, and embraced MacOS X as much as I could. I separated launching and running applications by moving the doc and using Dragthing. It was a good laptop, which was unfortunately stolen when I lived in San Francisco (along with my 5 GB iPod, my favourite ever backpack, and my stainless steel espresso mug). In San Francisco, I acquired another Turbo Colour Slab, as well as a Turbo Cube. I also had a prototype G4 Mac loaned from a friend that served as an in-house Web and Filemaker server. I replaced the lost Powerbook with an iBook, since I needed something portable for travel to the UK — during my years in San Francisco I was chronically under-employed (it was the dotcom bust, after all), and did most of my consulting work for UK contacts.

Eventually, one of those contacts turned into a real job. So I left (my heart) in San Francisco, and moved full-time to London. Not long after, I sold my iBook, and bought a Powerbook G4. The iBook wasn’t a bad computer, but it wasn’t as good as the G4. Or so it said on paper. To be honest, both of those computers, along with my subsequent Powerbook G4 Aluminium, were not great. They all had hardware problems, which necessitated AppleCare.

And let me tell you, if there is one word for the reason I stopped buying Apple computers, it’s AppleCare. If you were lucky enough to have AppleCare in Europe, you could send your computer away for repair in Amsterdam. It probably only would take two weeks or so. I wasn’t rich enough to have a second computer for when my primary one needed repairs, so I looked around for another solution, much as I had years before when I started with NeXT after a string of disappointing Macs.

Also, something was happening to Apple in those days. Apple was fiercely in the camp of small developers, with a huge and active community of freeware, shareware, and commercial companies. But something was turning inside Apple, and they began a long run to eventually control their application marketplace. Even back then, I knew that taking the ideas small developers had used and making system software that did the same thing wasn’t a good idea. Also, the one application which could have kept me on the Mac — OpenOffice — wasn’t ported to MacOS X, and Apple didn’t care to do so.

So, a lot of people said that Dell support was exceptional. And if your hardware isn’t exceptional, but your support is, well, does it really matter? I bought a Dell Latitude D600 and started trying to get a decent desktop Linux running on it — I still didn’t like the Windows UI very much. I went through Debian, Mepis, and eventually landed Ubuntu Warty Warthog. Ubuntu took the software library of Debian, packaged a good-enough UI (KDE3), and made everything work.

I call this part of my computing life Linux. It was an evolutionary change from what came before, but what a change.

Now, the UI and software were not as complete as MacOS, or NeXT, but Linux is all about options, both in application and user interface. Most of the options were, and are, somehow better than Windows, and somehow worse than the Mac. But they were free, both as in beer, and as in speech. All the time I spent on the Mac, people were passing back and forth pirated and cracked applications, and on Windows it has always been worse — just go to any ‘developing market’ computer store and you can buy every application you need for $2, complete with malware which will steal your passwords, if not your bank accounts and identity. I honestly don’t know why people bother.

And so began my time with Linux — mostly (K)Ubuntu and Mint — through four Dells, two Asus Eee PCs, a Lenovo Thinkpad (service not nearly as good as Dell), and now back to Dell with an XPS 13 Linux Edition (Sputnik). I also had a Macbook Air for a month or so, but not having the flexibility of Linux was too confining — my wife has that computer, now. I also have a Samsung Chromebook that I bought to run Ubuntu on — Arm is awesome — but hardware support is sketchy, so I went back to ChromeOS (itself a build of Gentoo Linux).

I was incredibly fortunate to begin my computing life as the personal computer came to maturity, and even more so to see the mainstreaming — at least amongst technical people — of the opensource and free software movements. The change from mainframe and minicomputer days to the personal computer was a liberation of computer users. The opportunity to participate in the FOSS revolution was likewise a revolutionary moment. Today’s resurgence of the mainframe model as cloud computing is perhaps a step back, but at least there is a healthy market for competition on where and when individuals can store their data — if they choose to do so at all.

With the recent confirmation of mass data collection — which isn’t surprising to most technical people, even if it is disappointing — we sit at the cusp of another groundshift in computing. Whether it means that users will be empowered and own their own data, or whether they will give up all rights to the people who run cloud services, is anyone’s guess. But at least today we have an option.

For me, I have slowed my changes over the years. I’ve never twiddled every setting, and there are times that I miss the combination of elegance and convenience that the Next had. It could be that I don’t have to miss it though — FOSS gives freedom and choice that we never thought about before. The citizen-programmer is not far off, and as long as we remember that we are making a purchase every time we trade privacy for convenience, there should be a way to move forward that is both convenient, elegant, and useful.

Or we could all just go buy a Next. But there are precious few of them left.

A somewhat incomplete list of my workstations over time:

Apple IIc
DEC Rainbow 100 w/ XT slave
286/12 (upgraded MB to 286/25)
Mac IIsi (12″ monitor)
Powermac 8100/80 (16″ monitor, 14.4K modem)
Powerbook 520 20/500
Powermac 7200/75
386 (Slackware)
Next 040 Cube
Powerbook Duo 280c
Color Turbo Slab
Powermac 7500/100
Color Turbo Slab (Polycon)
Powerbook G3/266 (Wallstreet II)
Powerbook G3/333 (Pismo)
Turbo Cube (SF)
iBook G3/500 (Icebook) (SF)
Prototype Powermac G4 (SF)
Powerbook G4 (Titanium) (London)
Powerbook G4 (Aluminium)
Dell Latitude D600 (Ubuntu Hoary)
Dell Vostro 1400
Asus Eee PC 900
Asus Eee PC 1000HE
Dell Vostro V13
Apple Macbook Air 11
Lenovo Thinkpad X220
Samsung Chromebook Series 3
Dell XPS 13

Palmtops, Mobile Phones, and the like:

Newton 2000
Palm Pilot IIIxe
Newton 2100
Nokia Brickphone (maybe a 5110?)
SE T20e Tomb Raider Edition
SE T68i
SE Z600
Nokia E61
Nokia E71
Nokia N810
Nokia N900
Nokia N9
Nexus One
Nexus S
Samsung Galaxy S Duos
Nexus Galaxy
Nexus 4
Nexus XT1033 Dual-SIM
Moto G XT
Nexus 7
Nexus 7 2013
Blackberry Z10
T2Mobile Flame FirefoxOS Reference Phone

Servers and Museum Pieces:

Mac SE/30 (w/ Ethernet)
Raspberry Pi (the cluster grows at home)
HP Microserver
HP Tower Server
Apple TV (2)
Pivos Xios Android Set Top Box


Cobalt Qube
Next Turbo Dimension
Quadra 700
Sparc and HP Nextstep Workstations
SGI Indy

Confessions of a Facebook Re-entrant

It’s only fair to admit that I am back on Facebook. I was quiet about it for some time, but the fact is that some of my family only mention news on Facebook, and looking over my wife’s shoulder started annoying her. So, I’m back (and have been for a long time, just hidden as my dog, who is probably my best feature, anyway).

Google: Don’t Be Evil is Bullshit (Even if They Mean It)

Less than a month after I left Facebook behind for reasons including privacy concerns, Google has announced, fairly and for all to see, changes to it’s privacy policy. We are all used to endless and incomprehensible software licenses; to clicking on ‘Agree’ when we don’t even read what we’re agreeing to. You might want to take a minute to look at this move, though.

You Don’t Think Enough About Data Privacy

Google’s privacy policy changes can be summarised: all their various services will share personal data. It seems innocent enough, but perhaps you should take a look at what that means.

The average Google user has handed a lot of personally-identifying information over to Google:

  1. Google Search: Internet search attempts (what you search for, maintained for at least two years)
  2. Google Search: Internet search results (what you actually clicked on)
  3. Google Search: your web presence (any time you are identified on the web)
  4. Gmail: your email (especially ‘personal’ email, which you might go to lengths to not put through a ‘work’ server)
  5. Android: phone records (usage patterns and other meta-data, if nothing else)
  6. Android and Google Latitude: your physical location
  7. Google Maps and Google Navigation: where you are and where you are going
  8. Google+: personal interactions and friendships

So, the average Android user with a Google account has willingly handed over all this information about themselves. And all this data is analysed, correlated, reported on, and synthesized, ostensibly to better target ads for you.

Don’t Be Evil is Bullshit (Even if They Mean It)

Google has been at the forefront of data search, retention, and analysis for more than a decade. Their web search tool, along with a host of other services, are genuinely useful to a lot of people. They are also genuinely useful to Google, law enforcement, and marketing firms. Your Internet searches, combined with targeted web search, email (especially personal email), phone calls, physical location, and social networking, give Google an incredibly detailed idea of who you are. Their stated reason for this is to target ads to you. However, at the same time, they are gathering, collating, and analysing data about you in a way that was impossible before. They are a potential one-stop shop for marketers (which is their explicit aim), as well as US law enforcement and security services. To say nothing of data leaks, either by intrusion, employee mistake, malice, or the activities of a party that claims the data from Google (the US or UK governments, with their amazing data protection record, immediately spring to mind).

According to Google CEO Eric Schmidt, if you want privacy, you must be doing something wrong. “If you have something that you don’t want anyone to know, maybe you shouldn’t be doing it in the first place,” Mr. Schmidt said in a CNBC interview in 2009. Much like Facebook’s Mark Zuckerberg, Mr. Schmidt believes in a future without privacy. That he makes millions of dollars from the private details he wants you to reveal is clearly just a coincidence.

What Can You Do?

If you decide to do something right now, you can remove your search history from Google. The EFF has a how-to that is pretty easy to follow. They also have some ideas on protecting your search privacy which you might find interesting.

You might decide that there is too much data being collected about you altogether. It is difficult to avoid in an increasingly interconnected world, but you could ask yourself if you really get much out of social networking and other sites that you willingly give data to. Remember that if you are not paying for a product (or service) you are the product. Servers, power, and Internet connectivity are not free!

There are some people who will argue the damage is done: the genie is out of the bottle. Yes, once something is on the Internet, it will be available probably forever, one way or another. But we can always limit the ability of corporations, malicious individuals, and government entities to gain further information about who we are and what we do. Turn off the flow of information and they may still know something about you, but the picture becomes much less clear.

The most important thing you can do is: think about what you are doing online, understand what is it that you are offering up, and decide if it is worth the service you receive.

Facebook contacts export

Here’s a quick note about how to get your contact list (emails, phone numbers, maybe more) out of Facebook.

Firstly, Facebook doesn’t let you export. So there’s that.

You can connect Yahoo mail, and/or Hotmail to Facebook. There are pretty straightforward tools on the Yahoo and Hotmail sites to do this. Hotmail will also let you connect to Linkedin, if that’s something that you want to do.

Once you have them in Yahoo or Hotmail, you should be able to export them, right? Wrong. As part of the process, Facebook locks the export ability. You can edit every Yahoo contact to remove the Facebook block, but that’s a lot of manual work. And, despite the Yahoo documentation, just adding contacts to a list does not make them exportable.

But you can install Yahoo Mail and Hotmail on your Android phone (I used my Nook Color, but it’s the same idea). The applications will let you sync users to your local contacts.

Once you have local contacts, you can use the built-in export tool to save them as a single-file vcard.

Copy the vcard to your computer, and now you can import (Linux users: Evolution is way better at this than Thunderbird).

I will come back and clean this up soon, but if you need a way, this worked for me.

Facebook: goodbye, and why

Today is the last day to have a Facebook account that’s not updated to the new timeline layout. Tomorrow, they automatically update everyone, which requires some work from the users to keep privacy settings the same. This all sounds pretty unimportant on Facebook’s site, but some of the UK tabloids aren’t so sure.

The way this change has come down has made me think a lot about privacy and social media. While there’s no doubt that Facebook has changed the way we communicate forever, I’ve decided to delete my account. I was a heavy Facebook user, but the more I think and read about it, the more I think I need to spend more time thinking about it. Everyone should think about what Facebook is, what they do, and consider deleting yours too. Here’s how. And here’s why I’m doing it:

The reaction over the ‘upgrade’ to timeline isn’t about the timeline feature at all. It’s that when we release any information on Facebook (or any other site, for that matter) it no longer belongs to us. We don’t pay Facebook, and they have to make money somehow. We’re all giving them something much more valuable than money: data.

Facebook is building a future without any explicit right to privacy. It is all very good if that’s what you want, but you have to understand: if you are not buying a product, you are the product. Someone has to pay for data centres, equipment, staff, and you certainly don’t get a company rumoured to be worth $100 Billion dollars providing a free service with no plan for profit. Novelist Charlie Stross has a thoughtful piece about that on his blog, which I encourage you to read. Suffice it to say that your holiday pictures, ‘like’ buttons, status updates, and other self-published data are the richest marketing and intelligence gathering resource the world has ever seen.

And just in case you have a Facebook account but ‘don’t use it’: you don’t have to have an active Facebook account to have privacy concerns. Just having an account someone can ‘tag’ in pictures, and other reports is bad, as well. They will grow a history of you that you have no control over (even though, frankly, you have no control over what Facebook does with the information you give them, anyway).

I have a friend who thinks that being on Facebook is like being pregnant. Since he’s on it now, he thinks that there’s no point in going back. His data ‘isn’t important’ because it’s self-screened, and he ‘can’t get it back anyway.’ Well, maybe, but maybe not. It’s never to late to start thinking about what you’re doing, and perhaps decide to make a change. You can stop giving data away; turning over mass amounts about yourself, be it to Google (think about it: web searches, IM, email, calendar, phone records, and their own social network — they know all about you), Twitter (yes, they can track you in 140 characters — it’s probably actually easier), or Facebook (the most obviously uncaring of the club, and also the most powerful) is probably unwise, but should we try to do something about it?

I am, at least until I’m done thinking about it. The journey of 1,000 miles begins with a single step. You should at least think about what you’re doing, too.


Computers are not magic. Nothing in technology is. Computers are tools that follow the operator’s instructions. Commands can be complex, and underlying code can possibly be send incorrect instructions, but in the end, there is no room for interpretation. Computers cannot ‘get mad’ at you, or have a bad day, or do something that they’re not told to do. Not yet, anyway.

I deal with technical problems every day. Generally, I try to solve them, with methodical and unambiguous techniques that produce quantifiable results. That’s a complicated way of saying I work on a problem until it’s solved and I know why it happened. This is a cornerstone of my entire professional life, and the process is simple. No matter how complex your problem, troubleshooting follows the same steps:

  1. You have a problem. Define it.
  2. Identify the variables (what can change, especially what you can change).
  3. Change one variable
  4. Test. Do you still have the problem? If not, quit, you have solved it.
  5. Change your previous variable back.
  6. Change another variable.
  7. Test.
  8. Repeat until a solution is found. Eventually, you will be rewarded.

Sometimes you get a complicated problem, with interaction between multiple variables. But that’s when your process has to be absolutely methodical and boring. Even if the system is burning down around you — especially if the system is burning down around you — you must stay calm; troubleshooting takes as much time as it takes.

I bring this up because, over the years, I have encountered a staggering number of technical people — engineers, computer scientists, systems and network administrators — who do not manage to be methodical, for one reason or another. Many people in the IT field don’t have a good troubleshooting process, and waste a lot of time and effort as a result — both their own, and that of those they work with (like me). Even if they solve a problem, they won’t know the cause, won’t be able to recreate the problem, cannot come up with a permanent fix, and cannot apply this experience to future problems.

Sometimes these folks are highly pressured and attempt everything they can think of at once. Sometimes they ‘don’t care what the problem is, as long as it’s fixed.’ Many times they simply do not have a background in or experience of problem solving, and also don’t understand what benefits a step-by-step process brings. But a cool head, methodical work habit, and good documentation, combined with sensible precautions (you did back up, right?) will always yield the desired results. Rushing and not knowing why things are working will only lead to problems down the road.

I would like to thank the science teachers I had in California public school, who taught me how to design an experiment at an early age. I’m not sure if it was third grade or seventh, but valid experimental procedure has become my ingrained response to solving technical problems. Without it, I wouldn’t have had a good job in university, wouldn’t have managed a technology career, and would not have the life I lead today. My hat is off to you, my former teachers. Here’s hoping there are still some people out teaching the basics.