Random Short Take #6

Welcome to the sixth edition of the Random Short Take. Here are a few links to a few things that I think might be useful, to someone.

OT – Digital Movie Consumption Still A Bin Fire – News At 11

This article’s a little different from my normal subject matter, but I felt the strong urge to have a bit of a rant, and explore some feelings, so buckle up. Digital content distribution (particularly for feature films) as it relates to consumers has been a mess for some time. It still is in my opinion. I wanted to work through some of my issues with it in this article. I don’t have a lot of answers, so if it’s resolution you’re after, you’re in the wrong place.

 

Background

It’s been a long time since video tape was the de facto mechanism for film consumption for the average punter. Unlike VCRs, DVDs (and Blu-ray) were readable on computers at around the same time they became available to the consumer to watch on standalone devices plugged into televisions. DVDs also came with a bunch of protection mechanisms that were pretty easily thwarted (if you were adept at searching the Internet). As a result you could take feature films and store them in a digital format relatively simply. So why not just distribute those files to consumers?

For some reason we’re okay to treat the storage and distribution of music in a way that’s different to movies. To wit, the iPod was massively successful in the market, but movie storage devices (even after we got past the capacity limitations of the time) have struggled to gain traction, commercially or legally. Even legitimate content delivery services like kaleidescape were, in my opinion, crippled by the licensing requirement to have the physical discs in the unit when they played files from their internal storage.

It took a long time for companies to get behind the idea of distributing movies in a digital format. Studios focused on using Digital Rights Management (DRM) to cripple consumption in a way that seemed positively hostile. In some instances it felt like they were not terribly interested in you actually consuming the film in a fashion that was simple or convenient. Movie studios to this day seem mighty afraid of putting content in the digital realm. This isn’t necessarily unwarranted, with tools like AnyDVD lasting a lot longer (and doing a lot more cool stuff) than anyone had imagined. I think some of this focus on making things difficult was the idea that consumers were merely accessing a license to consume the content, and the transport mechanism could be determined by the content owners. The problem with this is that people think of films in much the same way as they think of books. They have the idea that once they purchased the media, that should be sufficient to consume the film forever. Content owners (the studios, really) are pretty happy for you to think that, but they were also chuffed when we transitioned from VHS to Laserdisc to DVD to Blu-ray (and now, potentially, some new UHD variant). I have The Way of the Dragon on a variety of formats at home. I’m an edge case perhaps, but what about that copy of Throw Momma from the Train that you have on VHS? You probably don’t have a working deck anymore, but I’m sure you’d like to dip back into a cinematic masterpiece every now and then, wouldn’t you?

The other problem with digital content distribution was that, once the studios decided to go ahead with it, it was Apple versus the world in terms of distribution standards. In much the same way that Blu-ray was pitched against HD-DVD, Apple’s iTunes was promoted as a superior delivery mechanism. And it can be, as long as you’re all in with Apple, and happy with the content catalogue they have in place. Disney was also guilty of this approach. But if there’s stuff you want to watch that isn’t part of their ecosystem, you need to look at alternative methods of consumption. Like streaming, for example.

 

But Not Everyone is Streaming

Bandwidth is a problem in Australia. It’s a first world problem, to be sure, but it’s still a problem. And for a lot of people. A common connection type is ADSL1 or 2+, and fibre to the home was killed off in a political stoush that we should all be ashamed of. But I digress. In any case, things aren’t overly fast, and streaming content options are fairly limited (it’s a small market). Since its launch in Australia Netflix has been steadily improving its content catalogue, but it’s nowhere near as extensive as the one in the US.

It’s for that reason that I still buy movies on Blu-ray. And I get access to “Digital Copies” of movies along with these discs. In the olden days, these were often files I could import directly into iTunes off a separate DVD. Sometimes they were DRM-protected wmv files that I couldn’t really play anywhere except on a Windows PC. Nowadays they are primarily UltraViolet-based redemption codes. This makes sense, as a lot of computers don’t have optical drives any more. I don’t use UltraViolet services as my primary consumption mechanism, as I tend to watch movies on a big screen connected to an Apple TV running Plex. But from time to time (particularly when travelling on long-haul flights) I’ve found the ability to load up a reasonably sized file on an iPad or laptop to be very convenient, particularly when the in-flight entertainment system fails.

 

UltraViolet

The idea behind UltraViolet is / was pretty cool. People realized a few things about content distribution. Firstly, studios weren’t always going to agree on which service to use for distribution, or which device the content could be consumed on. And sometimes you wanted to change the way you consumed your media. So the narrative changed from media or streaming to licensing, and you were granted rights to consume the content you wanted, ostensibly on any platform you liked. Sounds like a great idea, and even in Australia, a number of content providers jumped on board. I found the redemption process to be fairly straightforward, although I didn’t like how some studios insisted on me handing over my details in order to gain access to the titles (after I’d already created accounts with UltraViolet and a provider of my choosing). I found the number of standalone devices that actually supported UltraViolet titles to be pretty small, despite what the FAQs were saying. I had the most success consuming content via the website of one of the providers, rather than using an app on an Apple TV or similar.

Is it still working? Sort of. If you read through the change notice of this FAQ you’ll notice a bunch of providers slowly disappearing from Australia and around the world. Again, I’m an edge case, consuming content in a small market. But it seems like just when every Blu-ray has a standardized electronic rights copy included we’ve slowly started to take away the ways to consume those copies. Well, that’s what I thought at first, but apparently there’s something else, potentially better, happening.

 

Movies Anywhere

A service recently launched called Movies Anywhere. It was originally launched in 2014 as Disney Movies Anywhere, and was rebranded and re-launched in the last month. The idea is that it ties together your content licenses from any number of providers and systems and allows you to consume them on a unified platform. That’s about all I can tell you, because it’s US-based and not available anywhere else. I’m not going to turn this into an ad for the service, because I can’t tell you how well it actually works, and whether it really does what I want it to do. But it does seem to tick a number of boxes in terms of linking a number of disparate services together.

 

So What’s the Problem?

I like the idea of being able to pay for content once and having access to it for a long time. I still have a Laserdisc player, but a lot of people don’t. So they’ve re-invested in media over and over again. This makes sense if you follow the progression of technology (and improvements in playback quality), but when we have better mechanisms to access content (such as digital storage) it makes less sense that we should continually pay for the same thing over and over.

The problem, as always, is that any time we do get close to having some cool tech available to do what we want, it gets restricted to a specific region. By the time this stuff gets to Australia, the rest of the world has moved on and we’re left with patchy support for what are considered legacy services. Or we get the service at launch but don’t get the full product. This is usually because of existing licensing agreements, differences in copyright law, and all kinds of other complicated reasons. Some of these reasons are even, well, reasonable. But it’s still annoying, and I think the Internet just serves to amplify this feeling of annoyance when it comes to things like this. I don’t really know how to solve the problem either. The studios will continue to do what they do until consumers stop consuming. And I think there are enough people out there going along with this that they won’t need to stop any time soon. I still think it’s a bin fire, and that’s a shame. Of course, my kids also think it’s weird that I still purchase content on media, so what do I know?

HTPC – Replacing The PC With macOS

The Problem

My HTPC (running Windows 7 Media Center) died a few months ago after around 5 or 6 years of service. It was a shame as I had used it as a backup for my TV recordings and also to rip optical discs to my NAS. At the time it died I was about to depart on a business trip and I couldn’t be bothered trying to work out why it died. So I gave the carcass of the machine to my nephew (who’s learning about PC hardware things) and tried to put something together using other machines in my house (namely an iMac and some other odds and sods). I’m obviously not the first person to use a Mac for these activities, but I thought it would be useful to capture my experiences.

 

Requirements

Requirements? Isn’t this just for home entertainment? Whatever, relaxation is important, and understanding your users is as well. We record a lot of free to air tv with a Fetch Mighty box and sometimes things clash. Or don’t record. “Catch-up” TV services in Australia are improving a lot, but our Netflix catalogue is nowhere near as extensive as the US one. So I like to have a backup option for TV recording. The HTPC provided that. And it had that cool Media Center extender option with the Xbox360 that was actually quite useable and meant I didn’t need a PC sitting in the lounge room.

From a movie consumption perspective, we mostly watch stuff using various clients such as AppleTV or WDTV devices, so the HTPC didn’t really impact anything there, although the way I got data from Blu-ray / DVD / HD-DVD / VCD was impacted as my iMac didn’t have a Blu-ray player attached. Okay, the SuperDrive could deal with VideoCDs, but you get my point.

So, in short, I need something that could:

  • Record free to air tv shows via a schedule;
  • Rip Blu-ray and other content to mkv containers (or similar); and
  • Grab accurate metadata for that media for use with Plex.

 

Solution?

The solution was fairly easy to put together when I thought about what I actually needed.

TV

I backed a Kickstarter for the HDHomeRun Connect some time ago and hadn’t really used the device very effectively save for the odd VLC stream on an iPad. It’s a dual-tuner device that sits on your wired network and is addressable by a few different applications over IP. The good news is that Elgato EyeTV works with both macOS and IceTV (a local TV guide provider) and also supports the HDHomeRun. I haven’t tested multiple HDHomeRun devices with the same EyeTV software and I’m not 100% convinced that this would work. I had 8 tuners on the HTPC so this is a bit of a bummer, but as it’s not the primary recording device I can live with it. The EyeTV software supports multiple export options too, so I can push shows into iTunes and have AppleTV pick them up there.

Optical Discs

I bought a USB-based Pioneer (BDR-XS06) Blu-ray drive that works with macOS, and MakeMKV is still my go to in terms of mkv ripping. It maxes out at 2x. I’m not sure if this is a macOS limitation or firmware crippling or some other thing. Now that I think about it, it’s likely the USB bus on my iMac. If anyone wants to send me a Thunderbolt dock I’m happy to try that out as an alternative. In any case a standard movie Blu-ray takes just shy of an hour to rip. It does work though. If I still need to rip HD-DVDs I can use the drive from the Xbox360 that I have laying about. What? I like to have options.

Metadata

For movie metadata I settled on MediaElch for scraping and have found it to be easy to use and reliable. Why bother with scraping metadata? It sometimes saves a bit of effort when Plex gets the match wrong.

Plex

I run Plex as the primary media server for the house (using either iPad, rasplex or AppleTV), with content being served from a number of NAS devices. It took me a while to get on board with Plex, but the ability to install the app on a 4th generation AppleTV and the portability of media has been useful at times (think plane trips on economy airlines where entertainment is an extra charge). Plex are working on a bunch of cool new features, and I’m going to try and make some time to test out the DVR functionality in the near future.

I’ve also recently started to use the auto_master file as the mechanism for mounting my SMB shares automatically on the iMac. I found User-based Login Items method was a bit flaky and shares would disappear every now and then and confuse Plex. I have three NAS devices all using a variation of the name “Multimedia” as their main share (no I didn’t really think this deployment through). As a result my iMac mounts them under Volumes as “Multimedia-1”, “Multimedia-2”, etc. This is fine, but then depending on the order they’re mounted in when the machine reboots can mess up things a bit. The process to use auto_master is fairly simple and can be found here. I’ve included an overview for your enjoyment. Firstly, fire up a terminal session and make a /mnt directory if you don’t have one already.

Last login: Mon Aug 14 05:10:12 on ttys000
imac27:~ dan$ pwd
/Users/dan
imac27:~ dan$ cd /
imac27:/ dan$ sudo mkdir mnt
Password:

You’ll then want to edit setup the auto_master file to look at auto_fs when it runs.

imac27:/ dan$ sudo nano /etc/auto_master

You should also ensure the NAS directory exists (if it doesn’t already).

imac27:/ dan$ cd /mnt
imac27:mnt dan$ sudo mkdir NAS

You can now create / modify the auto_nas file and include mount points, credentials and the shares.

imac27:/ dan$ sudo nano /etc/auto_nas

Now run your automount command.

imac27:/ dan$ sudo automount -vc
automount: /net updated
automount: /home updated
automount: /mnt/NAS updated
automount: no unmounts

Once this is done you’ll want to test that it works.

imac27:/ dan$ cd /mnt/NAS/
imac27:NAS dan$ ls
831multimedia
imac27:NAS dan$ cd 831multimedia/
-bash: cd: 831multimedia/: Too many users

Note that if you’re having issues with a “Too many users” error, it may be because you’re using a special character (like @) in your password and this is messing up the syntax for auto_master. Check this post for a resolution. Once you’ve sorted that it will look more like this.

imac27:NAS dan$ ls
412multimedia    831multimedia    omvmultimedia
imac27:NAS dan$ cd 412multimedia/
imac27:412multimedia dan$ ls
tv
imac27:412multimedia dan$ cd ..
imac27:NAS dan$ cd omvmultimedia/
imac27:omvmultimedia dan$ ls
basketball    music        music video    skateboarding    star wars    tv

It’s also a good idea to apply some permissions to that auto_nas file because you’re storing credentials in there in plain text.

imac27:/ dan$ sudo chmod 600 /etc/auto_nas
Password:

And now you can point Plex at these mount points and you shouldn’t have any problems with SMB shares disappearing randomly.

There’s just one thing though …

I’ve read a lot of reports of this functionality being broken in more recent versions of macOS. I’ve also witnessed shares disappear and get remounted with root permissions. This is obviously not ideal. There are a number of solutions floating around, including running a bash script to unmount the devices as root and then change directory as a normal user (prompting autofs to remount as that user). I found something in a forum somewhere the suggestion that I use -nosuid in my auto_master file. I did this and rebooted and the drives seem to have mounted as me rather than root. I’ll keep an eye on this and see if it continues to work or whether autofs remounts the shares as root. This would seem a non-ideal solution in a multi-user environment but that’s outside my ken at this stage.

*Update – 2017/08/20*

I ended up having problems with the auto_master method as well, and decided to create new shares on the various NAS devices with different names. These then mount as /Volumes/nas1data, /Volumes/nas2data, etc. So theoretically even if I lose the connection I can remount them manually and know that they’ll come up with a consistent path name and Plex won’t get too confused. I dragged these mounted volumes into my Login Items so that they mount every time the machine reboots. It still bites that they disappear every now and then though.

 

HTPCs Are So Last Decade

I recently threw out my RealMagic Hollywood+ XP cards. Those things were great at a time when playing DVDs on a PC was a real challenge. I remember being excited to play full screen MPEG-2 movies on my NT 4 Workstation running on a Pentium 133. Fast forward to about 8-10 years ago and it seemed everyone was running some kind of “Media Center” PC at home and consuming everything via that. Nowadays people tell me to “just stream it”. But I don’t live in a terribly bandwidth rich environment, and downloading 4GB of data from the Internet to watch a movie needs a little more prior planning than I’d like to do. So I’m still a big fan of physical discs, and still recording television shows via a PVR or via the iMac.

I still have a dream that one day I’ll happen upon the perfect user experience where I can consume whatever digital media I want in the fashion I want to. I’ve tried an awful lot of combinations in terms of backend and frontend and Plex is pretty close to doing everything I need. They don’t like ISO files though (which is justifiable, for sure). I rip most things in mkv containers now, and all of my iTunes content (well, the music at least) is pretty easy to consume, but there’re still some things that aren’t as easy to view. I still haven’t found a reliable way to consume UltraViolet content on a big screen (although I think I could do something with AirPlay). I’ve been reading a bit about various cable boxes in the US that can scan both local and streaming data for whatever you’re after and point you in the right direction. I guess this would be the way to go if you had access to reasonable bandwidth and decent content provider choices.

In any case, it’s still possible to run a HTPC the old-fashioned way with macOS.

Faith-based Computing – Just Don’t

I’d like to be very clear up front that this post isn’t intended as a swipe at people with faith. I have faith. Really, it’s a swipe at people who can’t use the tools available to them.

 

The Problem

I get cranky when IT decisions are based on feelings rather than data. As an example, I’ve been talking to someone recently about who has outsourced support of their IT to a third party. However, they’re struggling immensely with their inability to trust someone else looking after their infrastructure. I asked them why it was a problem. They told me they didn’t think the other party could do it as well as they did. I asked for evidence of this assertion. There was none forthcoming. Rather, they just didn’t feel that the other party could do the job.

 

The Data

In IT organisations / operations there’s a lot of data available. You can get uptime statistics, performance statistics, measurements of performance against time allocated for case resolution, all kinds of stuff. And you can get it not only from your internal IT department, but also from your vendors, and across most technology in the stack from the backend to the client-facing endpoint. Everyone’s into data nowadays, and everyone wants to show you theirs. So what I don’t understand is why some people insist on ignoring the data at hand, making decisions based solely on “feelings” rather than the empirical evidence laid out in front of them.

 

What’s Happening?

I call this focus on instinct “faith-based computing”. It’s similar to faith-based medicine. While I’m a believer, I’m also a great advocate of going to my doctor when I’m suffering from an ailment. Pray for my speedy recovery by all means, but don’t stop me from talking to people of science. Faith-based computing is the idea that you can make significant decisions regarding IT based on instinct rather than the data in front of you. I’m not suggesting that in life there aren’t opportunities for instinct to play a bigger part in how you do things rather than scientific data, but IT has technology in the name. Technology is a science, not a pseudo-science like numerology. Sure, I agree there’re a bunch of factors that influence our decision-making, including education, cultural background, shiny things, all kinds of stuff.

 

Conclusion

I come across organisations on a daily basis operating without making good use of the data in front of them. This arguably keeps me in business as a consultant, but doesn’t necessarily make it fun for you. Use the metrics at hand. If you must make a decision based on instinct or non-technical data, at least be sure that you’ve evaluated the available data. Don’t just dismiss things out of hand because you don’t feel like it’s right.

Apple – I know too much about iPad recovery after iOS 8

So I now know too much about how to recover old files from iPad backups. I know this isn’t exactly my bread and butter, but I found the process fascinating, and thought it was worth documenting the process here. It all started when I upgraded my wife’s iPad 2 to iOS 8. Bad idea. Basically, it ran like rubbish and was pretty close to unusable. So I rolled it back, using the instructions here. Ok, so that’s cool, but it turns out I can’t restore the data from a backup because that was made with iOS 8 and wasn’t compatible with iOS 7.1.2. Okay, fine, it was probably time to clear out some apps, and all of the photos were saved on the desktop, so no big deal. Fast forward a few days, and we realise that all of her notes were on that device. Now for the fun bit. Note that I’m using a Mac. No idea what you need to do on a Windows machine, but I imagine it’s not too dissimilar.

Step 1. Recover the iPad backup from before the iOS upgrade using Time Machine. Note that you’ll need to be able to see hidden files in Finder, as the backup is stored under HOME/Library/Application Support/MobileSync/Backup and Time Machine uses Finder’s settings for file visibility. I used these instructions. Basically, fire up a terminal and type:

$ defaults write com.apple.finder AppleShowAllFiles TRUE
$ killall Finder

You’ll then see the files you need with Time Machine. When you’re finished, type:

$ defaults write com.apple.finder AppleShowAllFiles FALSE
$ killall Finder

Step 2. Now you can browse to HOME/Library/Application Support/MobileSync/Backup and recover your backup files. If you have more than one iDevice backed up, you might need to dig a bit through the files to recover the correct files. I used these instructions to locate the correct backup files. You’ll want to look for a file called “Info.plist”. In that file, you’ll see something like

<key>Device Name</key>
<string>My iPhone</string>

And from there you can restore the correct files. It will look something like this when recovered:

screen1

Step 3. Now you’ll want to go to the normal location of your iPad backups and rename your current backup to something else. Then copy the files that you recovered from Time Machine to this location.

screen2

Step 4. At this point, I followed these quite excellent instructions from Chris Taylor and used the pretty neat iPhone Backup Extractor to extract the files I needed. Once you’ve extracted the files, you’ll have something like this. Note the path of the files is iOS Files/Library/Notes.

screen3

Step 5. At this point, fire up MesaSQLite and open up the “notes.sqlite” file as per the instructions on Chris’s post. Fantastic, I’ve got access to the text from the notes. Except they have a bunch of html tags in them and are generally unformatted. Well, I’m pretty lazy, so I used the tool at Web 2.0 Generators to decode the html to formatted text for insertion into Notes.app files. And that’s it.

Conclusion. As it happens, I’ve now set this iPad up with iCloud synchronisation. *Theoretically* I won’t need to do this again. Nor should I have had to do it in the first place. But I’ve never come across an update that was quite so ugly on particular iDevices. Thanks to Apple for the learning opportunity.

EMC – Boot from SAN MSCS Cluster configuration

Disclaimer: I haven’t done a Windows-based CLARiiON host-attach activity in about 4 or 5 years. And it’s been a bit longer than that since I did boot from SAN configurations. So you can make of this what you will. We’ve been building a Windows 2008 R2 Boot from SAN cluster lately. We got to the point where we were ready to add the 60+ LUNs that the cluster would use. The initial configuration had 3 hosts in 3 storage groups with their respective boot LUNs. I had initially thought that I’d just create another Storage Group for the cluster’s volumes and add the 3 hosts to that. All the time I was trying to remember the rule about multiple hosts or multiple LUNs in a Storage Group. And of course I remembered incorrectly.

To get around this issue, I had to add each LUN (there are about 67 of them) to each Storage Group for the cluster nodes. And ensure that they had consistent host IDs across the Storage Groups. Which has worked fine, but isn’t, as Unisphere points out, recommended. There’s also an issue with the number of LUNs I can put in a Consistency Group (32) – but that’s a story for another time.

HTPC – HDMI audio

HTPC audio over HDMI – why does it have to suck so much? I’ve learnt a lot about the limitations of PC-based entertainment systems in the last few weeks. And I hope to deliver a series of scathing posts that outline my failures along the way. A lot of this you can put to down to a learning experience (I’ve not built a PC for quite a few years) and my stubbornness regarding the use of Windows XP instead of Windows 7. But I’ll get to that at some stage.

On my HTPC I’m running Windows XP Pro SP3, a dual-core Intel chip, MSI mobo with onboard RealTek audio, 2GB RAM and an AMD/ATI 4350 video card with VGA, DVI and HDMI. I’m using the really quite excellent XBMC as the media player. In and of itself this is fine. I have it hooked up to a Sony STR-DG910 7.1 receiver which then passes the video on to a Sony projector.

So, the cool thing is that you can run audio through the graphics card (one cable to the receiver), if you load up the ATI HDMI audio driver. There’s a few things to note though.

If you find that you’re only getting stereo output even though you’ve adjusted the speaker settings in the XP sound control panel, chances are your receiver is passing audio through to your projector or TV. It is then telling the receiver that it can only do stereo. And so the receiver then tells your PC the same thing and it adjusts accordingly. You’ll need to dig through the receiver’s setup and stop it from passing audio on to the display device. Becuase you’re speakers are hooked up to the receiver, right?

While the nerds are happy that the graphics card outputs 7.1 LPCM I am not. When I listen to music, unless it’s been mixed for multi-channel, I want it in stereo. Same goes for OTA TV shows I’ve recorded. I don’t want everything coming out as 7.1 because it just doesn’t sound right. I could fiddle around with XBMC’s sound output and adjust it depending on what I’m listening to, but  really I want it to work it out for itself. Now, I think the one of the main reasons for this is that it’s a limitation of the card. But it seems to be also a limitation of XP as well. I’ve done some reading on multi-channel HDMI soundcards and it seems that they can all solve my problem, but not under Windows XP. Even if I go and buy PowerDVD 10 or whatever I think I’ll still have this issue. What I haven’t tried yet is using the analog outputs on my onboard soundcard to see what kind of decoding is available, although I fear it will be the same story. Any time you configure a PC with 7.1 speakers it thinks you want to use them all, all of the time.

I know, a lot of this could have been avoided by buying a copy of Windows 7 and using some of the media centre stuff built in to that. And I may still go that route. And yes I tried mythbuntu and no I don’t want to spend 3 hours getting my kernel recompiled and downloading SVN releases of code from various repositories just to get my TV tuner working. I’ll keep you posted on how things progress.

HTPC – Using the Leadtek Winfast DTV2000DS with GB-PVR

You may or may not have noticed that I’ve added an HTPC category to this blog. The point of this is simply to provide a place for a number of random HTPC- and media streamer-related issues that I’ve encountered recently when I made the insane decision to build a HTPC for the movie room while eschewing Windows 7. That’s right – we can do this the easy way or my way. Over the next few weeks I’m aiming to load up a bunch of articles relating to my frustration with this platform as an entertainment mechanism.

As part of this process I bought a very cheap dual digital TV tuner from PC Case Gear – the aptly named Leadtek Winfast DTV2000 DS. You can find details on this PCI card here. I was testing GB-PVR, but couldn’t find the card in the list of supported cards. I don’t want to give too much away, but, *spoiler alert*, it’s an ini file that needs adjusting. Check out the skinny here.

Add the following to the bda.ini file in your GB-PVR installation directory.

[Leadtek Winfast DTV 2000 DS]
TUNING_TYPE=DVB-T
RECEIVER_ONLY_MODE=1
FILTER_RECEIVER=WinFast DTV 2000 DS
PIN_RECEIVER_IN=Antenna In Pin
PIN_RECEIVER_OUT=MPEG2 Transport

OT: Sometimes a Mac problem needs a Windows solution

I have a Logitech 1000i all-in-one remote control. Good for me you say. Good for me I say too. But recently I wanted to update the remote via the Harmony Remote Software. I’ve recently updated my iMac to OS X 10.6.3, and made the silly error of updating to the latest version of the Harmony software – version 7.7.0. The software works fine, but it just doesn’t update the remote. There’s even a KB article about it. And I tried all of the steps, but nothing would get it working. And do you know how I resolved the issue? I uninstalled and reinstalled the same version of the software. That’s right, I had to do the sort of thing that I had been doing on my PC for years. Ridiculous. And I don’t know who’s to blame, but I don’t like it.

2009 and penguinpunk.net

It was a busy year, and I don’t normally do these type of posts, but I thought I’d try to do a year in review type thing so I can look back at the end of 2010 and see what kind of promises I’ve broken. Also, the Exchange Guy will no doubt enjoy the size comparison. You can see what I mean by that here.

In any case, here’re some broad stats on the site. In 2008 the site had 14966 unique visitors according to Advanced Web Statistics 6.5 (build 1.857). But in 2009, it had 15856 unique visitors – according to Advanced Web Statistics 6.5 (build 1.857). That’s an increase of some 890 unique visitors, also known as year-on-year growth of approximately 16.82%. I think. My maths are pretty bad at the best of times, but I normally work with storage arrays, not web statistics. In any case, most of the traffic is no doubt down to me spending time editing posts and uploading articles, but it’s nice to think that it’s been relatively consistent, if not a little lower than I’d hoped. This year (2010 for those of you playing at home), will be the site’s first full year using Google analytics, so assuming I don’t stuff things up too badly, I’ll have some prettier graphs to present this time next year. That said, MYOB / smartyhost are updating the web backend shortly so I can’t make any promises that I’ll have solid stats for this year, or even a website :)

What were the top posts? Couldn’t tell you. I do, however, have some blogging-type goals for the year:

1. Blog with more focus and frequency – although this doesn’t mean I won’t throw in random youtube clips at times.

2. Work more on the promotion of the site. Not that there’s a lot of point promoting something if it lacks content.

3. Revisit the articles section and revise where necessary. Add more articles to the articles page.

On the work front, I’m architecting the move of my current employer from a single data centre to a 2+1 active / active architecture (from a storage and virtualisation perspective). There’s more blades, more CLARiiON, more MV/S, some vSphere and SRM stuff, and that blasted Cisco MDS fabric stuff is involved too. Plus a bunch of stuff I’ve probably forgotten. So I think it will be a lot of fun, and a great achievement if we actually get anything done by June this year. I expect there’ll be some moments of sheer boredom as I work my way through 100s of incremental SAN Copies and sVMotions. But I also expect there will be moments of great excitement when we flick the switch on various things and watch a bunch of visio illustrations turn into something meaningful.

Or I might just pursue my dream of blogging about the various media streaming devices on the market. Not sure yet. In any case, thanks for reading, keep on reading, tell your friends, and click on the damn Google ads.