macOS Catalina and Frequent QNAP SMB Disconnections

TL;DR – I have no idea why this is happening as frequently as it is, what’s causing it, or how to stop it. So I’m using AFP for the moment.

Background

I run a Plex server on my Mac mini 2018 running macOS Catalina 10.15.6. I have all of my media stored on a QNAP TS-831X NAS running QTS 4.4.3.1400 with volumes connected to macOS over SMB. This has worked reasonably well with Catalina for the last year (?) or so, but with the latest Catalina update I’ve had frequent (as in every few hours) disconnections from the shares. I’ve tried a variety of fixes, and thought I’d document them here. None of them really worked, so what I’m hoping is that someone with solid macOS chops will be able to tell me what I’m doing wrong.

 

Possible Solutions

QNAP and SMB

I made sure I was running the latest QNAP firmware version. I noticed in the latest release notes for 4.4.3.1381 that it fixed a problem where “[u]sers could not mount NAS shared folders and external storage devices at the same time on macOS via SMB”. This wasn’t quite the issue I was experiencing, but I was nonetheless hopeful. This was not the case. This thread talked about SMB support levels. I was running my shares with support for SMB 2.1 through 3.0. I’ve since changed that to 3.0 only. No dice.

macOS

This guy on this thread thinks he’s nailed it. He may have, but not for me. I’ve included some of the text for reference below.

Server Performance Mode

https://support.apple.com/en-us/HT202528

The description reads:

Performance mode changes the system parameters of your Mac. These changes take better advantage of your hardware for demanding server applications. A Mac that needs to run high-performance services can turn on performance mode to dedicate additional system resources for server applications.

“Solution:

1 – First check to see if server performance mode is enabled on your machine using this Terminal command. You should see the command return serverperfmode=1 if it is enabled.

nvram boot-args

2 – If you do not see serverperfmode=1 returned, enter this following line of code to enable it. (I recommend rebooting your system afterwards)

sudo nvram boot-args="serverperfmode=1 $(nvram boot-args 2>/dev/null | cut -f 2-)"

I’ve also tried changing the power settings on the Mac mini, and disabled power nap. No luck there either. I’ve also tried using the FQDN of the NAS as opposed to the short name of the device when I map the drives. Nope, nothing.

 

Solution

My QNAP still supports Apple File Protocol, and it supports multiple protocols for the same share. So I turned on AFP and mapped the drives that way. I’m pleased to say that I haven’t had the shares disconnect since (and have thus had a much smoother Plex experience), but I’m sad to say that this is the only solution I have to offer for the moment. And if your storage device doesn’t support AFP? Sod knows. I haven’t tried doing it via NFS, but I’ve heard reports that NFS was its own special bin fire in recent versions of Catalina. It’s an underwhelming situation, and maybe one day I’ll happen across the solution. And I can share it here and we can all do a happy dance.

Random Short Take #40

Welcome to Random Short Take #40. Quite a few players have worn 40 in the NBA, including the flat-top king Frank Brickowski. But my favourite player to wear number 40 was the Reign Man – Shawn Kemp. So let’s get random.

  • Dell EMC PowerProtect Data Manager 19.5 was released in early July and Preston covered it pretty comprehensively here.
  • Speaking of data protection software releases and enhancements, we’ve barely recovered from the excitement of Veeam v10 being released and Anthony is already talking about v11. More on that here.
  • Speaking of Veeam, Rhys posted a very detailed article on setting up a Veeam backup repository on NFS using a Pure Storage FlashBlade environment.
  • Sticking with the data protection theme, I penned a piece over at Gestalt IT for Druva talking about OneDrive protection and why it’s important.
  • OpenDrives has some new gear available – you can read more about that here.
  • The nice folks at Spectro Cloud recently announced that its first product is generally available. You can read the press release here.
  • Wiliam Lam put out a great article on passing through the integrated GPU on Apple Mac minis with ESXi 7.
  • Time passes on, and Christian recently celebrated 10 years on his blog, which I think is a worthy achievement.

Happy Friday!

Random Short Take #39

Welcome to Random Short Take #39. Not a huge amount of players have worn 39 in the NBA, and I’m not going to pretend I’m any real fan of The Dwightmare. But things are tough all around, so let’s remain optimistic and push through to number 40. Anyway let’s get random.

  • VeeamON 2020 was online this week, and Anthony Spiteri has done a great job of summarising the major technical session announcements here.
  • I’ve known Howard Marks for a while now, and always relish the opportunity to speak with him when I can. This post is pretty hilarious, and I’m looking forward to reading the followup posts.
  • This is a great article from Alastair Cooke on COVID-19 and what En-Zed has done effectively to stop the spread. It was interesting to hear his thoughts on returning to the US, and I do agree that it’s going to be some time until I make the trip across the Pacific again.
  • Sometimes people get crazy ideas about how they might repurpose some old bits of technology. It’s even better when they write about their experiences in doing so. This article on automating an iPod Hi-Fi’s volume control over at Six Colors was fantastic.
  • Chris M. Evans put out a typically thought-provoking piece on data migration challenges recently that I think is worth checking out. I’ve been talking a lot to customers that are facing these challenges on a daily basis, and it’s interesting to see how, regardless of the industry vertical they operate in, it’s sometimes just a matter of the depth varying, so to speak.
  • I frequently bump into Ray Lucchesi at conferences, and he knows a fair bit about what does and doesn’t work. This article on his experiences recently with a number of virtual and online conferences is the epitome of constructive criticism.
  • Speaking of online conferences, the Australian VMUG UserCon will be virtual this year and will be held on the 30th July. You can find out more and register here.
  • Finally, if you’ve spent any time with me socially, you’ll know I’m a basketball nut. And invariably I’ll tell you that Deftones is may favouritest band ever. So it was great to come across this article about White Pony on one of my favourite sports (and popular culture) websites. If you’re a fan of Deftones, this is one to check out.

 

OT – Upgrading From macOS Mojave To Catalina (The Hard Way)

This post is really about the boring stuff I do when I have a day off and isn’t terribly exciting. TL;DR I had some problems upgrading to Catalina, and had to start from scratch.

 

Background

I’ve had an Apple Mac since around 2008. I upgraded from a 24″ iMac to a 27″ iMac and was super impressed with the process of migrating between machines, primarily because of Time Machine’s ability to recover settings, applications, and data in a fairly seamless fashion. I can’t remember what version of macOS I started with (maybe Leopard?), but I’ve moved steadily through the last few versions with a minimal amount of fuss. I was running Mojave on my iMac late last year when I purchased a refurbished 2018 Mac mini. At the time, I decided not to upgrade to Catalina, as I’d had a few issues with my work laptop and didn’t need the aggravation. So I migrated from the iMac to the Mac mini and kept on keeping on with Mojave.

Fast forward to a April this year, and the Mac mini gave up the ghost. With Apple shutting down its stores here in response to COVID-19, it was a 2 week turnaround at the local repair place to get the machine fixed. In the meantime, I was able to use Time Machine to load everything on a 2012 MacBook Pro that was being used sparingly. It was a bit clunky, but had an internal SSD and 16GB of RAM, so it could handle the basics pretty comfortably. When the Mac mini was repaired, I used Time Machine once again to move everything back. It’s important to note that this is everything (settings, applications, and data) that had been accumulated since 2008. So there’s a bit of cruft associated with this build. A bunch of 32-bit applications that I’d lost track of, widgets that were no longer really in use, and so on.

 

The Big Update

I took the day off on Friday last week. I’d been working a lot of hours since COVID-19 restrictions kicked in here, and I’d been filling my commuting time with day job work (sorry blog!). I thought it would be fun to upgrade the Mac mini to Catalina. I felt that things were in a reasonable enough state that I could work with what it had to offer, and I get twitchy when there’s an upgrade notification on the Settings icon. Just sitting there, taunting me.

I downloaded the installer and pressed on. No dice, my system volume wasn’t formatted with APFS. How could this be? Well, even though APFS has been around for a little while now, I’d been moving my installation across various machines. At the time when the APFS conversion was part of the macOS upgrade, I was running an iMac with a spinning disk as the system volume, and so it never prompted to do that upgrade. When I moved to the Mac mini, I didn’t do any macOS upgrade, so I guess it just kept working with the HFS+ volume. It seems a bit weird that Catalina doesn’t offer a workaround for this, but I may just have been looking in the wrong place. Now, there was a lot of chatter in the forums about rebooting into Recovery Mode and converting the drive to an APFS volume. No matter what I tried, I was unable to do this effectively (either using the Recovery Mode console with Mojave or with Catalina booting from USB). I followed articles like this one but just didn’t have the same experience. And when I erased the system drive and attempted to recover from Time Machine backups, it would re-erase the volume as HFS+. So, I don’t know, I guess I’m an idiot. The solution that finally worked for me was to erase the drive, format it as APFS, install Mojave from scratch, and recover from a Time Machine backup. Unfortunately, though, this seemed to only want to transfer around 800KB of settings data. The normal “wait a few hours while we copy your stuff” just didn’t happen. Sod knows why, but what I did know was that I was really wasting my day off with this stuff.

I also ran in to an issue trying to do the installation from USB. You can read about booting from external devices and the T2 security chip here, here, and here. I lost patience with the process and took a different approach.

 

Is That So Bad?

Not really. I have my Photos library and iTunes media on a separate volume. I have one email account that we have used POP with over the years, but I installed Thunderbird, recovered the profile from my Time Machine data, and modified profiles.ini to point to that profile (causing some flashbacks to my early days on a help desk supporting a Netscape user base). The other thing I had to do was recover my Plex database. You can read more on that here. It actually went reasonably well. I’d been storing my iPhone backups on a separate volume too, and had to follow this process to relocate those backup files. Otherwise, Microsoft, to their credit, has made the reinstallation process super simple with Microsoft 365. Once I had most everything setup again, I was able to perform the upgrade to Catalina.

 

Conclusion

If this process sounds like it was a bit of a pain, it was. I don’t know that Apple has necessarily dropped the ball in terms of usability in the last few years, but sometimes it feels like it. I think I just had really high expectations based on some good fortune I’d enjoyed over the past 12 years. I’m not sure what the term is exactly, but it’s possible that because I’ve invested this much money in a product, I’m more forgiving of the issues associated with the product. Apple has done a great job historically of masking the complexity of technology from the end user. Sometimes, though, you’re going to come across odd situations that potentially push you down an odd path. That’s what I tell myself anyway as I rue the time I lost on this upgrade. Was anyone else’s upgrade to Catalina this annoying?

Random Short Take #32

Welcome to Random Short Take #32. Lot of good players have worn 32 in the NBA. I’m a big fan of Magic Johnson, but honourable mentions go to Jimmer Fredette and Blake Griffin. It’s a bit of a weird time around the world at the moment, but let’s get to it.

  • Veeam 10 was finally announced a little while ago and is now available for deployment. I work for a service provider, and we use Veeam, so this article from Anthony was just what I was after. There’s a What’s New article from Veeam you can view here too.
  • I like charts, and I like Apple laptops, so this chart was a real treat. The lack of ports is nice to look at, I guess, but carrying a bag of dongles around with me is a bit of a pain.
  • VMware recently made some big announcements around vSphere 7, amongst other things. Ather Beg did a great job of breaking down the important bits. If you like to watch videos, this series from VMware’s recent presentations at Tech Field Day 21 is extremely informative.
  • Speaking of VMware Cloud Foundation, Cormac Hogan recently wrote a great article on getting started with VCF 4.0. If you’re new to VCF – this is a great resource.
  • Leaseweb Global recently announced the availability of 2nd Generation AMD EPYC powered hosts as part of its offering. I had a chance to speak with Mathijs Heikamph about it a little while ago. One of the most interesting things he said, when I questioned him about the market appetite for dedicated servers, was “[t]here’s no beating a dedicated server when you know the workload”. You can read the press release here.
  • This article is just … ugh. I used to feel a little sorry for businesses being disrupted by new technologies. My sympathy is rapidly diminishing though.
  • There’s a whole bunch of misinformation on the Internet about COVID-19 at the moment, but sometimes a useful nugget pops up. This article from Kieren McCarthy over at El Reg delivers some great tips on working from home – something more and more of us (at least in the tech industry) are doing right now. It’s not all about having a great webcam or killer standup desk.
  • Speaking of things to do when you’re working at home, JB posted a handy note on what he’s doing when it comes to lifting weights and getting in some regular exercise. I’ve been using this opportunity to get back into garage weights, but apparently it’s important to lift stuff more than once a month.

Random Short Take #23

Want some news? In a shorter format? And a little bit random? This listicle might be for you.

  • Remember Retrospect? They were acquired by StorCentric recently. I hadn’t thought about them in some time, but they’re still around, and celebrating their 30th anniversary. Read a little more about the history of the brand here.
  • Sometimes size does matter. This article around deduplication and block / segment size from Preston was particularly enlightening.
  • This article from Russ had some great insights into why it’s not wise to entirely rule out doing things the way service providers do just because you’re working in enterprise. I’ve had experience in both SPs and enterprise and I agree that there are things that can be learnt on both sides.
  • This is a great article from Chris Evans about the difficulties associated with managing legacy backup infrastructure.
  • The Pure Storage VM Analytics Collector is now available as an OVA.
  • If you’re thinking of updating your Mac’s operating environment, this is a fairly comprehensive review of what macOS Catalina has to offer, along with some caveats.
  • Anthony has been doing a bunch of cool stuff with Terraform recently, including using variable maps to deploy vSphere VMs. You can read more about that here.
  • Speaking of people who work at Veeam, Hal has put together a great article on orchestrating Veeam recovery activities to Azure.
  • Finally, the Brisbane VMUG meeting originally planned for Tuesday 8th has been moved to the 15th. Details here.

Random Short Take #17

Here are some links to some random news items and other content that I recently found interesting. You might find them interesting too. Episode 17 – am I over-sharing? There’s so much I want you to know about.

  • I seem to always be including a link from the Backblaze blog. That’s mainly because they write about things I’m interested in. In this case, they’ve posted an article discussing the differences between availability and durability that I think is worth your time.
  • Speaking of interesting topics, Preston posted an article on NetWorker Pools with Data Domain that’s worth looking at if you’re into that kind of thing.
  • Maintaining the data protection theme, Alastair wrote an interesting article titled “The Best Automation Is One You Don’t Write” (you know, like the best IO is one you don’t need to do?) as part of his work with Cohesity. It’s a good article, and not just because he mentions my name in it.
  • I recently wanted to change the edition of Microsoft Office I was using on my MacBook Pro and couldn’t really work out how to do it. In the end, the answer is simple. Download a Microsoft utility to remove your Office licenses, and then fire up an Office product and it will prompt you to re-enter your information at that point.
  • This is an old article, but it answered my question about validating MD5 checksums on macOS.
  • Excelero have been doing some cool stuff with Imperial College London – you can read more about that here.
  • Oh hey, Flixster Video is closing down. I received this in my inbox recently: “[f]ollowing the announcement by UltraViolet that it will be discontinuing its service on July 31, 2019, we are writing to provide you notice that Flixster Video is planning to shut down its website, applications and operations on October 31, 2019”. It makes sense, obviously, given UltraViolet’s demise, but it still drives me nuts. The ephemeral nature of digital media is why I still have a house full of various sized discs with various kinds of media stored on them. I think the answer is to give yourself over to the streaming lifestyle, and understand that you’ll never “own” media like you used to think you did. But I can’t help but feel like people outside of the US are getting shafted in that scenario.
  • In keeping up with the “random” theme of these posts, it was only last week that I learned that “Television, the Drug of the Nation” from the very excellent album “Hypocrisy Is the Greatest Luxury” by The Disposable Heroes of Hiphoprisy was originally released by Michael Franti and Rono Tse when they were members of The Beatnigs. If you’re unfamiliar with any of this I recommend you check them out.

Random Short Take #5

So it’s been over six months since I did one of these, and it’s clear that I’m literally rubbish at doing them regularly.

HTPC – Replacing The PC With macOS

The Problem

My HTPC (running Windows 7 Media Center) died a few months ago after around 5 or 6 years of service. It was a shame as I had used it as a backup for my TV recordings and also to rip optical discs to my NAS. At the time it died I was about to depart on a business trip and I couldn’t be bothered trying to work out why it died. So I gave the carcass of the machine to my nephew (who’s learning about PC hardware things) and tried to put something together using other machines in my house (namely an iMac and some other odds and sods). I’m obviously not the first person to use a Mac for these activities, but I thought it would be useful to capture my experiences.

 

Requirements

Requirements? Isn’t this just for home entertainment? Whatever, relaxation is important, and understanding your users is as well. We record a lot of free to air tv with a Fetch Mighty box and sometimes things clash. Or don’t record. “Catch-up” TV services in Australia are improving a lot, but our Netflix catalogue is nowhere near as extensive as the US one. So I like to have a backup option for TV recording. The HTPC provided that. And it had that cool Media Center extender option with the Xbox360 that was actually quite useable and meant I didn’t need a PC sitting in the lounge room.

From a movie consumption perspective, we mostly watch stuff using various clients such as AppleTV or WDTV devices, so the HTPC didn’t really impact anything there, although the way I got data from Blu-ray / DVD / HD-DVD / VCD was impacted as my iMac didn’t have a Blu-ray player attached. Okay, the SuperDrive could deal with VideoCDs, but you get my point.

So, in short, I need something that could:

  • Record free to air tv shows via a schedule;
  • Rip Blu-ray and other content to mkv containers (or similar); and
  • Grab accurate metadata for that media for use with Plex.

 

Solution?

The solution was fairly easy to put together when I thought about what I actually needed.

TV

I backed a Kickstarter for the HDHomeRun Connect some time ago and hadn’t really used the device very effectively save for the odd VLC stream on an iPad. It’s a dual-tuner device that sits on your wired network and is addressable by a few different applications over IP. The good news is that Elgato EyeTV works with both macOS and IceTV (a local TV guide provider) and also supports the HDHomeRun. I haven’t tested multiple HDHomeRun devices with the same EyeTV software and I’m not 100% convinced that this would work. I had 8 tuners on the HTPC so this is a bit of a bummer, but as it’s not the primary recording device I can live with it. The EyeTV software supports multiple export options too, so I can push shows into iTunes and have AppleTV pick them up there.

Optical Discs

I bought a USB-based Pioneer (BDR-XS06) Blu-ray drive that works with macOS, and MakeMKV is still my go to in terms of mkv ripping. It maxes out at 2x. I’m not sure if this is a macOS limitation or firmware crippling or some other thing. Now that I think about it, it’s likely the USB bus on my iMac. If anyone wants to send me a Thunderbolt dock I’m happy to try that out as an alternative. In any case a standard movie Blu-ray takes just shy of an hour to rip. It does work though. If I still need to rip HD-DVDs I can use the drive from the Xbox360 that I have laying about. What? I like to have options.

Metadata

For movie metadata I settled on MediaElch for scraping and have found it to be easy to use and reliable. Why bother with scraping metadata? It sometimes saves a bit of effort when Plex gets the match wrong.

Plex

I run Plex as the primary media server for the house (using either iPad, rasplex or AppleTV), with content being served from a number of NAS devices. It took me a while to get on board with Plex, but the ability to install the app on a 4th generation AppleTV and the portability of media has been useful at times (think plane trips on economy airlines where entertainment is an extra charge). Plex are working on a bunch of cool new features, and I’m going to try and make some time to test out the DVR functionality in the near future.

I’ve also recently started to use the auto_master file as the mechanism for mounting my SMB shares automatically on the iMac. I found User-based Login Items method was a bit flaky and shares would disappear every now and then and confuse Plex. I have three NAS devices all using a variation of the name “Multimedia” as their main share (no I didn’t really think this deployment through). As a result my iMac mounts them under Volumes as “Multimedia-1”, “Multimedia-2”, etc. This is fine, but then depending on the order they’re mounted in when the machine reboots can mess up things a bit. The process to use auto_master is fairly simple and can be found here. I’ve included an overview for your enjoyment. Firstly, fire up a terminal session and make a /mnt directory if you don’t have one already.

Last login: Mon Aug 14 05:10:12 on ttys000
imac27:~ dan$ pwd
/Users/dan
imac27:~ dan$ cd /
imac27:/ dan$ sudo mkdir mnt
Password:

You’ll then want to edit setup the auto_master file to look at auto_fs when it runs.

imac27:/ dan$ sudo nano /etc/auto_master

You should also ensure the NAS directory exists (if it doesn’t already).

imac27:/ dan$ cd /mnt
imac27:mnt dan$ sudo mkdir NAS

You can now create / modify the auto_nas file and include mount points, credentials and the shares.

imac27:/ dan$ sudo nano /etc/auto_nas

Now run your automount command.

imac27:/ dan$ sudo automount -vc
automount: /net updated
automount: /home updated
automount: /mnt/NAS updated
automount: no unmounts

Once this is done you’ll want to test that it works.

imac27:/ dan$ cd /mnt/NAS/
imac27:NAS dan$ ls
831multimedia
imac27:NAS dan$ cd 831multimedia/
-bash: cd: 831multimedia/: Too many users

Note that if you’re having issues with a “Too many users” error, it may be because you’re using a special character (like @) in your password and this is messing up the syntax for auto_master. Check this post for a resolution. Once you’ve sorted that it will look more like this.

imac27:NAS dan$ ls
412multimedia    831multimedia    omvmultimedia
imac27:NAS dan$ cd 412multimedia/
imac27:412multimedia dan$ ls
tv
imac27:412multimedia dan$ cd ..
imac27:NAS dan$ cd omvmultimedia/
imac27:omvmultimedia dan$ ls
basketball    music        music video    skateboarding    star wars    tv

It’s also a good idea to apply some permissions to that auto_nas file because you’re storing credentials in there in plain text.

imac27:/ dan$ sudo chmod 600 /etc/auto_nas
Password:

And now you can point Plex at these mount points and you shouldn’t have any problems with SMB shares disappearing randomly.

There’s just one thing though …

I’ve read a lot of reports of this functionality being broken in more recent versions of macOS. I’ve also witnessed shares disappear and get remounted with root permissions. This is obviously not ideal. There are a number of solutions floating around, including running a bash script to unmount the devices as root and then change directory as a normal user (prompting autofs to remount as that user). I found something in a forum somewhere the suggestion that I use -nosuid in my auto_master file. I did this and rebooted and the drives seem to have mounted as me rather than root. I’ll keep an eye on this and see if it continues to work or whether autofs remounts the shares as root. This would seem a non-ideal solution in a multi-user environment but that’s outside my ken at this stage.

*Update – 2017/08/20*

I ended up having problems with the auto_master method as well, and decided to create new shares on the various NAS devices with different names. These then mount as /Volumes/nas1data, /Volumes/nas2data, etc. So theoretically even if I lose the connection I can remount them manually and know that they’ll come up with a consistent path name and Plex won’t get too confused. I dragged these mounted volumes into my Login Items so that they mount every time the machine reboots. It still bites that they disappear every now and then though.

 

HTPCs Are So Last Decade

I recently threw out my RealMagic Hollywood+ XP cards. Those things were great at a time when playing DVDs on a PC was a real challenge. I remember being excited to play full screen MPEG-2 movies on my NT 4 Workstation running on a Pentium 133. Fast forward to about 8-10 years ago and it seemed everyone was running some kind of “Media Center” PC at home and consuming everything via that. Nowadays people tell me to “just stream it”. But I don’t live in a terribly bandwidth rich environment, and downloading 4GB of data from the Internet to watch a movie needs a little more prior planning than I’d like to do. So I’m still a big fan of physical discs, and still recording television shows via a PVR or via the iMac.

I still have a dream that one day I’ll happen upon the perfect user experience where I can consume whatever digital media I want in the fashion I want to. I’ve tried an awful lot of combinations in terms of backend and frontend and Plex is pretty close to doing everything I need. They don’t like ISO files though (which is justifiable, for sure). I rip most things in mkv containers now, and all of my iTunes content (well, the music at least) is pretty easy to consume, but there’re still some things that aren’t as easy to view. I still haven’t found a reliable way to consume UltraViolet content on a big screen (although I think I could do something with AirPlay). I’ve been reading a bit about various cable boxes in the US that can scan both local and streaming data for whatever you’re after and point you in the right direction. I guess this would be the way to go if you had access to reasonable bandwidth and decent content provider choices.

In any case, it’s still possible to run a HTPC the old-fashioned way with macOS.

Apple – I know too much about iPad recovery after iOS 8

So I now know too much about how to recover old files from iPad backups. I know this isn’t exactly my bread and butter, but I found the process fascinating, and thought it was worth documenting the process here. It all started when I upgraded my wife’s iPad 2 to iOS 8. Bad idea. Basically, it ran like rubbish and was pretty close to unusable. So I rolled it back, using the instructions here. Ok, so that’s cool, but it turns out I can’t restore the data from a backup because that was made with iOS 8 and wasn’t compatible with iOS 7.1.2. Okay, fine, it was probably time to clear out some apps, and all of the photos were saved on the desktop, so no big deal. Fast forward a few days, and we realise that all of her notes were on that device. Now for the fun bit. Note that I’m using a Mac. No idea what you need to do on a Windows machine, but I imagine it’s not too dissimilar.

Step 1. Recover the iPad backup from before the iOS upgrade using Time Machine. Note that you’ll need to be able to see hidden files in Finder, as the backup is stored under HOME/Library/Application Support/MobileSync/Backup and Time Machine uses Finder’s settings for file visibility. I used these instructions. Basically, fire up a terminal and type:

$ defaults write com.apple.finder AppleShowAllFiles TRUE
$ killall Finder

You’ll then see the files you need with Time Machine. When you’re finished, type:

$ defaults write com.apple.finder AppleShowAllFiles FALSE
$ killall Finder

Step 2. Now you can browse to HOME/Library/Application Support/MobileSync/Backup and recover your backup files. If you have more than one iDevice backed up, you might need to dig a bit through the files to recover the correct files. I used these instructions to locate the correct backup files. You’ll want to look for a file called “Info.plist”. In that file, you’ll see something like

<key>Device Name</key>
<string>My iPhone</string>

And from there you can restore the correct files. It will look something like this when recovered:

screen1

Step 3. Now you’ll want to go to the normal location of your iPad backups and rename your current backup to something else. Then copy the files that you recovered from Time Machine to this location.

screen2

Step 4. At this point, I followed these quite excellent instructions from Chris Taylor and used the pretty neat iPhone Backup Extractor to extract the files I needed. Once you’ve extracted the files, you’ll have something like this. Note the path of the files is iOS Files/Library/Notes.

screen3

Step 5. At this point, fire up MesaSQLite and open up the “notes.sqlite” file as per the instructions on Chris’s post. Fantastic, I’ve got access to the text from the notes. Except they have a bunch of html tags in them and are generally unformatted. Well, I’m pretty lazy, so I used the tool at Web 2.0 Generators to decode the html to formatted text for insertion into Notes.app files. And that’s it.

Conclusion. As it happens, I’ve now set this iPad up with iCloud synchronisation. *Theoretically* I won’t need to do this again. Nor should I have had to do it in the first place. But I’ve never come across an update that was quite so ugly on particular iDevices. Thanks to Apple for the learning opportunity.