Packrats and Browser Wars

In this series, Alex discusses browsers vs. operating systems and the Mysterious Case of The Unheeded Update, and why video streaming is harder than it should be. It all relates to his efforts in a major tuneup of us MacBook Pro, but lessons apply to all systems.

Neotoma_cinereaI’m a packrat. There, I said it. I like having my stuff on my computer, reachable even when the Internet’s down. That’s been the premise of the PC revolution going back 30 years. Others, notably ChromeBooks and Salesforce, work fully only when the Internet’s working.

That’s an important distinction. In 1994, when Netscape Navigator was the most popular web browser, Marc Andreessen declared the browser would replace the OS. Clearly aimed at Microsoft, who proved them wrong with Windows 95, it still was a marker for the future.

Now, Chromebooks and web-based tools like Google Apps routinely put everything (or nearly everything) on the Web. Faster, smaller, much cheaper computers, twenty years of development, plus routine access to broadband, have made “everything on the web” (Or cloud) routine.

But “everything on the web” is quite different from collaborative tools adding to, not replacing, the user’s my-stuff-on-my-device experience. Consider Apple’s Continuity and Handoff: Continuity supports answering a call on whatever device is to hand: Your iPhone rings, answer on your MacBook. This is a great use of Bluetooth Low Energy (Your phone has to be within BT range of the other device), and seems to Just Work.

Handoff does the same for Apple Mail, Messages, Safari, Reminders, Contacts, Maps, Keynote, Pages, Numbers, and Calendar. (There’s a developer API for their own apps, too.) Start a text chat on your phone, it gets heated (This happens). You want your full-size keyboard to type lengthier arguments? So long as both are logged into the same iCloud account, simply pause mid-rant, swap to the other device, and pick right up.

Apple aren’t the only ones doing this. Facebook is moving from devices-in-isolation to transparent synchronization: Of your timeline, new notifications, etc., across multiple platforms, and darned quickly, as anyone watching both phone and computer pop up a message notification within half a second can attest. Presumably these features will extend to WhatsApp as well.

(Related: Facebook recently added encrypted notifications, using OpenPGP; while you might not care whether news that you were poked was spied upon, this is an encouraging step for privacy, especially after they moved to SSL-default access for their webpage.)

Or, consider Google Chrome, which shares browser search history between your devices—I am constantly surprised when I begin searches for “Antique steamrollers” on the PC and it’s an immediate choice when I type “A” on my Samsung Sprint Note 3. In fact, this works so transparently that, when it doesn’t, I know I have no or very limited cell signal. This is the sort of unobtrusive enhancement I quickly learn to rely upon.

Dropbox, Box.com, and a dozen other utilities replicate your files to the cloud; I’m not so Luddite that I dislike automatic, transparent copies to a secure location, but that’s still different from No Connection = No Work, a la the cloud-only collaborative solutions.

In my opinion, these additive tools are so transparent, they don’t replace your existing methods, nor detract from the “My stuff on my computer” model. Arguably, requiring an iCloud account is vendor lock-in, but that’s a subject for a different essay about online storage (Upcoming). My devices magically sharing my search history is a Good Thing, as long as security isn’t on my mind.

Rather, were sharing results: I’ve been using FireFox, with Yahoo as my default search, instead of Google Chrome, and therein lies a tale.

Browser Wars

The last few steps to fixing the problems with mysterious slowdowns involved chasing down browser bugs, or at least browser-versus-plugin bugs. As usual, the culprit seems to be Flash. Windows users, don’t tune out: Most of this applies for you, too.

I used Chrome as my default browser by elimination: Safari (Apple’s own browser) had strange page-display glitches, the cursor would freeze in other apps while Safari was open, and CPU utilization would suddenly go over 100%; Firefox didn’t like playing videos much, and saves within Microsoft Word 2011 could take 30 seconds. Chrome seemed the least-worst of the three. (On the Mac, CPU is measured on cores, not your entire system’s performance, so 130% CPU use isn’t as crazy as it seems.)

Then came the long series of fixes previously told; the underlying system problems went away, leaving issues only when I had a browser open. While performance has been blessedly better, it hasn’t been perfect. As before, the issues were sudden CPU spikes, accompanied by 747-on-takeoff level fan activity; cursor freezes; very slow saves within Word or Excel. To finish the investigation, I tried all three browsers again.

Of the three, Chrome now seems to cause the most slowdown. Chrome may also be why AppleSpell.service and AppleSpell grow’d like Topsy; since I stopped using Chrome as my main browser, I’ve all of four copies running in the two weeks since my last reboot, instead of the 30 to 40 I used to find. (None of them seemed to use much CPU or RAM, but it was the principle of the thing.) So, the AppleSpell issue seems to link to a particular browser, Chrome.

Safari? Loading it still brings the machine to a crawl for three to eight minutes, depending on the number of tabs to re-open: Neither unobtrusive nor friendly to routine use.

Time to try Firefox again. Kudos to the Mozilla Firefox developers; modern Firefox is much improved. No apparent marking errors, webpages malformed, dropped tags, or screen tearing. Load and re-load times are much faster, even with two dozen tabs open. I haven’t tried Firefox Hello (their Instant Message/video chat client), or Pocket (share and save webpages for viewing elsewhere), but the general browser performance has been great. Still, it wouldn’t be Chaos Manor without something inexplicable cropping up.

I was running version 38.0.5 of Firefox; per Firefox | About Firefox, this was “up to date”. I did get an odd error messages about how “Something is trying to trick Firefox into accepting an insecure update. Please contact your network provider and seek help.” Alas, I’m my own network provider—no one to call, and nothing odd in the router log. None of the online help, including the Mozilla support site, had anything useful to say. I found no evidence I actually was facing a man-in-the-middle, pagejack, suborned webpage or any other obvious attack. Avast anti-virus remained mum on any maladies. When the message popped up again, I investigated further.

A trip to the Firefox download site indicated Version 39 was ready for download, despite Version 38.0.5 insisting it was the absolute latest. One upgrade later, I appear to have better compatibility with Flash (More in a minute), faster load times, and greater browser compatibility. The mysterious “insecure update” message did re-appear, so something’s still not right. Further research finds nothing about Firefox being spoofed by Mozilla over updates, or any other cause. I’ll keep an eye out for answers, though.

Next Time

In the next installment, I’ll talk about browser compatibility (In general, improving), the current and future plumbing of web-delivered video, and how to check your own system out in a little more depth.

We invite your comments below. And if you have a ‘I did this so you don’t have to story’ to tell, consider sharing with us. Details on how to do that are here. – Editor

Weather Stations and USB Servers

Your intrepid CMR editor bought a weather station last year that can update to the Weather Underground network. But to do that, he needed to connect the  weather console to the PC. He wanted the console to be downstairs – and the desktop is upstairs. He discovered a USB server that connects USB devices over his LAN that saved quite a few ‘sawbucks’.

500px-CloudShowerSun.svgA trip last year to Costco netted me a Acurite Professional Weather Station (model 0232C, which lists for $169.99, available on Amazon  for $113); at Costco, the price was under $85. I have always been interested in getting a weather station, but most cost even more than $170. So when I saw the one at the local Costco at that price, I decided to get it. (It may not be available now, but the story is still interesting.)

Setup was fairly easy, although at additional cost. “Siting” a weather station is important to get accurate readings, but I had a good spot in the corner of my yard – no fences, lots of open space around it. In my case, there was a solid tree stump at that corner, so a trip to the local Home Depot netted me a 1” galvanized flange, a 1” to ¾” adapter, and a five foot ¾” galvanized pipe, plus four 2 ½” lag bolts, and a can of beige spray paint (to help the post blend into the neighborhood, since it is visible from the street). The weather station is mounted on the ¾” pipe, although the kit did include a mounting bracket to connect to a wooden post. All of that got the weather station installed in a great spot, and so far, the neighbors haven’t complained about the placement (there is an HOA involved, so don’t tell them).

The battery operated (with a solar panel to drive an internal fan) outdoor unit communicates with the indoor color display wirelessly. The display has a USB port to connect to a computer, and then you can use the included software (although I found some better software than came with the kit) to send your data to a place like Weather Underground. You can also install an app on your Android or Apple device to display your data. But you do need to connect the display to your computer via the included USB cable.

And that was my problem that makes all of this related to a “doing things so you don’t have to” story.

In my house, the living room is the desired location for the color weather display. But the desktop computer is in the office upstairs. That computer doesn’t get used much, since my wife and I both have laptops that we use to connect to our home wireless network (LAN). But the desktop is connected to the laser printer (via a network printer device) so we can use that printer. The desktop is also used as the backup for our laptop files, which are backed up to the ‘cloud’ with Carbonite. (We use the Microsoft SyncToy to sync files from laptops to the desktop.) And the desktop is connected to our little LAN via an older Netgear Wireless Extender. It all works together quite nicely, in a geeky sort of way.

In order to get the weather data to Weather Underground (WU), the weather station comes with an app that is loaded on a computer. The weather display is then connected to the computer via the USB cable, and then the desktop app sends the data up to WU. So my layout was not going to work, unless I used a 50 foot USB cable draped around the walls. Not a good home decorating idea.

Acurite had a solution, though, called a Network Bridge, at $80 retail (plus shipping). A bit steep, I thought.

A search (with a bit of help from fellow CMR advisor Eric Pobirs) eventually got me to the Monoprice company web site, which had a USB server that allowed USB devices to connect to a computer via IP (). The list price at Monoprice is $24, but who pays list? A search on Amazon didn’t find a cheaper price. The listing there was about $31, but there were a couple of used ones for under $10 (at the time, current pricing for used ones now is around $27) including ground shipping. The device specs and features looked good, and losing a ‘sawbuck’ (for you youngsters, see here ) was OK if it didn’t work, so I ordered the used one.

It arrived a few days later in a padded package: the device, the power supply, and the mini-CD with the device application). It was a simple installation process: connect the weather station USB cord to the USB server, plug the server into the router (which is in the same downstairs location as the weather display), and connect the AC adapter.

Then upstairs to the desktop computer to install the device software from a mini-CD on the desktop. A few clicks, and a search via the app to find the USB server on the network, and the weather display was connected. (The device used DHCP to get its IP address, although you can assign an IP address with the app.)

I switched to the Acurite weather application, and it saw the weather display that was connected over my LAN via the little USB server. It all worked.

The USB server device can support a USB hub, so you could connect more than one USB device to your local network. You can also use it to connect a non-network type printer to your local network. If you have multiple computers, the USB server software will allow each computer to connect to any USB device, although not at the same time. The software does have a way to send a message from one computer to another to release control of the USB device.

The USB server might be most useful for sharing non-network-enabled printers at a reasonable price, although wireless-enabled printers are not that expensive. And you can find little USB wireless print servers that will allows you to connect a printer to your wireless network.

In my case, I saved over $60 using the USB server. My weather display is now connected to the desktop computer through my local network, which allows me to share my weather station on Weather Underground.

Do you have a story about computers that you’d like to share? See this page for details. – Editor

Desktop and Laptop Upgrade

CMR Reader Karen Parker decided she wanted to replace an older Windows 7 desktop with some newer hardware. Rather than purchasing a new system, she decided to ‘build her own’, and then transfer her OS and applications and data to the new system.

And then, apparently on a ‘roll’, she decided to upgrade her Windows 8 laptop with a new internal SSD drive, moving its current drive to an external box.

So in the CMR grand tradition of “I did so you don’t have to” (or “you may want to, too”), we share her experience. – Editor

For many years I’ve followed the theory that when you get a new machine you get one that is one step back from the bleeding edge (I may well have learned this from you.) This has the advantage that the machine is good for a long time, and you don’t spend a lot of money upgrading every other year, and the disadvantage that you frequently end up waiting a while before getting new technology. This last point is the hook that caught me with my desktop upgrade.

The old desktop machine was about seven years old, based on a Core i7-920 processor. My theory was that I’d get a new processor, motherboard, and ram and just install them in my existing case, reusing the video board, disc drives, etc. I decided on a new Core i7-5820 six-core processor, and off to my local Micro-Center I went. They pointed me to an Asrock motherboard, suitable memory, and oh, by the way, you need a cpu cooler since the chip doesn’t come with one in the box. I took it all home and hit the first snag — the cooler was too tall to fit in the case. Back to micro-center the next day for a new case and power supply.

After a day or so of studying all the instructions (it’s been several years since I assembled a computer after all) I gritted my teeth and put it all together. So far, so good.

My plan was to boot the machine into safe mode, install the new motherboard drivers, reboot, and go. So much for plans . . .

Then machine would start to boot and then blue screen. After a bit of research on the web, and a bit of thought I realized what the problem was. The old machine only implemented one way of talking to SATA drives, the IDE protocol. AHCI hadn’t been invented when I built that machine, but in the mean time it has become the default. I powered up the machine, went into the BIOS, and sure enough, the SATA ports were set to AHCI. I reset them to IDE, booted the machine and it came right up. I then installed the motherboard drivers, rebooted again, and the machine was alive and fully functional.

The only fly in the ointment was that it’s still using IDE protocol for the drives, hardly optimal, especially for my SSD boot drive.

Another web search revealed the necessary incantation to force Windows to detect AHCI and install the necessary drivers. You need to run regedit to go into the system registry and navigate to HKEY_LOCAL_MACHINE/SYSTEM/CurrentControlset/Services/msahci. Select the “start” value, right-click it and select “Modify” and then set its value to “0”.

Exit regedit, reboot, go into the bios, change the SATA ports from IDE to AHCI, and then reboot again. Windows will load the drivers, and then reboot once more, and Bob’s Your Uncle. I did have to re-authenticate my copy of Windows, but that was done automatically over the internet, and took only a moment.

In the end, I managed to put a whole new machine under my existing copy of Windows, without a reinstall, always a painful proposition that takes at least a week of reinstalling and reconfiguring software.

As an aside of sorts, I have to comment on the design of the new case. This is the first case I’ve used that provides for routing the cables behind the motherboard, and it was an absolute joy to work with. A far cry from the days when you had PATA ribbon cables sprouting out of the middle of the motherboard, and running willy-nilly through the middle of the case. Tool free drive caddies are very nice too.

Upgrading the Laptop

Apparently the planets converged this summer because my laptop came due for an upgrade too. The machine I selected is a fairly ordinary high end lap top, with a Core i7 processor and 17 inch display. It was equipped with a 1 TB 5400 rpm disc drive, so I decided to buy a 500 GB SSD and external enclosure, planning to put the SSD into the machine and the disc drive in the external enclosure.

The SSD came with a license for Acronis True Image cloning software, so I downloaded it and cloned the hard drive to the SSD. I thought that there might be problems cloning a larger drive into a smaller one, but the software seemed to be ok with it, and besides, that big drive was mostly empty anyway. The clone worked just fine, so I went ahead and swapped the disc drive and the SSD, easy enough but requiring the removal and replacement of 16 or so tiny screws. Then I booted the machine, and all hell broke loose.

The machine would boot, but was very balky about doing anything. Any sort of disc operation, in particular, seemed to take half of forever, if it would complete at all. I spent a couple of days trying to fix the machine, to no avail.

So, undo those 16 tiny screws, swap the drives again, and verify that the machine had regained its sanity. I also plugged the SSD, in the external case, into my desktop machine, which promptly complained that the drive needed repair. I let it run chkdsk on the SSD, and after that it could read the SSD just fine. This was the hint I needed.

I formatted the SSD, cloned the disc onto it again, but this time ran chkdsk on it. Chkdsk found and fixed a bunch of errors, when it was done I verified that the SSD contained everything it should, and copied several files back and forth between the disc and the SSD. All seemed to be well, so I undid those 16 tiny screws yet again and swapped the drives.

This time it all worked, and I now have a laptop that boots lightening quick and runs great, along with one TB of external storage. Bottom line, cloning software works well, but it’s probably a good idea to run chkdsk on the cloned drive before you try to use it.

Finally, a question for [Chaos Manor] advisors. My desktop machines is currently running an Nvidia 630 graphics card, certainly not the most powerful card out that. The rest of the machine is pretty high powered, Core i7-5820 processor, 8 GB of RAM, SSD boot drive, 2 TB of other storage, and two monitors. The primary use is processing photos and writing and laying out books, so basically it needs to run MS Office and Adobe Creative Suite, but no games. Do you (collectively) have any suggestions as to what might be a better video card, or even if I need something better?

The Chaos Manor Advisors responded with some thoughts

From Peter Glazkowsky

The one good consequence of the fact that PCs aren’t  improving as rapidly as they used to is that older systems are still useful. Pretty much any desktop from the last six or eight years would be fine for reading and writing things.

From Eric Pobirs

    If gaming is not an issue, the main thing to look for in a video card upgrade is the improvement for Adobe Creative Suite and other software that in recent versions can enlist the GPU for additional processing power and not just display tasks. A lot of the stuff that once drove the purchase of dedicated accelerator boards is now well within the functionality of a modern GPU.

From Alex Pournelle

And the built-in graphics are getting better all the time. Intel’s next generation of graphics claim 30-45% improvement.

From Peter Glaskowsky

Since the Core i7-920 processor was introduced less than seven years ago, she might have made an exception to the “wait a year” rule, or maybe the machine isn’t as old as she thinks. Might as well clear that up before publishing, if possible, so as to avoid confused readers and unnecessary feedback.

As for her graphics card question, her applications don’t require anything better. A significant upgrade would cost maybe $80-$100, but it’s possible she wouldn’t even notice the difference.

It might be worth checking with Adobe to see if her version of Creative Suite supports GPU acceleration, and if so, what graphics cards can be used. It may be that a workstation-class card will deliver a significant performance enhancement in those applications.

Many workstation cards are available for under $100, but for the same price as a consumer card, a workstation card will usually come with an older, slower GPU. So it would take some work to figure out how much it would cost to deliver a meaningful speed boost.

[If you readers have thoughts, use the comments below. And if you would like to share your experience, we’re looking for more stories. – Editor]

A Cleaner Mac Air

Do Alex’s OS X cleanup recommendations work? Chaos Manor Advisor Brian finds out…

When I previewed Alex’s recent CMR article on cleaning up the cruft on a Mac, I thought that I’d best have a go myself. While I’m nobody’s target demographic for installing all of the latest apps and games, I’ve had this particular Mac Air for 4+ years, and migrated to it from a 2007 MacBook Pro that started at OS X Leopard (10.5). So there’s possibly a few dust bunnies and rodent droppings in the odd corners of the machine. And since it already runs fine, I see no reason to see if it can’t be made better.

First off, I ensured that my home Time Machine backup was current. I also connected and refreshed my work Time Machine backup earlier in the day. Good to have restore targets, no? I used to use Super Duper with great regularity, but in a recent bit of work, I rebuilt directly from Time Machine and it worked just fine. So I now reserve my Super Duper runs to right before major OS X dot releases.

Keep what works

Now, to work. I checked Login Items under my account in Settings / Users & Groups. Not much there, but there were a couple of helper scripts that I don’t really need running. What’s left is just f.lux, a neat bit of software that “makes the color of your computer’s display adapt to the time of day….” I find that f.lux reduces my eyestrain from evening work on the machine. I can’t recommend this enough.

Next up, I went hunting in /Library/LaunchAgents, as the root user, and listed the plist files I found there:

Agog:~ bilbrey$ sudo su 
Agog:~ root# cd /Library/LaunchAgents/
Agog:LaunchAgents root# ls
com.REDACT.ED.gui.plist com.oracle.java.Java-Updater.plist
com.REDACT.ED.upgrader.plist org.gpgtools.macgpg2.gpg-agent.plist
com.google.keystone.agent.plist org.macosforge.xquartz.startx.plist

I need the {REDACT.ED} agent bits for work, but after a bit of research on the other four, I decided to hang onto them, too. I use GPG for assorted purposes, XQuartz is used by some remote work I do with Linux systems, and I have Java installed for the same reasons. The Google updater? Well, yeah, I’ll keep that, too since Chrome is my primary browser here and everywhere else, and I’d generally like to have it up-to-date.

Under /Library/PreferencePanes, I only had the Java Control Panel, and something for MacFUSE, which I’d uninstalled long ago. So it was out with the latter.

Agog:PreferencePanes root# cd /Library/PreferencePanes/
Agog:PreferencePanes root# ls
JavaControlPanel.prefPane MacFUSE.prefPane
Agog:PreferencePanes root# rm -rf MacFUSE.prefPane
In the personal LaunchAgents space, I had more cruft, none of which I wanted anymore:
Agog:~ bilbrey$ cd ~/Library/LaunchAgents/
Agog:LaunchAgents bilbrey$ ls
com.amazon.music.plist
com.apple.CSConfigDotMacCert-brianbilbrey@me.com-SharedServices.Agent.plist
com.microsoft.LaunchAgent.SyncServicesAgent.plist
org.virtualbox.vboxwebsrv.plist
Agog:LaunchAgents bilbrey$ rm *plist

After cleaning all that up, I restarted the system.

Did it work?

Well, yes, it did. Frankly, though, with on-board SSD, this Mac Air was already fast. But now it feels like it’s nearly back to new again. I had not measured boot times prior to doing this work, but it’s always been about 20 seconds from entering the password to the desktop being alive. Where the additional time is saved is in all of the add-on stuff that no longer shows up in the status bar – there’s less of that now.

I did run EtreCheck before and after. The latter run found that I had not found and destroyed both copies of a particular PreferencePane entry. One was up at the system level, and I forgot to check in /Users/Bilbrey/Library/PreferencePanes as well. That’s cleared up, now.

What I’m even happier about is that following Alex’s tips, I gained more visibility into my daily carry system. What I still want to look into are the things that after three days of runtime, EtreCheck reported as “killed due to memory pressure.” Hmmm. I’ll have to keep an eye on that.

Thanks, Alex!

Cleaning Out the MacBook Attic

 In this second installment (first of the two parts is here), Alex digs into the leftovers from Migration Assistant on his MacBook Pro, finds code almost old enough to vote, and discovers his computer truly can run faster.

Last time, I discussed MacOS’s Migration Assistant, and how it will enthusiastically move all of your old software and settings. This time, I’ll show how to delete it manually; it’s worth the effort.

Your Mac’s Attic

After the (much earlier) migration, I looked at my Applications, and deleted a few. Of course, apps don’t run unless started—or not at all, if they’re incompatible. Unless they auto-start, applications (as compared to daemons) are probably not getting in your way. /Applications holds most of your apps (29 Gigs, for me), including the OS X installer app, if you haven’t thrown it in the trash. At 5 GB, this may be worth deleting, if you’re trying to trim disk usage. (I just did—I can re-download if needed.)

But I didn’t look in other folders, ones holding other, important applications and settings, including various daemons which do auto-start, silently. Complimenting the /Applications folder is /Library, the system-level repository of launch agents, launch daemons, and preference panes. Some of these are controlled by the Login Items preference pane, but others, especially ones transferred from your older computer, may start anyway, even if not listed.

I took the Application Startup list (System Preferences | Users & Groups | Login Items) on faith; surely MacOS knew about applications it had transferred? Seemingly, no; there are multiple services resident on the Mac, moved by Migration Assistant from the older Mac, which don’t show as Login Items. They simply weren’t there.

Astute readers will recall that the Windows Upgrade Adviser checks for application compatibility, including version numbers, before moving your old apps to your new computer. So too does PCMover. The MacOS Migration Assistant doesn’t appear to do that.

Leftovers and Ancient Code

As I mentioned in a previous installment, EtreSoft’s freeware EtreCheck compiles a good list of all the applications, services (“daemons”, in classic Unix parlance), and other cruft in residence. (Much of this information is also in the System Report utility, but not as nicely formatted.) EtreCheck is how I found the Valve Steam client plumbing, the remnants of LogMeIn (Replaced with TeamViewer), and other stuff.

When EtreCheck still showed at least a dozen undead roaming unfettered amongst the innocent, it was time to look in individual folders.

Closer Looks

Launch agents are started by the launchd process, from a plist script (Preference script—essentially, a shell command). You can remove agents from starting via the command-line, but deleting them is more final.

My /library/launchagents folder contained several old plist scripts, including an ancient (2008!) system-stat app, iStat. After some cleaning, my /library/launchagents folder now contains:

clip_image002

There’s Avast’s startup commands, the inevitable Java updater, etc. Note the leftover Logitech preferences, from when I had a Logitech mouse attached; more about that in a moment. There’s also TeamViewer startup scripts, despite having uninstalled TeamViewer and thrown it in the trash. Since TeamViewer hooks pretty deeply into MacOS, for video and sound redirects, I’m going to delete those, too.

/library/launchDaemons contains daemons, or at least the scripts to load them:

clip_image004

Again, Java version-checker, plus the Microsoft Office license-checker, more Avast, Adobe VersionCue (Loaded but never used). Yet more TeamViewer that has to go.

There’s also Google Keystone—a play on Keyhole, codename for American surveillance satellites of the 1990s and source of much of the ground imagery in past generations. Really, that’s for Google Earth, invaluable for determining microwave look angles, scouting for hikes, and looking for easter eggs.

/library/PreferencePanes holds all the “Preference Panes”, Macspeak for third-party control panels, themselves (As separate from daemon launchers or preferences). Mine looked like:

clip_image006

This is after I removed the Logitech control panel (Ok, “Preference pane”) itself, a leftover from when I had a Logitech mouse. Before deletion, it ran, found no Logitech devices, appeared benign; still, since it was 2008 code, out it went.

Remove an item from System Preferences by control-clicking (Right-clicking, on a two-button mouse) then clicking “remove” on the menu. I’m sure you could also delete it from the PreferencePanes directory itself, but whatever legerdemain the Mac goes through to delete a pane seems complex, so I’d rather not take a chance.

And we’re still not quite done with the zombie-stomping. /System/Library/Extensions contains, as you might imagine, a long list of kernel extensions. EtreCheck showed many of them running on my computer, including ancient Virgin Mobile drivers (For a tethered-mode hotspot), even older Sierra Wireless drivers (Ditto), the inevitable LogMeIn drivers, old RIM/BlackBerry communications drivers, and Logitech drivers. After careful consideration, I’ve deleted these. Also present, but not running: Several dozen H-P drivers, apparently for printers, which I left alone.

As I discussed previously, preferences and programs fall into two types: Global, for all users, in /Library and /System/Library; Per-user, in the /<user>/Library folder. Unsurprisingly, I also have personal launch agents, in /Library/LaunchAgents:

clip_image008

That includes Citrix GoToMeeting preferences, personal Avast anti-malware preferences (Vs. systemwide in the earlier folders), ancient Kerio MailServer (Competitor to Exchange) client-side preferences, and Valve Steam client preferences for the few games I keep on my system.

The Google contact sync agent replicates mac-side contact info to my Android phone; H-P printer preferences and personal preferences for Apple folder views finish the list. Staggeringly, no TeamViewer preferences appear here.

Testing For Speed

I then emptied the trash to truly exorcise the zombie code, then a reboot. Reboot times are still in the 3-minute range, but login is much faster.

If you check “Re-open apps when I restart” on the Mac, they appear as pictures—snapshots of what they looked like at reboot—before becoming live. The “picture to live” times, even with a dozen apps restarting, was noticeably faster (Sorry, forgot the stopwatch). Perhaps the zombie apps and daemons have been exorcised?

EtreCheck shows far fewer mysterious entries, too, as does Activity Monitor. I don’t know yet for certain, but the results are promising. Word loads faster, saves faster; Time Machine backups don’t slow the system down appreciably; no strange delays in Firefox loads.

But there’s still one major application that needed reorganizing: Apple Mail, as I’ll discuss more in the next article.

Next time: more adventures with Alex as he digs into Apple Mail.

MacBook Pro Migration Assistant

 In this first part of a two-part installment, Alex works on his MacBook Pro to cure erratic performance, hidden storage, Terminator processes and the Precambrian-era apps that cause them. He starts with a discussion of migration.

Migration Assistant: Too helpful?

While I moved to this Mac (MacBook Pro 15”, early 2011, 8 GB RAM) over a year ago, it took this long to find out just how much stuff I’d moved—some programs were from two moves earlier! This was part of my mysterious slowdowns, unresponsive performance, and general low-level annoyance, distractions from Getting Things Done.

This is, I suppose, a curse of the modern age. In previous generations, computer speed was slow enough that I’d save up context-switches, changes from one program to another, or other time-wasters, until I needed a break.

Today, systems are fast enough that we, I, rely on instantaneous swaps from Word to Firefox and back. A two-second delay for Word’s “Insert Hyperlink” feature to open, or more than five seconds for MacOS to paste from the clipboard into the Hyperlink window, pulls me right out of writing.

Even though a few seconds is trivial, when I’m on a roll, it breaks my concentration; the imprecations yelled at an unresponsive computer scare the animals; I pick up the phone to look at the latest e-mails, etc. It takes long enough to get into writing mode; I don’t need computer excuses for not being productive.

Still, I had grown used to these delays, or at least tolerated them. When they stretched into seeming minutes, longer if you include the lost ‘what-was-I-doing-anyway’ productivity, it was time to investigate further.

Short of deleting everything from the computer and starting over, how else could I speed up the Mac? Turns out, plenty—including actions I haven’t seen written up elsewhere. While hunting down old code, I learned considerably more about what’s under the covers, too.

Choosing Your Migration Adventure

How should you move from your old computer to the new? There are many ways: Wi-Fi, Time Machine backup, USB drive, FireWire drive, Ethernet.

Migrating over the wire: As Apple mentions in their notes, many current-generation MacBooks and MacBook Pros don’t come with an Ethernet port; you will need a Thunderbolt or USB 3 Ethernet adapter to upgrade over-the-wire from the older computer. Many people buy the Apple-labeled Thunderbolt Ethernet adapters, or the USB 2 adapter, for just such an occasion as migration. USB 2 tops out at 480 Mbps, far slower than Gigabit Ethernet, but plenty fast for most uses. (This MacBook Pro 15” only has USB 2, not 3.)

The new MacBook (not “Pro”, not “Air”, just MacBook) is even more minimal; its only physical connector is a single USB-C—no Thunderbolt. It even charges via this single connector. USB 3.0 is fast (Nominally, 5 Gbps), USB 3.1 (On the MacBook), double that. The MacBook appears to be a trial balloon: Will consumers buy a computer with only one port, or is more better?

Personally, I miss having three USB connectors on the 17” Mac, plus FireWire, Ethernet and a separate video out, but I’m not the target market. I find an Ethernet adapter is essential; I move far too much data to count on Wi-Fi for everything. Sure, it’s another part to lose, but Ethernet is more reliable than wireless, and usually much faster. Yes, 802.11ac Wave 2 is multi-gigabit, but that’s under ideal conditions, and when the Wi-Fi is working.

USB or Thunderbolt adapter? Thunderbolt adapters take up the Thunderbolt/Displayport connector, meaning no second monitor if you need a wired network. (Alternatively, you could use a Thunderbolt dock, but they’re fairly bulky.) There are very nice USB 3 hubs which also sport an Ethernet port; unless you want the smallest possible set of gear to carry, that might be a good choice.

The lost connector: I mentioned FireWire, but it’s quickly being phased out, appearing on none of the current MacBook models. Not a huge loss; while FireWire 800 was almost twice as fast as USB 2, it’s much slower than USB 3 or Thunderbolt. Its other use, connecting directly to camcorders, disappeared long ago, and it will be relegated to guess this connector quizzes in a few years.

Can Migration Be Too Helpful?

When I moved from my finally-dead 17” MacBook Pro, I’d used Migration Assistant, the built-in “move your stuff” program that ships with MacOS. Migration Assistant (in /Applications/Utilities).

It makes the process simple: Connect the two computers (Wi-Fi, Time Machine backup, USB drive, Ethernet), start the Migration Assistant on both, choose “to another Mac” on the source machine, then, on the target, choose the source machine. You’ll be given an opportunity to choose what to move, about which more in a second. Watch for the message “Your other Mac is ready” on the old one, click continue on the new one, and step back.

If your old computer’s dead, the standard rite of passage is to extract its hard drive, stick it in an external case, and run Migration Assistant from there. The old drive then becomes a backup, or kicks around in your desk drawer for ages. (Not that I’m guilty, but exactly why do I have 40 GB hard drives still gathering dust?)

In-place upgrades (No new computer, just a new operating system) happen as part of the MacOS upgrades; they are similarly helpful. Most of the time, they’re completely automatic and transparent, but there are important gotchas, particularly with Apple Mail, as I’ll discuss next time.

Windows Migration Assistant works well for PC-to-Mac migration, though of course it moves files and settings, not the applications themselves. Mac-to-Mac migration with plain old Migration Assistant does move programs—all programs, if you let it. If you don’t just click past it (I did), there’s a selection pane, where you can choose exactly what to move:

clip_image002

Note the “Settings” checkbox and drop-down menu; this will become important in our next article.

In the next installment, Alex looks at leftovers, really old code, and the joyous discovery of a faster computer.

Wi-Fi Sharing in Windows 10–Facts or Hysteria?

With the release of Windows 10, one of the subjects of concern is the new Wi-Fi Sharing process. It looks like there has been a bit of hysteria and/or exaggeration about this issue.

The Chaos Manor Advisors discussed this a few weeks ago, when the first article about this appeared in The Register. The general consensus is that on first look, this may be a ‘bad thing’. But a lot of the hype about this seems to be just that, hype. And some misunderstanding of the whole process. It appears that one might want to ‘read past the headlines’ on this issue.

Chaos Manor Advisor Peter Glaskowsky reports on his testing of Microsoft’s Wi-Fi Sharing process in a late beta release of Windows 10.

I’ve been talking about Wi-Fi sense without the benefit of having used it, since I have only one Windows 10 machine and that one is a 2U server with no wireless in it.

But yesterday I realized that I could attach one of my USB Wi-Fi dongles. (A blinding flash of the obvious.)

This is as pure a test as I can imagine of the default Wi-Fi Sense settings, since this machine has literally never had a wireless capability before now, and Windows 10 was the first version of Windows ever installed on it.

So, the results:

When I installed the Wi-Fi adapter (a TP-Link TL-WN722N, one of the nicer ones of this type since it has a proper RP-SMA antenna connector), it became available essentially instantly. Windows 10 said nothing about installing a driver.

I went into the new-style Settings (not the old Control panel), then Network & Internet, then Wi-Fi on the left sidebar (which had not been there before), then Manage Wi-Fi settings in the main window. This sequence brings up the main Wi-Fi Sense settings dialog.

The “Connect to suggested open hotspots” option was on, allowing the machine to connect to well-known public hotspot systems like Boingo. I think this is generally fine, but I don’t know whether there is robust protection against someone setting up a bogus hotspot that appears to be part of the Boingo network. Since I don’t need it, at the conclusion of this testing, I turned it off. In the meantime, I left it alone.

The setting of primary concern to everyone is “Connect to networks shared by my contacts”, and that one was OFF by default.

Turning it ON experimentally brought up the three sharing options: Outlook contacts, Skype contacts, and Facebook friends. All three of these were OFF.

I turned on the Skype contacts option.

I then started the process to connect to my home Wi-Fi network by pulling open the network submenu in the task bar and clicking on my SSID.

This brought up the usual “Enter the network security key” field and a new one: “Share networking with my contacts.” That option was OFF even though I had turned on the share-with-contacts and Skype sharing options.

In other words, the defaults for the sharing method of primary concern in these web articles are ALL OFF. As off as off can be.

I abandoned the connection process without entering the security key, then turned off the share-with-contacts option in the Wi-Fi Sense settings and started the connection process again.
This time the connection box didn’t even have the “Share networking with my contacts” option.
I re-enabled the share-with-contacts and Skype options, and actually did go through with the connection process, including checking the sharing option.

Interestingly, the system did not give me any choice about which contacts to share it with. I went back into the Wi-Fi Sense settings… and the Manage known networks section said that my network was “Not shared.” How curious, but it saved me a few steps in the procedure I was going through, since my next thing was to share a network that had previously been connected but not shared to see what happens.

I clicked the Share button.

Even though I had already entered the network security key, it asked for the key again. This is exactly the right thing to do. This is how Windows 10 prevents a friend from sharing your security key if you personally type the security key into their device rather than, for example, reading it to them to enter manually.

I completed the sharing process and verified that it “stuck” this time.

Then I disabled the share-with-contacts option in Wi-Fi Sense, and then re-enabled it.
When I went back into “Manage known networks,” my network showed as “Not shared.”

So that’s the whole deal, I think. By default, Wi-Fi Sense operates, at least on my machine, as of today, on Build 10162, exactly as Microsoft says it does. Sharing only happens when you click a bunch of extra buttons to enable it, and stops when you deselect any of those options.
Every share-with-contacts option defaults to OFF, and it DOES protect against a Wi-Fi security key being shared by someone who doesn’t actually know it.

I hope that is the end of this matter for now, at least until we find someone reliable (that is, not a writer for The Register) who has a machine that works differently.

Or until Microsoft provides additional information on the various security aspects (how is the security key protected, how is local network access prevented, does Microsoft have a way to learn your password, does Microsoft have a way to review your Facebook contacts list, etc.).

Or until Microsoft adds what I think is the essential feature for sharing a Wi-Fi security key securely: sharing it with only one individually specified person at a time, without giving Microsoft a way to see the key.

Comments and questions welcome, of course.

The Chaos Manor Advisors discussed this issue a bit today (29 July 2015), especially after Brian Krebs wrote about this (see here). We shared that link to the article with the Advisors.

Eric said:

    So Krebs went ahead and wrote this without doing even the same brief testing Peter did weeks ago. This is how hysterias grow.

Peter added

In spite of the hysteria, I believe it is already fully opt-in.

The only, only, only thing that defaults to “on” is that the service is enabled. Every time a user adds a new Wi-Fi network, the dialog box specifically asks whether to share it with contacts or not, and which contacts to share it with from the three available options (Outlook/Facebook/Skype). All four of those questions, at least on my machine with a clean install, defaulted to OFF.

If the service itself is turned off, none of those sharing questions will be asked.

Now, if someone has turned on the service and shared a network, maybe it defaults to enable sharing the next time; I didn’t test that.

I think this business Krebs raises (and the Register raised) about how a friend could share your Wi-Fi credentials without your permission is just nonsense. That still takes a deliberate effort. If you have a friend who would do that, you need new friends.

This may be a bit of hysteria, as Peter stated. Although sharing your Wi-Fi password is generally not a good thing (especially for the paranoid?), it would appear to us that the actual risk is quite low, based on some limited testing by the Advisors.

We’d be interested in your opinion on this. You can share in the comments below. If you are inclined, you can send us more detailed information that we might use in a future post here at Chaos Manor Reviews. See this page on the submission guidelines for Chaos Manor Reviews.

Saving Space and Lessons Learned

[In the final fourth installment of this series, Alex works with his MacBook Pro as he discovers several causes of its slowness. He figures out how to save more space, and concludes with Lessons Learned.

Prior installments of this series: first, second, and third. Comments are welcomed at the end of this post. – Editor]

Space Saving

I discovered another surprising disk – filler: Unused printer drivers. I collect them, like it or not, at client sites, building temporary networks onsite, or troubleshooting balky copier/printers. While I’m not exactly counting the bytes, a gig here, a gig there, and after a while it adds up to real space.

Where do printer drivers live? In /Library/Printers — not the same as /Users/Alex/Library. Installed printer drivers are system files, used by all logins, not installed in each user’s directory. I felt stupid. Of course, there were system files, separate from my personal login, and of course they were in the system’s own directory structure, not the users’.

How much space? /Library/Printers held 7.5 Gbytes of drivers—very surprising, considering I’d only downloaded a gig or so of driver – installers over the years. (I guess they get expanded at install.) The install files themselves are in /Users/Alex/Downloads, because the user (Me) had downloaded them. It’s only after installation that they end up on /Library/Printers. (I had already deleted unwanted installers from /Users/Alex/Downloads). Saving 7.5 Gigs of space is enough to make me care, especially before I migrate to an SSD.

Final result: I cleared out another 4.5 GB of unused drivers. It’s not the disk space per se that bothers me (Disk is cheap), so much as the backup “elbow room”.

Time Machine, Apple’s excellent built – in backup software, Just Works if the backup drive isn’t too full, but my secondary backup drive is the same size as the internal drive, 500 GB. (I also have a 1 TB primary backup drive.) Time Machine is supposed to automatically erase old backups to make room for new files. However, it doesn’t always think there’s room for backup on the smaller drive; I’ve had to erase the secondary backup target (After I was sure the primary backup was good!) before a backup would finish.

So, going from 80 GB free space (about a month ago) to 120 GB (now) should make my backup experience more reliable.

What We Have Learned

Here’s some ‘lessons learned’:

Precautions: As recommended here, I made sure I had deleted unused printers (Printers, once installed, are shown in System Preferences | Printers & Scanners) first, before I went after /Library/Printers. That’s a good idea anyway: I used to keep ten or more printer drivers installed, in case I visited that client again. I realized my printer-chooser (equivalent of the drop – down choose – a – printer Windows menu) was getting slow and erratic. A few months ago, I weed-whacked out all the ones I seldom use, and selecting a printer got much faster.

Printer presets are wonderful: As long as I’m discussing printers, it’s a good time to mention MacOS printer presets, too. (If you only ever print single – sided on one printer, you can skip this paragraph.) A preset is a bundle of settings for a given printer: Say, double – sided on the long edge, black and white, with toner savings on, from tray 3. Instead of navigating four dialog boxes every time you print, save a preset (or several) for each printer, and waste a lot less paper. It also avoids the “Where does the Xerox WorkCentre 355 driver hide &$$^&$!! stapling!” mini – crisis every time you need a seldom – used feature. For complex print jobs with collation, stapling, hole – punching or folding, capturing all those settings (With careful naming) into a preset can save endless paper, click charges, and aggravation. Printer presets are discussed here, here, here and many other places. Experiment early and often before you do any volume printing; the best time to learn is not on deadline. But, in short, if you ever print anything fancy, you should know about printer presets.

Extra networks too: While I was at it, I also deleted all the network locations (System Preferences | Network, then “Location” tab at the top) I wasn’t using. These are the equivalent of Windows’ network profiles, and I had dozens, for each network I’d ever tested. While they take up little space on disk, they do seem to confuse the network Preferences pane. Like selecting a printer, changing networks took longer than it should, until I deleted those extra network locations. Most people will probably never use any but the default “Automatic”, set for DHCP, except perhaps for a second “location” with a static address for talking to a particular router. MacOS makes switching network settings very easy; I learned to appreciate this the first time I needed to switch from a WAN (direct to a microwave link) to LAN (inside the router) more than once. (I think the record was five. Yes, a second laptop would have been invaluable.)

Some apps love their versions: I also threw out a dozen older versions of GoToMeeting, which were never deleted when the application updated itself. The GoToMeeting tools blog says this is on purpose, that all attendees must run the same version for compatibility, so this isn’t just a lazy installer. Still, it’s a bit disconcerting to have a dozen different copies of an app, stretching back multiple years, that you know you’ll never use. Or sometimes just days: just in July, I got three different versions within a week! It also suggests Citrix (Owners of GTM) might delete unsupported, older, versions, but I can see where that might be a whole other headache.

Wrapping Up

In all, I deleted another 15 gig of unwanted and old files, which will make the eventual migration to a Solid – State Disk (SSD) faster, along with future backups. (Backups take longer than they should, as my backup drives are all USB, and this model MacBook Pro doesn’t have USB 3.) Partly this was to delete anything inessential from slowing me down; partly this was for working space.

That’s not just work avoidance. For good performance, “Elbow room” on the Mac is fairly important. Time Machine needs temporary space on the source (internal) drive to prepare for backups. This is in addition to .MobileBackups directory, a local Time Machine duplicate of everything not yet backed up, which also takes up room. (Time Machine does intelligently decide what should take up local snapshots, trying hard to never make your disk so full you can’t work or back it up.)

I also feel a little more in control of my own destiny on this machine. I’ve been using Macs for thirty years, but only in the last five has MacOS been my primary choice. Learning (or, with UNIX commands, revisiting) the mechanics under the hood, if only a bit, has made me more confident.

Still, there’s a lot to explore. I still have unpredictable performance, Terminator processes, and (it appears) Precambrian – era apps causing them. Oh, and several dozen copies of AppleSpell.

More next time.

[Alex will return with those explorations in the next installment – Editor]

True Exorcism Requires Deeper Incantations

[Alex continues with his tuneup of his MacBook Pro. The first installment is here, and continues with the second.

In this third installment, Alex tries to clear 4,000 viruses, finds hidden storage, and searches for OS clutter. Comments are welcomed at the end of this post. – Editor]

Last time, I thought the mysterious slow – downs of my MacBook Pro (MBP), running the latest version of MacOS (10.10.4) had been vanquished. Alas, no; the app – not – quite – hanging – but – not – responding and unkillable – app problems returned, prompting yet more investigation.

I was fairly sure I didn’t have a native MacOS virus problem; I don’t click on bad links, I don’t download doubtful software, I don’t visit sketchy parts of the Internet. Still, there are a lot of threats, as discussed here, and I was seeing glitches, like slow awaken – from – sleep, apps that suddenly didn’t respond, etc.

The Mac has some built – in protection: XProtect scans for certain malware, MacOS won’t run unsigned code without permission, and it’s UNIX under the covers so there are fewer attack surfaces. The Safari browser blocks known – bad plugins (Java and Flash have been particularly vulnerable, lately), too.

The Virus Hunt Continues

Still, it was time to resume the virus hunt, Just In Case. In its first run, Avast had scanned over 4 million files before complaining it couldn’t reach its malware engine and quitting the scan; I had put all 2,700 viruses, Trojans and other malware found in the “Virus Chest”, their name for quarantine.

Before consigning them, I checked as best I could that they were all in e – mails, bore only Windows – malicious code, and were therefore benign to the Mac. Avast’s “Infection Details” list is somewhat clunky:

clip_image002

As you can see, some lines show actual infection details; on others, you must click on the right – arrow to show them. I never did see any “status” information. You can’t sort the screen report, nor expand all of the “infection details” at once.

Since then, I’ve run Avast once again. Instead of claiming to be “100%” done after two hours, it topped out at “73%” after five, and kept going, through over 4.3 million files in 50 hours, finding over 4,000 viruses while still at “73%”. (The picture is from the results of the second scan.) An annoying ‘feature’: Avast doesn’t have a “pause scan” option, so you can’t park it while you run something else. I finally stopped the second scan, as I needed to reboot.

However, when I attempted to stash this second crop in the virus chest—all e – mail viruses again, and all apparently inoffensive to Macs—Avast skipped thousands of them. I couldn’t tell just how many, as there’s no consolidated reporting.

Last time I had a virus scare, I installed Sophos. So far I’m 0 – for – 2 on Mac anti – virus I like. I’d welcome recommendations from readers.

During my perusals of the virus record, I also realized that Avast was searching the “other” user login, and there’s another tale.

Looking Under the Covers

I inherited the computer from another user. I thought he’d removed all his files before he gave it to me, but it turns out not. It was time to investigate user accounts.

User accounts are managed in the Users & Groups panel (System Preferences | Users & Groups). Sure enough, there’s a second Admin – equivalent login, named “Alex Pournelle”, different from the primary one, named “Alexander Pournelle”. I discovered that right clicking on “Alex Pournelle” (Or shift – click, if you don’t have a two – button mouse) brings up the Advanced Options tab:

clip_image003

Note the dire warning: This is serious and deep voodoo, which should not be toyed with needlessly.

Viewing it, though—I was careful to click cancel when done—confirmed that PeterX (internal account name) and “Alex Pournelle” (displayed full account name and home directory) were one and the same.

Having confirmed that, I had to decide: Do I want to delete the user and everything in the account? I’d rather not; this is a second, admin – level user account which I could use to access the computer if need be. Better to delete the contents and not the user account.

User File Management

Go to Finder, open /Users. Another similarity between the Mac and Windows: Most common commands do have a keyboard equivalent, for those who prefer typing to clicking. Shift – Command – G brings up the “Go” dialog box (also available off the “Go” menu). Finder view of /Users shows the “Alex Pournelle” directory—red “X” in the lower right means it’s not viewable by me.

First, I had to give myself permission to view this directory. Under the covers, MacOS is still a UNIX variant, so I knew I could use the chown (Change Ownership) or chmod (Change Permissions) if I couldn’t do it another way. I’d rather not use such deep system oaths if I didn’t have to—memories of “rm *” deleting way, way more than I wanted to still rankle—so look for the GUI method.

The Get Info panel (Finder, click on the directory, File menu | Get Info—or just Command – I) shows most data about files and folders. It’s also where you set permissions, or in this case, add them. Sharing & Permissions, bottom of the Info panel, click on the “+”, add myself, then click on the gear drop – down menu, then “Apply to enclosed items…” Wait until all the red pluses on the subdirectories in “/Users/Alex Pournelle” disappear. I had to do this twice, before it took effect, for obscure reasons, but now I could see folders and files.

Astute Windows users will be comparing this to the “takeown” and “icacls” CLI commands, or to the Advance Security Settings tab, and you’d be right—they’re quite similar. Note: If you never use an Active Directory network in Windows, or install multiple users on a single computer, you may never see these settings, but they’re there, and they can still cause havoc.

I also discovered that Apple Finder doesn’t give accurate sizes if you don’t have permission to view a folder—after I had access to the directory, the 15 GB of storage turned out to be more like 8, much of which was in Apple Mail.

I don’t want those e – mail messages around—they don’t belong to me, anyway—so it was time to delete them. In Finder, it’s off to /Users/Alex Pournelle/Library. Gut check: Do I want to delete the Mail directory? Yep, they’re not my files. Double – check that I’m in the correct user directory. Send the folders “Mail” and “Mail Downloads” to the trash, empty the trash, another 5 GB of space available.

[Next time: Alex figures out how to save more space, and concludes with Lessons Learned. – Editor]

A New XBox for Chaos Manor

[Chaos Manor Advisor Eric Pobirs installed a new XBox One for Jerry at Chaos Manor. Here is Eric’s report – Editor]

Picked an Xbox One with Kinect and the optional Media Controller remote for Jerry and set it up at Chaos Manor today. Some stuff worked fine as expected. Other stuff, especially in areas I’d not had the opportunity to examine previously at my own home, were a bit troublesome.

Getting the 1 year subscription code entered was easy enough but Xbox.com was taken aback when I tried to claim the current Xbox 360 Games With Gold title for Jerry’s account in hopes it would eventually gain backward compatibility support on the Xbox One. (This isn’t shipping yet for most Xbox One owners but if you’re in the testing program it’s pretty slick.) The site decided it was time for Jerry to prove his identity, possibly because the account had never been used in relation to an Xbox before and I hadn’t set up the machine yet. It wanted to send Jerry a code via email and receive it back as verification but the email never arrived at the designated address. After five hours there was no sign of it.

This is no tragedy as Jerry is unlikely to ever become a big player of the Gears of War series but it might have been something to have around if the grandkids are having a prolonged visit.

Initially I set up the Xbox to take advantage of the TV integration. Unlike most such devices, the Xbox One has an HDMI In port as well as the expected HDMI Out port. Combined with the 2nd generation Kinect this allows a variety of whizzy features where the Kinect responds to voice and gesture commands and controls the TV and cable/satellite box via IR. The setup went well enough, with the equipment identified and supported automagically. The problem came when the OneGuide, the grid display for the TV schedule, failed to include the essential channels 2 – 13. I suspected there were other versions of these same channels in the Time Warner lineup but I didn’t find them quickly and it is difficult to fight force of habit, especially for accessing the traditional major broadcasters. People expect channel 2 to be CBS if that is how their TV has worked for decades.

This is possibly the answer for the missing channels, in the third entry:

http://forums.xbox.com/xbox_forums/xbox_support/xbox_one_support/f/4277/p/1631619/4232493.aspx

Until I can test the issue the Xbox will remain on the TV’s HDMI 2 input, which is a bit less convenient and loses the integration feature.

Out of the box, many of the standard hardware features of the Xbox One remain unsupported by software until the appropriate App is installed. For example, Blu-ray movie support.  The Blu-ray app is just a few dozen megabytes out of the standard 500 GB hard drive (models with 1 TB drives are now shipping for a $50 price difference) and I suspect this is due to licensing costs. The primary firmware required an immediate 2 GB update to be downloaded and installed the first time the Xbox was booted. This may seem remarkable to someone who hasn’t looked closely at game consoles since the days of ROM cartridges but nowadays these are fairly sophisticated devices with both positive and negative aspects. The expectation is that the typical location will offer a broadband connection and the value of the console is affected if this isn’t available.

The optional Media Player app is bare bones but effective for giving the user access to locally stored video, audio, and image files. USB drives and DLNA network volumes are supported. (There is a separate client app available for the far more elaborate Plex server software which far beyond a simple listing of files.) Oddly, the Media Player will not look at an optical disc for files to play. This is a bit inconvenient if you’re setup to produce BD-R discs for sharing large items and I can only suspect it is a sop to Hollywood’s demands for piracy control. Despite this, the Media player supports the MKV wrapper format popular on torrent sites for its features and lack of patent encumbrances. The previous generation of Microsoft and Sony consoles ignored the existence of MKV but they would play files they did support off a burned disc. Since flash drives and NAS boxes, both with massive capacity, are now remarkably inexpensive, I’ll take this compromise over the previous generation’s compromise.

Although it works fine with my Netgear NAS at home, I was unable to test the Xbox One media player with the D-Link NAS at Chaos Manor. It turned out that at some point the NAS had forgotten much of its configuration and no longer appeared as an accessible volume on the network, not did it present as a DLNA volume. (The D-Link is a bit elderly for a consumer/SOHO NAS and doesn’t ever mention the term DLNA in its firmware or documentation. Instead, it has a UPnP AV server feature which is the same thing in all but name.) We were able to reset the unit and restore some of the configuration but getting it to do DLNA again meant it had to scan several hundred gigabytes of files, a task at which it is glacially slow. So we ran out of time while waiting and may have to do it again before the DLNA functionality can be configured.  Since there is a need to reduce the number of PCs left running upstairs and jacking up the monthly power bill, a more substantial NAS box may be the solution to the household’s storage and backup needs, preferably one with at least four drive bays to support the greatest data integrity from the RAID.

The 500 GB internal drive can be easily filled if the user likes to keep their whole game library at the ready. Retail games ship on Blu-ray discs and can consume dozens of gigabytes. Adding additional capacity is as easy as plugging in a USB 3.0 drive and formatting. The system automatically decides where to put stuff and management is minimal.

The range of media apps is quite varied, with a great deal of free content available, along with many subscription driven options. In between there are regionally driven apps. If you get your service from one of the major cable TV MSOs, there is a good chance you can get an Xbox One app that give you access to a large library of on-demand material. I installed the Time Warner app, which also features live TV channels. This wouldn’t be of much use in the same room the cable box is found but could be of value on a household with good networking and a desire not to have a rented cable box in every bedroom. Unfortunately, the TWC app wouldn’t work. It just produced a blank screen. Pressing B or the logo button on the controller allowed one to escape but that is the best that can be said for it so far. I’ll try fiddling with it some more on my next visit. I’m sure if this were a widespread problem it would have produced more meaningful results when I attempted a search. Most likely the Chaos Manor Murphy Field at work.

Other apps I’ve installed include Netflix and Hulu, which each require their own subscription but Jerry had expressed interest. Since he is an Amazon Prime subscriber the Amazon video app should give him plenty to work with without incurring added expense. Amazon has been developing a lot of its own exclusive content to stake a claim on cord cutters’ entertainment budgets.

Another area that has been problematic thus far is also one of the leading features of the package. Having the Kinect accessory allows the Xbox One to be controlled by voice commands. This generally worked fine for me, though there is a bit of a learning curve due to the limited range of things the system understands. The upcoming ‘Windows 10’ update to the Xbox One will bring with it Cortana, so the sophistication should gain a fair bit. Meanwhile, there was a bigger problem: the Xbox couldn’t seem to understand Jerry. This may be a case of humans still having a great advantage, as everybody I’ve observed seems to understand him fine, although those are all people who’ve known him long before his two big medical issues, so maybe we have an unfair advantage. Even so, this was frustrating when sitting side by side it would ignore a command from Jerry and respond to the same command from me. Was it somehow trained to my voice from doing the setup? Maybe but I didn’t find any documentation suggesting it would become attuned to specific users.

This might help:

http://support.xbox.com/en-US/xbox-one/kinect/kinect-doesnt-respond-to-voice-commands

If these issues can be overcome, the Xbox One is a pretty powerful set of features. A bit on the pricey side if the gaming aspect doesn’t cater to your tastes but so many of the lower cost choices are painfully slow, due to CPU performance and limited RAM. The price range of the Xbox One puts it on par with an HTPC, many of which use a close relative of the AMD APU inside, but for someone looking for what the old Media Center team used to call a ‘ten foot UI’ the Xbox One is more of a turnkey solution.

A note from Jerry: I had an early Xbox but that was long age; I’m busy with Cthulhus and this has to wait a bit, but my early experiences with the XBOX have been quite good.

[Add your comments below; note that there may not be any response, but we welcome your thoughts – Editor]