Bright Ideas

incandescent-lightbulbOne of the Chaos Manor Advisors saw an article about a new kind of incandescent bulb that was brighter than normal bulbs.

Researchers at MIT have shown that by surrounding the filament with a special crystal structure they can bounce back the energy which is usually lost (see article here).

This seemed interesting, and brought forth a comment from Peter Glaskowsky, another CM advisor, expressed some skepticism:

Some of this story is simply false to fact, but other parts are correct.

For example, LEDs with CRI values over 90 (and up to 97) are widely available, and some of these also provide high-quality red tones (the part of “warm” that is particularly noticeable to some people).

On the other hand, it’s correct that LEDs with high CRI values are only around 14 percent efficient, as the story says, so if the MIT solution can increase this figure to 40%, that could be good.

But MIT has only achieved an efficiency level of 6.6% and these researchers haven’t even identified a theoretical basis for surpassing 20% efficiency—about where the best LEDs stand today—so they’re a long way from claiming any real advantages.

Also, the technology behind this invention looks expensive and has some limitations. First, the reflectors are made using semiconductor-like materials and processes—up to hundreds of stacked layers of exotic materials that have to be made with high precision or the product won’t work right. There’s no precedent for using these processes on highly curved surfaces, either.

Ultimately it isn’t at all clear to me that it will ever be possible for this technology to surpass the combination of cost, efficiency, and color quality offered by LEDs, which is not the conclusion invited by MIT’s press release.

I’ve seen many of these factually questionable and unjustifiably optimistic MIT press releases in the past, suggesting this is either a deliberate strategy or just a quirk of someone in their press office who really ought to find something else to do for a living. MIT does plenty of good work; there’s no need to hype it past all scientific justification.

Your CMR editor has a supply of incandescent bulbs, many bought before some sizes were outlawed by the US government (info on incandescent bulb ban here). There are several places in my house where the lights tend to be on all the time (partly due to need, partly due to laziness). Here’s my thoughts:

I have started a slow process of replacing my incandescent bulbs (and CFL’s) with LED lights (for A19 base 2700K bulbs, I got these http://amzn.to/1PsxE8h ; 6 pack for about $21.00). I have a full set of six on the light fixture above the dining table, just a few feet away from my usual spot in the living room. That light is always on, even during the day. They are 60W equivalent, and are brighter than the incandescent bulbs, and the CFL that I tried in the same fixture. And they are full-bright immediately, rather than needing the warm-up period of CFLs.

I have also started using LED bulbs in various ceiling ‘can’ fixtures (65-watt equivalent, BR30 bulbs, use 7 watts), using them as the old bulbs (incandescent and CFL) have failed. They appear to provide more light, and again do not need the warm-up period of CFLs. The 65-watt LED bulbs will eventually replace the older BR30 bulbs in the entire house, especially in the kitchen, where the slow-brighten time of CFLs is problematic.

The 65-watt BR30 bulbs were purchased from Amazon http://amzn.to/1Psxrlp , 6 for $35, with free shipping courtesy of Amazon Prime. They are ‘dimmable’.  I notice they are out of stock at the moment, but LED bulbs are available in many places. (Some local utilities are also subsidizing LED purchase.)

So far, pleased with them. I am assuming that they will be a positive effect on my electricity bill. And using the bulbs in the family room, at 7 W each instead of 65W, will allow that circuit to be powered by my generator during any power outage. (That light circuit is on the same circuit as the TV, which I plan to power during any outage with my generator.)

Advisor David Em is also an LED bulb proponent:

Last year I bought fairly inexpensive dimmable full-spectrum LED spots for the studio. I love them.

Also happen to be reading Oliver Sacks’s “Uncle Tungsten” [Amazon link ] at the moment, which has some interesting discussion of the history of light bulbs.

What do you think? Are you moving towards LED bulbs, or are you holding out with incandescent? Or are CFL’s your choice? Let us know in the comments.

Chaos Manor Network Attached Storage Upgrade And Retirements – Part 3

Now that Eric has finished the NAS project installation and configuration, it was time to consolidate data from various systems, upgrade others, and consider system retirements. Part 3 of the project continues with Eric’s narrative:

typewriter-297383_1280With the NAS/RAID project completed, the next step was to track down the various elder PCs that were serving as network backup locations, back them up to the NAS, then shut them down in hopes of making a dent in the frightening power bill that had been shrugged off in the days when Chaos Manor was like Frankenstein’s Castle for PCs, with a gruesome experiment running in every corner.

Some of these machines were quite long in the tooth and were due to go off to the farm where they could run and play with the other PCs. They weren’t in operation and adding to the power bill but it was time to clear some space.

First up were the Satine and Roxanne systems. Satine was a Socket 939 (indicating a long ago AMD CPU generation) system with an Nvidia 6000 series video card and a pair of 500 GB drives that may have been mirrored.

The googles note an early mention of the Roxanne system in February 2008: http://goo.gl/lxT07M : “Roxanne, the Vista system that was the main writing machine before Isobel”.

An early mention of Satine – nee’ Sativa – is around July 2006: http://goo.gl/gy1lP3 , also here http://goo.gl/zwR0e8 . Earlier mentions could be in the Chaos Manor columns printed in Byte magazine, but web versions of those sites are not available, at least with a quick search.

Beyond that it was hard to say because Satine was inoperable. It would spin up the fans and do everything preparatory to booting but never would. Not so much as a POST beep of any kind. The properties of some system files indicated it was running Windows XP SP2, so there was probably little value there for anyone beyond salvaging the drives.

So the hard drives were backed up and formatted, placed back in the case, and Satine remained as a project for someone with a strange combination of motive and nothing better to do.

Roxanne was more promising. She had last seen life as Mrs. Pournelle’s workstation but had been replaced by a new build when the case vents had become clogged, causing poor Roxanne to overheat before System Restore could run long enough to repair the damage from the first overheating.

Even with the vents cleared the Pentium 4 HT system was rather loud and hot. It isn’t clear to me whether this was always the case and not a bother to Jerry’s artillery-blasted hearing or if it had become compromised at some point.

Certainly it wasn’t worth any significant investment to replace any of the cooling bits. But it was running Windows 7, raising the question of whether it could become a Windows 10 system with a potentially long life ahead. Therein the saga lies.

Updates and Upgrades

On both Vista and Windows 7 there was just the one Service Pack. (Windows 7 may still get a second due to its business footprint but I’m not holding my breath.) Going back to NT 4, Service Packs were once produced far more frequently. Internet access was far less widespread and the need to store updates locally for installing on numerous machines was far higher. It was especially helpful if a Service Pack replaced its predecessor. Install SP4, and SP2 and SP3 were included in that update.

As the internet and live Windows Update downloads became more the standard, it became more of a hassle to update a new install or a machine that had been offline for a long period. By the time of Windows 7 in the days after Windows 8 had launched, this had gotten a bit painful.

Businesses of the scale to have their own WSUS setup [Windows Software Update Server, a ‘personal’ Windows Update system to centralize and manage updates across the business environment] or enough identical machines to use an updated image weren’t bad off but supporting the SOHO market got annoying. I had one experience where several refurb PCs that came with Windows 7 SP1 needed almost 1.5 GB of downloads to be fully updated. This was a very slow process regardless of how good your broadband speed might be.

Well, Roxanne hadn’t been to Windows Update in over two years. On the first attempt it spent nearly two hours figuring out which updates it needed before I noticed the gas gauge animation had stopped. The Update app was frozen. Start again.

This time it was a bit better. It installed about 150 updates before stopping and announcing it had an error it could not work past.

Along the way, the Windows Defender anti-virus scan announced finding some malware. It was deleted but found several other copies as it worked through the drives. Roxanne had received copies of older machine’s content in the process of originally entering service and consequently the malware implanted itself in each copy it found of certain system files. This was one of those packages that would break Windows Update as part of its activities. Why I was able to get as many updates installed as I did before it kicked in is a mystery.

This meant going off and searching out the ‘fixit’ app to correct the damage. Still more downloads. Then Windows Update failed with a different error.

At this point Roxanne had been updating, more or less, for about 20 hours. (This could have been reduced some if I’d been there to respond every time it needed human input but I have this congenital condition that requires me to spend a certain portion of each day unconscious. It’s very inconvenient. The doctors call it ‘sleep’ and can only offer short-term mitigation.)

A Different Fix Needed

This time a different fix was needed but it was soon found and applied. Finally, almost a day after starting this quest, it was done. Roxanne was as up to date as a Windows 7 machine could be at that moment in early September of 2015.

But where was the Windows 10 upgrade offer in the notification area? It should have been in the last batch of updates. Apparently this issue comes up often enough that there is an app from Microsoft that looks at your machine and determines if it is eligible, and installs the upgrade app if so.

Roxanne wasn’t eligible for the Windows 10 upgrade for two reasons. One was correctable, the other was not. At least, not at reasonable cost, which in this case is not anything over $0.

The Nvidia FX5700 video card had long since fallen out of the range supported by the company after Windows 7. This could be fixed by replacing it with a newer card that would still be old enough to be free or of negligible cost. The other problem was the Pentium 4 HT CPU. It was too old to have NX Bit support.  https://goo.gl/4dgKB0

headstone-312540_1280Considering how much past malware misery could have been prevented if this NX Bit feature had become common much earlier in microprocessors, it represents a perfectly reasonable place for Microsoft to say “here and no farther” when it comes to antique hardware. The last generation of Pentium 4 did have the NX Bit (Intel calls it XD bit) added but this was after Roxanne’s CPU came out of the foundry.

So there it ends for Roxanne. She may find a home yet and brighten some life but we are done with her and bid her farewell.

Wrapping Up

The project met the need of providing a centralized and more efficient data storage for the Chaos Manor network. By consolidating data into that central location with high capacity drives, Chaos Manor gains efficiencies in data storage (data is not in several different places, with the attendant work of synchronizing data amongst several systems). It also makes backing up that data more centralized – which will be the focus of an upcoming project.

Using a RAID 6 configuration allows for data reliability, although you shouldn’t rely on just a RAID 6 for backup or data recovery, it is more of a reliable centralized storage system. You really need to have a process in place for off-network storage of your important data.

As for older computer systems: there’s “old” and then there is “really old”. Determining the moving target that divides those is necessary to decide whether a system should be donated or simply sent to e-waste.

And so ends this project. We hope that it has provided you with useful information, and perhaps some thought for a similar project of your own. We’re always looking for guest authors, see here. Let us know what you think in the comments below, and share this article with others.

Chaos Manor Network Attached Storage Upgrade And Retirements – Part 2

We continue with the story of the new NAS system at Chaos Manor.

Home-Server-iconAs the project got closer, Eric did some more research about the initial setup of a RAID 6 system.

From what I’ve read on Netgear’s support forum, the initialization of the RAID 6 setup can take a LONG time but since the unit isn’t in active use yet this will only be a mild annoyance.

Eric noticed a Netgear RN10400 RAID system was advertised by Fry’s, so he decided that would be appropriate for the project. (See http://goo.gl/lYAl2p for product details.)

There wasn’t so much a survey [of system possibilities] as a Fry’s ad offering the unit for approximately $100 off its normal price. At the time I grabbed it there were potentially four different places it might have gone. If I had used it as a media server at home I would have gone for a JBOD configuration to maximize capacity as nothing on would have been irreplaceable.

The initial setup [of the RN10400] was very straightforward. A full set of four 4 TB drives from Seagate (Model ST4000VN000) were purchased and installed in the carriers. This is familiar to anyone who has done much PC construction with cases that provide removable carriers. Once the drive is installed in the carrier, it slides into its slot and connects to the SATA interface.

Upon powering up the NAS defaults to a RAID 5 configuration and immediately sets about verifying the drives and creating the RAID. This would eventually result in a capacity of slightly over 10 TB of storage capacity. Netgear offers an app called RAIDar that searches the LAN for any of the company’s NAS products and reports their address and status. From there you can log into the NAS itself and see what progress it has made and specify a wide variety of options, such as user accounts, access limits, and backups to other NAS or offsite via cloud services. This is also where you’d break up the default volume to recreate in a different RAID configuration.
That part that got tricky was making sense of the instructions. I cannot say if Netgear has updated the firmware and the PDF manual hadn’t caught up yet or it had always been wrong. It also may be intended for a different model and put in this version by mistake. The method detailed for destroying the existing volume simply didn’t apply to the interface presented.

The System Setup and Configuration

Eric continued the work on the Netgear ReadyNAS 104 4-bay NAS system.netgear-rn10200-3-4lft-photo-large

After much consideration of the choices, I decided to accept safety over capacity and move the Netgear’s four 4 TB drives to a RAID 6 configuration. This effectively means that half of the volume’s capacity is consumed by the non-data use of sector to insure the volume can be reconstructed without loss if any one drive should fail.

Eventually I noted the gear icon that had appeared elsewhere and found it was an active link to bring up an otherwise undocumented menu. From there the task more or less matched that described by the manual. The four drives were applied to a new RAID 6 volume which would offer 7.6 TB after 60 hours of setup. They weren’t kidding. It took that long. A search of the forums on the Netgear site produced a post that explained the RAID 6 setup had far more overhead than RAID 5 or lower, and that this was one of the reason their higher capacity NAS products used Intel processors with specialized function for the task.

Another optional setting was ‘Bit rot protection.’ This tries to head off failures on the drive media before data is lost but they warn of a potentially dire performance hit. I turned it on and did note that when uploading the backups from the recently retired machines the throughput was frequently awful, often descending low enough to be measured in KB. Before the next big upload I’m going to switch this feature off to see if it is the culprit. Once the bulk of initial backups are made it may have little bearing on day to day use but we’ll just have to watch and see. I recall some line about those who would sacrifice performance for data security may end up with neither.

Because this is a fairly inexpensive model using an ARM-based SoC [ARM processor, System-on-Chip – think Raspberry PI, which also uses an ARM-based SoC], the process of creating the new volume was quite lengthy, coming in at 60 hours. High-end models most often use Intel processors with substantially better performance in the specific areas complex RAIDs place their demand.

SOC is System On Chip. ARM processors are rarely found all by themselves. The myriad licensees package them with other functions, licensed from other companies or created by themselves, to produce the device specific to the intended purpose. This has become increasingly common across the industry, especially in the consumer sector where the other functions are a given.

Thus most Intel processors these days have a GPU included, and numerous other functions that were formerly found in separate chips. Due to the way ARM’s business model works, an ARM-based device is likely to be far more customized than something out of Intel as it simply isn’t worth Intel’s time to pursue markets below a certain size. Intel does have a line within the ATOM product range that is aimed at NAS builders but at a significantly great cost than ARM-based competitors. You’ll find these on higher end NAS models with a greater range of abilities than were needed here. For example, some high powered NAS products can act as a self-contained media library with the hardware to drive a TV all by themselves rather than relying on a client device.

As Eric worked on the NAS/RAID setup, he set up the volumes as RAID 6 instead of the default RAID 5.

The Netgear ReadyNAS 104 was populated with 4 TB Seagate NAS drives. This, when it finally finishes, will offer 7.6 TB of space to work with

I intend to write at greater length about it but was waiting until the change was done and more material might be added. There is a lot of capability in the device, making for a dizzying array of choices for how best to make use of it.

I’ll also make note that the 60 hours needed to perform the RAID 6 volume creation would be greatly reduced on higher end product using the Intel products that have been targeted at the NAS business. This model, and most like it in price and features, use ARM chips that cannot match the Intel products on the more computationally intense RAID configurations, according to Netgear reps in their forums.

Minor Annoyances Solved

There was some initial difficulty with the NAS alerts emails not getting to Dr. Pournelle’s account; they were going to Eric’s email account. Eric noted that, along with some additional info on the configuration.

Currently the email alerts are coming to me [Eric]. The mystery is why Jerry’s account wouldn’t work using the same setting displayed Outlook remains unsolved.

The NAS has a pair of gigabit Ethernet ports and is administered through an intranet web UI. It supports the installation of apps to add custom features but a brief survey didn’t make any stand out for Jerry’s needs.
Netgear warns that RAID 6 on this product range will significantly reduce write performance but with so few users on the Chaos Manor network this isn’t likely to be a problem. After the initial backup from an actively used PC the following updates should be quick, unless a complete image is being made every time.

The Swan system has an eSATA drive docking slot that lends itself better to that level of backup. A USB 3 external drive should suffice for AlienArtifact and the same drive could also serve the Surface Pro 3 since it has so little storage capacity compared to the desktop systems.

As the project continued, with status reports to the Advisors for their reading pleasure, Advisor Peter wondered:

I’m still curious to know how much of that gigabit capacity can be delivered over the Chaos Manor network, but it probably won’t come very close to the theoretical performance of the array itself, which should be well above 1 GBPS for reads, at least.

That’s really saying something, since I think all the computation is just XOR operations and table lookups in normal operations. But cheap ARM chips have few of the data-movement and vector-processing features of Intel’s cheapest processors, and that’s probably the real issue rather than “computation” per se.

Advisor Brian Bilbrey chimed in about initial and ongoing performance of the system.

A reduced write performance is primarily an issue on initial seeding of the device. Jerry undoubtedly has a lot of data he’d like to centralize, and getting all of that on the first pass will be the ordeal. Keeping it updated thereafter is less likely to be affected by the performance issues.

Eric continued the discussion:

I won’t be surprised if the network speed is much more of a limiting factor and the difference in write performance impossible to discern without much bigger loads than we’re ever likely to create.

Perhaps [it could be measured] if we tried to save an image from every workstation simultaneously. None of those machines has more than 1.25 TB of local storage (250-ish GB SSD and 1 TB hard drive) and most of that empty. (There is 192 GB on the Surface if you include the microSD card.)

If we did full backups every night there would be plenty of time to avoid overlap.

The saga continues in the final installment, as data is moved from systems, and older systems are considered for retirement. Let us know what you think in the comments below, and share this article with others.

Chaos Manor Network Attached Storage Upgrade And Retirements – Part 1

In this three-part series, the Chaos Manor gets an upgrade of its data storage with the installation of a new Network Attached Storage system. Along the way, several old systems are deemed retirement candidates. We start with Part 1, where the project and features are discussed amongst the Chaos Manor Advisors:

Network-Nas-iconThe Chaos Manor Advisors determined that Dr. Pournelle’s various systems needed a cleanup, consolidation, and upgrade of at Chaos Manor. There was a need for a better backup and archiving process, along with some retirement of systems. Consolidating systems and data storage in light of Dr. Pournelle’s mobility problems was another objective, now that the new Chaos Manor wireless network was in place.

One aspect of this consolidation was to create a Network Attached Storage system that had RAID capability to serve as centralized data storage. A backup process was also needed. And there was a need for the archive to be protected against any encrypting-type malware. Although Dr. Pournelle practices ‘safe computing’ that reduces that risk, a protected backup of his data, including past and future books, was deemed to be a good objective for this project. We thought that this similar project would be interesting for Chaos Manor Reviews readers.

Chaos Manor Advisor Eric Pobirs, an experienced technician that works with Dr. Pournelle’s son Alex (as well as doing some freelancing) took the lead on this project. A discussion among the Advisors discussed the configuration and issues involved in creating this NAS/RAID system.

Eric started out with his general objectives:

Well, the idea was to have capacity wildly in excess of need to reduce the amount of management concern it generates once it has been configured fully. The difference in cost for the somewhat safer lower capacity drives is fairly minor and they’d still be at risk for the Rebuild+URE [more on this below] problem. So doubling up on the 4 TB drives in RAID 6 likely works out better than say, 2 TB drives in RAID 5, for a difference of around $100 for the set.

Part of this was offering the example of just how amazingly cheap this stuff has gotten over the years and how a bit of research can lead to better results without massive expense. Now, some might regard this investment as massive expense but creating organized bytes is Jerry’s livelihood, so this is just insurance. Also, the use of better qualified equipment should win back some of the expenditure in reduced power costs for the house. A few percent here, a few percent there…

After some thought, Eric came up with the outlines of a plan.

NAS and Backups

The main objective of this project was to determine how to configure a backup system for all of the computers at Chaos Manor. Backups are important, and Dr. Pournelle has lots of data: massive amounts of emails and his equally massive book files.

netgear-rn10200-3-4lft-photo-largeAfter a survey of the possibilities, Eric decided on a Network Attached Storage (NAS) system that consisted of Netgear ReadyNAS 104 4-bay NAS (http://goo.gl/lYAl2p )

Advisor Brian Bilbrey has much experience with large systems, being a senior systems administrator. He discussed the basics of the various types of RAID systems, beginning with an explanation of an “URE”, an ‘Unrecoverable Read Error’:

Magnetic disk storage sizes are now on the same order of magnitude as the quoted bit error rate for reading data from the disk. That is, if there are 4 TB on a disk, the chances of 1 in 10^14 Unrecoverable Read Error are pretty small. You don’t read a lot from your drive at any given time.

However, if you have an array of five 4 TB disks in a RAID 5 configuration, then you’ve got approximately 4 disks worth of data and one disks’ worth of calculated parity spread across all of the disks. If any ONE of those disks fails, then when you put in a new disk to rebuild that array, ALL 16 TB of bytes will be read to rebuild. There’s a significant chance that during that process, a read will fail. At that point, the array cannot be rebuilt. Data done and gone; restore from proper backups.
I recommend RAID 6 for 4 or more disks, and 2 or 3 way mirrors for 2 or 3 disk systems. Yes, you’re “throwing away” storage. Or, to put it another way, you’re managing the risk of data loss. With RAID 6, during the rebuild, you can lose a disk, suffer a URE during the rebuild, and still have all your data.

Personally, I also buy Enterprise-grade disks, because there’s usually another factor of 10 added to the URE reliability. For more info, use your favorite search engine and the phrase “URE RAID 5” without the quotes.

With that explanation, Brian continues:

One thing I’m pondering in light of the Rebuild+URE problem is whether a RAID 10 might be safer. This would be a two-drive stripe set mirrored by the second set of two drives. This cuts the raw capacity from 16 TB to a ‘mere’ 8 TB, which is still a vast capacity for storing primarily text files. In this case, recovering from a drive failure is more a matter of copying than a complex rebuild and the NAS should keep working with the intact set until the replacement drive is installed.

The Netgear box will also do RAID 6 with the four drives but as the capacity works out the same I find myself wondering what advantage remains, if any. RAID 10 may have the advantage in continued functionality after the loss of a single drive, whereas I have the impression a RAID 6 would be out of commission until the failed drive is replaced and the volume rebuilt.

In 234 pages the manual has remarkably little to say about drive failures, how to handle them, and how different configurations affect this.

Advisor Peter Glaskowsky agreed with Brian, adding:

To add to Brian’s reply, a RAID 6 array not only keeps working after a single drive failure, it still has redundancy– it becomes the equivalent of a RAID 5 array. Even two drive failures in a RAID 6 array will not stop the array from working.

So if you have an effective RAID 6 option, that’s my recommendation too. I know it’s painful to lose half your capacity, but in the long run, that’s better than losing all your data.

Brian added some additional thoughts about the various RAID types:

RAID 6: Lose one drive, you’re running degraded, and can rebuild at leisure. IF there’s a bit read failure during the rebuild, you have the second parity to fall back on.
Lose two drives (or lose a second drive during the rebuild after the loss of a first drive) and you’re running degraded, with no backstop. If you lose a third drive while rebuilding against a two-disk failure, you’re dead.

RAID 10 (and friends): Lose one drive, you’re running degraded. Rebuild, and hope there’s no bit read failure.
Lose two drives, and if it’s half of one mirror pair, and the OTHER half of the other mirror pair, and you’re still running degraded. But after one drive failure, you have a one-in three chance of catastrophic failure during the rebuild, should there be a bit read error.
The point of spinning storage is to have large data available for immediate access. Periodic copies of this data to spinning rust that is then stored offsite with which to rebuild if you lose the RAID 6 is prudent.

One of the considerations of a RAID system is that it is more of a centralized storage area than a full backup/restore solution. As Eric noted:

This article and its links cover the nature of the problem:
http://goo.gl/ungcQI
In short, drive capacity has advanced far faster than reliability and it may not be possible for reliability to ever be high enough to overcome the odds of a crippling failure. This is why RAID cannot be regarded as a backup but merely a way to centralize data to be backed up.

In the next installment, a system is selected, and installation and configuration is begun. Let us know what you think in the comments below, and share this article with others.

Another Backup Strategy

[In a previous article, Drake Christensen described his backup strategy using a network attached storage (NAS) and software to immediately backup changed documents. Your intrepid editor has a different strategy, since he doesn’t need immediate backups with version control.]

pc-bruciato-fireIt is a Good Thing to have a way to back up your important files. And since it is also Emergency Preparedness Month, having a good backup strategy is a good subject to visit.

There are tons of ways to backup your data, and many reasons to actually Do It. You’ve probably heard about all the reasons ‘why’. My main focus on my backup strategy is three-fold:

  • keep backup copies in several physical locations; not just at home
  • make the backup process easy and mostly automatic
  • allow backups of multiple devices (there are four computer systems in our house) while reducing costs

Backing My Laptops

The first process is to copy important documents/files (pictures, project code, document, etc) to a central location in my home. That is a desktop computer sitting upstairs, connected to my LAN. It’s an older system, not used as much anymore, but there is a big hard drive on the system.

I use Microsoft’s SyncToy, a free program that synchronizes files between two locations. In my case, I use it to one-way sync between laptop (source) and desktop (target). SyncToy has the advantage of only copying or updating files that have changed or been deleted. Only those files that meet those criteria are copied from the laptop to the desktop. With thousands of files on my laptop, that saves a bunch of time.

Since SyncToy is free (thanks, Microsoft!), I have it installed and configured on all three of our laptops. Right now, it is a manual process to do the ‘sync’, but there are ways to set up a ‘batch job’ and schedule the SyncToy task on a regular basis. My practice is to do a sync every couple of days. This is OK, since most of my work is with web sites, and the web site code files are also available on the external web sites in case of disaster.

ScreenShot394SyncToy works fairly fast; once you set it up, just run the sync task and let it do it’s thing. Here’s the results screen of the last time I ran SyncToy. You can see that a bunch of files didn’t need to be copied, because they hadn’t changed. That saves a bunch of time on the backup.

Now, I could back up to an external hard disk, either manually or with SyncToy. But the desktop is available on my LAN, and backing up via wireless is Fast Enough for my purposes.

Backing Up the Desktop

At this point, all my important files are in two places: my laptop(s), and the desktop. But the files are in the same physical location. Any problems with that physical location (theft, fire, earthquake, zombies; take your pick) would result in loss of files – especially all the family pictures, many of them irreplaceable.

So the second part of my backup strategy is to copy important files to the ‘cloud’. For that, I have chosen the Carbonite backup service (http://goo.gl/45wv ). For a flat fee, all my important files are automatically copied to their encrypted servers. There are similar services available from other vendors.

The best part is that ‘automatic’ part. Any file that changes on the desktop is automatically backed up to the Carbonite servers. It happens in the background, so when I use that computer, the backup process doesn’t interfere with my use of the computer.

Carbonite stores multiple copies of my files, so there are some ‘history’ versions of files that are available. You can also access any of your backed up files on other devices or computers – phones, tablets, whatever. This is a great advantage if you travel a lot, since you can access that important file you left on your home system while on the road.

There is another advantage to using a ‘cloud backup’ service like Carbonite. That relates to ‘ransomware’.

I P0wn All Your Filez

Any backup strategy needs to account for damaged files. Files can be damaged by hardware problems, physical damage (fire, etc.) or theft. And then there is ‘logical’ damage – damage done by malware.

There is malware that encrypts your files, requiring a payment to recover your data. This ‘ransomware’ can be a big moneymaker – reports are that up to US$18 million has been paid to recover encrypted files. While there are things you can do to block ransomware – or any malware – think ‘safe computing’ practices – your backup strategy can also help prevent file loss from ransomware.

Ransomware can damage files from any infected computer on your network – even your little home network. If a file is available on the network from an ransomware-infected computer, then that file can be encrypted, even if it is on another computer.

So your backup strategy needs to take possibility into account.

There are a couple of ways that you can enhance your backup strategy to protect from ransomware:

  • Copy files to an external drive (or even DVDs), then physically disconnect that external drive from the network.
  • Use a ‘cloud backup’ service.

As you may have guessed, my backup strategy to prevent possible ransomware problems is using a cloud backup service from Carbonite.

gTWVIWhat about copying files to a Linux-based Network Attached Storage (NAS) system? It is likely that configuration includes access to the NAS by Windows-based systems. So there is no protection there. Of course, you could run all-Linux systems, but that is less likely for most people.

So the Carbonite-based cloud service is my solution. They keep multiple backup copies of my backups, so even if the ransomware gets to my desktop system, and Carbonite backs up those encrypted files, I can work with Carbonite to get a prior, non-encrypted versions of my files. I might lose a few recent files, but the majority of my important files would be available to get back – after I rid my systems of the ransomware.

Wrapping Up

So there you have it. My own personal backup strategy. It works pretty well for me. I haven’t had to recover files – mainly because I practice ‘safe computing’. And have been lucky enough to not have any disasters.

But I am prepared. My important pictures, documents, web site files, etc., are available-  Just In Case.

What is your backup strategy? What do you do different? Or do you just trust in your ‘karma’ to keep away the possibility of file loss? Let us know in your comments – or write up your own backup strategy for an article here on Chaos Manor Reviews.

After the Storm

 

[A bit different, but there is a “I did this so you don’t have to” angle on this story from our editor about emergency preparedness. It is, after all Emergency Preparedness month. Originally published on his blog. – Editor]

storm-treeOur house is on the northeast corner  of the Olympic Peninsula in Washington state, not far from the Hood Canal Bridge. Although we get about half the rain of Seattle, there are the occasional windy storms that come through here. This weekend was one of those; a extra-windy affair with rain, that usually happens once or twice a year around here.

The weather forecasters predicted very windy conditions, with gusts up to 55mph. Since there are a lot of trees in Washington, there was the real possibility of falling trees causing damage to the electrical lines, resulting in power outages.

There was several days of warnings about the storm, so plenty of time to lay in supplies. If there were electrical outages, they might last 12 to 48 hours or more.

We had a similar storm last year, with a power outage of about 8 hours, starting in the evening. But this storm was supposed to hit mid-day. I figured that would be the case again with this storm.

So I did some minor preparation at home. I knew that I had plenty of flashlights – and batteries. We had some canned food, plus fresh fruit, some energy bars, and four cases of bottled water. We have a small chest freezer with some meat in there, and a good propane BBQ grill with an extra fuel canister. I figured we could handle a short power outage, even if it did happen overnight.

Storm On The Road

The storm came as scheduled on Saturday. We had family in town, so had planned a trip across Hood Canal bridge to Silverdale to visit the local marine museum with my daughter’s’ family (husband, wife, two and four year old). We went across the bridge, and it was a little windy, but not too bad. Enough wind that there were 2-3 foot ‘rollers’ and a bit of whitecaps. But winds gusts under 20mph. The bridge will close if the winds get over 45mph.

We made it to SIlverdale OK, had a great visit in the museum (a hands-on place with lots of touching of the sea critters, to the delight of the grandchildren). It started to rain a bit more when we got there, but not really a downpour.

After the marine museum, a trip to a hamburger restaurant. Lots of people there; it was lunch time, but service was good. It was a bit windy outside, maybe 15-20mph, and some rain. There was a bit of light flickering due to power issues while we were inside, and one 10-second outage, but all was well.

During lunch, I was watching the roads (via Waze and Google Maps), and notices some slowdowns on the usual route home that appeared to be just traffic-related. The drive back home is about 35 miles; some four-lane divided highway, some two-way undivided before the bridge.

After lunch, we went over to the local Costco to look for a replacement laptop (didn’t find the right one). But I thought it would be a good idea to get a LED lantern and some extra batteries – extra batteries are usually a good idea. (the Costco Kirkland brand is a good value).

The Costco was the usual Saturday-busy, but we got out OK. And back into the car for the trip home. The traffic on the four-lane highway wasn’t too bad. But then we got to the about 10 mile two-lane highway part. That was backed up solid and stopped. It looked like traffic was coming from the other direction, so figured there was just more traffic than usual.

While stopped, I was checking out the traffic, seeing if there was another way that might be better. But there are really only two ways to the Hood Canal Bridge. Our usual route was jam-packed.

And, my cellphone was no help. No bars, so no traffic help from Waze or Google Maps. The wind had knocked out the power (trees into power lines), so no idea which way was the best way home. After sitting in nearly one spot for about 30 minutes (the “this should start moving in a few minutes” kind of wait), I decided to turn around and try the other direction home. That turned out to have less traffic to the bridge, although the ‘long way’ around.

We crossed the bridge (more rollers and white caps on the water) with some crosswinds. The bridge had been closed for a couple of hours due to the wind, which caused the big backup on the main route to the bridge. Our alternate route wasn’t as busy, and the bridge was open by the time we go there. Then on to the two lane road to our small town. And on that road, you could see several power lines that have been downed (but off the road) by trees. It didn’t look good for power when we got home.

Back Home and It Is Dark

And, that was correct. No automatic garage door when we pushed the button. In to the front door to a dark and power-free house. It was about 430pm, so plenty of light from the big windows in the main room. But it was time to prepare for darkness – find the flashlights (where were they?) check the batteries (several flashlights were dead, but I did have replacement batteries), and set up the LED camp light. The water was still running, though.

A reminder to everyone to stay out of the refrigerator and freezer (the ice cream cake we brought home to celebrate a birthday was a bit soggy due to the long ride home, but still good). There were hard-back books for a some, ebooks for others, and a movie on an iPad for the kids. When it got dark outside (and inside), we turned on the LED camp light (a nice amount of light) until it was bedtime for the kids, with flashlights issued as needed.

With that over, some quiet time for the adults, then off to bed about 10pm. I was able to keep up with the local power company’s efforts via social media on my phone; the cell towers were still working.

There were many power lines down in the area; the Olympic Peninsula around our home had about 12,000 customers in the dark, with much larger impacts throughout the region. Crews were (and still are) working on things, but big trees falling on power lines does cause some damage that takes a while to repair.

I use a CPAP machine for my sleep apnea. That didn’t work, of course, and sleep was difficult for me because of that. Power was finally restored around 3:30am for us. Up early for church, where everyone swapped power outage and storm stories. Some people in more rural areas were still powerless that morning, and throughout the day. Some still are, as I write this on Sunday night.

Reviewing Things

Now all of that is a rather long preface to ‘I did this so you don’t have to’. I read a few blog sites that talk about ‘prepping’. After thinking about my preparations for the wind storm, how did I do?

Well, I did have some flashlights, although it took a bit to find them all, and get them working. The food in the freezer and refrigerator stayed cold, because the power outage wasn’t very long (and it was a good excuse to eat extra ice cream cake). I didn’t have to worry about a cold night; I do have a propane fireplace, and the propane tank is full, but the nights are mild (around 55-65F) this time of year.

There was food that could be used for an extensive power outage, although not that much. I did have water (the municipal water supply was working through the outage). Lots of toilet paper, so that is covered. There would have been cold showers in the morning, though, since I have an electric water heater.

But my flashlight supply wasn’t really ready; I did have to do some digging around in the garage a bit to find working ones. The new LED lantern was a good purchase; we’ll get another the next trip to Costco. And I have lots of spare batteries, along with two crank-type LED flashlights, one with a radio.

My cell phone was mostly charged, but my backup cell phone battery pack was not (I had used it the weekend before, and hadn’t thought to charge it yet). My CPAP machine only runs on house power, so I didn’t sleep that well – getting one that runs on 12v might be a good idea.

Food supplies were passable, but an extended outage might result in a not-healthy diet. Our personal medicine supply was good. My first aid supply is very basic – bandages and antiseptic cream. I have some antiseptic hand wash stuff, but not enough for an extended period of time.

There were lots of trees down in my area. I had an alder tree, about six inches in circumference, that split and fell, luckily not on my house. A neighbor helped cut the damaged branch – he can use it for his wood stove, but I’ll need to cut the rest of the tree down – so where is my bow saw?

Lessons Learned

Looking back, I probably could have prepared better. There was several days warning of the impending wind storm, and I knew that the area is prone to power outages during wind storms. More and varied food might be better. I may need to consider a small generator to keep the refrigerator and freezer cold.

Perhaps heading out on the road just before a storm hits is something that is less than ideal.

When the power came back on, I didn’t think to check the status of frozen meats in the freezer; since the outage wasn’t that long, and we kept the doors shut, I think the frozen food is OK.

I was prepared to cook on the propane grill; I had an extra propane tank. But it might be a good idea to get a small two-burner propane stove, which would be more efficient than the propane grill for some meals. Both cars were full of gas, so I could have charged my cell phone batteries there, but I need to ensure my cell phone ‘battery-brick’ is kept charged, and maybe buy an extra one. II

I could use more LED flashlights, and batteries. Maybe even a solar battery charger.(I did order a couple of solar-powered flashlights to try out.) And another LED camp light or two. And I need to organize the emergency supplies to have them in a central space, so I can find them. (I still haven’t found my LED head lamp.)

I need to be aware of alternate routes in the area. Perhaps a paper map would be better for when the cell phone towers are dead because of power outages, or at least an on-line map study before the next emergency.

Perhaps an alternate power supply for my CPAP. Getting enough rest during an emergency is a Good Thing.

So, maybe an overall grade of C+? Good enough for this short outage, but I need to think (and act on) additional things to get ready for the next one. Whatever the emergency is.

What about you? Have you thought about your emergency preparation status? Are you ready for a short-term power outage? Could you survive on what you have in your house right now? Let us know in the comments.

An Evolving Backup System

[Chaos Manor Reviews reader Drake Christensen was reading about the Network Attached Storage (NAS) system your editor set up with the Raspberry Pi, and decided to share his experiences over the years with his NAS and backup system configuration and practices. – Editor]

nas-backup-cloudI’m very happy with my backup system, which has evolved over many years, and I thought I’d share my experiences as I have enhanced and improved it over the years to its current configuration.  It may have been on one of the Chaos Manor Mail pages, where someone declared:  “If it’s not in at least three places then it’s not backed up”.

I use Windows at home. I’ve had a Network Attached Storage (NAS) on my home network since 2009. Even though they’re a little more expensive, I’m a big fan of NAS over external USB drives.  I worry that if Windows gets confused and trashes a drive, it may also trash any backup media that’s plugged into it.

Since a NAS is a separate computer, that provides a bit of a buffer to protect against the disk structure getting destroyed.  Also, just as a simple practical matter, the NAS can be accessed by multiple computers on my home network.  And, I don’t have external drives cluttering my work area.  The NAS is off in the corner, out of the way.

I started my home NAS with a two-drive D-Link DNS-321, which is still attached to my network.  This older NAS is limited to 2 TB drives. The D-Link is very hack-able, and I think I could have downloaded a custom build of the OS that would let me use larger drives. But I just don’t have time to add another hobby.

In early 2014, I added a new hot Windows box that I had built by a boutique builder, iBuyPower. My previous, circa 2009 iBuyPower box was relegated to be a secondary chat/email machine that I use on the same desk.

I needed more backup space, so I added a Synology DS213j (see information here; link to site?) This two-drive system currently has only one 4 TB drive in it.  The admin page of the Synology is quite a bit slicker than the D-Link, and it has a lot more optional features available through their interface.  And it is being actively updated.

With my first NAS, I used a conventional backup program, which did daily backups to the NAS.  But, for some reason, even though the data was growing only slowly over time, the backup was taking much, much longer.  Eventually, it was taking most of a day, even for an incremental backup.  I never did figure out what was causing the performance issues.

Updating over the Years

Somewhere around 2012 I started looking for a program to give me Apple Time Machine-like capabilities on Windows.  I wanted to have a file backed up within a few minutes of when it was modified, with multiple versions stored.  I tried one commercial product, Genie Timeline; it wasn’t horrible. But, I found its interface to be a bit too “cute,” for my taste, and I felt that it got in the way.

Eventually, I found AutoVer mentioned in several places.  It’s certainly not a pretty program, but I find it fairly straightforward.  I’ve been running that on a few machines ever since.  I set it to backup up my entire \Users\(me)\Roaming directory, plus my data drive.

During the first few weeks, it does require some attention.  Some programs, like Firefox and Evernote, for example, will touch some large files fairly often, which can quickly eat up space on the backup drive.  I was able to break up the backup task into three or four smaller pieces, with custom rules for each task, and greatly reduce the number of versions it keeps of those larger files.

Unfortunately, “Real Life” has encroached on the author of AutoVer, and it is teetering on the verge of abandonware.  He rarely logs in to his own forum, anymore. It is still reliable for me, and I’m still using it on two machines.

More Enhancements to My Backup System

When I purchased my latest machine I decided to find an alternative that appears to have more recent work done on it.  I ended up with Yadis! Backup.  Its interface is a bit more familiar and friendly.  I’ve been running it for about 18 months now.  The only issue I’ve had with Yadis! Backup is that over time the log file will grow huge and crash the program on start.  Every couple of months I’ve had to rename/delete the file, which clears up the problem.  I have contacted their tech support a couple of times and received reasonably prompt responses.

One wrinkle that I’ve recently solved is automatically logging into my network drives.  Apparently, when checking the “Automatically connect” box in Explorer, the order in which Windows attempts to log into the network shares vs loading its network drivers during boot results in an error, leaving me unconnected.  I had hacked together a quick Powershell script to do that, but I wasn’t happy with it. A few months ago, I started looking around, and found the open source NetDrives, a free utility that I can run on startup to connect the network shares when the OS is ready to take them.

So, that’s one extra backup copy.

Going to the Cloud

A couple of years ago I saw an ad for a cloud backup service, called Backblaze.  That got me started researching.  I found lots of good reports about Backblaze, but it was a little expensive for my use.  (I record some amateur sports videos, which greatly bulk up my data.)

Carbonite is well-known, but at the time I was looking at it, it was also very expensive for large backups. It has been long enough that I don’t remember the specific prices, at the time. I recall that one of my machines had over 600 GB on the data drive, and that Carbonite was in the several hundred dollar range for that much data.

I ended up with Crashplan, which gives me unlimited data on 10 machines for $149/year.  I added my mother’s machine to my account (and I set up a NAS for her, too.)  Crashplan is also Time-Machine-like, in that it backs up continuously, and keeps multiple versions.  I’ve actually made use of Crashplan to restore a couple of files.

I don’t want to sound like a commercial for Crashplan, but there are a couple of other features that are worth mentioning which have been useful to me in my configuration and usage. (As they say, Your Mileage May Vary.)

First, since all my machines are under the same account, if I were on the road, I could conceivably use my laptop to pull a file from the cloud that was saved from one of my desktops. They also have Android and iOS mobile apps to access files backed up in the cloud.

Crashplan can also back up from one Crashplan machine to another, whether local or remote. And, it can back up to physically attached drives. It does not appear capable of replacing AutoVer and Yadis! Backup to back up to a NAS, though, even when the shares are mapped to drive letters.

Cloud Backup Advantages

The prices and packages of all of these cloud systems have changed a lot since I looked at them a couple of years ago. Backblaze is now $50/yr/computer, unlimited. And, they offer a stolen computer locator service. Carbonite is $59/yr for the first computer, unlimited data with the exception of video. Video files and more computers are available for an added cost. All of them provide seeded backups (the option for you to send them a drive with an initial copy of your data.) And, there is an option for them to send you a recovery drive. In any case, do your homework before choosing your cloud backup service to see which best fits your needs.

Cloud systems like this also protect against ransomware.  Since it backs up only through the software service, ransomware has no way to get at that set of backup files to encrypt them. For a while, you might be backing up encrypted files. But, with this kind of versioning, you can get back to a good copy from prior to the infection. The NAS, on the other hand, is still vulnerable, if the virus looks to see what shares the machine is connected to. From a Windows point-of-view, the share is “Just Another Drive”.

An aside:  One thing I found when researching cloud backups is that there is one company that is poisoning Internet searches.  They have released four or five different programs which are nearly identical, but under different names and different pricing schemes.  And then they have paid for a large number of reviews, and commissioned a bunch of Top 5 and Top 10 lists with their programs listed near the top, to make them look a lot better than they are.  Digging a little deeper, there are a lot of complaints about that company – either bait and switch pricing, poor customer service or technical problems.

Wrapping Up

My current backup system is comprised of Network Attached Storage, Time Machine-like versioning for local backup (is there a more generic term for this sort of thing?) and a commercial Cloud versioning backup. With this system in place, I can set up multiple computers on my network for continuous backup.

There are a few things I really like about my backup system.

The first is, after the initial teething period, it is completely automatic.  I don’t have to remember to do anything.  It Just Works.

Also, the multiple versions have come in handy. I’m a bit of a packrat, and I like having multiple versions of stuff I’m actively working on. It’s only a few times a year that I have that breath-sucking, “Oh, no.” feeling when I saved instead of canceled. The versioning has saved me a few times. One example would be, when my mother made a change to her book collection database file, and didn’t tell me about it for over a week.  I was able to pull a version out of Crashplan from before the change. I chose to pull from Crashplan because it happened to be the first time I needed to get an old version of a file since installing it, and I wanted to try the interface. It worked about as I had expected.

Next, I like the speed of on-site storage, as my first place to restore from.

And, finally, it adds a lot of peace of mind to know that I have off-site storage, in case of fire or theft, or similar disasters at the house. Plus, there is the slim chance of ransomware wiping out everything locally.  And, again, I don’t have to think about it.  I don’t have to discipline myself to rotate storage to have off-site safety.  For practical purposes, it’s built into my computers, now.

My solution is maybe a little pricey. I spent about $300 initially, for the D-Link DNS-321 and a 1 TB drive. The more recent Synology DS213j with a 4 TB drive can be had for about $300, at today’s prices. And the yearly cost for cloud backup is $149.

The NAS is a one-time expense, lasting me for years.  Crashplan is an ongoing expense.  As always seems to be the case, it was all a little more than I’d prefer to spend.  But, given the bulk, I think it’s reasonable.

[What do you do for a backup system? It is extensive, or do you even have a backup system? Let us know in the comments. Or you can submit your own experiences with backup processes on your home computers; see our Submissions page for details. – Editor]

Windows Live Writer – Almost Good Enough

Our intrepid editor maintains Chaos Manor and this site. Dr. Jerry Pournelle mainly writes for Chaos Manor. And he uses Microsoft Live Writer to do that. A new install of that resulted in a problem with entering the title of a post. And that resulted in a call to the Chaos Manor Advisors for help. Which, in turn, resulted in an interesting (well, to the editor, since he did it) troubleshooting process to figure out why.

Windows Live Writer main function is an easy way to write blog entries, and easily publish them to your blog site. It is a stripped-down version of Word, with basic HTML page editing functions. Once you set it up for your blog, you can write something, insert pictures, format text, add links, spell check, and the other usual things. Then there is a one-button ‘publish’ to your blogging site. The advantage to using it is that it is easy for anyone to easily write and publish blog entries.

Microsoft Word has some blog publishing capabilities also. So you could write a blog post in Word, then use the File, Save and Send to publish. One advantage to Live Writer over using Word is that you can see what your post will look like on your blogging site. The Live Writer editor screen will show your post with all of your blogs styling/look.

Sort of.

Windows Live Writer

Windows Live Writer (LW) is like a  step-child of Microsoft. It’s not very well maintained. It doesn’t work well as a full  WYSIWYG editor. Not to mention that if you want to install it, the proper download location (from Microsoft – you don’t want to get it from  non-Microsoft site) is not easy to find.

We use LW to initially create draft posts on this site, and other WordPress sites we have. We paste potential articles from Word (or email) into LW. Then we do the final editing within the WordPress editing screen.

Dr. Pournelle uses it for his posts to the Chaos Manor site. The process works fairly well for us and him. The LW editing screen is clean, with a ribbon bar to do basic formatting. Pictures pasted into LW will get uploaded to the site. The editing screen looks close enough to the final posted page on the web site.

Over in Chaos Manor, Dr. Pournelle has several computers that he uses to write his posts. He’s been rearranging his work areas lately, so he needed to get LW installed on a new system. That was the first problem.

Eric Pobirs, one of the Chaos Manor Advisors, helped get LW installed on the system at Chaos Manor. Dr. Pournelle was having difficulties getting the install process to complete. Eric said:

Essentially, it came down to downloading the correct file to start the install. For reasons that defy my understanding, Microsoft has never done a good job on how they manage the Live suite of apps. My impression is they regarded it more as something for OEMs to bundle with new PCs, like the MS Works suite of yore, and didn’t put the proper effort into presenting it to individuals downloading the product.

There were three major generations, 2009, 2011, and 2012. The earliest does not like post-XP versions of Windows. The middle version was intended for Vista, and the last version for 7 and 8.x. It was odd for a Microsoft program to display such compatibility issues but there it is. The 2011 version never gave me problems on Windows 7 but the only portion I used extensively is the Mail app, which has a long history as Outlook Express.

http://windows.microsoft.com/en-us/windows-live/download-windows-essentials#wetabs=we2012

Microsoft pulled the earlier versions from download availability but they are still offered on numerous sites that are likely to show up in search. They’re hard to distinguish because they always have the same wlsetup.exe file name, rather than carrying some clue to their version up front. Some people are still obsessively attached to the 8.3 file naming convention.

So, I made sure I was downloading the 2012 version and it simply worked. Notably, it showed a different icon than the one downloaded to Swan previously. The .NET 3.5 runtime must have been installed on Swan at some point because it didn’t ask for it as it did on my Windows 10 test machine a few days earlier.

So Eric was successful in getting LW installed on the “Swan” system, making it available to Dr. Pournelle on that system, after he set up the Blog Account in LW for the Chaos Manor site.


The LW Editing Screen

A bit about that. You can have multiple Blog Accounts set up in LW. Each account will ‘connect’ to the appropriate site. You enter the user credentials and the site URL, and LW does some trundling to get things set up. Part of that ‘trundling’ is to download the sites theme (‘look’), which results in templates that are used by the LW editing screen. That template includes the various HTML and CSS for the site’s theme, and is used to present the theme’s look in the LW editing screen. So the HTML/CSS of the site’s theme is an important part of the template used by LW to display content on the LW editing screen.

WordPress themes get updated all of the time with additional features, and probably new and changed CSS styles. LW has a button to update the theme, so it’s editing screen will ‘look’ like a published post on the live site.

The LW editing screen looks like this (a partial screenshot of the LW editing area.

imageYou can see the ribbon bar (similar to the one in Word) across the top for basic formatting (there is more that is not shown on this screen shot). There is an area to enter the post’s title, and the area underneath that is the content area. You click on the Post Title area, type in the title, then move to the content area and type in the content. When all is done, you hit the Publish button, and the post is published on your web site.

This first screenshot shows the LW editing screen when we are using the Chaos Manor Reviews blog account. The CMR site uses a theme called “Voyage”. We’ve done some modifications of it, adding some CSS and other changes that we wanted to have.

The Chaos Manor site uses a different theme called ‘Mantra’. I’ve modified it with additional CSS and code. If you look at the two sites, you can see the difference in how they ‘look’. That is because they use different themes, each having it’s own ‘look and feel’.

Now, let’s take a look at the LW editing screen for the Chaos Manor site:

imageSee the difference? No title area. Just the entry area for the post’s content, and the gray area of the site’s background. (LW doesn’t show the sidebar area, nor the heading/menu area.)

Both screenshots are in the LW “WYSIWYG” mode. On the Chaos Manor site, because it uses a different theme, you can’t enter the title of the post on this LW editing screen. You can get the title area if you toggle off the WYSIWYG mode (with Ctrl + F11). Here’s what the Chaos Manor site looks like in LW with the WYSIWYG mode turned off:

imageThe Post Title area is back, but the WYSIWYG (the look of the post with the site’s theme) is gone.

The missing post title area caused a problem for Dr. Pournelle on the new install of LW. The title area was OK on the other systems he uses, since they were working off of the older version of the Mantra theme.

So it appeared that additional CSS with the latest version of the Mantra theme was causing the Post Title area to disappear in WYSIWYG mode in LW.

Digging into the Problem

imageThat took a bit of digging around to figure out. This next part is a bit more technical, with HTML and CSS code references. But it is interesting, even to the non-web page designer.

LW stores the site’s theme templates in the AppData folder on the computer. Each site is stored in a folder with a GUID-type name. Inside that folder area is the template file for the site. Here’s the file list for the Chaos Manor site; other sites that I have installed on my system have a similar file structure, shown on the right.

The LW editing page uses the index.htm template. The older versions of the index.html file are prior template ‘syncs’, as are, I think, the other GUID-named folders.

If we look at the code inside the index.htm file, we see standard HTML code with CSS styles, etc. Here’s the BODY area of the template code in that file

image

Note the highlighted code at line 172:

<DIV class=”comments-link”><SPAN><SPAN class=”screen-reader-text”>{post-title}</SPAN></SPAN></DIV>

Again, this code is ‘built’ by LW from the theme’s generated code for a page. The {post-title} is used by LW for the input area for the post’s title. Note that it is surrounded by the CSS Class called ‘screen-reader-text’. That’s an indication of where our problem of not seeing the title area on the LW screen. Compare that to the code in the index.htm file for the Chaos Manor Reviews site, which uses a different theme:

<H2 class=”entry-title”><A href=”http://www.jerrypournelle.com/chaosmanor/”>{post-title}</A></H2>

That gives us a clue as to the problem with the disappearing Post Title area on the Chaos Manor LW screen. The {post-title} is surrounded by the ‘screen-reader-text’ class. In the CMR code, there is a different class. So looking at the ‘screen-reader-text’ class is our next step. Here’s the CSS code for that CSS class.

.screen-reader-text {

position: absolute;

left: -9000px;

}

Digging into our knowledge of CSS stuff, we see that any HTML code using that class will have the text positioned 9000 px (pixels) to the left of the current position. That will position the visual text off of the LW screen (and off of the browser screen when the page is viewed). Screen reader applications (for the visually impaired) will be able to read the text, but a ‘normal’ view of the page in a browser will not show any content surrounded by that CSS class ‘div’.

Since LW editing screen uses a browser-representation (based on the LW template created from the site’s theme) of the post, that CSS class was the cause of the disappearing post title area on the new (with the current updated theme) installation of LW on Dr. Pournelle’s ‘Swan’ system. That particular bit of code is not in the LW installs for the old version of the Mantra theme, which is why Dr. Pournelle was able to see the Post Title when using LW on those systems.

The Fix is In – Sort Of

So, how to fix that? The quick way is to use the Ctrl+F11 toggle to get out of WYSIWYG mode on the LW editing screen. The disadvantage of that is that you can’t see what the post will look like when published. For instance, the ‘block quotes’ we used above to show the code contents will look similar to the web site version, with indentation, a lighter gray background, and a white border around the box. If you toggle off the WYSIWYG mode in LW, that area is just shown as indented text. But that is a Good Enough solution for Dr. Pournelle.

You could modify the Mantra Theme to not put in that CSS code. That takes away some of the ‘accessability’ of the site to visually impaired visitors.

And there is a risk in modifying theme code, unless you use a ‘child theme’ (as we do on both sites, and on all WordPress sites we make). If you don’t use a child theme, any changes you make to the theme’s code or CSS will get destroyed with a theme update. And if you do use a child theme, you may have to duplicate a lot of the theme code – depending on how the theme is ‘built’. Either way, some PHP skills are needed (among other skills). We’ve done child themes, and recommend them, but there is some effort involved.

You could change themes, of course (and we may do that at some point on the Chaos Manor site), but that requires a lot of testing and tweaking the new theme; things that you don’t want to do on a live site. And finding just the right theme with all of the features you want to have can be quite a ‘time sink’. You can spend hours finding the right theme (I’ve done it). And still things aren’t quite what you want.

Or you could build your own theme. There are theme building templates to help out, but that is still a bunch of effort requiring PHP, HTML, and CSS skills. (Again, I’ve done it – or at least, started on the process. Many hours/days/weeks of coding and testing are required to build a theme that works well.)

Now, it may be, as the executive editor and web guy of the Chaos Manor and Chaos Manor Review sites, that I’ll change the theme of Chaos Manor to be closer to Chaos Manor Reviews. Again, time is involved in that.

But in the meantime using the Ctrl+F11 key to toggle in and out of WYSIWYG mode just so Dr. Pournelle can type in the post title is the best short-term solution.

Whither Live Writer?

There aren’t many good alternatives to LW. You could use the native editor in WordPress, but that requires learning a bit about the WP Admin area. LW is great, since it doesn’t require any access to the WP Admin area.

You could use Word and publish there. That would work with simple blogs, but once you get into more than the basic formatting, Word is not the best solution either; it creates a lot of HTML ‘gunk’ in the page code.

Will Live Writer ever be more than ‘good enough’? Microsoft has announced that they are planning on taking it open-source, which might fix all of the little problems it has (including, hopefully, this one). No announcement of when that will happen, though. One can hope that it will be Real Soon Now.

So, we’re stuck with Live Writer. It is, overall, a great way to easily ‘blog’.  You do have to work around some issues. But it is almost ‘good enough’.

What do you think? What is your favorite blog editor? Let us know in the comments. And if you have a story you’d like to share on Chaos Manor Reviews, let us know here.

A Raspberry Pi Media Server

clip_image001If you haven’t heard, the Raspberry Pi is a card-deck size small computer, great for tinkering around with for many projects. There is tons of information and projects that are available about the Pi; just ask your local search engine. (This article is not meant to be a full technical review of the Pi, there are many other sites that have done that. This page from the Raspberry Pi Foundation has the Raspberry Pi specs and capabilities; lots of other info on that site to understand the power and capabilities of the Pi.)

I was intrigued with the concept of using the Pi as a little computer, thinking back to the days when I would build my own PC. My first personal computer was the original IBM PC model 5150. It had 16K of memory, the 8088 processor, and the operating system was Cassette Basic. (Yes, I am that old.)

I spent about $5K on the whole setup, including an RGB monitor and some accounting software for my wife to start an accounting business with (that was the excuse for the computer). I added 256K more memory with an expansion card, added a couple of 360K floppy drives, and later a 10MB hard disk. It was very powerful.

Then. It was very powerful back then.

More Power!

The Raspberry Pi is more powerful than my first computer. And it costs $35 for the computer itself. I decided that I wanted to get one and try to build a media server for my large collection of DVDs.

clip_image002You do have to spend a bit more than $35 to make it a usable system. I found a kit from Canakit on Amazon that included the Pi Model 2 (Quad-Core 900 MHz 1GB RAM), a 2.5 amp power supply, Wi-Fi dongle (Ralink RT5370 chipset), the 8GB MicroSD memory card (which included the NOOBS operating system), a heat sink, and a case. All of that for around $70. I got mine from Amazon here, there are several kits available; you can get only the parts you need for your project. (That link has all the details and pictures of the parts in the kit.)

I then got a 1TB USB hard drive ($70, such as this one), along with an powered USB hub ($18 here), since the USB port on the Pi doesn’t have quite enough power to run the hard drive. So my total expenditure was around $160. A bit more than the $35 cost for the Pi itself, but much less than my original computer; and more powerful.

SWMBO allowed the purchase, because her hobby (scrapbooking) results in almost daily delivery of supplies for that. So my expenditure, in the grand scheme of ‘things we do around here as a hobby’, was acceptable (she also is the CFO here).

All the pieces arrived in a couple of days.

A Great Tutorial

In the meantime, I poked around the Interwebs for a good tutorial on how to set up the Raspberry as a media server. And I found an excellent tutorial from a guy named Mel Grubb ).

Back in the old days (pre-Windows, DOS 1.0 days), I got pretty good at doing computer things with the command line. The Raspberry Pi OS is “Raspbian” (aka ‘”NOOBS”), which is a distribution of Debian. That’s Linux stuff, which I have played around with some over the years, but never on a daily basis. But I knew some of the concepts, so it wasn’t totally unfamiliar.

Using Mel Grubb’s excellent tutorials, I was able to enter the necessary incantations to install and configure all of the software needed to set up the MiniDLNA media server software. I won’t repeat all of that here, but Mr. Grubb’s tutorials are the place to go for a very easy and clear introduction to getting things going. In addition to the MiniDLNA instructions, he tells you how to set up a NAS (Network Attached Storage), BitTorrent, a VPN, and more. All very readable, with clear instructions.

The result was a full configuration of the Pi as a Media Server, with remote access to the Pi (via Secure Shell – SSH and control via the Webmin interface) so it doesn’t really need a keyboard or monitor for things to work. There were a few false steps along the way, so I needed a bit more googles (plus some questions answered by Mr. Grubb)  to figure things out until I got things working as I wished.

Ripping and Copying

Now that the Pi Media Server was configured, and visible on my home network, it was time to figure out how to ‘rip’ the DVDs into media files that could be stored on the USB hard drive.

After a few false starts, I settled on the WinxDVD software at about $35. Installing the software is the usual install wizard process. Operation is just a basic three-step process; insert the DVD, click a few buttons, and the software rips the DVD into your desired format (I chose MP4). The ripping process takes about 45 minutes per DVD, depending on the DVD length, and the hardware capabilities of your computer. I ran it on my HP Pavillion DV7 laptop under Windows 7, all was well after the Windows 10 upgrade. The WinxDVD software runs nicely in the background, so I could use my laptop for other purposes while the ripping was being done.

The Pi is on my local network via it’s wired and wireless Ethernet connections, although video playback via wireless is just fine without any ‘stuttering pixels’.  I could transfer the files from my laptop’s wireless connection via the LAN. I found it a bit faster to shut down the Pi and connect the USB hard drive to my laptop and transfer the media files that way. Once the USB hard drive is connected back to the Pi Media Server, and the Pi is restarted, all of the DVDs I have ripped are available on my networked media Server.

Viewing the Movies

We have a Roku box here connected to the main TV (well, there’s another one for another TV). The Roku is connected to our LAN via a Wi-Fi connection. It comes with an application that will connect to the Pi media server. So all access to the movies on the Pi media server can be done with the Roku remote. The quality of the movies is just as good as the DVD player. (The WinxDVD software will not handle BluRay DVDs, but they have a BluRay version available.) We’re not that picky on movie quality, DVD 1080i is just fine. And we have some old DVDs that are only 720p resolution, but those play just fine through the Pi Media Server. I also copied some digitized home movies files, and they viewed fine. (Now I have even more things to embarrass my children when they visit.)

There are ways to convert your old video tapes to MP4 files; a quick search came up with this article; the process would require a RCA to USB cable and some software, plus your VCR. I haven’t done that, but it would be a way to get more old home movies onto the media server. There are also services that will do that conversion for you.

As I was reviewing this article with the Chaos Manor Advisors, Eric Pobirs chimed in with this observation on VHS conversion:

    I did that quite a lot for my sister when she had a  VHS-C camcorder. The product we used was the Pinnacle Dazzle, which came with their Studio software. (They’ve since been acquired by Avid, so they have heavy duty connections.) That product is now a lot less, listing at $70. It should work with anything that outputs to composite or S-Video.

http://www.pinnaclesys.com/publicsite/us/products/dazzle/dvd-recorder-hd/

There are a lot of much cheaper products out there. A few months ago I got an SIIG branded device that Frys was selling for under $10 on a promo. Just in case I ever needed such a thing again as the Dazzle went missing at some point after my sister switch to a digital video camera.

The Pi MiniDLNA Media Server will also handle pictures and music, so you can access those via your LAN, just copy the files to the appropriate folder on the USB hard drive. The 1TB drive will hold a lot of movies and music. The MiniDLNA software on the Pi will handle multiple hard drives if needed.

Wrapping Up

clip_image003The whole project was fun to do. So much so, that I made two more (one for each daughter and their family). I got a simple wooden case from the local craft store, and a can of brown spray paint. I cut a notch in the back of the wooden box for cables, and used some Velcro squares to mount the three pieces in the box. The picture with this article shows the finished project. (It is convenient to have a little surge protector to power up everything, since the Pi doesn’t have a power switch, just an external power supply.)

The Raspberry Pi is an interesting platform for anyone to try out. The Pi’s Raspbian NOOBS OS comes with MineCraft, plus the Python programming language, and a simple programming language for kids to try out.

There are a ton of projects out there for the Pi;. I’ve seen home security, robotics, motion-sensing cameras that send texts alerts, flashing lights, wireless phones, mini-tablets, and more – any web search will find ideas for your own projects. Home schoolers will be able to find projects that will help any kid ‘get geeky’. If you can think of a project for it, chances are that someone has already done one that you can try out.

Since the Pi’s operating systems is on the memory card, you can swap it out for your different projects. The ‘HAT’ interface (Hardware Attached Things) can be used to control just about anything. The possibilities are many and varied.

And it’s much cheaper than my original $$$ computer.

What do you think? Have you made something with the Raspberry Pi? Use the comments below to add your thoughts. Or write up your own story and submit it to Chaos Manor Reviews for publication consideration – details are here.

Browser Compatibility, ‘Software’ vs. Software, and Video Playback

Alex returns with Part 2, wherein he covers improving browser compatibility and why video streaming is harder than it should be.

Part One was about browsers vs. operating systems, a mysterious Firefox upgrade, and changing default browsers. Browser compatibility is next: All of them are improving, and that’s great for both users and developers.

I Do Believe It’s Getting Better

256px-Globe1.svgRemember when browser compatibility was a big headache? Html5test.com does, and provides a point score to boot. Firefox 39 scores 467 of 555 points; Chrome 44 scores 526; Safari 9.0, 400 (See results here). A closer look at the compatibility issues for Firefox shows many are with form input types—not a dealbreaker for many—but in general, browsers have improved hugely in the last few years.

Better browsers were supposed to replace operating systems. Instead, they’re slowly replacing local applications, especially for mobile. Today, companies write applications, apps, for iOS and Android, which you download through the App Store or Google Play. Over time, many apps will become HTML5 wrappers, using CSS3 or JavaScript for many functions, then most, then all. When they become “all” HTML5 or 5.1 (Due next year), then more complex programs can become a URL, instead of an app? Maybe.

That plan relies on stable, known, well-understood browsers on each target platform, browsers which execute code quickly and reliably, without too many versionitis headaches. It will be most prevalent for free (as in free beer) apps, ones that don’t require local hardware access, store their data in the cloud, and never have in-game purchases.

And then? Will the browser-as-app revolution bypass the app stores entirely, even for non-free games and productivity tools? Probably not. Both Apple and Google make quite a bit of money via their app stores, so they aren’t going to look kindly at any bypass movement. Developers, too, know exactly what royalty rates they’ll earn. Users are familiar with the current purchase methods and won’t be persuaded to change easily.

Users also look at app store certification as a mark of quality (Rightly or wrongly). Apple, especially, will look unkindly at any browser-based not-an-app that requests local hardware access, and clamp down on non-App Store plays. There are already self-load and side-load apps for Android (straight apps, not browser based); Apple makes this very difficult without rooting your iOS device.

Summary? Download-and-install-yourself will continue as the dominant model for PCs and Macs—even outliers like Salesforce have downloads for Chatter and local file replication. On mobile, expect the app stores to be the standard purchase method, even if they’re HTML5 in a local-code wrapper, for at least two more years.

Replacing Flash and H.264: A Look Into the Crystal Ball

Under the covers, my remaining computer slowdowns and mysterious stoppages may be an interaction between Flash and Chrome. So far, using Firefox instead, they’re minimal. What caused them?

For nearly everyone, Flash means video playback, particularly for Youtube and Facebook. (Yes, there are Flash games, but ever fewer.) Can you live without the Adobe Flash plugin?

In the future, yes: You’ll click on a link to an HTML5 video, played by the browser itself. No plug-in, and it’ll Just Work. The HTML5 web specification is specifically designed for video playback, natively, within a browser; no plugins to separately update. First introduced by Opera Software in 2007, it’s been The Next Thing to replace Flash video for nearly a decade, but progress has been slow.

Background: HTML5 video (Using the <video> tag) is a container, a video element for the web, not a playback format or encoder standard; the most popular HTML5 encode formats are MPEG-DASH and H.264. That difference, between container/file format and encoding, confuses many. For instance, if you save a Flash video on your computer, it’s probably a .FLV extension, but inside it might be a Sorenson Spark, VP6 or H.264 encoded file. Flash itself opens the file, reads a header, and loads the correct decoder to play back the video. All that works automatically, except when it doesn’t.

clip_image001The HTML5test of Firefox gives some hints on the complexity under the covers; here’s all the checkboxes for video playback compatibility:

 

(Results are from Firefox 44 on MacOS 10.10.4; I suspect yours will be similar.)

Notice the Codec list: Each is a separate option. Notice also that WebM, an open-source media file format appears fully supported.

YouTube has been promising universal HTML5 video playback Real Soon Now for several years; time to check on progress. Compare the above to the YouTube HTML5 Video Player detection page for the same system:

clip_image003

(Note that HTML5test believes WebM VP9 playback is supported, while Youtube doesn’t.)

Your browser probably supports basic HTML5 video playback; try this link.

Do videos play in Flash or HTML5 format? Open YouTube, select a video, let it auto-start, check the format by right-clicking, see whether it’s the HTML5 or Flash player. Some videos (Especially from Vevo) attempt to play as HTML5, they fail, then, upon reload, play automatically as Flash.

Can I force HTML5 playback? Disable Flash on Firefox on the Mac, restarting, go to YouTube and attempt to play a video. It shows… black. Right-clicking shows the embedded video is indeed shown as HTML5, but it’s both silent and dark. As for Facebook, with Flash disabled, clicking on an embedded video shows a banner suggesting you download the latest version of Flash—no HTML5 playback yet.

These are playbacks from the Mac browser, not mobile clients. Facebook has their own apps for iOS and Android, handling video playback directly. Unsurprisingly, YouTube (Owned by Google) is a native Android app; it’s also a very popular app for iOS.

Back on the Mac, all is not perfect. If a link opens a webpage which is neither Facebook nor Youtube, embedded Youtube links therein do not play. You must copy the Youtube URL, copy that to the browser, load Youtube itself, then play. This took some persistence to learn.

Still, for Youtube and Facebook, I finally have reliable Flash video playback within Firefox 39. Google Chrome plus MacOS is still a problematic combination, so I don’t use it. Still, it’s progress—I have one browser I don’t have to close, should I wish to actually be productive—and I’ll take it. Arguably, I would have gotten more work done by not updating Firefox to work with Flash in the first place, but that’s a personal problem.

Video Formats and the Future

Many smart people have continuously improved digital video quality—more quality at any given bitrate—for several decades. Anyone who watched Video CDs (VCDs), with their ghastly, blocky, low quality look in MPEG-1 format, remembers how far we’ve come. Yes, we’ve got far more bits-per-second to watch, but encode efficiency has increased, too.

The latest CODEC is HEVC, High Efficiency Video Coding, alias H.265, 20% to 50% more efficient than the current H.264 standard, and over three times as efficient as MPEG2. Mostly, HEVC will be used to push 4K and higher resolution video via pipes as small as 8Mbps, mostly for online delivery. On the pro side, I don’t see HEVC used as a production format—as in, for moving realtime, low-latency video from camera to switcher via Ethernet—for at least a year.

One of the two formats within HTML5 video, MPEG-DASH (often just “DASH”), dynamically adapts to available bandwidth and to changes in video complexity: If you move from a crummy 3G connection to your home Wi-Fi, picture quality should automatically improve, once this solution works. Hulu’s moving to DASH, though few others have, yet. Expect DASH to appear in mobile playback products, where adaptive-rate playback will be the most valuable.

HEVC will also be delayed a bit, before it hits volume use; there’s no single patent pool yet agreed upon for the rights, and there’s tussling (Not yet lawsuits) about royalty rights. As for containers, HTML5 will, presumably, be the delivery method, replacing Flash. Certainly, that’s the fervent wish of the streaming community, if Streaming Media Magazine is any indicator.

But that’s all at least 12, more likely 24, months in the future. Until then, you will see a patchwork of formats: More Flash updates (Currently on version 18), more Flash security vulnerabilities, slow adoption of the HTML5 video standard, MPEG-DASH and WebM appearing, then a slow delaying action by Adobe as Flash fades away.

Coming Up

Next time, it’s on to Apple Mail and Spotlight, then installation of a Solid-State Disk (SSD) in the Mac. Your thoughts for future installments are also welcome.

What do you think? Your comments are welcomed, along with ideas for new subjects for Alex and the other Advisors. And if you have a story to tell, start here.