Detecting Vulnerable “Internet of Things”

magnifier-492539So the big news last week was the giant attack by the Mirai malware/botson Dyn that effectively killed (well, seriously wounded) the Internet for a lot of people. And that the “Internet of Things” (IoT) was the source of the attack, because of bad security practices (devices with ‘backdoors’ and default passwords) on those devices.

I’m not going to explain what happened. If you are interested in this subject, you probably already know that the attack was done by the Mirai malware. The rest of you can ask the googles if you need an explanation of what happened.

And I am not going to explain how Mirai works, or that you can get a copy of the Mirai malware source code.

The thing that is not clear to many people:how can you check to see if your devices on your network, whether home or work, are susceptible to the attack by the Mirai attack.

The basic attack is through specific ports on your network, visible to the outside (external to your network) to devices ‘inside’ your network. So to test if your network is vulnerable, you need to check from the ‘outside’ of your network.

To do this check from the ‘outside’, I recommend the venerable (fancy term for old) “ShieldsUp” check from Gibson Research. This is a free tool that will scan for open ports on your network (this should work on any OS or network).

But, before you do that, make sure you have the permission of the owners of your network. Attacking – or even scanning – a network you do not own can be a felony in the US, and probably other countries. So, before you proceed, make sure that you have the networks’ owners’ permission.

You can check your own home network, though, since you are the owner. But, again, only do this scan on networks you own, even though the scan is very benign.

You can find the Gibson Research “ShieldsUp” tool at http://bit.ly/2dA9Ubd. Carefully read the information on that page. (For instance, that page will show you your unique identification that every web site can find out. Even the ‘private’ function of your browser will disclose that information. Again, read the page carefully to understand the implications.)

Once you have read the info on that page, click on the “Proceed” button (either one). On the next page, read the information, then click the orange button to check your exposure to UPnP (Universal Plug and Play).

image

The test will take under a minute, then the result will be displayed. If your network is OK for that test, you’ll get a nice green message. That’s good. If your network has problems, there will be some explanation of what you should do. We’re not going to go into any of that “What You Should Do” stuff, it’s pretty deep and complicated.

The next step is to check for any open ‘ports’ on your network. Go back to the testing page (the page you saw when you clicked on the “Proceed” button). On that screen, these series of buttons are the next step.

image

Run the “Common Ports” test first. Then run the “All Service Ports”. As with the first test, you are looking for all ‘green’ results. Any bad results will be listed, along with explanations. Again, we aren’t going to explain things here; if you need more info, look at the site’s explanations, and ask the googles if needed.

On my computer on my home network (which I own, so I have permission to scan my network), I got ‘all green’, as shown in this screen shot:

image

Hopefully, you will too. If you don’t, then proceed from there.

Fixing Broken Windows 10 Apps


Reader Sean Long submitted this tip for fixing broken Windows 10 applications. If you have a tip that will help CMR readers, let us know. And add your comments after the article.

I have another Windows 10 tip that seems to be a hot topic in help forums but doesn’t have a consistent fix.

The problem I had was I tried to bring up the default Windows calculator, and it wouldn’t run. Since I had fiddled with the default Windows 10 apps before, I figured I just needed to re-install the calculator app. When that failed, I tried to brute-force reinstall all Windows 10 default apps, and that resulted in ALL of the windows 10 apps becoming unusable.

The issue is that some of the Windows 10 apps are super annoying, so many people have been trying to uninstall one or more of the default apps. Unfortunately under the current build of Windows 10, the installer appears to be badly broken so both uninstalling and attempting to reinstall the apps can make all of the Windows 10 default apps unusable. They can’t be uninstalled, they can’t be reinstalled, they don’t work, Windows store breaks, and Microsoft considers them core components so they don’t even show up in the programs and features control panel or settings applets so you simply can’t fix them yourself.

For an example of a badly behaved windows default app, the new Windows 10 photos app will continuously attempt to scan, index, and enhance all images in all libraries. That’s great if the library is only on your local drive but if the library is located on a networked computer, it will saturate your network and thrash the remote library’s hard drive endlessly.

One unsatisfactory workaround is to go to your libraries and remove all libraries on networked drives, but you shouldn’t have to do that if the Windows default apps didn’t have these horrible and destructive behaviors set by default. So instead of removing networked libraries, you can fix the problem by removing whatever Windows app (photos was the worst for me) that is causing the problem.

Of course, many people realize after the fact that they really did want that app back. So the “magic” re-installation command that you could enter into the PowerShell program (run as administrator), as found on a dozen websites and help forums, is:

Get-AppXPackage | Foreach {Add-AppxPackage -DisableDevelopmentMode -Register “$($_.InstallLocation)\AppXManifest.xml”}

Detailed instructions can be found on many windows help forums so I won’t go into more detail than that.

Unfortunately under the current mainstream Windows 10 build (as of 16 Jan 2016), that will wreck all Windows default apps and make them unusable. Oops. There are a handful of other approaches to get around this including some registry tweaks and resetting permissions, but the bottom line is that for almost all users, attempting to uninstall or reinstall the default Windows 10 apps will likely break all of them without any way to repair or restore any of them, including the windows store. Thankfully, there is one solution, although it reminds me of buying a new car every time you need an oil change.

The solution for now is to go to the Microsoft Windows 10 installer site here: http://bit.ly/1ZC6vVG and re-run the Microsoft Windows 10 installer. It will do just what the original upgrade did, leaving your current apps and files alone and restoring any lost functionality. It can take an hour or more depending on computer and network speed, but I’ve had to do it on 2 separate computers now without any failures, using the online installer.

Did this help you? Any other ideas? Let us know in the comments. – Editor

Bright Ideas

incandescent-lightbulbOne of the Chaos Manor Advisors saw an article about a new kind of incandescent bulb that was brighter than normal bulbs.

Researchers at MIT have shown that by surrounding the filament with a special crystal structure they can bounce back the energy which is usually lost (see article here).

This seemed interesting, and brought forth a comment from Peter Glaskowsky, another CM advisor, expressed some skepticism:

Some of this story is simply false to fact, but other parts are correct.

For example, LEDs with CRI values over 90 (and up to 97) are widely available, and some of these also provide high-quality red tones (the part of “warm” that is particularly noticeable to some people).

On the other hand, it’s correct that LEDs with high CRI values are only around 14 percent efficient, as the story says, so if the MIT solution can increase this figure to 40%, that could be good.

But MIT has only achieved an efficiency level of 6.6% and these researchers haven’t even identified a theoretical basis for surpassing 20% efficiency—about where the best LEDs stand today—so they’re a long way from claiming any real advantages.

Also, the technology behind this invention looks expensive and has some limitations. First, the reflectors are made using semiconductor-like materials and processes—up to hundreds of stacked layers of exotic materials that have to be made with high precision or the product won’t work right. There’s no precedent for using these processes on highly curved surfaces, either.

Ultimately it isn’t at all clear to me that it will ever be possible for this technology to surpass the combination of cost, efficiency, and color quality offered by LEDs, which is not the conclusion invited by MIT’s press release.

I’ve seen many of these factually questionable and unjustifiably optimistic MIT press releases in the past, suggesting this is either a deliberate strategy or just a quirk of someone in their press office who really ought to find something else to do for a living. MIT does plenty of good work; there’s no need to hype it past all scientific justification.

Your CMR editor has a supply of incandescent bulbs, many bought before some sizes were outlawed by the US government (info on incandescent bulb ban here). There are several places in my house where the lights tend to be on all the time (partly due to need, partly due to laziness). Here’s my thoughts:

I have started a slow process of replacing my incandescent bulbs (and CFL’s) with LED lights (for A19 base 2700K bulbs, I got these http://amzn.to/1PsxE8h ; 6 pack for about $21.00). I have a full set of six on the light fixture above the dining table, just a few feet away from my usual spot in the living room. That light is always on, even during the day. They are 60W equivalent, and are brighter than the incandescent bulbs, and the CFL that I tried in the same fixture. And they are full-bright immediately, rather than needing the warm-up period of CFLs.

I have also started using LED bulbs in various ceiling ‘can’ fixtures (65-watt equivalent, BR30 bulbs, use 7 watts), using them as the old bulbs (incandescent and CFL) have failed. They appear to provide more light, and again do not need the warm-up period of CFLs. The 65-watt LED bulbs will eventually replace the older BR30 bulbs in the entire house, especially in the kitchen, where the slow-brighten time of CFLs is problematic.

The 65-watt BR30 bulbs were purchased from Amazon http://amzn.to/1Psxrlp , 6 for $35, with free shipping courtesy of Amazon Prime. They are ‘dimmable’.  I notice they are out of stock at the moment, but LED bulbs are available in many places. (Some local utilities are also subsidizing LED purchase.)

So far, pleased with them. I am assuming that they will be a positive effect on my electricity bill. And using the bulbs in the family room, at 7 W each instead of 65W, will allow that circuit to be powered by my generator during any power outage. (That light circuit is on the same circuit as the TV, which I plan to power during any outage with my generator.)

Advisor David Em is also an LED bulb proponent:

Last year I bought fairly inexpensive dimmable full-spectrum LED spots for the studio. I love them.

Also happen to be reading Oliver Sacks’s “Uncle Tungsten” [Amazon link ] at the moment, which has some interesting discussion of the history of light bulbs.

What do you think? Are you moving towards LED bulbs, or are you holding out with incandescent? Or are CFL’s your choice? Let us know in the comments.

Access Pre-Windows 7 File Shares on Windows 10


Sean Long, a Chaos Manor Reviews reader, had difficulties with the Windows 10 upgrade making his pre-Windows 7 file shares inaccessible. After much trial-and-error and incorrect information on the Googles, he figured out how to fix the problem. His solution is below. The usual precautions apply.

Thanks to Mr. Long in sharing his problem and solution. CMR is interested in similar problem-solving from CMR readers. This page has the details on how to share your solutions to computer issues. – Editor

file-sharing-iconFirst, if your win10 machine can’t access pre-win7 file shares (winXP, windows home server, some linux or NAS versions), go here http://bit.ly/1ZiNLdz

The original response doesn’t seem to be a complete answer, but down in the comments is the actual solution:

There is a setting in windows Local Security Policy which is incorrectly set by default for viewing an older communication protocol NAS.

To access said setting go to the Control Panel in Windows 10 (or 7), in Category view click on the text “System and Security”, then click on the text “Administrative Tools”. Now double click and open “Local Security Policy”.

In the Local Security Policy screen on the left navigation tree, expand the “Local Policies –> Security Options” then about 2/3rd’s the way down the list you’ll see a Policy called “Network Security: LAN Manager authentication level”. Double click and change the setting to be “Send LM & NTLM – use NTLMv2 session security if negotiated.”
Then just press OK and close all of the open windows and then try again.

In the case of Windows 10 Home, Local Security Policy does not exist (thanks Microsoft); therefore make the change in the registry (use the REGEDIT program). Find the indicated entry, then add a new entry as detailed below:

HKEY_LOCAL_MACHINE\System\CurrentControlSet\control\LSA
Add:
LMCompatibilityLevel
Value Type: REG_DWORD – Number (32 bit, hexadecimal)
Valid Range 0-5
Default: 0, Set to 1 (Use NTLMv2 session security if negotiated)
Description: This parameter specifies the type of authentication to be used.

Basically, Microsoft failed to set a critical security setting (it is set to null by default), and it needs to be set to something in order to connect to Windows XP or Windows Home Server file shares.  Easy fix, stupidly hard to find though.

Second, if you have a laptop or tablet that can’t get through the Windows 10 version 1511 update, it may because you have an SD card installed.  To get the Windows 10 1511 update to install correctly, you must remove the SD card before starting the update process.

If the update has failed and no longer shows as an available update, you will need to go to the Windows 10 upgrade page and re-run the Windows 10 installer.  It will recognize that you already have Windows 10 installed, and will patch it to version 1511.

The critical point is to have the SD card removed throughout the entire update process.  If you are like me and have moved user files to the SD card, don’t worry, it worked ok for me when I popped the SD card out right before running the installer, and put it back in before logging in after the update was complete.

Thanks to Mr. Long for his report. Enter your comments or questions below. And share your computer stories with CMR readers; start here. – Editor

Chaos Manor Network Attached Storage Upgrade And Retirements – Part 3

Now that Eric has finished the NAS project installation and configuration, it was time to consolidate data from various systems, upgrade others, and consider system retirements. Part 3 of the project continues with Eric’s narrative:

typewriter-297383_1280With the NAS/RAID project completed, the next step was to track down the various elder PCs that were serving as network backup locations, back them up to the NAS, then shut them down in hopes of making a dent in the frightening power bill that had been shrugged off in the days when Chaos Manor was like Frankenstein’s Castle for PCs, with a gruesome experiment running in every corner.

Some of these machines were quite long in the tooth and were due to go off to the farm where they could run and play with the other PCs. They weren’t in operation and adding to the power bill but it was time to clear some space.

First up were the Satine and Roxanne systems. Satine was a Socket 939 (indicating a long ago AMD CPU generation) system with an Nvidia 6000 series video card and a pair of 500 GB drives that may have been mirrored.

The googles note an early mention of the Roxanne system in February 2008: http://goo.gl/lxT07M : “Roxanne, the Vista system that was the main writing machine before Isobel”.

An early mention of Satine – nee’ Sativa – is around July 2006: http://goo.gl/gy1lP3 , also here http://goo.gl/zwR0e8 . Earlier mentions could be in the Chaos Manor columns printed in Byte magazine, but web versions of those sites are not available, at least with a quick search.

Beyond that it was hard to say because Satine was inoperable. It would spin up the fans and do everything preparatory to booting but never would. Not so much as a POST beep of any kind. The properties of some system files indicated it was running Windows XP SP2, so there was probably little value there for anyone beyond salvaging the drives.

So the hard drives were backed up and formatted, placed back in the case, and Satine remained as a project for someone with a strange combination of motive and nothing better to do.

Roxanne was more promising. She had last seen life as Mrs. Pournelle’s workstation but had been replaced by a new build when the case vents had become clogged, causing poor Roxanne to overheat before System Restore could run long enough to repair the damage from the first overheating.

Even with the vents cleared the Pentium 4 HT system was rather loud and hot. It isn’t clear to me whether this was always the case and not a bother to Jerry’s artillery-blasted hearing or if it had become compromised at some point.

Certainly it wasn’t worth any significant investment to replace any of the cooling bits. But it was running Windows 7, raising the question of whether it could become a Windows 10 system with a potentially long life ahead. Therein the saga lies.

Updates and Upgrades

On both Vista and Windows 7 there was just the one Service Pack. (Windows 7 may still get a second due to its business footprint but I’m not holding my breath.) Going back to NT 4, Service Packs were once produced far more frequently. Internet access was far less widespread and the need to store updates locally for installing on numerous machines was far higher. It was especially helpful if a Service Pack replaced its predecessor. Install SP4, and SP2 and SP3 were included in that update.

As the internet and live Windows Update downloads became more the standard, it became more of a hassle to update a new install or a machine that had been offline for a long period. By the time of Windows 7 in the days after Windows 8 had launched, this had gotten a bit painful.

Businesses of the scale to have their own WSUS setup [Windows Software Update Server, a ‘personal’ Windows Update system to centralize and manage updates across the business environment] or enough identical machines to use an updated image weren’t bad off but supporting the SOHO market got annoying. I had one experience where several refurb PCs that came with Windows 7 SP1 needed almost 1.5 GB of downloads to be fully updated. This was a very slow process regardless of how good your broadband speed might be.

Well, Roxanne hadn’t been to Windows Update in over two years. On the first attempt it spent nearly two hours figuring out which updates it needed before I noticed the gas gauge animation had stopped. The Update app was frozen. Start again.

This time it was a bit better. It installed about 150 updates before stopping and announcing it had an error it could not work past.

Along the way, the Windows Defender anti-virus scan announced finding some malware. It was deleted but found several other copies as it worked through the drives. Roxanne had received copies of older machine’s content in the process of originally entering service and consequently the malware implanted itself in each copy it found of certain system files. This was one of those packages that would break Windows Update as part of its activities. Why I was able to get as many updates installed as I did before it kicked in is a mystery.

This meant going off and searching out the ‘fixit’ app to correct the damage. Still more downloads. Then Windows Update failed with a different error.

At this point Roxanne had been updating, more or less, for about 20 hours. (This could have been reduced some if I’d been there to respond every time it needed human input but I have this congenital condition that requires me to spend a certain portion of each day unconscious. It’s very inconvenient. The doctors call it ‘sleep’ and can only offer short-term mitigation.)

A Different Fix Needed

This time a different fix was needed but it was soon found and applied. Finally, almost a day after starting this quest, it was done. Roxanne was as up to date as a Windows 7 machine could be at that moment in early September of 2015.

But where was the Windows 10 upgrade offer in the notification area? It should have been in the last batch of updates. Apparently this issue comes up often enough that there is an app from Microsoft that looks at your machine and determines if it is eligible, and installs the upgrade app if so.

Roxanne wasn’t eligible for the Windows 10 upgrade for two reasons. One was correctable, the other was not. At least, not at reasonable cost, which in this case is not anything over $0.

The Nvidia FX5700 video card had long since fallen out of the range supported by the company after Windows 7. This could be fixed by replacing it with a newer card that would still be old enough to be free or of negligible cost. The other problem was the Pentium 4 HT CPU. It was too old to have NX Bit support.  https://goo.gl/4dgKB0

headstone-312540_1280Considering how much past malware misery could have been prevented if this NX Bit feature had become common much earlier in microprocessors, it represents a perfectly reasonable place for Microsoft to say “here and no farther” when it comes to antique hardware. The last generation of Pentium 4 did have the NX Bit (Intel calls it XD bit) added but this was after Roxanne’s CPU came out of the foundry.

So there it ends for Roxanne. She may find a home yet and brighten some life but we are done with her and bid her farewell.

Wrapping Up

The project met the need of providing a centralized and more efficient data storage for the Chaos Manor network. By consolidating data into that central location with high capacity drives, Chaos Manor gains efficiencies in data storage (data is not in several different places, with the attendant work of synchronizing data amongst several systems). It also makes backing up that data more centralized – which will be the focus of an upcoming project.

Using a RAID 6 configuration allows for data reliability, although you shouldn’t rely on just a RAID 6 for backup or data recovery, it is more of a reliable centralized storage system. You really need to have a process in place for off-network storage of your important data.

As for older computer systems: there’s “old” and then there is “really old”. Determining the moving target that divides those is necessary to decide whether a system should be donated or simply sent to e-waste.

And so ends this project. We hope that it has provided you with useful information, and perhaps some thought for a similar project of your own. We’re always looking for guest authors, see here. Let us know what you think in the comments below, and share this article with others.

Chaos Manor Network Attached Storage Upgrade And Retirements – Part 2

We continue with the story of the new NAS system at Chaos Manor.

Home-Server-iconAs the project got closer, Eric did some more research about the initial setup of a RAID 6 system.

From what I’ve read on Netgear’s support forum, the initialization of the RAID 6 setup can take a LONG time but since the unit isn’t in active use yet this will only be a mild annoyance.

Eric noticed a Netgear RN10400 RAID system was advertised by Fry’s, so he decided that would be appropriate for the project. (See http://goo.gl/lYAl2p for product details.)

There wasn’t so much a survey [of system possibilities] as a Fry’s ad offering the unit for approximately $100 off its normal price. At the time I grabbed it there were potentially four different places it might have gone. If I had used it as a media server at home I would have gone for a JBOD configuration to maximize capacity as nothing on would have been irreplaceable.

The initial setup [of the RN10400] was very straightforward. A full set of four 4 TB drives from Seagate (Model ST4000VN000) were purchased and installed in the carriers. This is familiar to anyone who has done much PC construction with cases that provide removable carriers. Once the drive is installed in the carrier, it slides into its slot and connects to the SATA interface.

Upon powering up the NAS defaults to a RAID 5 configuration and immediately sets about verifying the drives and creating the RAID. This would eventually result in a capacity of slightly over 10 TB of storage capacity. Netgear offers an app called RAIDar that searches the LAN for any of the company’s NAS products and reports their address and status. From there you can log into the NAS itself and see what progress it has made and specify a wide variety of options, such as user accounts, access limits, and backups to other NAS or offsite via cloud services. This is also where you’d break up the default volume to recreate in a different RAID configuration.
That part that got tricky was making sense of the instructions. I cannot say if Netgear has updated the firmware and the PDF manual hadn’t caught up yet or it had always been wrong. It also may be intended for a different model and put in this version by mistake. The method detailed for destroying the existing volume simply didn’t apply to the interface presented.

The System Setup and Configuration

Eric continued the work on the Netgear ReadyNAS 104 4-bay NAS system.netgear-rn10200-3-4lft-photo-large

After much consideration of the choices, I decided to accept safety over capacity and move the Netgear’s four 4 TB drives to a RAID 6 configuration. This effectively means that half of the volume’s capacity is consumed by the non-data use of sector to insure the volume can be reconstructed without loss if any one drive should fail.

Eventually I noted the gear icon that had appeared elsewhere and found it was an active link to bring up an otherwise undocumented menu. From there the task more or less matched that described by the manual. The four drives were applied to a new RAID 6 volume which would offer 7.6 TB after 60 hours of setup. They weren’t kidding. It took that long. A search of the forums on the Netgear site produced a post that explained the RAID 6 setup had far more overhead than RAID 5 or lower, and that this was one of the reason their higher capacity NAS products used Intel processors with specialized function for the task.

Another optional setting was ‘Bit rot protection.’ This tries to head off failures on the drive media before data is lost but they warn of a potentially dire performance hit. I turned it on and did note that when uploading the backups from the recently retired machines the throughput was frequently awful, often descending low enough to be measured in KB. Before the next big upload I’m going to switch this feature off to see if it is the culprit. Once the bulk of initial backups are made it may have little bearing on day to day use but we’ll just have to watch and see. I recall some line about those who would sacrifice performance for data security may end up with neither.

Because this is a fairly inexpensive model using an ARM-based SoC [ARM processor, System-on-Chip – think Raspberry PI, which also uses an ARM-based SoC], the process of creating the new volume was quite lengthy, coming in at 60 hours. High-end models most often use Intel processors with substantially better performance in the specific areas complex RAIDs place their demand.

SOC is System On Chip. ARM processors are rarely found all by themselves. The myriad licensees package them with other functions, licensed from other companies or created by themselves, to produce the device specific to the intended purpose. This has become increasingly common across the industry, especially in the consumer sector where the other functions are a given.

Thus most Intel processors these days have a GPU included, and numerous other functions that were formerly found in separate chips. Due to the way ARM’s business model works, an ARM-based device is likely to be far more customized than something out of Intel as it simply isn’t worth Intel’s time to pursue markets below a certain size. Intel does have a line within the ATOM product range that is aimed at NAS builders but at a significantly great cost than ARM-based competitors. You’ll find these on higher end NAS models with a greater range of abilities than were needed here. For example, some high powered NAS products can act as a self-contained media library with the hardware to drive a TV all by themselves rather than relying on a client device.

As Eric worked on the NAS/RAID setup, he set up the volumes as RAID 6 instead of the default RAID 5.

The Netgear ReadyNAS 104 was populated with 4 TB Seagate NAS drives. This, when it finally finishes, will offer 7.6 TB of space to work with

I intend to write at greater length about it but was waiting until the change was done and more material might be added. There is a lot of capability in the device, making for a dizzying array of choices for how best to make use of it.

I’ll also make note that the 60 hours needed to perform the RAID 6 volume creation would be greatly reduced on higher end product using the Intel products that have been targeted at the NAS business. This model, and most like it in price and features, use ARM chips that cannot match the Intel products on the more computationally intense RAID configurations, according to Netgear reps in their forums.

Minor Annoyances Solved

There was some initial difficulty with the NAS alerts emails not getting to Dr. Pournelle’s account; they were going to Eric’s email account. Eric noted that, along with some additional info on the configuration.

Currently the email alerts are coming to me [Eric]. The mystery is why Jerry’s account wouldn’t work using the same setting displayed Outlook remains unsolved.

The NAS has a pair of gigabit Ethernet ports and is administered through an intranet web UI. It supports the installation of apps to add custom features but a brief survey didn’t make any stand out for Jerry’s needs.
Netgear warns that RAID 6 on this product range will significantly reduce write performance but with so few users on the Chaos Manor network this isn’t likely to be a problem. After the initial backup from an actively used PC the following updates should be quick, unless a complete image is being made every time.

The Swan system has an eSATA drive docking slot that lends itself better to that level of backup. A USB 3 external drive should suffice for AlienArtifact and the same drive could also serve the Surface Pro 3 since it has so little storage capacity compared to the desktop systems.

As the project continued, with status reports to the Advisors for their reading pleasure, Advisor Peter wondered:

I’m still curious to know how much of that gigabit capacity can be delivered over the Chaos Manor network, but it probably won’t come very close to the theoretical performance of the array itself, which should be well above 1 GBPS for reads, at least.

That’s really saying something, since I think all the computation is just XOR operations and table lookups in normal operations. But cheap ARM chips have few of the data-movement and vector-processing features of Intel’s cheapest processors, and that’s probably the real issue rather than “computation” per se.

Advisor Brian Bilbrey chimed in about initial and ongoing performance of the system.

A reduced write performance is primarily an issue on initial seeding of the device. Jerry undoubtedly has a lot of data he’d like to centralize, and getting all of that on the first pass will be the ordeal. Keeping it updated thereafter is less likely to be affected by the performance issues.

Eric continued the discussion:

I won’t be surprised if the network speed is much more of a limiting factor and the difference in write performance impossible to discern without much bigger loads than we’re ever likely to create.

Perhaps [it could be measured] if we tried to save an image from every workstation simultaneously. None of those machines has more than 1.25 TB of local storage (250-ish GB SSD and 1 TB hard drive) and most of that empty. (There is 192 GB on the Surface if you include the microSD card.)

If we did full backups every night there would be plenty of time to avoid overlap.

The saga continues in the final installment, as data is moved from systems, and older systems are considered for retirement. Let us know what you think in the comments below, and share this article with others.

Chaos Manor Network Attached Storage Upgrade And Retirements – Part 1

In this three-part series, the Chaos Manor gets an upgrade of its data storage with the installation of a new Network Attached Storage system. Along the way, several old systems are deemed retirement candidates. We start with Part 1, where the project and features are discussed amongst the Chaos Manor Advisors:

Network-Nas-iconThe Chaos Manor Advisors determined that Dr. Pournelle’s various systems needed a cleanup, consolidation, and upgrade of at Chaos Manor. There was a need for a better backup and archiving process, along with some retirement of systems. Consolidating systems and data storage in light of Dr. Pournelle’s mobility problems was another objective, now that the new Chaos Manor wireless network was in place.

One aspect of this consolidation was to create a Network Attached Storage system that had RAID capability to serve as centralized data storage. A backup process was also needed. And there was a need for the archive to be protected against any encrypting-type malware. Although Dr. Pournelle practices ‘safe computing’ that reduces that risk, a protected backup of his data, including past and future books, was deemed to be a good objective for this project. We thought that this similar project would be interesting for Chaos Manor Reviews readers.

Chaos Manor Advisor Eric Pobirs, an experienced technician that works with Dr. Pournelle’s son Alex (as well as doing some freelancing) took the lead on this project. A discussion among the Advisors discussed the configuration and issues involved in creating this NAS/RAID system.

Eric started out with his general objectives:

Well, the idea was to have capacity wildly in excess of need to reduce the amount of management concern it generates once it has been configured fully. The difference in cost for the somewhat safer lower capacity drives is fairly minor and they’d still be at risk for the Rebuild+URE [more on this below] problem. So doubling up on the 4 TB drives in RAID 6 likely works out better than say, 2 TB drives in RAID 5, for a difference of around $100 for the set.

Part of this was offering the example of just how amazingly cheap this stuff has gotten over the years and how a bit of research can lead to better results without massive expense. Now, some might regard this investment as massive expense but creating organized bytes is Jerry’s livelihood, so this is just insurance. Also, the use of better qualified equipment should win back some of the expenditure in reduced power costs for the house. A few percent here, a few percent there…

After some thought, Eric came up with the outlines of a plan.

NAS and Backups

The main objective of this project was to determine how to configure a backup system for all of the computers at Chaos Manor. Backups are important, and Dr. Pournelle has lots of data: massive amounts of emails and his equally massive book files.

netgear-rn10200-3-4lft-photo-largeAfter a survey of the possibilities, Eric decided on a Network Attached Storage (NAS) system that consisted of Netgear ReadyNAS 104 4-bay NAS (http://goo.gl/lYAl2p )

Advisor Brian Bilbrey has much experience with large systems, being a senior systems administrator. He discussed the basics of the various types of RAID systems, beginning with an explanation of an “URE”, an ‘Unrecoverable Read Error’:

Magnetic disk storage sizes are now on the same order of magnitude as the quoted bit error rate for reading data from the disk. That is, if there are 4 TB on a disk, the chances of 1 in 10^14 Unrecoverable Read Error are pretty small. You don’t read a lot from your drive at any given time.

However, if you have an array of five 4 TB disks in a RAID 5 configuration, then you’ve got approximately 4 disks worth of data and one disks’ worth of calculated parity spread across all of the disks. If any ONE of those disks fails, then when you put in a new disk to rebuild that array, ALL 16 TB of bytes will be read to rebuild. There’s a significant chance that during that process, a read will fail. At that point, the array cannot be rebuilt. Data done and gone; restore from proper backups.
I recommend RAID 6 for 4 or more disks, and 2 or 3 way mirrors for 2 or 3 disk systems. Yes, you’re “throwing away” storage. Or, to put it another way, you’re managing the risk of data loss. With RAID 6, during the rebuild, you can lose a disk, suffer a URE during the rebuild, and still have all your data.

Personally, I also buy Enterprise-grade disks, because there’s usually another factor of 10 added to the URE reliability. For more info, use your favorite search engine and the phrase “URE RAID 5” without the quotes.

With that explanation, Brian continues:

One thing I’m pondering in light of the Rebuild+URE problem is whether a RAID 10 might be safer. This would be a two-drive stripe set mirrored by the second set of two drives. This cuts the raw capacity from 16 TB to a ‘mere’ 8 TB, which is still a vast capacity for storing primarily text files. In this case, recovering from a drive failure is more a matter of copying than a complex rebuild and the NAS should keep working with the intact set until the replacement drive is installed.

The Netgear box will also do RAID 6 with the four drives but as the capacity works out the same I find myself wondering what advantage remains, if any. RAID 10 may have the advantage in continued functionality after the loss of a single drive, whereas I have the impression a RAID 6 would be out of commission until the failed drive is replaced and the volume rebuilt.

In 234 pages the manual has remarkably little to say about drive failures, how to handle them, and how different configurations affect this.

Advisor Peter Glaskowsky agreed with Brian, adding:

To add to Brian’s reply, a RAID 6 array not only keeps working after a single drive failure, it still has redundancy– it becomes the equivalent of a RAID 5 array. Even two drive failures in a RAID 6 array will not stop the array from working.

So if you have an effective RAID 6 option, that’s my recommendation too. I know it’s painful to lose half your capacity, but in the long run, that’s better than losing all your data.

Brian added some additional thoughts about the various RAID types:

RAID 6: Lose one drive, you’re running degraded, and can rebuild at leisure. IF there’s a bit read failure during the rebuild, you have the second parity to fall back on.
Lose two drives (or lose a second drive during the rebuild after the loss of a first drive) and you’re running degraded, with no backstop. If you lose a third drive while rebuilding against a two-disk failure, you’re dead.

RAID 10 (and friends): Lose one drive, you’re running degraded. Rebuild, and hope there’s no bit read failure.
Lose two drives, and if it’s half of one mirror pair, and the OTHER half of the other mirror pair, and you’re still running degraded. But after one drive failure, you have a one-in three chance of catastrophic failure during the rebuild, should there be a bit read error.
The point of spinning storage is to have large data available for immediate access. Periodic copies of this data to spinning rust that is then stored offsite with which to rebuild if you lose the RAID 6 is prudent.

One of the considerations of a RAID system is that it is more of a centralized storage area than a full backup/restore solution. As Eric noted:

This article and its links cover the nature of the problem:
http://goo.gl/ungcQI
In short, drive capacity has advanced far faster than reliability and it may not be possible for reliability to ever be high enough to overcome the odds of a crippling failure. This is why RAID cannot be regarded as a backup but merely a way to centralize data to be backed up.

In the next installment, a system is selected, and installation and configuration is begun. Let us know what you think in the comments below, and share this article with others.

XBox as a DVR

Eric Pobirs passed this information along to the Chaos Manor Advisors, and we thought you might be interested. – Editor

Microsoft announced today during their Gamescom presentation their intent to offer DVR capability on the Xbox One.

http://www.engadget.com/2015/08/04/xbox-one-dvr/

500px-Xbox_logo_2012_cropped.svg If this would let me retire my TiVo and its monthly service fee I’d apply the savings to a very high capacity external drive. I currently have a 3 TB drive set aside for the purpose but the additional load from storing video would merit some added TBs. Initially this will be for the USB OTA antennas offered for the Xbox One in Europe and the US. (Which are completely useless in my location due to the amount of geology between here and Mt. Wilson.

Unless Microsoft is planning an external CableCard box, I’d expect that this would use a cable/satellite box connected to the HDMI In and encode the stream to a saved file. (I know the AMD designed APU has some encoding hardware functionality but I’ve seen far less info than on the Intel QuickSync tech.) This may mean a somewhat lesser quality than on a modern DVR that records the compressed stream straight from the cable. Still, hard to beat for free if you already own an Xbox One.

The Amazon Prime service is on-demand, 24/7/365. You can watch anything you want when you want. Because this is a rapidly growing method for accessing video entertainment, scheduled broadcasting is starting to feel some pain. I believe that most of it will be gone within 20 years, possibly much sooner.  Thus the DVR itself is product category with a shrinking future. TiVo is working to reposition itself as the center of video access for the home, which puts them up against game consoles in a way they hadn’t needed to deal with previously.

One example is the Season Pass feature. It will now enlist any streaming service you subscribe to, and have told the TiVo how to login, in order to gather up all the episodes requested sooner than later. It a nice innovation if you have the subscription but really only applicable if you’re watching a current season. A completed season on the streaming service would simply be accessed entirely from there.

So TiVo is in an existential struggle because console makers will always have games as the primary function but the hardware has enough spare horsepower that encoding video isn’t the major task it once was. Microsoft, and potentially Sony, can just add this as a software feature rather than rely on it as their core product.

What was announce for release next year is solely for use with an OTA antenna but it has been hinted by MS execs that it won’t end there.

There are alternatives available right now.

http://cetoncorp.com/products/infinitv-6-ethernet/

This page is somewhat dated since Windows Media Center is dead as far as ongoing development goes. (It is not supported in Windows 10 at all.) Storing recordings on a shared volume that is presented by a DLNA works fine for networked DVR. A CableCard from Time Warner to decode their feed is $2.50 a month on my bill.

https://www.silicondust.com/ is another player in this area.

Gadgetry abounds but I think you’ll find you could spend vast amounts of time just exploring the library Amazon Prime gives you, with the added virtue of it being already paid for as part of your Prime subscription.

What do you think? Let us know in the comments. And share your stories with us; check out the details here.

Weather Stations and USB Servers

Your intrepid CMR editor bought a weather station last year that can update to the Weather Underground network. But to do that, he needed to connect the  weather console to the PC. He wanted the console to be downstairs – and the desktop is upstairs. He discovered a USB server that connects USB devices over his LAN that saved quite a few ‘sawbucks’.

500px-CloudShowerSun.svgA trip last year to Costco netted me a Acurite Professional Weather Station (model 0232C, which lists for $169.99, available on Amazon  for $113); at Costco, the price was under $85. I have always been interested in getting a weather station, but most cost even more than $170. So when I saw the one at the local Costco at that price, I decided to get it. (It may not be available now, but the story is still interesting.)

Setup was fairly easy, although at additional cost. “Siting” a weather station is important to get accurate readings, but I had a good spot in the corner of my yard – no fences, lots of open space around it. In my case, there was a solid tree stump at that corner, so a trip to the local Home Depot netted me a 1” galvanized flange, a 1” to ¾” adapter, and a five foot ¾” galvanized pipe, plus four 2 ½” lag bolts, and a can of beige spray paint (to help the post blend into the neighborhood, since it is visible from the street). The weather station is mounted on the ¾” pipe, although the kit did include a mounting bracket to connect to a wooden post. All of that got the weather station installed in a great spot, and so far, the neighbors haven’t complained about the placement (there is an HOA involved, so don’t tell them).

The battery operated (with a solar panel to drive an internal fan) outdoor unit communicates with the indoor color display wirelessly. The display has a USB port to connect to a computer, and then you can use the included software (although I found some better software than came with the kit) to send your data to a place like Weather Underground. You can also install an app on your Android or Apple device to display your data. But you do need to connect the display to your computer via the included USB cable.

And that was my problem that makes all of this related to a “doing things so you don’t have to” story.

In my house, the living room is the desired location for the color weather display. But the desktop computer is in the office upstairs. That computer doesn’t get used much, since my wife and I both have laptops that we use to connect to our home wireless network (LAN). But the desktop is connected to the laser printer (via a network printer device) so we can use that printer. The desktop is also used as the backup for our laptop files, which are backed up to the ‘cloud’ with Carbonite. (We use the Microsoft SyncToy to sync files from laptops to the desktop.) And the desktop is connected to our little LAN via an older Netgear Wireless Extender. It all works together quite nicely, in a geeky sort of way.

In order to get the weather data to Weather Underground (WU), the weather station comes with an app that is loaded on a computer. The weather display is then connected to the computer via the USB cable, and then the desktop app sends the data up to WU. So my layout was not going to work, unless I used a 50 foot USB cable draped around the walls. Not a good home decorating idea.

Acurite had a solution, though, called a Network Bridge, at $80 retail (plus shipping). A bit steep, I thought.

A search (with a bit of help from fellow CMR advisor Eric Pobirs) eventually got me to the Monoprice company web site, which had a USB server that allowed USB devices to connect to a computer via IP (). The list price at Monoprice is $24, but who pays list? A search on Amazon didn’t find a cheaper price. The listing there was about $31, but there were a couple of used ones for under $10 (at the time, current pricing for used ones now is around $27) including ground shipping. The device specs and features looked good, and losing a ‘sawbuck’ (for you youngsters, see here ) was OK if it didn’t work, so I ordered the used one.

It arrived a few days later in a padded package: the device, the power supply, and the mini-CD with the device application). It was a simple installation process: connect the weather station USB cord to the USB server, plug the server into the router (which is in the same downstairs location as the weather display), and connect the AC adapter.

Then upstairs to the desktop computer to install the device software from a mini-CD on the desktop. A few clicks, and a search via the app to find the USB server on the network, and the weather display was connected. (The device used DHCP to get its IP address, although you can assign an IP address with the app.)

I switched to the Acurite weather application, and it saw the weather display that was connected over my LAN via the little USB server. It all worked.

The USB server device can support a USB hub, so you could connect more than one USB device to your local network. You can also use it to connect a non-network type printer to your local network. If you have multiple computers, the USB server software will allow each computer to connect to any USB device, although not at the same time. The software does have a way to send a message from one computer to another to release control of the USB device.

The USB server might be most useful for sharing non-network-enabled printers at a reasonable price, although wireless-enabled printers are not that expensive. And you can find little USB wireless print servers that will allows you to connect a printer to your wireless network.

In my case, I saved over $60 using the USB server. My weather display is now connected to the desktop computer through my local network, which allows me to share my weather station on Weather Underground.

Do you have a story about computers that you’d like to share? See this page for details. – Editor

Wi-Fi Sharing in Windows 10–Facts or Hysteria?

With the release of Windows 10, one of the subjects of concern is the new Wi-Fi Sharing process. It looks like there has been a bit of hysteria and/or exaggeration about this issue.

The Chaos Manor Advisors discussed this a few weeks ago, when the first article about this appeared in The Register. The general consensus is that on first look, this may be a ‘bad thing’. But a lot of the hype about this seems to be just that, hype. And some misunderstanding of the whole process. It appears that one might want to ‘read past the headlines’ on this issue.

Chaos Manor Advisor Peter Glaskowsky reports on his testing of Microsoft’s Wi-Fi Sharing process in a late beta release of Windows 10.

I’ve been talking about Wi-Fi sense without the benefit of having used it, since I have only one Windows 10 machine and that one is a 2U server with no wireless in it.

But yesterday I realized that I could attach one of my USB Wi-Fi dongles. (A blinding flash of the obvious.)

This is as pure a test as I can imagine of the default Wi-Fi Sense settings, since this machine has literally never had a wireless capability before now, and Windows 10 was the first version of Windows ever installed on it.

So, the results:

When I installed the Wi-Fi adapter (a TP-Link TL-WN722N, one of the nicer ones of this type since it has a proper RP-SMA antenna connector), it became available essentially instantly. Windows 10 said nothing about installing a driver.

I went into the new-style Settings (not the old Control panel), then Network & Internet, then Wi-Fi on the left sidebar (which had not been there before), then Manage Wi-Fi settings in the main window. This sequence brings up the main Wi-Fi Sense settings dialog.

The “Connect to suggested open hotspots” option was on, allowing the machine to connect to well-known public hotspot systems like Boingo. I think this is generally fine, but I don’t know whether there is robust protection against someone setting up a bogus hotspot that appears to be part of the Boingo network. Since I don’t need it, at the conclusion of this testing, I turned it off. In the meantime, I left it alone.

The setting of primary concern to everyone is “Connect to networks shared by my contacts”, and that one was OFF by default.

Turning it ON experimentally brought up the three sharing options: Outlook contacts, Skype contacts, and Facebook friends. All three of these were OFF.

I turned on the Skype contacts option.

I then started the process to connect to my home Wi-Fi network by pulling open the network submenu in the task bar and clicking on my SSID.

This brought up the usual “Enter the network security key” field and a new one: “Share networking with my contacts.” That option was OFF even though I had turned on the share-with-contacts and Skype sharing options.

In other words, the defaults for the sharing method of primary concern in these web articles are ALL OFF. As off as off can be.

I abandoned the connection process without entering the security key, then turned off the share-with-contacts option in the Wi-Fi Sense settings and started the connection process again.
This time the connection box didn’t even have the “Share networking with my contacts” option.
I re-enabled the share-with-contacts and Skype options, and actually did go through with the connection process, including checking the sharing option.

Interestingly, the system did not give me any choice about which contacts to share it with. I went back into the Wi-Fi Sense settings… and the Manage known networks section said that my network was “Not shared.” How curious, but it saved me a few steps in the procedure I was going through, since my next thing was to share a network that had previously been connected but not shared to see what happens.

I clicked the Share button.

Even though I had already entered the network security key, it asked for the key again. This is exactly the right thing to do. This is how Windows 10 prevents a friend from sharing your security key if you personally type the security key into their device rather than, for example, reading it to them to enter manually.

I completed the sharing process and verified that it “stuck” this time.

Then I disabled the share-with-contacts option in Wi-Fi Sense, and then re-enabled it.
When I went back into “Manage known networks,” my network showed as “Not shared.”

So that’s the whole deal, I think. By default, Wi-Fi Sense operates, at least on my machine, as of today, on Build 10162, exactly as Microsoft says it does. Sharing only happens when you click a bunch of extra buttons to enable it, and stops when you deselect any of those options.
Every share-with-contacts option defaults to OFF, and it DOES protect against a Wi-Fi security key being shared by someone who doesn’t actually know it.

I hope that is the end of this matter for now, at least until we find someone reliable (that is, not a writer for The Register) who has a machine that works differently.

Or until Microsoft provides additional information on the various security aspects (how is the security key protected, how is local network access prevented, does Microsoft have a way to learn your password, does Microsoft have a way to review your Facebook contacts list, etc.).

Or until Microsoft adds what I think is the essential feature for sharing a Wi-Fi security key securely: sharing it with only one individually specified person at a time, without giving Microsoft a way to see the key.

Comments and questions welcome, of course.

The Chaos Manor Advisors discussed this issue a bit today (29 July 2015), especially after Brian Krebs wrote about this (see here). We shared that link to the article with the Advisors.

Eric said:

    So Krebs went ahead and wrote this without doing even the same brief testing Peter did weeks ago. This is how hysterias grow.

Peter added

In spite of the hysteria, I believe it is already fully opt-in.

The only, only, only thing that defaults to “on” is that the service is enabled. Every time a user adds a new Wi-Fi network, the dialog box specifically asks whether to share it with contacts or not, and which contacts to share it with from the three available options (Outlook/Facebook/Skype). All four of those questions, at least on my machine with a clean install, defaulted to OFF.

If the service itself is turned off, none of those sharing questions will be asked.

Now, if someone has turned on the service and shared a network, maybe it defaults to enable sharing the next time; I didn’t test that.

I think this business Krebs raises (and the Register raised) about how a friend could share your Wi-Fi credentials without your permission is just nonsense. That still takes a deliberate effort. If you have a friend who would do that, you need new friends.

This may be a bit of hysteria, as Peter stated. Although sharing your Wi-Fi password is generally not a good thing (especially for the paranoid?), it would appear to us that the actual risk is quite low, based on some limited testing by the Advisors.

We’d be interested in your opinion on this. You can share in the comments below. If you are inclined, you can send us more detailed information that we might use in a future post here at Chaos Manor Reviews. See this page on the submission guidelines for Chaos Manor Reviews.