Detecting Vulnerable “Internet of Things”

magnifier-492539So the big news last week was the giant attack by the Mirai malware/botson Dyn that effectively killed (well, seriously wounded) the Internet for a lot of people. And that the “Internet of Things” (IoT) was the source of the attack, because of bad security practices (devices with ‘backdoors’ and default passwords) on those devices.

I’m not going to explain what happened. If you are interested in this subject, you probably already know that the attack was done by the Mirai malware. The rest of you can ask the googles if you need an explanation of what happened.

And I am not going to explain how Mirai works, or that you can get a copy of the Mirai malware source code.

The thing that is not clear to many people:how can you check to see if your devices on your network, whether home or work, are susceptible to the attack by the Mirai attack.

The basic attack is through specific ports on your network, visible to the outside (external to your network) to devices ‘inside’ your network. So to test if your network is vulnerable, you need to check from the ‘outside’ of your network.

To do this check from the ‘outside’, I recommend the venerable (fancy term for old) “ShieldsUp” check from Gibson Research. This is a free tool that will scan for open ports on your network (this should work on any OS or network).

But, before you do that, make sure you have the permission of the owners of your network. Attacking – or even scanning – a network you do not own can be a felony in the US, and probably other countries. So, before you proceed, make sure that you have the networks’ owners’ permission.

You can check your own home network, though, since you are the owner. But, again, only do this scan on networks you own, even though the scan is very benign.

You can find the Gibson Research “ShieldsUp” tool at http://bit.ly/2dA9Ubd. Carefully read the information on that page. (For instance, that page will show you your unique identification that every web site can find out. Even the ‘private’ function of your browser will disclose that information. Again, read the page carefully to understand the implications.)

Once you have read the info on that page, click on the “Proceed” button (either one). On the next page, read the information, then click the orange button to check your exposure to UPnP (Universal Plug and Play).

image

The test will take under a minute, then the result will be displayed. If your network is OK for that test, you’ll get a nice green message. That’s good. If your network has problems, there will be some explanation of what you should do. We’re not going to go into any of that “What You Should Do” stuff, it’s pretty deep and complicated.

The next step is to check for any open ‘ports’ on your network. Go back to the testing page (the page you saw when you clicked on the “Proceed” button). On that screen, these series of buttons are the next step.

image

Run the “Common Ports” test first. Then run the “All Service Ports”. As with the first test, you are looking for all ‘green’ results. Any bad results will be listed, along with explanations. Again, we aren’t going to explain things here; if you need more info, look at the site’s explanations, and ask the googles if needed.

On my computer on my home network (which I own, so I have permission to scan my network), I got ‘all green’, as shown in this screen shot:

image

Hopefully, you will too. If you don’t, then proceed from there.

Fixing Broken Windows 10 Apps


Reader Sean Long submitted this tip for fixing broken Windows 10 applications. If you have a tip that will help CMR readers, let us know. And add your comments after the article.

I have another Windows 10 tip that seems to be a hot topic in help forums but doesn’t have a consistent fix.

The problem I had was I tried to bring up the default Windows calculator, and it wouldn’t run. Since I had fiddled with the default Windows 10 apps before, I figured I just needed to re-install the calculator app. When that failed, I tried to brute-force reinstall all Windows 10 default apps, and that resulted in ALL of the windows 10 apps becoming unusable.

The issue is that some of the Windows 10 apps are super annoying, so many people have been trying to uninstall one or more of the default apps. Unfortunately under the current build of Windows 10, the installer appears to be badly broken so both uninstalling and attempting to reinstall the apps can make all of the Windows 10 default apps unusable. They can’t be uninstalled, they can’t be reinstalled, they don’t work, Windows store breaks, and Microsoft considers them core components so they don’t even show up in the programs and features control panel or settings applets so you simply can’t fix them yourself.

For an example of a badly behaved windows default app, the new Windows 10 photos app will continuously attempt to scan, index, and enhance all images in all libraries. That’s great if the library is only on your local drive but if the library is located on a networked computer, it will saturate your network and thrash the remote library’s hard drive endlessly.

One unsatisfactory workaround is to go to your libraries and remove all libraries on networked drives, but you shouldn’t have to do that if the Windows default apps didn’t have these horrible and destructive behaviors set by default. So instead of removing networked libraries, you can fix the problem by removing whatever Windows app (photos was the worst for me) that is causing the problem.

Of course, many people realize after the fact that they really did want that app back. So the “magic” re-installation command that you could enter into the PowerShell program (run as administrator), as found on a dozen websites and help forums, is:

Get-AppXPackage | Foreach {Add-AppxPackage -DisableDevelopmentMode -Register “$($_.InstallLocation)\AppXManifest.xml”}

Detailed instructions can be found on many windows help forums so I won’t go into more detail than that.

Unfortunately under the current mainstream Windows 10 build (as of 16 Jan 2016), that will wreck all Windows default apps and make them unusable. Oops. There are a handful of other approaches to get around this including some registry tweaks and resetting permissions, but the bottom line is that for almost all users, attempting to uninstall or reinstall the default Windows 10 apps will likely break all of them without any way to repair or restore any of them, including the windows store. Thankfully, there is one solution, although it reminds me of buying a new car every time you need an oil change.

The solution for now is to go to the Microsoft Windows 10 installer site here: http://bit.ly/1ZC6vVG and re-run the Microsoft Windows 10 installer. It will do just what the original upgrade did, leaving your current apps and files alone and restoring any lost functionality. It can take an hour or more depending on computer and network speed, but I’ve had to do it on 2 separate computers now without any failures, using the online installer.

Did this help you? Any other ideas? Let us know in the comments. – Editor

Access Pre-Windows 7 File Shares on Windows 10


Sean Long, a Chaos Manor Reviews reader, had difficulties with the Windows 10 upgrade making his pre-Windows 7 file shares inaccessible. After much trial-and-error and incorrect information on the Googles, he figured out how to fix the problem. His solution is below. The usual precautions apply.

Thanks to Mr. Long in sharing his problem and solution. CMR is interested in similar problem-solving from CMR readers. This page has the details on how to share your solutions to computer issues. – Editor

file-sharing-iconFirst, if your win10 machine can’t access pre-win7 file shares (winXP, windows home server, some linux or NAS versions), go here http://bit.ly/1ZiNLdz

The original response doesn’t seem to be a complete answer, but down in the comments is the actual solution:

There is a setting in windows Local Security Policy which is incorrectly set by default for viewing an older communication protocol NAS.

To access said setting go to the Control Panel in Windows 10 (or 7), in Category view click on the text “System and Security”, then click on the text “Administrative Tools”. Now double click and open “Local Security Policy”.

In the Local Security Policy screen on the left navigation tree, expand the “Local Policies –> Security Options” then about 2/3rd’s the way down the list you’ll see a Policy called “Network Security: LAN Manager authentication level”. Double click and change the setting to be “Send LM & NTLM – use NTLMv2 session security if negotiated.”
Then just press OK and close all of the open windows and then try again.

In the case of Windows 10 Home, Local Security Policy does not exist (thanks Microsoft); therefore make the change in the registry (use the REGEDIT program). Find the indicated entry, then add a new entry as detailed below:

HKEY_LOCAL_MACHINE\System\CurrentControlSet\control\LSA
Add:
LMCompatibilityLevel
Value Type: REG_DWORD – Number (32 bit, hexadecimal)
Valid Range 0-5
Default: 0, Set to 1 (Use NTLMv2 session security if negotiated)
Description: This parameter specifies the type of authentication to be used.

Basically, Microsoft failed to set a critical security setting (it is set to null by default), and it needs to be set to something in order to connect to Windows XP or Windows Home Server file shares.  Easy fix, stupidly hard to find though.

Second, if you have a laptop or tablet that can’t get through the Windows 10 version 1511 update, it may because you have an SD card installed.  To get the Windows 10 1511 update to install correctly, you must remove the SD card before starting the update process.

If the update has failed and no longer shows as an available update, you will need to go to the Windows 10 upgrade page and re-run the Windows 10 installer.  It will recognize that you already have Windows 10 installed, and will patch it to version 1511.

The critical point is to have the SD card removed throughout the entire update process.  If you are like me and have moved user files to the SD card, don’t worry, it worked ok for me when I popped the SD card out right before running the installer, and put it back in before logging in after the update was complete.

Thanks to Mr. Long for his report. Enter your comments or questions below. And share your computer stories with CMR readers; start here. – Editor

Chaos Manor Network Attached Storage Upgrade And Retirements – Part 3

Now that Eric has finished the NAS project installation and configuration, it was time to consolidate data from various systems, upgrade others, and consider system retirements. Part 3 of the project continues with Eric’s narrative:

typewriter-297383_1280With the NAS/RAID project completed, the next step was to track down the various elder PCs that were serving as network backup locations, back them up to the NAS, then shut them down in hopes of making a dent in the frightening power bill that had been shrugged off in the days when Chaos Manor was like Frankenstein’s Castle for PCs, with a gruesome experiment running in every corner.

Some of these machines were quite long in the tooth and were due to go off to the farm where they could run and play with the other PCs. They weren’t in operation and adding to the power bill but it was time to clear some space.

First up were the Satine and Roxanne systems. Satine was a Socket 939 (indicating a long ago AMD CPU generation) system with an Nvidia 6000 series video card and a pair of 500 GB drives that may have been mirrored.

The googles note an early mention of the Roxanne system in February 2008: http://goo.gl/lxT07M : “Roxanne, the Vista system that was the main writing machine before Isobel”.

An early mention of Satine – nee’ Sativa – is around July 2006: http://goo.gl/gy1lP3 , also here http://goo.gl/zwR0e8 . Earlier mentions could be in the Chaos Manor columns printed in Byte magazine, but web versions of those sites are not available, at least with a quick search.

Beyond that it was hard to say because Satine was inoperable. It would spin up the fans and do everything preparatory to booting but never would. Not so much as a POST beep of any kind. The properties of some system files indicated it was running Windows XP SP2, so there was probably little value there for anyone beyond salvaging the drives.

So the hard drives were backed up and formatted, placed back in the case, and Satine remained as a project for someone with a strange combination of motive and nothing better to do.

Roxanne was more promising. She had last seen life as Mrs. Pournelle’s workstation but had been replaced by a new build when the case vents had become clogged, causing poor Roxanne to overheat before System Restore could run long enough to repair the damage from the first overheating.

Even with the vents cleared the Pentium 4 HT system was rather loud and hot. It isn’t clear to me whether this was always the case and not a bother to Jerry’s artillery-blasted hearing or if it had become compromised at some point.

Certainly it wasn’t worth any significant investment to replace any of the cooling bits. But it was running Windows 7, raising the question of whether it could become a Windows 10 system with a potentially long life ahead. Therein the saga lies.

Updates and Upgrades

On both Vista and Windows 7 there was just the one Service Pack. (Windows 7 may still get a second due to its business footprint but I’m not holding my breath.) Going back to NT 4, Service Packs were once produced far more frequently. Internet access was far less widespread and the need to store updates locally for installing on numerous machines was far higher. It was especially helpful if a Service Pack replaced its predecessor. Install SP4, and SP2 and SP3 were included in that update.

As the internet and live Windows Update downloads became more the standard, it became more of a hassle to update a new install or a machine that had been offline for a long period. By the time of Windows 7 in the days after Windows 8 had launched, this had gotten a bit painful.

Businesses of the scale to have their own WSUS setup [Windows Software Update Server, a ‘personal’ Windows Update system to centralize and manage updates across the business environment] or enough identical machines to use an updated image weren’t bad off but supporting the SOHO market got annoying. I had one experience where several refurb PCs that came with Windows 7 SP1 needed almost 1.5 GB of downloads to be fully updated. This was a very slow process regardless of how good your broadband speed might be.

Well, Roxanne hadn’t been to Windows Update in over two years. On the first attempt it spent nearly two hours figuring out which updates it needed before I noticed the gas gauge animation had stopped. The Update app was frozen. Start again.

This time it was a bit better. It installed about 150 updates before stopping and announcing it had an error it could not work past.

Along the way, the Windows Defender anti-virus scan announced finding some malware. It was deleted but found several other copies as it worked through the drives. Roxanne had received copies of older machine’s content in the process of originally entering service and consequently the malware implanted itself in each copy it found of certain system files. This was one of those packages that would break Windows Update as part of its activities. Why I was able to get as many updates installed as I did before it kicked in is a mystery.

This meant going off and searching out the ‘fixit’ app to correct the damage. Still more downloads. Then Windows Update failed with a different error.

At this point Roxanne had been updating, more or less, for about 20 hours. (This could have been reduced some if I’d been there to respond every time it needed human input but I have this congenital condition that requires me to spend a certain portion of each day unconscious. It’s very inconvenient. The doctors call it ‘sleep’ and can only offer short-term mitigation.)

A Different Fix Needed

This time a different fix was needed but it was soon found and applied. Finally, almost a day after starting this quest, it was done. Roxanne was as up to date as a Windows 7 machine could be at that moment in early September of 2015.

But where was the Windows 10 upgrade offer in the notification area? It should have been in the last batch of updates. Apparently this issue comes up often enough that there is an app from Microsoft that looks at your machine and determines if it is eligible, and installs the upgrade app if so.

Roxanne wasn’t eligible for the Windows 10 upgrade for two reasons. One was correctable, the other was not. At least, not at reasonable cost, which in this case is not anything over $0.

The Nvidia FX5700 video card had long since fallen out of the range supported by the company after Windows 7. This could be fixed by replacing it with a newer card that would still be old enough to be free or of negligible cost. The other problem was the Pentium 4 HT CPU. It was too old to have NX Bit support.  https://goo.gl/4dgKB0

headstone-312540_1280Considering how much past malware misery could have been prevented if this NX Bit feature had become common much earlier in microprocessors, it represents a perfectly reasonable place for Microsoft to say “here and no farther” when it comes to antique hardware. The last generation of Pentium 4 did have the NX Bit (Intel calls it XD bit) added but this was after Roxanne’s CPU came out of the foundry.

So there it ends for Roxanne. She may find a home yet and brighten some life but we are done with her and bid her farewell.

Wrapping Up

The project met the need of providing a centralized and more efficient data storage for the Chaos Manor network. By consolidating data into that central location with high capacity drives, Chaos Manor gains efficiencies in data storage (data is not in several different places, with the attendant work of synchronizing data amongst several systems). It also makes backing up that data more centralized – which will be the focus of an upcoming project.

Using a RAID 6 configuration allows for data reliability, although you shouldn’t rely on just a RAID 6 for backup or data recovery, it is more of a reliable centralized storage system. You really need to have a process in place for off-network storage of your important data.

As for older computer systems: there’s “old” and then there is “really old”. Determining the moving target that divides those is necessary to decide whether a system should be donated or simply sent to e-waste.

And so ends this project. We hope that it has provided you with useful information, and perhaps some thought for a similar project of your own. We’re always looking for guest authors, see here. Let us know what you think in the comments below, and share this article with others.

Another Backup Strategy

[In a previous article, Drake Christensen described his backup strategy using a network attached storage (NAS) and software to immediately backup changed documents. Your intrepid editor has a different strategy, since he doesn’t need immediate backups with version control.]

pc-bruciato-fireIt is a Good Thing to have a way to back up your important files. And since it is also Emergency Preparedness Month, having a good backup strategy is a good subject to visit.

There are tons of ways to backup your data, and many reasons to actually Do It. You’ve probably heard about all the reasons ‘why’. My main focus on my backup strategy is three-fold:

  • keep backup copies in several physical locations; not just at home
  • make the backup process easy and mostly automatic
  • allow backups of multiple devices (there are four computer systems in our house) while reducing costs

Backing My Laptops

The first process is to copy important documents/files (pictures, project code, document, etc) to a central location in my home. That is a desktop computer sitting upstairs, connected to my LAN. It’s an older system, not used as much anymore, but there is a big hard drive on the system.

I use Microsoft’s SyncToy, a free program that synchronizes files between two locations. In my case, I use it to one-way sync between laptop (source) and desktop (target). SyncToy has the advantage of only copying or updating files that have changed or been deleted. Only those files that meet those criteria are copied from the laptop to the desktop. With thousands of files on my laptop, that saves a bunch of time.

Since SyncToy is free (thanks, Microsoft!), I have it installed and configured on all three of our laptops. Right now, it is a manual process to do the ‘sync’, but there are ways to set up a ‘batch job’ and schedule the SyncToy task on a regular basis. My practice is to do a sync every couple of days. This is OK, since most of my work is with web sites, and the web site code files are also available on the external web sites in case of disaster.

ScreenShot394SyncToy works fairly fast; once you set it up, just run the sync task and let it do it’s thing. Here’s the results screen of the last time I ran SyncToy. You can see that a bunch of files didn’t need to be copied, because they hadn’t changed. That saves a bunch of time on the backup.

Now, I could back up to an external hard disk, either manually or with SyncToy. But the desktop is available on my LAN, and backing up via wireless is Fast Enough for my purposes.

Backing Up the Desktop

At this point, all my important files are in two places: my laptop(s), and the desktop. But the files are in the same physical location. Any problems with that physical location (theft, fire, earthquake, zombies; take your pick) would result in loss of files – especially all the family pictures, many of them irreplaceable.

So the second part of my backup strategy is to copy important files to the ‘cloud’. For that, I have chosen the Carbonite backup service (http://goo.gl/45wv ). For a flat fee, all my important files are automatically copied to their encrypted servers. There are similar services available from other vendors.

The best part is that ‘automatic’ part. Any file that changes on the desktop is automatically backed up to the Carbonite servers. It happens in the background, so when I use that computer, the backup process doesn’t interfere with my use of the computer.

Carbonite stores multiple copies of my files, so there are some ‘history’ versions of files that are available. You can also access any of your backed up files on other devices or computers – phones, tablets, whatever. This is a great advantage if you travel a lot, since you can access that important file you left on your home system while on the road.

There is another advantage to using a ‘cloud backup’ service like Carbonite. That relates to ‘ransomware’.

I P0wn All Your Filez

Any backup strategy needs to account for damaged files. Files can be damaged by hardware problems, physical damage (fire, etc.) or theft. And then there is ‘logical’ damage – damage done by malware.

There is malware that encrypts your files, requiring a payment to recover your data. This ‘ransomware’ can be a big moneymaker – reports are that up to US$18 million has been paid to recover encrypted files. While there are things you can do to block ransomware – or any malware – think ‘safe computing’ practices – your backup strategy can also help prevent file loss from ransomware.

Ransomware can damage files from any infected computer on your network – even your little home network. If a file is available on the network from an ransomware-infected computer, then that file can be encrypted, even if it is on another computer.

So your backup strategy needs to take possibility into account.

There are a couple of ways that you can enhance your backup strategy to protect from ransomware:

  • Copy files to an external drive (or even DVDs), then physically disconnect that external drive from the network.
  • Use a ‘cloud backup’ service.

As you may have guessed, my backup strategy to prevent possible ransomware problems is using a cloud backup service from Carbonite.

gTWVIWhat about copying files to a Linux-based Network Attached Storage (NAS) system? It is likely that configuration includes access to the NAS by Windows-based systems. So there is no protection there. Of course, you could run all-Linux systems, but that is less likely for most people.

So the Carbonite-based cloud service is my solution. They keep multiple backup copies of my backups, so even if the ransomware gets to my desktop system, and Carbonite backs up those encrypted files, I can work with Carbonite to get a prior, non-encrypted versions of my files. I might lose a few recent files, but the majority of my important files would be available to get back – after I rid my systems of the ransomware.

Wrapping Up

So there you have it. My own personal backup strategy. It works pretty well for me. I haven’t had to recover files – mainly because I practice ‘safe computing’. And have been lucky enough to not have any disasters.

But I am prepared. My important pictures, documents, web site files, etc., are available-  Just In Case.

What is your backup strategy? What do you do different? Or do you just trust in your ‘karma’ to keep away the possibility of file loss? Let us know in your comments – or write up your own backup strategy for an article here on Chaos Manor Reviews.

An Evolving Backup System

[Chaos Manor Reviews reader Drake Christensen was reading about the Network Attached Storage (NAS) system your editor set up with the Raspberry Pi, and decided to share his experiences over the years with his NAS and backup system configuration and practices. – Editor]

nas-backup-cloudI’m very happy with my backup system, which has evolved over many years, and I thought I’d share my experiences as I have enhanced and improved it over the years to its current configuration.  It may have been on one of the Chaos Manor Mail pages, where someone declared:  “If it’s not in at least three places then it’s not backed up”.

I use Windows at home. I’ve had a Network Attached Storage (NAS) on my home network since 2009. Even though they’re a little more expensive, I’m a big fan of NAS over external USB drives.  I worry that if Windows gets confused and trashes a drive, it may also trash any backup media that’s plugged into it.

Since a NAS is a separate computer, that provides a bit of a buffer to protect against the disk structure getting destroyed.  Also, just as a simple practical matter, the NAS can be accessed by multiple computers on my home network.  And, I don’t have external drives cluttering my work area.  The NAS is off in the corner, out of the way.

I started my home NAS with a two-drive D-Link DNS-321, which is still attached to my network.  This older NAS is limited to 2 TB drives. The D-Link is very hack-able, and I think I could have downloaded a custom build of the OS that would let me use larger drives. But I just don’t have time to add another hobby.

In early 2014, I added a new hot Windows box that I had built by a boutique builder, iBuyPower. My previous, circa 2009 iBuyPower box was relegated to be a secondary chat/email machine that I use on the same desk.

I needed more backup space, so I added a Synology DS213j (see information here; link to site?) This two-drive system currently has only one 4 TB drive in it.  The admin page of the Synology is quite a bit slicker than the D-Link, and it has a lot more optional features available through their interface.  And it is being actively updated.

With my first NAS, I used a conventional backup program, which did daily backups to the NAS.  But, for some reason, even though the data was growing only slowly over time, the backup was taking much, much longer.  Eventually, it was taking most of a day, even for an incremental backup.  I never did figure out what was causing the performance issues.

Updating over the Years

Somewhere around 2012 I started looking for a program to give me Apple Time Machine-like capabilities on Windows.  I wanted to have a file backed up within a few minutes of when it was modified, with multiple versions stored.  I tried one commercial product, Genie Timeline; it wasn’t horrible. But, I found its interface to be a bit too “cute,” for my taste, and I felt that it got in the way.

Eventually, I found AutoVer mentioned in several places.  It’s certainly not a pretty program, but I find it fairly straightforward.  I’ve been running that on a few machines ever since.  I set it to backup up my entire \Users\(me)\Roaming directory, plus my data drive.

During the first few weeks, it does require some attention.  Some programs, like Firefox and Evernote, for example, will touch some large files fairly often, which can quickly eat up space on the backup drive.  I was able to break up the backup task into three or four smaller pieces, with custom rules for each task, and greatly reduce the number of versions it keeps of those larger files.

Unfortunately, “Real Life” has encroached on the author of AutoVer, and it is teetering on the verge of abandonware.  He rarely logs in to his own forum, anymore. It is still reliable for me, and I’m still using it on two machines.

More Enhancements to My Backup System

When I purchased my latest machine I decided to find an alternative that appears to have more recent work done on it.  I ended up with Yadis! Backup.  Its interface is a bit more familiar and friendly.  I’ve been running it for about 18 months now.  The only issue I’ve had with Yadis! Backup is that over time the log file will grow huge and crash the program on start.  Every couple of months I’ve had to rename/delete the file, which clears up the problem.  I have contacted their tech support a couple of times and received reasonably prompt responses.

One wrinkle that I’ve recently solved is automatically logging into my network drives.  Apparently, when checking the “Automatically connect” box in Explorer, the order in which Windows attempts to log into the network shares vs loading its network drivers during boot results in an error, leaving me unconnected.  I had hacked together a quick Powershell script to do that, but I wasn’t happy with it. A few months ago, I started looking around, and found the open source NetDrives, a free utility that I can run on startup to connect the network shares when the OS is ready to take them.

So, that’s one extra backup copy.

Going to the Cloud

A couple of years ago I saw an ad for a cloud backup service, called Backblaze.  That got me started researching.  I found lots of good reports about Backblaze, but it was a little expensive for my use.  (I record some amateur sports videos, which greatly bulk up my data.)

Carbonite is well-known, but at the time I was looking at it, it was also very expensive for large backups. It has been long enough that I don’t remember the specific prices, at the time. I recall that one of my machines had over 600 GB on the data drive, and that Carbonite was in the several hundred dollar range for that much data.

I ended up with Crashplan, which gives me unlimited data on 10 machines for $149/year.  I added my mother’s machine to my account (and I set up a NAS for her, too.)  Crashplan is also Time-Machine-like, in that it backs up continuously, and keeps multiple versions.  I’ve actually made use of Crashplan to restore a couple of files.

I don’t want to sound like a commercial for Crashplan, but there are a couple of other features that are worth mentioning which have been useful to me in my configuration and usage. (As they say, Your Mileage May Vary.)

First, since all my machines are under the same account, if I were on the road, I could conceivably use my laptop to pull a file from the cloud that was saved from one of my desktops. They also have Android and iOS mobile apps to access files backed up in the cloud.

Crashplan can also back up from one Crashplan machine to another, whether local or remote. And, it can back up to physically attached drives. It does not appear capable of replacing AutoVer and Yadis! Backup to back up to a NAS, though, even when the shares are mapped to drive letters.

Cloud Backup Advantages

The prices and packages of all of these cloud systems have changed a lot since I looked at them a couple of years ago. Backblaze is now $50/yr/computer, unlimited. And, they offer a stolen computer locator service. Carbonite is $59/yr for the first computer, unlimited data with the exception of video. Video files and more computers are available for an added cost. All of them provide seeded backups (the option for you to send them a drive with an initial copy of your data.) And, there is an option for them to send you a recovery drive. In any case, do your homework before choosing your cloud backup service to see which best fits your needs.

Cloud systems like this also protect against ransomware.  Since it backs up only through the software service, ransomware has no way to get at that set of backup files to encrypt them. For a while, you might be backing up encrypted files. But, with this kind of versioning, you can get back to a good copy from prior to the infection. The NAS, on the other hand, is still vulnerable, if the virus looks to see what shares the machine is connected to. From a Windows point-of-view, the share is “Just Another Drive”.

An aside:  One thing I found when researching cloud backups is that there is one company that is poisoning Internet searches.  They have released four or five different programs which are nearly identical, but under different names and different pricing schemes.  And then they have paid for a large number of reviews, and commissioned a bunch of Top 5 and Top 10 lists with their programs listed near the top, to make them look a lot better than they are.  Digging a little deeper, there are a lot of complaints about that company – either bait and switch pricing, poor customer service or technical problems.

Wrapping Up

My current backup system is comprised of Network Attached Storage, Time Machine-like versioning for local backup (is there a more generic term for this sort of thing?) and a commercial Cloud versioning backup. With this system in place, I can set up multiple computers on my network for continuous backup.

There are a few things I really like about my backup system.

The first is, after the initial teething period, it is completely automatic.  I don’t have to remember to do anything.  It Just Works.

Also, the multiple versions have come in handy. I’m a bit of a packrat, and I like having multiple versions of stuff I’m actively working on. It’s only a few times a year that I have that breath-sucking, “Oh, no.” feeling when I saved instead of canceled. The versioning has saved me a few times. One example would be, when my mother made a change to her book collection database file, and didn’t tell me about it for over a week.  I was able to pull a version out of Crashplan from before the change. I chose to pull from Crashplan because it happened to be the first time I needed to get an old version of a file since installing it, and I wanted to try the interface. It worked about as I had expected.

Next, I like the speed of on-site storage, as my first place to restore from.

And, finally, it adds a lot of peace of mind to know that I have off-site storage, in case of fire or theft, or similar disasters at the house. Plus, there is the slim chance of ransomware wiping out everything locally.  And, again, I don’t have to think about it.  I don’t have to discipline myself to rotate storage to have off-site safety.  For practical purposes, it’s built into my computers, now.

My solution is maybe a little pricey. I spent about $300 initially, for the D-Link DNS-321 and a 1 TB drive. The more recent Synology DS213j with a 4 TB drive can be had for about $300, at today’s prices. And the yearly cost for cloud backup is $149.

The NAS is a one-time expense, lasting me for years.  Crashplan is an ongoing expense.  As always seems to be the case, it was all a little more than I’d prefer to spend.  But, given the bulk, I think it’s reasonable.

[What do you do for a backup system? It is extensive, or do you even have a backup system? Let us know in the comments. Or you can submit your own experiences with backup processes on your home computers; see our Submissions page for details. – Editor]

Wi-Fi Sharing in Windows 10–Facts or Hysteria?

With the release of Windows 10, one of the subjects of concern is the new Wi-Fi Sharing process. It looks like there has been a bit of hysteria and/or exaggeration about this issue.

The Chaos Manor Advisors discussed this a few weeks ago, when the first article about this appeared in The Register. The general consensus is that on first look, this may be a ‘bad thing’. But a lot of the hype about this seems to be just that, hype. And some misunderstanding of the whole process. It appears that one might want to ‘read past the headlines’ on this issue.

Chaos Manor Advisor Peter Glaskowsky reports on his testing of Microsoft’s Wi-Fi Sharing process in a late beta release of Windows 10.

I’ve been talking about Wi-Fi sense without the benefit of having used it, since I have only one Windows 10 machine and that one is a 2U server with no wireless in it.

But yesterday I realized that I could attach one of my USB Wi-Fi dongles. (A blinding flash of the obvious.)

This is as pure a test as I can imagine of the default Wi-Fi Sense settings, since this machine has literally never had a wireless capability before now, and Windows 10 was the first version of Windows ever installed on it.

So, the results:

When I installed the Wi-Fi adapter (a TP-Link TL-WN722N, one of the nicer ones of this type since it has a proper RP-SMA antenna connector), it became available essentially instantly. Windows 10 said nothing about installing a driver.

I went into the new-style Settings (not the old Control panel), then Network & Internet, then Wi-Fi on the left sidebar (which had not been there before), then Manage Wi-Fi settings in the main window. This sequence brings up the main Wi-Fi Sense settings dialog.

The “Connect to suggested open hotspots” option was on, allowing the machine to connect to well-known public hotspot systems like Boingo. I think this is generally fine, but I don’t know whether there is robust protection against someone setting up a bogus hotspot that appears to be part of the Boingo network. Since I don’t need it, at the conclusion of this testing, I turned it off. In the meantime, I left it alone.

The setting of primary concern to everyone is “Connect to networks shared by my contacts”, and that one was OFF by default.

Turning it ON experimentally brought up the three sharing options: Outlook contacts, Skype contacts, and Facebook friends. All three of these were OFF.

I turned on the Skype contacts option.

I then started the process to connect to my home Wi-Fi network by pulling open the network submenu in the task bar and clicking on my SSID.

This brought up the usual “Enter the network security key” field and a new one: “Share networking with my contacts.” That option was OFF even though I had turned on the share-with-contacts and Skype sharing options.

In other words, the defaults for the sharing method of primary concern in these web articles are ALL OFF. As off as off can be.

I abandoned the connection process without entering the security key, then turned off the share-with-contacts option in the Wi-Fi Sense settings and started the connection process again.
This time the connection box didn’t even have the “Share networking with my contacts” option.
I re-enabled the share-with-contacts and Skype options, and actually did go through with the connection process, including checking the sharing option.

Interestingly, the system did not give me any choice about which contacts to share it with. I went back into the Wi-Fi Sense settings… and the Manage known networks section said that my network was “Not shared.” How curious, but it saved me a few steps in the procedure I was going through, since my next thing was to share a network that had previously been connected but not shared to see what happens.

I clicked the Share button.

Even though I had already entered the network security key, it asked for the key again. This is exactly the right thing to do. This is how Windows 10 prevents a friend from sharing your security key if you personally type the security key into their device rather than, for example, reading it to them to enter manually.

I completed the sharing process and verified that it “stuck” this time.

Then I disabled the share-with-contacts option in Wi-Fi Sense, and then re-enabled it.
When I went back into “Manage known networks,” my network showed as “Not shared.”

So that’s the whole deal, I think. By default, Wi-Fi Sense operates, at least on my machine, as of today, on Build 10162, exactly as Microsoft says it does. Sharing only happens when you click a bunch of extra buttons to enable it, and stops when you deselect any of those options.
Every share-with-contacts option defaults to OFF, and it DOES protect against a Wi-Fi security key being shared by someone who doesn’t actually know it.

I hope that is the end of this matter for now, at least until we find someone reliable (that is, not a writer for The Register) who has a machine that works differently.

Or until Microsoft provides additional information on the various security aspects (how is the security key protected, how is local network access prevented, does Microsoft have a way to learn your password, does Microsoft have a way to review your Facebook contacts list, etc.).

Or until Microsoft adds what I think is the essential feature for sharing a Wi-Fi security key securely: sharing it with only one individually specified person at a time, without giving Microsoft a way to see the key.

Comments and questions welcome, of course.

The Chaos Manor Advisors discussed this issue a bit today (29 July 2015), especially after Brian Krebs wrote about this (see here). We shared that link to the article with the Advisors.

Eric said:

    So Krebs went ahead and wrote this without doing even the same brief testing Peter did weeks ago. This is how hysterias grow.

Peter added

In spite of the hysteria, I believe it is already fully opt-in.

The only, only, only thing that defaults to “on” is that the service is enabled. Every time a user adds a new Wi-Fi network, the dialog box specifically asks whether to share it with contacts or not, and which contacts to share it with from the three available options (Outlook/Facebook/Skype). All four of those questions, at least on my machine with a clean install, defaulted to OFF.

If the service itself is turned off, none of those sharing questions will be asked.

Now, if someone has turned on the service and shared a network, maybe it defaults to enable sharing the next time; I didn’t test that.

I think this business Krebs raises (and the Register raised) about how a friend could share your Wi-Fi credentials without your permission is just nonsense. That still takes a deliberate effort. If you have a friend who would do that, you need new friends.

This may be a bit of hysteria, as Peter stated. Although sharing your Wi-Fi password is generally not a good thing (especially for the paranoid?), it would appear to us that the actual risk is quite low, based on some limited testing by the Advisors.

We’d be interested in your opinion on this. You can share in the comments below. If you are inclined, you can send us more detailed information that we might use in a future post here at Chaos Manor Reviews. See this page on the submission guidelines for Chaos Manor Reviews.