-
Backup for Windows Home Server
Now that my WHS installation is running properly I’m a bit happier because it means I now have a current backup of all of our other computers. That’s a good start, but it doesn’t solve the problem of having an off-site backup.
One option is to buy one (or more) external drives – backup the WHS data to the external drive and then transport that drive to a trusted external location. That’s fine, but it would rely on me being disciplined enough to update it at regular intervals – and I’m not sure that I trust myself to remember to do that frequently enough!
The other option is to use the ‘cloud’ - subscribe to an online backup solution. Googling “WHS Backup” doesn’t list that many useful results. The top result is a relevant question on SuperUser. Scanning the answers reveals two products that apparently DO work with WHS, and a number of products to avoid because they don’t.
KeepVault
KeepVault provide online backup for Windows desktops and Windows Home Server. Their WHS product also includes a ‘client connector’ so you can also backup files from client PC’s too.
Pricing starts at $US48/year for 40GB. A range of larger amounts are also available including 80, 130, 200, 300-900, 1TB-5TB. They also offer a 15% discount if you pay via PayPal.
humyo
Humyo don’t specifically mention WHS, but the SuperUser comment indicates it installs and functions correctly.
Their pricing starts at $US8.21/month or $US82.24/year for 100GB. Additional amounts of 100GB can be added for $US11.74/month
Comparison
So how do the numbers stack up? The comparison is simpler once you get to 200GB and beyond. To simplify things, I’ve used US dollars and excluded KeepVault’s PayPal discount.
GB Provider 40 80 100 130 200 500 1000 Humyo 82.24 223.12 645.76 1350.16 KeepVault 48 89 139 199 480 930 Throwing the numbers into a graph illustrates this nicely. For amounts of data below 200GB, Humyo looks ok, but once you pass that mark KeepVault appears to be the best value.
I can only see our backup requirements increasing, so at this stage I’m planning to sign up with KeepVault.
-
Happy Hyper-V
I think my Hyper-V server is finally behaving itself. In searching for a resolution to the intermittent BSOD, I finally found something that seemed to match my particular combination of hardware and software. I’ve installed the hotfix for Windows Server 2008 R2 that works around this “erratum” in Intel’s Core i7 processors, and so far so good.
The other reason Hyper-V has to be happy is that I also purchased a proper case to house the hardware in. I ended up getting an Antec Three Hundred case from MATS Systems. It’s a nice, smart, functional case. While there are cheaper cases around, Mark from MATS recommended the Antec models in particular because of their cooling ability. The Three Hundred (the model, not the price!) comes with two fans, and has decent capacity for mounting a few hard disks too.
Let’s see what a difference a proper case with extra fans makes:
Component °C (DIY Case) °C (Antec Case) CPU Core #0 41 32 CPU Core #1 36 29 CPU Core #2 43 34 CPU Core #3 38 28 HDD ST314003 #1 53 36 HDD ST314003 #2 55 35 That’s quite a significant drop. Those temperatures seem much more reasonable too.
-
NDepend
I received an email the other day from Patrick Smacchia, the lead developer of probably the best known .NET dependency analysis tool NDepend a couple of days ago, letting me know that version 3.0 is now available.
It also pricked my conscious that a fair while back Patrick had given me a license for NDepend v2.0 and only asked that I post about my experiences using it. It is now way overdue for me to return the favour… so what are my thoughts?
First of all, what is it? As I understand it NDepend is a static analysis tool that looks at the dependencies between classes and assemblies. Whereas FxCop might look at individual lines of code to detect sub-optimal patterns, NDepend takes a step back and looks at the big picture of how your application is architected – especially how loosely coupled (or not) your classes are (which is normally a desirable aim).
Probably the main thing I like about NDepend is the dependency graph it can generate. I’ve found this useful both for getting a thumbnail sketch of your current solution (especially if it has evolved a bit over time), and also to a rough idea of the architecture of a legacy application that you’ve just been handed.
One other feature that appeals to my sense of keeping your code in good order is how it can highlight potential version mismatches between assemblies and also symbol files. When your build process becomes slightly more complicated this can be handy to ensure all your ducks are in the right row and version :-) The version differences may not be a problem for compiling and linking, but if your PDB file is wrong then you’re going to get misleading or minimal stack trace information – something better avoided in the first place.
The final feature that can be quite illuminating for me is the dependencies matrix. This is a table (matrix) that shows the interdependencies between all your assemblies. Because it adds up the totals, you can quickly identify where an assembly may only have one (or even no) hard dependencies on another assembly and consider refactoring to reduce your coupling if appropriate.
There are also features of NDepend that to be honest I’m happy to know are there but haven’t had a chance to make use of in a real way yet. It has a very comprehensive inbuilt query language, and could add some real value to a continuous integration build system – where you’re keen to set some code quality benchmarks.
If you’re still not convinced, there’s a whole bunch of demo and tutorial videos on the NDepend site that can explain the range of features a lot better than I can.
Version 3.0 adds new features including:
- Better integration with Visual Studio, including support for VS 2010
- NDepend projects can now be a part of a Visual Studio solution
- Improved performance
- Integration with Reflector
In summary I have found NDepend to be a really useful tool to have in my toolkit, and I’d like to thank Patrick and his colleagues for the opportunity to have access to it.