This project is read-only.

Is development still active?

Apr 3, 2013 at 3:20 PM
I was curious if anyone knows if development on this project is still active? I find it a very useful tool and I was looking to make a number of improvements. Before I did that, I thought I would check to see if anyone is still working on it and if so maybe sync up on the roadmap.

Some things I am looking to implement:
  • Options for disabling compression
  • Support for "file copy" (possibly support delta/diff copies only)
  • Support for 7zip (in addition to zip)
Apr 3, 2013 at 3:53 PM
The project is still in development. We're always willing to accept contributions! :-)

Send me please an email to discuss about development guidelines, etc.

Thanks,

Alessandro
Apr 3, 2013 at 4:42 PM
I just added support for a custom compression level. The value must be between 0 (no compression) and 9 (max compression).

Would you like to test it before publishing the binaries? :-)

Here's a build w the latest code: https://dl.dropbox.com/u/9060190/HyperVBackup_beta_20130403.zip

Example for disabling compression:

HVBackup -l VM1,VM2 -compressionlevel 0 -o \yourserver\backup


Thanks,

Alessandro
Apr 3, 2013 at 4:53 PM

Happy to help – I’ll start playing with it now.

Apr 3, 2013 at 8:35 PM

Ok, not bad. Backup time went for 4 hours to 1 hour 40 minutes (still much longer than a file copy by much better).

One thing to note, for some reason in this build –o didn’t work I needed to use –output

Apr 3, 2013 at 9:08 PM
Cool tx!

IMO to improve performance we can get some big benefits by replacing DotNetZIp with 7z.dll.
There are quite a few .Net bindings available e.g.: http://sevenzipsharp.codeplex.com/
Apr 12, 2013 at 7:03 PM
Hi ravensorb and alexp,

did you already start to integrate HVBackup with 7-zip? Once it works, If you need someone to help you test i, just drop me a note.

Markus
Apr 18, 2013 at 10:38 AM
Edited Apr 18, 2013 at 11:02 AM
Hello!
Dont understand how to "replace DotNetZIp with 7z.dll". Cant found DotNetZip.
Hope this helps to load the CPU more than 8% and will make backups faster.
Apr 19, 2013 at 3:59 PM
Edited Apr 19, 2013 at 4:03 PM
If I might suggest another thing too...

I just found a freaking fast compressor/decompressor, open source etc.

Source is available in different languages (C, C# etc.).

Also the CMD-Line tool can work with streams/pipes (same with 7z btw.). So if there was an option to "reroute" like "HVBackup -l VM1,VM2 -extcomp "lz4.exe input d:\bla.lz4" ... work done without coding :)

For the mainpage (with link to google code) http://fastcompression.blogspot.de/p/lz4.html

EDIT: Forgot some benchmarks :3

From SATA Raid 5 to NAS over 1 GB/s (ok, did a copy to the NAS from another pc at the same time)
Compressing D:\hyperv\Virtual Hard Disks\Datenlaufwerk.vhdx using 8 threads (compression level = 0)
Compression completed : 62.00GB --> 49.07GB  (79.14%) (52689940013 Bytes)
Total Time : 918.72s ==> 72.5MB/s
(CPU : 78.39s = 9%)
And from the same raid5 to the local SSD HDD:
*** LZ4 v1.3.3, by Yann Collet (Feb  8 2013) ***
Detected : 8 cores
Compressing D:\hyperv\Virtual Hard Disks\sbs.vhdx using 8 threads (compression level = 0)
Compression completed : 135.38GB --> 84.60GB  (62.49%) (90836179573 Bytes)
Total Time : 712.78s ==> 203.9MB/s
(CPU : 354.86s = 50%)
And thats with comp. lvl 0 ... The zip's right now don't get much smaller
Apr 23, 2013 at 1:40 AM
I agree with Maerad to some degree, i.e.
  1. It's a good idea to run some external command to do the backup. It could be a simple xcopy or some compression tool.
  2. I wouldn't use a compression format that most people don't know (LZ4) as the only available compression format. That might become a problem when the Server crashed, you have to restore to a new hyper-v machine and you have to get the external compression tool first. The good thing of using well known compression formats like ZIP, RAR or 7zip is that anyone immediately knows how to decompress it (and where to get the decompressor from).
Having said that I vote for a command line Interface that's open to any compressor or even a batch file that simply copies every file to the backup storage.

@AlexP: Is this possible from a technical point of view?

Markus
Apr 25, 2013 at 7:24 AM
Edited Apr 25, 2013 at 9:02 AM
alexp wrote:
I just added support for a custom compression level. The value must be between 0 (no compression) and 9 (max compression).



Here's a build w the latest code: https://dl.dropbox.com/u/9060190/HyperVBackup_beta_20130403.zip

Example for disabling compression:

HVBackup -l VM1,VM2 -compressionlevel 0 -o \yourserver\backup


Thanks,

Alessandro
Strangely enough it doesn't work ;( Using --compresionlevel (with = or without it) it doesn't recognise the syntax. Without this option everything works just fine ;(
Used binaries provided by alexp.

//Edit - found solution provided by ravensorb - use -output not -o
//Edit 2 - I was too hasty:

D:\share\HVBackup>hvbackup -a -output d:\backup -compressionlevel 1

Cloudbase HyperVBackup 1.0 beta1
Copyright (C) 2012 Cloudbase Solutions Srl
http://www.cloudbasesolutions.com
Error: Exception has been thrown by the target of an invocation.
at System.RuntimeMethodHandle._InvokeMethodFast(Object target, Object[] argum
ents, SignatureStruct& sig, MethodAttributes methodAttributes, RuntimeTypeHandle
typeOwner)
at System.Reflection.RuntimeMethodInfo.Invoke(Object obj, BindingFlags invoke
Attr, Binder binder, Object[] parameters, CultureInfo culture, Boolean skipVisib
ilityChecks)
at System.Reflection.RuntimeMethodInfo.Invoke(Object obj, BindingFlags invoke
Attr, Binder binder, Object[] parameters, CultureInfo culture)
at CommandLine.HelpOptionAttribute.InvokeMethod(Object target, Pair`2 pair, S
tring& text)
at CommandLine.CommandLineParser.ParseArguments(String[] args, Object options
, TextWriter helpWriter)
at Cloudbase.Titan.HyperV.Backup.CLI.Program.Main(String[] args)