Additional function

Nov 15, 2012 at 12:47 PM
Edited Nov 15, 2012 at 12:48 PM

Hi,

First i would like to tell you : GREAT JOB !!!!!!

Next one i would kindly ask for additional function :

can I make backup VM without using zip ????  

regards 

Gregory

Coordinator
Nov 15, 2012 at 6:52 PM

Hi Gregory,

For the moment a zipped archive is the only option, but you can easily unzip the resulting archives on the command line with e.g. unzip.exe, 7-zip's command line tool or Powershell.

Tx!

Alessandro

Nov 15, 2012 at 8:17 PM
alexp wrote:

Hi Gregory,

For the moment a zipped archive is the only option, but you can easily unzip the resulting archives on the command line with e.g. unzip.exe, 7-zip's command line tool or Powershell.

Tx!

Alessandro

Hi

Thanks for answer.. but

There is no problem with unzip.... 

Backup of 600 GB VM takes now more less 10 hours......  and this is a problem for me :(

copy of this VM from manually attached snapshot less than 1 hour :)

 

Thx

Gregory

Coordinator
Nov 15, 2012 at 9:17 PM

There's a parallel zip feature in DotNetZIp that unfortunately is buggy, otherwise it would be way faster.

One option could be to specify the compression level, including 0 (no compression) and another to simply specify a target path and create subdirectories for each backup.

We'll probably do both :-)

 

Dec 17, 2012 at 2:45 AM

It would be great to allow compression to be turned off in my scenario.  I was hoping to use a Server 2012 file share as a target and then run deduplication on that volume.  Without compression the backup should be much faster and then the file share can do the dedup process keeping storage to a minimum.  This would also allow very fast restores.

My two cents anyway.  Thanks for a great utility!

Dec 17, 2012 at 1:23 PM

I ran a simple analysis comparing the zip results versus dedup.  I am using 2008 R2 but evaluating the dedup savings using the tool ddpeval.

For my test, I only backed up a single host running two guests. Each guest is a virtual domain controller running 2008 R2.  They use about 25GB of storage on the host machine.

HVBackup generated two zip files about 10GB in size.  That is a great reduction!  Running ddpeval against these shows deduplication under 2012 would not help much which is expected since the files are compressed. 

Evaluated folder size: 19.39 GB
Files in evaluated folder: 3

Processed files: 3
Processed files size: 19.39 GB
Optimized files size: 18.49 GB
Space savings: 119.80 KB
Space savings percent: 0

Optimized files size (no compression): 18.49 GB
Space savings (no compression): 119.80 KB
Space savings percent (no compression): 0

I unzipped the files into a separate directory tree and ran ddpeval on those directories.  The results are pretty good:

Evaluated folder size: 53.66 GB
Files in evaluated folder: 8

Processed files: 6
Processed files size: 53.66 GB
Optimized files size: 13.31 GB
Space savings: 40.35 GB
Space savings percent: 75

Optimized files size (no compression): 31.86 GB
Space savings (no compression): 21.80 GB
Space savings percent (no compression): 40

The space savings would be much higher as the VM density increases if the guests are all running the same Windows OS.  Most of our VM hosts are running 10-25 guests, all using 2008 R2.  I put this here only to suggest that the ability to turn off zip compression could be valuable with the new release of 2012 and using that as a storage point for the backups.  I suspect the overall timeline of backups and deduplication would be faster than the zip compression currently used (just a guess).

Jan 1, 2013 at 7:08 PM
iprob wrote:

It would be great to allow compression to be turned off in my scenario.  I was hoping to use a Server 2012 file share as a target and then run deduplication on that volume.  Without compression the backup should be much faster and then the file share can do the dedup process keeping storage to a minimum.  This would also allow very fast restores.

My two cents anyway.  Thanks for a great utility!


I want to second this suggestion.  I'd love to shuffle the backups to a deduplicated share on my 2012 file server.  Dedup with 7zip compression yields nothing meaningful.

 

Mark Ringo

Mar 25, 2013 at 10:27 PM
Edited Mar 25, 2013 at 10:46 PM
Hello Alex,

first of all, thank you for your great work. But storing without compression would be a huge benefit for us also. Compressing hundreds of gigabytes takes to much time. What about this project? Is it still developed? (When) is this feature applied?
Feb 11, 2016 at 11:01 PM
I also echo PrismaComputer's comments:
Fantastic software Alex. Well done.
I agree that an option to disable ZIP archive would be very helpful. My reasoning as follows. Even with no compression, archiving in a ZIP file requires Windows to extract a file before mounting. This means if I want to restore a 50KB Word doc from a 1TB file server, I will need a spare 1TB of storage to temporary extract the ZIP file before mounting. In summery, if ZIP archiving could be disabled, an administrator could simply double-click a backup VHD file to browse and then restore a file in minutes with native Windows tools.

Keep up the great work!

Kind regards,
Duncan