More tests

pull/10/head
Dennis Schwerdel 2017-07-04 12:36:39 +02:00
parent 263339077e
commit 1674f19309
4 changed files with 57 additions and 7 deletions

View File

@ -12,7 +12,7 @@ and leave most chunks unchanged. Multiple backups of the same data set will only
take up the space of one copy.
The deduplication in zVault is able to reuse existing data no matter whether a
file is modified, stored again under a different name, renamed or moved to
file is modified, stored again under a different name, renamed or moved to a
different folder.
That makes it possible to store daily backups without much overhead as backups
@ -88,7 +88,7 @@ to work on the backup data and find the needed files.
I am using zVault on several of my computers. Here are some numbers from my
desktop PC. On this computer I am running daily backups of both the system `/`
(exclusing some folder like `/home`) with 12.9 GiB and the home folder `/home`
(excluding some folders like `/home`) with 12.9 GiB and the home folder `/home`
with 53.6 GiB.
$> zvault config ::
@ -100,7 +100,7 @@ with 53.6 GiB.
The backup repository uses the default configuration with encryption enabled.
The repository currently contains 12 backup versions of each folder. Both
folders combined currently contain over 66.5 GiB not counting changed between
folders combined currently contain over 66.5 GiB not counting changes between
the different versions.
$> zvault info ::
@ -115,7 +115,7 @@ the different versions.
The repository info reveals that the data stored in the repository is only
58.1 GiB, so 8.4 GiB / 12.5% has been saved by deduplication. Another 20.2 GiB /
34.7% have been saved by compression. In total, 28.6 out of 66.5 GiB / 43% have
been saved in total.
been saved.
The data is stored in over 5 million chunks of an average size of 10.9 KiB. The
average chunk is smaller than configured because of files smaller than the chunk
@ -137,9 +137,9 @@ original data.
This is the information on the last backup run for `/home`. The total data in
that backup is 53.6 GiB of which 2.4 GiB have been detected to have changed by
comparing file dates to the last backup. Of those changed files, deduplication
reduced the data to 50.8 MiB and compression reduced this to 8.9 MiB. The whole
backup run took less than 2 minutes.
comparing file dates and sizes to the last backup. Of those changed files,
deduplication reduced the data to 50.8 MiB and compression reduced this to
8.9 MiB. The whole backup run took less than 2 minutes.
$> zvault info ::system/2017-06-19
Date: Mon, 19 Jun 2017 00:00:01 +0200

View File

@ -19,3 +19,5 @@ mod linux {
}
pub use self::linux::*;
// Not testing since this requires root

View File

@ -29,3 +29,33 @@ pub fn parse_hex(hex: &str) -> Result<Vec<u8>, ()> {
_ => Err(()),
}
}
mod tests {
#[allow(unused_imports)]
use super::*;
#[test]
fn test_to_hex() {
assert_eq!(to_hex(&[0]), "00");
assert_eq!(to_hex(&[1]), "01");
assert_eq!(to_hex(&[15]), "0f");
assert_eq!(to_hex(&[16]), "10");
assert_eq!(to_hex(&[255]), "ff");
assert_eq!(to_hex(&[5,255]), "05ff");
}
#[test]
fn test_parse_hex() {
assert_eq!(parse_hex("00"), Ok(vec![0]));
assert_eq!(parse_hex("01"), Ok(vec![1]));
assert_eq!(parse_hex("0f"), Ok(vec![15]));
assert_eq!(parse_hex("0fff"), Ok(vec![15,255]));
assert_eq!(parse_hex("0F"), Ok(vec![15]));
assert_eq!(parse_hex("01 02\n03\t04"), Ok(vec![1,2,3,4]));
}
}

View File

@ -16,3 +16,21 @@ pub fn get_hostname() -> Result<String, ()> {
Err(())
}
}
mod tests {
#[allow(unused_imports)]
use super::*;
#[test]
fn test_gethostname() {
let res = get_hostname();
assert!(res.is_ok());
let name = res.unwrap();
assert!(name.len() >= 1);
}
}