Networking

Unix and Linux network configuration. Multiple network interfaces. Bridged NICs. High-availability network configurations.

Applications

Reviews of latest Unix and Linux software. Helpful tips for application support admins. Automating application support.

Data

Disk partitioning, filesystems, directories, and files. Volume management, logical volumes, HA filesystems. Backups and disaster recovery.

Monitoring

Distributed server monitoring. Server performance and capacity planning. Monitoring applications, network status and user activity.

Commands & Shells

Cool Unix shell commands and options. Command-line tools and application. Things every Unix sysadmin needs to know.

Home » Commands & Shells, Featured

Multithreaded Encryption and Compression

Submitted by on November 24, 2014 – 9:13 am 5 Comments

One problem with encryption is it’s a slow and resource-intensive process. While most encryption software lacks multithreading support, it is possible to use the GNU Parallel to take full advantage of modern multi-core CPUs to greatly speedup encryption and decryption. Below are a few examples showing how to use parallel with gpg. I also included some examples of using tar and pigz (multithreaded gzip).

I suggest you create a directory with a bunch of small files and use it as your sandbox to test encryption/decryption before you move on to the real data. Here is a script that will generate a dummy folder structure with  a few thousand small, randomly-generate files. You can read more about this script here.

 Note:  for the sake of simplicity I am using gpg password encryption. For better security, in real life you should be using gpg with encryption key.

Install tools:

To install parallel from source:

On Solaris 10/11:

Set dir and number of cores:

On Solaris 10/11:

Encrypt all files in a directory:

Delete original non-encrypted files:

Decrypt all encrypted files in a directory:

Delete original encrypted file:

Create compressed tarball with pigz:

Create compressed encrypted tarball with pigs and gpg:

Decrypt compressed tarball:

 

Print Friendly, PDF & Email

5 Comments »

  • Avatar OleTange says:

    GNU Parallel defaults to -j number_of_cores, so you do no need that. Also you do not need ” around {}.

    • Avatar igor444 says:

      GNU parallel defaults to 9 threads if -j is not specified. See here: http://www.admin-magazine.com/HPC/Articles/GNU-Parallel-Multicore-at-the-Command-Line-with-GNU-Parallel

      An alternative is to specify “-j +0”, allowing parallel to automatically determine the number of core. However, this doesn’t always work on Linux and never works on SPARC Solaris.

      It is a good practice to enclose variables in double quotes, thus saving yourself a lot of trouble when the variable’s value happens to contain a space or a special character.

      • Avatar OleTange says:

        Being the author of GNU Parallel I see myself as a more authoritative source than admin-magazine. GNU Parallel _used_ to default to 9 processes. It was changed in 20110122 after a user vote.

        While it is good practice to enclose VARIABLES in double quotes, that is not the case with {} (which is not a variable but a replacement string).

        GNU Parallel deals with the spacing, so putting ” around is always unneeded and can cause harm in some situations. Here for example: parallel echo ‘a “{}”‘ ::: ‘a b’

        • Avatar igor444 says:

          Point taken on the double-quotes. Thanks.

          I noticed on SPARC T4-2 parallel launches 16 threads with or without the –use-cpus-instead-of-cores. I would have expected 128 – the number of vcores.

          • Avatar OleTange says:

            Core detection is not tested very well on SPARC: I do not have access to a T4-2, so I rely on users to provide improvements. Feel free to do so.

4 Pingbacks »

Leave a Reply

%d bloggers like this: