Skip to contents

Use package entropy to compute Kullback-Leibler divergence. The function first converts each vector's reads to pseudo-number of transcripts by normalizing the total reads to total_reads. The normalized read for each gene is then rounded to serve as the pseudo-number of transcripts. Function entropy::KL.shrink is called to compute the KL-divergence between the two vectors, and the maximal allowed divergence is set to max_KL. Finally, a linear transform is performed to convert the KL divergence, which is between 0 and max_KL, to a similarity score between -1 and 1.

Usage

kl_divergence(vec1, vec2, if_log = FALSE, total_reads = 1000, max_KL = 1)

Arguments

vec1

Test vector

vec2

Reference vector

if_log

Whether the vectors are log-transformed. If so, the raw count should be computed before computing KL-divergence.

total_reads

Pseudo-library size

max_KL

Maximal allowed value of KL-divergence.

Value

numeric value, with additional attributes, of kl divergence between the vectors