A fundamental problem in biomedical research is the low number of observations, mostly due to a lack of available biosamples, prohibitive costs, or ethical reasons. By augmenting a few real observations with artificially generated samples, their analysis could lead to more robust and higher reproducibility. One possible solution to the problem is the use of generative models, which are statistical models of data that attempt to capture the entire probability distribution from the observations. Using the variational autoencoder (VAE), a well-known deep generative model, this package is aimed to generate samples with gene expression data, especially for single-cell RNA-seq data. Furthermore, the VAE can use conditioning to produce specific cell types or subpopulations. The conditional VAE (CVAE) allows us to create targeted samples rather than completely random ones.
Autoencoders are an unsupervised neural networks that perform data compression from multidimensional to a preferred dimensionality. They reconstruct input data using the hidden layer weights calculated by encoding. The basic idea of an autoencoder is to have an output layer with the same dimensionality as the inputs. The idea is to try to reconstruct each dimension exactly by passing it through the network. It is common but not necessary for an autoencoder to have a symmetric architecture between the input and output. The number of units in the middle layer is typically fewer than that in the input or output. After training an autoencoder, it is not necessary to use both the encoder and decoder portions. For example, when using the approach for dimensionality reduction, one can use the encoder portion in order to create the reduced representations of the data. The reconstructions of the decoder might not be required at all. As a result, an autoencoder is capable of performing dimension reduction. The objective function of this neural network encompasses reconstruction loss. The loss function uses the sum of squared differences between the input and the output in order to force the output to be as similar as possible to the input. Also, the cross-entropy can used as a loss function for quantifying the difference between two probability distributions.
Another interesting application of the autoencoder is one in which we use only the decoder portion of the network. Variational autoencoders are based on Bayesian inference in which the compressed representation follows probability distribution. This constraint differentiates the VAE from standard autoencoder. The VAE can generate new data while conventional autoencoders fail. For example, one might add a term to the loss function to enforce the fact that the hidden variables are drawn from a Gaussian distribution. Then, one might repeatedly draw samples from this Gaussian distribution and use only the decoder portion of the network in order to generate samples of the original data. In this autoencoder, bottleneck vector (latent vector) is replaced by two vectors, namely, mean vector and standard deviation vector. The overall loss function \(J = L + \lambda R\) of the VAE is expressed as a weighted sum of the reconstruction loss \(L\) and the regularization loss \(R\), where \(\lambda > 0\) is the regularization parameter. The term “variational” comes from the close relationship between the regularization and the variational inference method in statistics. One can use a variety of choices for the reconstruction error, and we will use the binary cross-entropy loss between the input and output. The regularization loss is simply the Kullback-Leibler divergence measure of the conditional distributions of the hidden representations of particular points with respect to the standard multivariate Gaussian distribution. Small values of \(\lambda\) will favor exact reconstruction, and the approach will behave like a traditional autoencoder.
One can apply conditioning to variational autoencoders in order to obtain some interesting results. The basic idea in the conditional variational autoencoder is to add an additional conditional input. From an implementation perspective, we can encode category information as a one-hot representation, indicating to the model which class is at the input. One can use an autoencoder for embedding multimodal data in a joint latent space. Multimodal data is essentially data in which the input features are heterogeneous. In addition, by separating the samples into different classes, the data points within the same category become more similar, enhancing the modeling capacity and sample quality of the CVAE.
Consider artificial data. The data consist of 1000 genes and three groups of 100 samples. Each group has 100 differentially expressed genes.
if (keras::is_keras_available() & reticulate::py_available()) {
library(VAExprs)
### simulate differentially expressed genes
set.seed(1)
g <- 3
n <- 100
m <- 1000
mu <- 5
sigma <- 5
mat <- matrix(rnorm(n*m*g, mu, sigma), m, n*g)
rownames(mat) <- paste0("gene", seq_len(m))
colnames(mat) <- paste0("cell", seq_len(n*g))
group <- factor(sapply(seq_len(g), function(x) {
rep(paste0("group", x), n)
}))
names(group) <- colnames(mat)
mu_upreg <- 6
sigma_upreg <- 10
deg <- 100
for (i in seq_len(g)) {
mat[(deg*(i-1) + 1):(deg*i), group == paste0("group", i)] <-
mat[1:deg, group==paste0("group", i)] + rnorm(deg, mu_upreg, sigma_upreg)
}
# positive expression only
mat[mat < 0] <- 0
x_train <- as.matrix(t(mat))
# heatmap
heatmap(mat, Rowv = NA, Colv = NA,
col = colorRampPalette(c('green', 'red'))(100),
scale = "none")
}
## Loading required package: keras
## Loading required package: mclust
## Package 'mclust' version 6.0.0
## Type 'citation("mclust")' for citing this R package in publications.
The VAE model can be built by using the function “fit_vae” with gene expression data and the cell annotation from the object “sce”. The overall loss function of the VAE is expressed as a weighted sum of the reconstruction loss and the regularization loss. The reconstruction loss is the binary cross-entropy loss between the input and output and the regularization loss is simply the Kullback-Leibler divergence measure. Note that the same dataset is used for training and validation.
if (keras::is_keras_available() & reticulate::py_available()) {
# model parameters
batch_size <- 32
original_dim <- 1000
intermediate_dim <- 512
epochs <- 100
# VAE
vae_result <- fit_vae(x_train = x_train, x_val = x_train,
encoder_layers = list(layer_input(shape = c(original_dim)),
layer_dense(units = intermediate_dim,
activation = "relu")),
decoder_layers = list(layer_dense(units = intermediate_dim,
activation = "relu"),
layer_dense(units = original_dim,
activation = "sigmoid")),
epochs = epochs, batch_size = batch_size,
use_generator = FALSE,
callbacks = keras::callback_early_stopping(
monitor = "val_loss",
patience = 10,
restore_best_weights = TRUE))
}
## normalizing...
## training...
## Train on 300 samples, validate on 300 samples
## Epoch 1/100
##
32/300 [==>...........................] - ETA: 2s - loss: 693.3777
192/300 [==================>...........] - ETA: 0s - loss: 680.4378
300/300 [==============================] - 0s 2ms/sample - loss: 664.0866 - val_loss: 623.7853
## Epoch 2/100
##
32/300 [==>...........................] - ETA: 0s - loss: 623.7745
192/300 [==================>...........] - ETA: 0s - loss: 615.8184
300/300 [==============================] - 0s 563us/sample - loss: 613.3309 - val_loss: 605.1447
## Epoch 3/100
##
32/300 [==>...........................] - ETA: 0s - loss: 604.4410
256/300 [========================>.....] - ETA: 0s - loss: 604.0837
300/300 [==============================] - 0s 450us/sample - loss: 604.1434 - val_loss: 602.3071
## Epoch 4/100
##
32/300 [==>...........................] - ETA: 0s - loss: 602.5760
224/300 [=====================>........] - ETA: 0s - loss: 602.3988
300/300 [==============================] - 0s 436us/sample - loss: 601.6061 - val_loss: 600.3206
## Epoch 5/100
##
32/300 [==>...........................] - ETA: 0s - loss: 601.6988
224/300 [=====================>........] - ETA: 0s - loss: 600.1878
300/300 [==============================] - 0s 466us/sample - loss: 599.8928 - val_loss: 599.6219
## Epoch 6/100
##
32/300 [==>...........................] - ETA: 0s - loss: 599.8324
224/300 [=====================>........] - ETA: 0s - loss: 598.9551
300/300 [==============================] - 0s 454us/sample - loss: 599.1199 - val_loss: 599.1119
## Epoch 7/100
##
32/300 [==>...........................] - ETA: 0s - loss: 597.6871
224/300 [=====================>........] - ETA: 0s - loss: 598.4109
300/300 [==============================] - 0s 449us/sample - loss: 598.4155 - val_loss: 597.6731
## Epoch 8/100
##
32/300 [==>...........................] - ETA: 0s - loss: 600.3248
192/300 [==================>...........] - ETA: 0s - loss: 597.3896
300/300 [==============================] - 0s 466us/sample - loss: 597.2993 - val_loss: 596.4795
## Epoch 9/100
##
32/300 [==>...........................] - ETA: 0s - loss: 597.5352
224/300 [=====================>........] - ETA: 0s - loss: 596.7849
300/300 [==============================] - 0s 428us/sample - loss: 596.1867 - val_loss: 594.7914
## Epoch 10/100
##
32/300 [==>...........................] - ETA: 0s - loss: 595.9444
224/300 [=====================>........] - ETA: 0s - loss: 593.4766
300/300 [==============================] - 0s 426us/sample - loss: 593.0241 - val_loss: 590.5174
## Epoch 11/100
##
32/300 [==>...........................] - ETA: 0s - loss: 590.7295
224/300 [=====================>........] - ETA: 0s - loss: 589.8755
300/300 [==============================] - 0s 443us/sample - loss: 590.1935 - val_loss: 589.3097
## Epoch 12/100
##
32/300 [==>...........................] - ETA: 0s - loss: 589.0380
224/300 [=====================>........] - ETA: 0s - loss: 589.7121
300/300 [==============================] - 0s 452us/sample - loss: 589.7264 - val_loss: 589.3918
## Epoch 13/100
##
32/300 [==>...........................] - ETA: 0s - loss: 588.9846
224/300 [=====================>........] - ETA: 0s - loss: 589.4919
300/300 [==============================] - 0s 469us/sample - loss: 589.3229 - val_loss: 588.3949
## Epoch 14/100
##
32/300 [==>...........................] - ETA: 0s - loss: 587.0503
192/300 [==================>...........] - ETA: 0s - loss: 589.9577
300/300 [==============================] - 0s 437us/sample - loss: 588.6570 - val_loss: 586.3035
## Epoch 15/100
##
32/300 [==>...........................] - ETA: 0s - loss: 588.0855
224/300 [=====================>........] - ETA: 0s - loss: 586.5282
300/300 [==============================] - 0s 438us/sample - loss: 586.5291 - val_loss: 585.2660
## Epoch 16/100
##
32/300 [==>...........................] - ETA: 0s - loss: 584.8837
224/300 [=====================>........] - ETA: 0s - loss: 584.6337
300/300 [==============================] - 0s 412us/sample - loss: 584.6638 - val_loss: 583.8701
## Epoch 17/100
##
32/300 [==>...........................] - ETA: 0s - loss: 584.1514
256/300 [========================>.....] - ETA: 0s - loss: 583.4665
300/300 [==============================] - 0s 414us/sample - loss: 583.5034 - val_loss: 582.4383
## Epoch 18/100
##
32/300 [==>...........................] - ETA: 0s - loss: 581.3309
224/300 [=====================>........] - ETA: 0s - loss: 581.9946
300/300 [==============================] - 0s 477us/sample - loss: 581.8045 - val_loss: 581.3592
## Epoch 19/100
##
32/300 [==>...........................] - ETA: 0s - loss: 582.6385
224/300 [=====================>........] - ETA: 0s - loss: 581.4079
300/300 [==============================] - 0s 478us/sample - loss: 581.8279 - val_loss: 580.5949
## Epoch 20/100
##
32/300 [==>...........................] - ETA: 0s - loss: 581.5426
224/300 [=====================>........] - ETA: 0s - loss: 581.6155
300/300 [==============================] - 0s 437us/sample - loss: 581.8785 - val_loss: 580.0442
## Epoch 21/100
##
32/300 [==>...........................] - ETA: 0s - loss: 579.9960
224/300 [=====================>........] - ETA: 0s - loss: 580.6586
300/300 [==============================] - 0s 446us/sample - loss: 580.5885 - val_loss: 580.0973
## Epoch 22/100
##
32/300 [==>...........................] - ETA: 0s - loss: 577.9459
224/300 [=====================>........] - ETA: 0s - loss: 580.0337
300/300 [==============================] - 0s 488us/sample - loss: 580.3698 - val_loss: 579.8708
## Epoch 23/100
##
32/300 [==>...........................] - ETA: 0s - loss: 581.3958
192/300 [==================>...........] - ETA: 0s - loss: 579.7902
300/300 [==============================] - 0s 484us/sample - loss: 579.8561 - val_loss: 579.2599
## Epoch 24/100
##
32/300 [==>...........................] - ETA: 0s - loss: 579.7958
224/300 [=====================>........] - ETA: 0s - loss: 579.6096
300/300 [==============================] - 0s 454us/sample - loss: 579.9104 - val_loss: 580.3393
## Epoch 25/100
##
32/300 [==>...........................] - ETA: 0s - loss: 578.0182
224/300 [=====================>........] - ETA: 0s - loss: 579.9552
300/300 [==============================] - 0s 473us/sample - loss: 580.5612 - val_loss: 580.3345
## Epoch 26/100
##
32/300 [==>...........................] - ETA: 0s - loss: 580.4282
224/300 [=====================>........] - ETA: 0s - loss: 580.3461
300/300 [==============================] - 0s 460us/sample - loss: 581.2507 - val_loss: 579.2969
## Epoch 27/100
##
32/300 [==>...........................] - ETA: 0s - loss: 578.5140
192/300 [==================>...........] - ETA: 0s - loss: 580.0470
300/300 [==============================] - 0s 466us/sample - loss: 580.3329 - val_loss: 579.6585
## Epoch 28/100
##
32/300 [==>...........................] - ETA: 0s - loss: 578.2626
224/300 [=====================>........] - ETA: 0s - loss: 579.7411
300/300 [==============================] - 0s 469us/sample - loss: 579.6127 - val_loss: 579.1018
## Epoch 29/100
##
32/300 [==>...........................] - ETA: 0s - loss: 581.6481
192/300 [==================>...........] - ETA: 0s - loss: 579.4683
300/300 [==============================] - 0s 478us/sample - loss: 579.8704 - val_loss: 578.8787
## Epoch 30/100
##
32/300 [==>...........................] - ETA: 0s - loss: 579.0470
224/300 [=====================>........] - ETA: 0s - loss: 579.3684
300/300 [==============================] - 0s 475us/sample - loss: 579.4397 - val_loss: 579.4427
## Epoch 31/100
##
32/300 [==>...........................] - ETA: 0s - loss: 578.3942
192/300 [==================>...........] - ETA: 0s - loss: 579.2429
300/300 [==============================] - 0s 467us/sample - loss: 579.4735 - val_loss: 579.0290
## Epoch 32/100
##
32/300 [==>...........................] - ETA: 0s - loss: 578.2325
256/300 [========================>.....] - ETA: 0s - loss: 579.7551
300/300 [==============================] - 0s 428us/sample - loss: 579.7909 - val_loss: 579.4906
## Epoch 33/100
##
32/300 [==>...........................] - ETA: 0s - loss: 582.4084
224/300 [=====================>........] - ETA: 0s - loss: 579.7127
300/300 [==============================] - 0s 480us/sample - loss: 579.5809 - val_loss: 579.0492
## Epoch 34/100
##
32/300 [==>...........................] - ETA: 0s - loss: 578.8679
192/300 [==================>...........] - ETA: 0s - loss: 578.8098
300/300 [==============================] - 0s 446us/sample - loss: 579.1068 - val_loss: 578.6322
## Epoch 35/100
##
32/300 [==>...........................] - ETA: 0s - loss: 580.1639
224/300 [=====================>........] - ETA: 0s - loss: 578.8209
300/300 [==============================] - 0s 385us/sample - loss: 578.8917 - val_loss: 578.7863
## Epoch 36/100
##
32/300 [==>...........................] - ETA: 0s - loss: 578.3207
256/300 [========================>.....] - ETA: 0s - loss: 579.2481
300/300 [==============================] - 0s 378us/sample - loss: 579.2875 - val_loss: 579.3637
## Epoch 37/100
##
32/300 [==>...........................] - ETA: 0s - loss: 578.3543
256/300 [========================>.....] - ETA: 0s - loss: 579.2191
300/300 [==============================] - 0s 390us/sample - loss: 579.3599 - val_loss: 578.8713
## Epoch 38/100
##
32/300 [==>...........................] - ETA: 0s - loss: 577.3948
256/300 [========================>.....] - ETA: 0s - loss: 578.9716
300/300 [==============================] - 0s 399us/sample - loss: 579.1305 - val_loss: 578.7188
## Epoch 39/100
##
32/300 [==>...........................] - ETA: 0s - loss: 579.7421
192/300 [==================>...........] - ETA: 0s - loss: 578.3781
300/300 [==============================] - 0s 448us/sample - loss: 578.9485 - val_loss: 578.7664
## Epoch 40/100
##
32/300 [==>...........................] - ETA: 0s - loss: 578.1682
224/300 [=====================>........] - ETA: 0s - loss: 579.1499
300/300 [==============================] - 0s 428us/sample - loss: 578.9659 - val_loss: 578.9266
## Epoch 41/100
##
32/300 [==>...........................] - ETA: 0s - loss: 578.2414
224/300 [=====================>........] - ETA: 0s - loss: 578.7799
300/300 [==============================] - 0s 391us/sample - loss: 578.8808 - val_loss: 578.5350
## Epoch 42/100
##
32/300 [==>...........................] - ETA: 0s - loss: 577.7531
256/300 [========================>.....] - ETA: 0s - loss: 578.5253
300/300 [==============================] - 0s 370us/sample - loss: 578.8233 - val_loss: 579.0961
## Epoch 43/100
##
32/300 [==>...........................] - ETA: 0s - loss: 582.2568
224/300 [=====================>........] - ETA: 0s - loss: 579.2811
300/300 [==============================] - 0s 440us/sample - loss: 579.5117 - val_loss: 578.7843
## Epoch 44/100
##
32/300 [==>...........................] - ETA: 0s - loss: 579.8170
224/300 [=====================>........] - ETA: 0s - loss: 579.0293
300/300 [==============================] - 0s 465us/sample - loss: 578.8055 - val_loss: 578.5224
## Epoch 45/100
##
32/300 [==>...........................] - ETA: 0s - loss: 578.3704
224/300 [=====================>........] - ETA: 0s - loss: 579.3092
300/300 [==============================] - 0s 441us/sample - loss: 578.8734 - val_loss: 578.3503
## Epoch 46/100
##
32/300 [==>...........................] - ETA: 0s - loss: 579.1277
192/300 [==================>...........] - ETA: 0s - loss: 579.3515
300/300 [==============================] - 0s 451us/sample - loss: 578.9133 - val_loss: 578.3201
## Epoch 47/100
##
32/300 [==>...........................] - ETA: 0s - loss: 579.1024
256/300 [========================>.....] - ETA: 0s - loss: 579.3913
300/300 [==============================] - 0s 443us/sample - loss: 579.2333 - val_loss: 578.8878
## Epoch 48/100
##
32/300 [==>...........................] - ETA: 0s - loss: 579.5697
192/300 [==================>...........] - ETA: 0s - loss: 578.5270
300/300 [==============================] - 0s 486us/sample - loss: 578.7957 - val_loss: 578.1869
## Epoch 49/100
##
32/300 [==>...........................] - ETA: 0s - loss: 577.8311
192/300 [==================>...........] - ETA: 0s - loss: 577.7930
300/300 [==============================] - 0s 470us/sample - loss: 578.5662 - val_loss: 578.3935
## Epoch 50/100
##
32/300 [==>...........................] - ETA: 0s - loss: 578.1556
160/300 [===============>..............] - ETA: 0s - loss: 578.4161
300/300 [==============================] - 0s 491us/sample - loss: 578.6572 - val_loss: 578.5369
## Epoch 51/100
##
32/300 [==>...........................] - ETA: 0s - loss: 579.2602
224/300 [=====================>........] - ETA: 0s - loss: 579.2943
300/300 [==============================] - 0s 466us/sample - loss: 579.2122 - val_loss: 579.9596
## Epoch 52/100
##
32/300 [==>...........................] - ETA: 0s - loss: 579.4588
224/300 [=====================>........] - ETA: 0s - loss: 579.2711
300/300 [==============================] - 0s 469us/sample - loss: 579.9561 - val_loss: 578.7137
## Epoch 53/100
##
32/300 [==>...........................] - ETA: 0s - loss: 577.4160
192/300 [==================>...........] - ETA: 0s - loss: 579.7165
300/300 [==============================] - 0s 475us/sample - loss: 579.9291 - val_loss: 578.5700
## Epoch 54/100
##
32/300 [==>...........................] - ETA: 0s - loss: 579.7241
224/300 [=====================>........] - ETA: 0s - loss: 579.1916
300/300 [==============================] - 0s 445us/sample - loss: 579.0994 - val_loss: 578.4685
## Epoch 55/100
##
32/300 [==>...........................] - ETA: 0s - loss: 579.2697
224/300 [=====================>........] - ETA: 0s - loss: 578.4889
300/300 [==============================] - 0s 424us/sample - loss: 578.6873 - val_loss: 578.5219
## Epoch 56/100
##
32/300 [==>...........................] - ETA: 0s - loss: 579.3256
224/300 [=====================>........] - ETA: 0s - loss: 578.4725
300/300 [==============================] - 0s 377us/sample - loss: 578.8209 - val_loss: 578.5682
## Epoch 57/100
##
32/300 [==>...........................] - ETA: 0s - loss: 579.6339
256/300 [========================>.....] - ETA: 0s - loss: 578.6703
300/300 [==============================] - 0s 409us/sample - loss: 578.5684 - val_loss: 578.3268
## Epoch 58/100
##
32/300 [==>...........................] - ETA: 0s - loss: 578.5401
224/300 [=====================>........] - ETA: 0s - loss: 578.9186
300/300 [==============================] - 0s 542us/sample - loss: 579.5206 - val_loss: 578.6842
The function “plot_vae” draws the plot for model architecture.
if (keras::is_keras_available() & reticulate::py_available()) {
# model architecture
plot_vae(vae_result$model)
}
The function “gen_exprs” can generate samples with expression data by using the trained model.
if (keras::is_keras_available() & reticulate::py_available()) {
# sample generation
set.seed(1)
gen_sample_result <- gen_exprs(vae_result, num_samples = 100)
# heatmap
heatmap(cbind(t(x_train), t(gen_sample_result$x_gen)),
col = colorRampPalette(c('green', 'red'))(100),
Rowv=NA)
}
## generating...
## post-processing...
The function “plot_aug” uses reduced dimension plots for augmented data visualization.
if (keras::is_keras_available() & reticulate::py_available()) {
# plot for augmented data
plot_aug(gen_sample_result, "PCA")
}
The “yan” data set is single-cell RNA sequencing data with 20214 genes and 90 cells from human preimplantation embryos and embryonic stem cells at different passages. The rows in the dataset correspond to genes and columns correspond to cells. The “SingleCellExperiment” class can be used to store and manipulate single-cell genomics data. It extends the “RangedSummarizedExperiment” class and follows similar conventions. The object “sce” can be created by the data “yan” with cell type annotation “ann”.
if (keras::is_keras_available() & reticulate::py_available()) {
library(VAExprs)
library(SC3)
library(SingleCellExperiment)
# create a SingleCellExperiment object
sce <- SingleCellExperiment::SingleCellExperiment(
assays = list(counts = as.matrix(yan)),
colData = ann
)
# define feature names in feature_symbol column
rowData(sce)$feature_symbol <- rownames(sce)
# remove features with duplicated names
sce <- sce[!duplicated(rowData(sce)$feature_symbol), ]
# remove genes that are not expressed in any samples
sce <- sce[which(rowMeans(assay(sce)) > 0),]
dim(assay(sce))
# model parameters
batch_size <- 32
original_dim <- 19595
intermediate_dim <- 256
epochs <- 100
# model
cvae_result <- fit_vae(object = sce,
encoder_layers = list(layer_input(shape = c(original_dim)),
layer_dense(units = intermediate_dim,
activation = "relu")),
decoder_layers = list(layer_dense(units = intermediate_dim,
activation = "relu"),
layer_dense(units = original_dim,
activation = "sigmoid")),
epochs = epochs, batch_size = batch_size,
use_generator = TRUE,
callbacks = keras::callback_early_stopping(
monitor = "loss",
patience = 20,
restore_best_weights = TRUE))
# model architecture
plot_vae(cvae_result$model)
}
## pre-processing...
## normalizing...
## training...
## Epoch 1/100
##
1/3 [=========>....................] - ETA: 2s - batch: 0.0000e+00 - size: 32.0000 - loss: 13587.0332
2/3 [===================>..........] - ETA: 0s - batch: 0.5000 - size: 32.0000 - loss: 13570.8545
3/3 [==============================] - ETA: 0s - batch: 1.0000 - size: 32.0000 - loss: 13368.6504
3/3 [==============================] - 1s 101ms/step - batch: 1.0000 - size: 32.0000 - loss: 13368.6504
## Epoch 2/100
##
1/3 [=========>....................] - ETA: 1s - batch: 0.0000e+00 - size: 32.0000 - loss: 89144831246336.0000
2/3 [===================>..........] - ETA: 0s - batch: 0.5000 - size: 32.0000 - loss: 44572415672712.9062
3/3 [==============================] - ETA: 0s - batch: 1.0000 - size: 32.0000 - loss: 29714943786096.4414
3/3 [==============================] - 1s 74ms/step - batch: 1.0000 - size: 32.0000 - loss: 29714943786096.4414
## Epoch 3/100
##
1/3 [=========>....................] - ETA: 0s - batch: 0.0000e+00 - size: 32.0000 - loss: 12876.1953
2/3 [===================>..........] - ETA: 0s - batch: 0.5000 - size: 32.0000 - loss: 12790.4814
3/3 [==============================] - ETA: 0s - batch: 1.0000 - size: 32.0000 - loss: 12671.4541
3/3 [==============================] - 0s 68ms/step - batch: 1.0000 - size: 32.0000 - loss: 12671.4541
## Epoch 4/100
##
1/3 [=========>....................] - ETA: 0s - batch: 0.0000e+00 - size: 32.0000 - loss: 12201.8955
2/3 [===================>..........] - ETA: 0s - batch: 0.5000 - size: 32.0000 - loss: 12066.8125
3/3 [==============================] - ETA: 0s - batch: 1.0000 - size: 32.0000 - loss: 11836.0013
3/3 [==============================] - 0s 85ms/step - batch: 1.0000 - size: 32.0000 - loss: 11836.0013
## Epoch 5/100
##
1/3 [=========>....................] - ETA: 0s - batch: 0.0000e+00 - size: 32.0000 - loss: 11044.7383
2/3 [===================>..........] - ETA: 0s - batch: 0.5000 - size: 32.0000 - loss: 10795.4990
3/3 [==============================] - ETA: 0s - batch: 1.0000 - size: 32.0000 - loss: 10606.5065
3/3 [==============================] - 0s 78ms/step - batch: 1.0000 - size: 32.0000 - loss: 10606.5065
## Epoch 6/100
##
1/3 [=========>....................] - ETA: 0s - batch: 0.0000e+00 - size: 32.0000 - loss: 10121.6680
2/3 [===================>..........] - ETA: 0s - batch: 0.5000 - size: 32.0000 - loss: 9895.2842
3/3 [==============================] - ETA: 0s - batch: 1.0000 - size: 32.0000 - loss: 9787.9062
3/3 [==============================] - 0s 99ms/step - batch: 1.0000 - size: 32.0000 - loss: 9787.9062
## Epoch 7/100
##
1/3 [=========>....................] - ETA: 0s - batch: 0.0000e+00 - size: 32.0000 - loss: 9527.0840
2/3 [===================>..........] - ETA: 0s - batch: 0.5000 - size: 32.0000 - loss: 9657.4756
3/3 [==============================] - ETA: 0s - batch: 1.0000 - size: 32.0000 - loss: 9707.0765
3/3 [==============================] - 0s 80ms/step - batch: 1.0000 - size: 32.0000 - loss: 9707.0765
## Epoch 8/100
##
1/3 [=========>....................] - ETA: 0s - batch: 0.0000e+00 - size: 32.0000 - loss: 9608.9658
2/3 [===================>..........] - ETA: 0s - batch: 0.5000 - size: 32.0000 - loss: 9581.2095
3/3 [==============================] - ETA: 0s - batch: 1.0000 - size: 32.0000 - loss: 9598.8089
3/3 [==============================] - 0s 81ms/step - batch: 1.0000 - size: 32.0000 - loss: 9598.8089
## Epoch 9/100
##
1/3 [=========>....................] - ETA: 0s - batch: 0.0000e+00 - size: 32.0000 - loss: 9203.7354
2/3 [===================>..........] - ETA: 0s - batch: 0.5000 - size: 32.0000 - loss: 9276.2266
3/3 [==============================] - ETA: 0s - batch: 1.0000 - size: 32.0000 - loss: 9441.6654
3/3 [==============================] - 0s 78ms/step - batch: 1.0000 - size: 32.0000 - loss: 9441.6654
## Epoch 10/100
##
1/3 [=========>....................] - ETA: 0s - batch: 0.0000e+00 - size: 32.0000 - loss: 9589.7627
2/3 [===================>..........] - ETA: 0s - batch: 0.5000 - size: 32.0000 - loss: 9532.3501
3/3 [==============================] - ETA: 0s - batch: 1.0000 - size: 32.0000 - loss: 9478.8949
3/3 [==============================] - 0s 72ms/step - batch: 1.0000 - size: 32.0000 - loss: 9478.8949
## Epoch 11/100
##
1/3 [=========>....................] - ETA: 0s - batch: 0.0000e+00 - size: 32.0000 - loss: 9274.4873
2/3 [===================>..........] - ETA: 0s - batch: 0.5000 - size: 32.0000 - loss: 9280.6187
3/3 [==============================] - ETA: 0s - batch: 1.0000 - size: 32.0000 - loss: 9282.8747
3/3 [==============================] - 0s 127ms/step - batch: 1.0000 - size: 32.0000 - loss: 9282.8747
## Epoch 12/100
##
1/3 [=========>....................] - ETA: 0s - batch: 0.0000e+00 - size: 32.0000 - loss: 9164.7637
2/3 [===================>..........] - ETA: 0s - batch: 0.5000 - size: 32.0000 - loss: 9122.2944
3/3 [==============================] - ETA: 0s - batch: 1.0000 - size: 32.0000 - loss: 9150.7314
3/3 [==============================] - 0s 82ms/step - batch: 1.0000 - size: 32.0000 - loss: 9150.7314
## Epoch 13/100
##
1/3 [=========>....................] - ETA: 0s - batch: 0.0000e+00 - size: 32.0000 - loss: 9341.9238
2/3 [===================>..........] - ETA: 0s - batch: 0.5000 - size: 32.0000 - loss: 9170.9097
3/3 [==============================] - ETA: 0s - batch: 1.0000 - size: 32.0000 - loss: 9153.2627
3/3 [==============================] - 0s 72ms/step - batch: 1.0000 - size: 32.0000 - loss: 9153.2627
## Epoch 14/100
##
1/3 [=========>....................] - ETA: 0s - batch: 0.0000e+00 - size: 32.0000 - loss: 9315.1211
2/3 [===================>..........] - ETA: 0s - batch: 0.5000 - size: 32.0000 - loss: 9159.9570
3/3 [==============================] - ETA: 0s - batch: 1.0000 - size: 32.0000 - loss: 9081.1973
3/3 [==============================] - 0s 77ms/step - batch: 1.0000 - size: 32.0000 - loss: 9081.1973
## Epoch 15/100
##
1/3 [=========>....................] - ETA: 0s - batch: 0.0000e+00 - size: 32.0000 - loss: 9029.6289
2/3 [===================>..........] - ETA: 0s - batch: 0.5000 - size: 32.0000 - loss: 9093.2300
3/3 [==============================] - ETA: 0s - batch: 1.0000 - size: 32.0000 - loss: 9108.8112
3/3 [==============================] - 0s 89ms/step - batch: 1.0000 - size: 32.0000 - loss: 9108.8112
## Epoch 16/100
##
1/3 [=========>....................] - ETA: 0s - batch: 0.0000e+00 - size: 32.0000 - loss: 9178.0830
2/3 [===================>..........] - ETA: 0s - batch: 0.5000 - size: 32.0000 - loss: 9268.4683
3/3 [==============================] - ETA: 0s - batch: 1.0000 - size: 32.0000 - loss: 9254.3109
3/3 [==============================] - 0s 75ms/step - batch: 1.0000 - size: 32.0000 - loss: 9254.3109
## Epoch 17/100
##
1/3 [=========>....................] - ETA: 0s - batch: 0.0000e+00 - size: 32.0000 - loss: 9347.5176
2/3 [===================>..........] - ETA: 0s - batch: 0.5000 - size: 32.0000 - loss: 9194.0215
3/3 [==============================] - ETA: 0s - batch: 1.0000 - size: 32.0000 - loss: 9168.2578
3/3 [==============================] - 0s 70ms/step - batch: 1.0000 - size: 32.0000 - loss: 9168.2578
## Epoch 18/100
##
1/3 [=========>....................] - ETA: 0s - batch: 0.0000e+00 - size: 32.0000 - loss: 8922.8691
2/3 [===================>..........] - ETA: 0s - batch: 0.5000 - size: 32.0000 - loss: 8981.2930
3/3 [==============================] - ETA: 0s - batch: 1.0000 - size: 32.0000 - loss: 9069.8978
3/3 [==============================] - 0s 72ms/step - batch: 1.0000 - size: 32.0000 - loss: 9069.8978
## Epoch 19/100
##
1/3 [=========>....................] - ETA: 0s - batch: 0.0000e+00 - size: 32.0000 - loss: 9108.2910
2/3 [===================>..........] - ETA: 0s - batch: 0.5000 - size: 32.0000 - loss: 9080.1670
3/3 [==============================] - ETA: 0s - batch: 1.0000 - size: 32.0000 - loss: 9113.0964
3/3 [==============================] - 0s 77ms/step - batch: 1.0000 - size: 32.0000 - loss: 9113.0964
## Epoch 20/100
##
1/3 [=========>....................] - ETA: 0s - batch: 0.0000e+00 - size: 32.0000 - loss: 9076.2334
2/3 [===================>..........] - ETA: 0s - batch: 0.5000 - size: 32.0000 - loss: 8991.2822
3/3 [==============================] - ETA: 0s - batch: 1.0000 - size: 32.0000 - loss: 9064.5117
3/3 [==============================] - 0s 73ms/step - batch: 1.0000 - size: 32.0000 - loss: 9064.5117
## Epoch 21/100
##
1/3 [=========>....................] - ETA: 0s - batch: 0.0000e+00 - size: 32.0000 - loss: 8792.6309
2/3 [===================>..........] - ETA: 0s - batch: 0.5000 - size: 32.0000 - loss: 8784.9272
3/3 [==============================] - ETA: 0s - batch: 1.0000 - size: 32.0000 - loss: 8841.4502
3/3 [==============================] - 0s 70ms/step - batch: 1.0000 - size: 32.0000 - loss: 8841.4502
## Epoch 22/100
##
1/3 [=========>....................] - ETA: 0s - batch: 0.0000e+00 - size: 32.0000 - loss: 9106.9580
2/3 [===================>..........] - ETA: 0s - batch: 0.5000 - size: 32.0000 - loss: 9123.5229
3/3 [==============================] - ETA: 0s - batch: 1.0000 - size: 32.0000 - loss: 9142.8070
3/3 [==============================] - 0s 72ms/step - batch: 1.0000 - size: 32.0000 - loss: 9142.8070
## Epoch 23/100
##
1/3 [=========>....................] - ETA: 0s - batch: 0.0000e+00 - size: 32.0000 - loss: 8683.5742
2/3 [===================>..........] - ETA: 0s - batch: 0.5000 - size: 32.0000 - loss: 8708.5835
3/3 [==============================] - ETA: 0s - batch: 1.0000 - size: 32.0000 - loss: 8693.5544
3/3 [==============================] - 1s 514ms/step - batch: 1.0000 - size: 32.0000 - loss: 8693.5544
## Epoch 24/100
##
1/3 [=========>....................] - ETA: 0s - batch: 0.0000e+00 - size: 32.0000 - loss: 9056.7412
2/3 [===================>..........] - ETA: 0s - batch: 0.5000 - size: 32.0000 - loss: 9099.3140
3/3 [==============================] - ETA: 0s - batch: 1.0000 - size: 32.0000 - loss: 9117.3161
3/3 [==============================] - 0s 74ms/step - batch: 1.0000 - size: 32.0000 - loss: 9117.3161
## Epoch 25/100
##
1/3 [=========>....................] - ETA: 0s - batch: 0.0000e+00 - size: 32.0000 - loss: 8570.8730
2/3 [===================>..........] - ETA: 0s - batch: 0.5000 - size: 32.0000 - loss: 8716.9834
3/3 [==============================] - ETA: 0s - batch: 1.0000 - size: 32.0000 - loss: 8777.1839
3/3 [==============================] - 0s 71ms/step - batch: 1.0000 - size: 32.0000 - loss: 8777.1839
## Epoch 26/100
##
1/3 [=========>....................] - ETA: 0s - batch: 0.0000e+00 - size: 32.0000 - loss: 8571.4648
2/3 [===================>..........] - ETA: 0s - batch: 0.5000 - size: 32.0000 - loss: 8816.5713
3/3 [==============================] - ETA: 0s - batch: 1.0000 - size: 32.0000 - loss: 8846.5199
3/3 [==============================] - 0s 60ms/step - batch: 1.0000 - size: 32.0000 - loss: 8846.5199
## Epoch 27/100
##
1/3 [=========>....................] - ETA: 0s - batch: 0.0000e+00 - size: 32.0000 - loss: 8988.2480
2/3 [===================>..........] - ETA: 0s - batch: 0.5000 - size: 32.0000 - loss: 8872.1143
3/3 [==============================] - ETA: 0s - batch: 1.0000 - size: 32.0000 - loss: 8756.6113
3/3 [==============================] - 0s 62ms/step - batch: 1.0000 - size: 32.0000 - loss: 8756.6113
## Epoch 28/100
##
1/3 [=========>....................] - ETA: 0s - batch: 0.0000e+00 - size: 32.0000 - loss: 8635.0400
2/3 [===================>..........] - ETA: 0s - batch: 0.5000 - size: 32.0000 - loss: 8577.8706
3/3 [==============================] - ETA: 0s - batch: 1.0000 - size: 32.0000 - loss: 8615.5212
3/3 [==============================] - 0s 72ms/step - batch: 1.0000 - size: 32.0000 - loss: 8615.5212
## Epoch 29/100
##
1/3 [=========>....................] - ETA: 0s - batch: 0.0000e+00 - size: 32.0000 - loss: 8557.5762
2/3 [===================>..........] - ETA: 0s - batch: 0.5000 - size: 32.0000 - loss: 8734.5947
3/3 [==============================] - ETA: 0s - batch: 1.0000 - size: 32.0000 - loss: 8807.6582
3/3 [==============================] - 0s 93ms/step - batch: 1.0000 - size: 32.0000 - loss: 8807.6582
## Epoch 30/100
##
1/3 [=========>....................] - ETA: 0s - batch: 0.0000e+00 - size: 32.0000 - loss: 8775.3066
2/3 [===================>..........] - ETA: 0s - batch: 0.5000 - size: 32.0000 - loss: 8794.4688
3/3 [==============================] - ETA: 0s - batch: 1.0000 - size: 32.0000 - loss: 8763.3415
3/3 [==============================] - 0s 110ms/step - batch: 1.0000 - size: 32.0000 - loss: 8763.3415
## Epoch 31/100
##
1/3 [=========>....................] - ETA: 0s - batch: 0.0000e+00 - size: 32.0000 - loss: 8670.2578
2/3 [===================>..........] - ETA: 0s - batch: 0.5000 - size: 32.0000 - loss: 8716.6592
3/3 [==============================] - ETA: 0s - batch: 1.0000 - size: 32.0000 - loss: 8715.8132
3/3 [==============================] - 0s 69ms/step - batch: 1.0000 - size: 32.0000 - loss: 8715.8132
## Epoch 32/100
##
1/3 [=========>....................] - ETA: 0s - batch: 0.0000e+00 - size: 32.0000 - loss: 8368.9150
2/3 [===================>..........] - ETA: 0s - batch: 0.5000 - size: 32.0000 - loss: 8240.9866
3/3 [==============================] - ETA: 0s - batch: 1.0000 - size: 32.0000 - loss: 8246.7853
3/3 [==============================] - 0s 63ms/step - batch: 1.0000 - size: 32.0000 - loss: 8246.7853
## Epoch 33/100
##
1/3 [=========>....................] - ETA: 0s - batch: 0.0000e+00 - size: 32.0000 - loss: 8213.0518
2/3 [===================>..........] - ETA: 0s - batch: 0.5000 - size: 32.0000 - loss: 8377.5366
3/3 [==============================] - ETA: 0s - batch: 1.0000 - size: 32.0000 - loss: 8343.6286
3/3 [==============================] - 0s 61ms/step - batch: 1.0000 - size: 32.0000 - loss: 8343.6286
## Epoch 34/100
##
1/3 [=========>....................] - ETA: 0s - batch: 0.0000e+00 - size: 32.0000 - loss: 8415.8281
2/3 [===================>..........] - ETA: 0s - batch: 0.5000 - size: 32.0000 - loss: 8509.7500
3/3 [==============================] - ETA: 0s - batch: 1.0000 - size: 32.0000 - loss: 8491.1198
3/3 [==============================] - 0s 63ms/step - batch: 1.0000 - size: 32.0000 - loss: 8491.1198
## Epoch 35/100
##
1/3 [=========>....................] - ETA: 0s - batch: 0.0000e+00 - size: 32.0000 - loss: 8393.0293
2/3 [===================>..........] - ETA: 0s - batch: 0.5000 - size: 32.0000 - loss: 8270.6760
3/3 [==============================] - ETA: 0s - batch: 1.0000 - size: 32.0000 - loss: 8258.2707
3/3 [==============================] - 0s 76ms/step - batch: 1.0000 - size: 32.0000 - loss: 8258.2707
## Epoch 36/100
##
1/3 [=========>....................] - ETA: 0s - batch: 0.0000e+00 - size: 32.0000 - loss: 8345.1943
2/3 [===================>..........] - ETA: 0s - batch: 0.5000 - size: 32.0000 - loss: 8384.6021
3/3 [==============================] - ETA: 0s - batch: 1.0000 - size: 32.0000 - loss: 8325.1943
3/3 [==============================] - 0s 103ms/step - batch: 1.0000 - size: 32.0000 - loss: 8325.1943
## Epoch 37/100
##
1/3 [=========>....................] - ETA: 0s - batch: 0.0000e+00 - size: 32.0000 - loss: 8296.2539
2/3 [===================>..........] - ETA: 0s - batch: 0.5000 - size: 32.0000 - loss: 8378.4854
3/3 [==============================] - ETA: 0s - batch: 1.0000 - size: 32.0000 - loss: 8317.7246
3/3 [==============================] - 0s 82ms/step - batch: 1.0000 - size: 32.0000 - loss: 8317.7246
## Epoch 38/100
##
1/3 [=========>....................] - ETA: 0s - batch: 0.0000e+00 - size: 32.0000 - loss: 8178.2539
2/3 [===================>..........] - ETA: 0s - batch: 0.5000 - size: 32.0000 - loss: 8173.1143
3/3 [==============================] - ETA: 0s - batch: 1.0000 - size: 32.0000 - loss: 8201.0319
3/3 [==============================] - 0s 76ms/step - batch: 1.0000 - size: 32.0000 - loss: 8201.0319
## Epoch 39/100
##
1/3 [=========>....................] - ETA: 0s - batch: 0.0000e+00 - size: 32.0000 - loss: 8113.5962
2/3 [===================>..........] - ETA: 0s - batch: 0.5000 - size: 32.0000 - loss: 8165.4465
3/3 [==============================] - ETA: 0s - batch: 1.0000 - size: 32.0000 - loss: 8196.7127
3/3 [==============================] - 0s 78ms/step - batch: 1.0000 - size: 32.0000 - loss: 8196.7127
## Epoch 40/100
##
1/3 [=========>....................] - ETA: 0s - batch: 0.0000e+00 - size: 32.0000 - loss: 8183.6328
2/3 [===================>..........] - ETA: 0s - batch: 0.5000 - size: 32.0000 - loss: 8253.5400
3/3 [==============================] - ETA: 0s - batch: 1.0000 - size: 32.0000 - loss: 8161.9033
3/3 [==============================] - 0s 75ms/step - batch: 1.0000 - size: 32.0000 - loss: 8161.9033
## Epoch 41/100
##
1/3 [=========>....................] - ETA: 0s - batch: 0.0000e+00 - size: 32.0000 - loss: 8237.2109
2/3 [===================>..........] - ETA: 0s - batch: 0.5000 - size: 32.0000 - loss: 8210.9463
3/3 [==============================] - ETA: 0s - batch: 1.0000 - size: 32.0000 - loss: 8276.3770
3/3 [==============================] - 0s 88ms/step - batch: 1.0000 - size: 32.0000 - loss: 8276.3770
## Epoch 42/100
##
1/3 [=========>....................] - ETA: 0s - batch: 0.0000e+00 - size: 32.0000 - loss: 8199.5410
2/3 [===================>..........] - ETA: 0s - batch: 0.5000 - size: 32.0000 - loss: 8098.0427
3/3 [==============================] - ETA: 0s - batch: 1.0000 - size: 32.0000 - loss: 8105.9093
3/3 [==============================] - 0s 59ms/step - batch: 1.0000 - size: 32.0000 - loss: 8105.9093
## Epoch 43/100
##
1/3 [=========>....................] - ETA: 0s - batch: 0.0000e+00 - size: 32.0000 - loss: 8182.6187
2/3 [===================>..........] - ETA: 0s - batch: 0.5000 - size: 32.0000 - loss: 8264.1487
3/3 [==============================] - ETA: 0s - batch: 1.0000 - size: 32.0000 - loss: 8259.6733
3/3 [==============================] - 0s 69ms/step - batch: 1.0000 - size: 32.0000 - loss: 8259.6733
## Epoch 44/100
##
1/3 [=========>....................] - ETA: 0s - batch: 0.0000e+00 - size: 32.0000 - loss: 8520.6270
2/3 [===================>..........] - ETA: 0s - batch: 0.5000 - size: 32.0000 - loss: 8517.4043
3/3 [==============================] - ETA: 0s - batch: 1.0000 - size: 32.0000 - loss: 8313.1152
3/3 [==============================] - 0s 69ms/step - batch: 1.0000 - size: 32.0000 - loss: 8313.1152
## Epoch 45/100
##
1/3 [=========>....................] - ETA: 0s - batch: 0.0000e+00 - size: 32.0000 - loss: 8039.8013
2/3 [===================>..........] - ETA: 0s - batch: 0.5000 - size: 32.0000 - loss: 7986.6653
3/3 [==============================] - ETA: 0s - batch: 1.0000 - size: 32.0000 - loss: 8048.9624
3/3 [==============================] - 0s 77ms/step - batch: 1.0000 - size: 32.0000 - loss: 8048.9624
## Epoch 46/100
##
1/3 [=========>....................] - ETA: 0s - batch: 0.0000e+00 - size: 32.0000 - loss: 8082.1514
2/3 [===================>..........] - ETA: 0s - batch: 0.5000 - size: 32.0000 - loss: 8148.4624
3/3 [==============================] - ETA: 0s - batch: 1.0000 - size: 32.0000 - loss: 8198.2227
3/3 [==============================] - 0s 93ms/step - batch: 1.0000 - size: 32.0000 - loss: 8198.2227
## Epoch 47/100
##
1/3 [=========>....................] - ETA: 0s - batch: 0.0000e+00 - size: 32.0000 - loss: 8232.4219
2/3 [===================>..........] - ETA: 0s - batch: 0.5000 - size: 32.0000 - loss: 8170.9250
3/3 [==============================] - ETA: 0s - batch: 1.0000 - size: 32.0000 - loss: 8127.2174
3/3 [==============================] - 0s 70ms/step - batch: 1.0000 - size: 32.0000 - loss: 8127.2174
## Epoch 48/100
##
1/3 [=========>....................] - ETA: 0s - batch: 0.0000e+00 - size: 32.0000 - loss: 8117.5098
2/3 [===================>..........] - ETA: 0s - batch: 0.5000 - size: 32.0000 - loss: 8164.3716
3/3 [==============================] - ETA: 0s - batch: 1.0000 - size: 32.0000 - loss: 8181.0641
3/3 [==============================] - 0s 69ms/step - batch: 1.0000 - size: 32.0000 - loss: 8181.0641
## Epoch 49/100
##
1/3 [=========>....................] - ETA: 0s - batch: 0.0000e+00 - size: 32.0000 - loss: 8260.5117
2/3 [===================>..........] - ETA: 0s - batch: 0.5000 - size: 32.0000 - loss: 8124.3745
3/3 [==============================] - ETA: 0s - batch: 1.0000 - size: 32.0000 - loss: 8240.8447
3/3 [==============================] - 0s 76ms/step - batch: 1.0000 - size: 32.0000 - loss: 8240.8447
## Epoch 50/100
##
1/3 [=========>....................] - ETA: 0s - batch: 0.0000e+00 - size: 32.0000 - loss: 8085.1475
2/3 [===================>..........] - ETA: 0s - batch: 0.5000 - size: 32.0000 - loss: 8116.3120
3/3 [==============================] - ETA: 0s - batch: 1.0000 - size: 32.0000 - loss: 8131.9697
3/3 [==============================] - 0s 107ms/step - batch: 1.0000 - size: 32.0000 - loss: 8131.9697
## Epoch 51/100
##
1/3 [=========>....................] - ETA: 0s - batch: 0.0000e+00 - size: 32.0000 - loss: 7960.6016
2/3 [===================>..........] - ETA: 0s - batch: 0.5000 - size: 32.0000 - loss: 8113.5298
3/3 [==============================] - ETA: 0s - batch: 1.0000 - size: 32.0000 - loss: 8011.3936
3/3 [==============================] - 0s 73ms/step - batch: 1.0000 - size: 32.0000 - loss: 8011.3936
## Epoch 52/100
##
1/3 [=========>....................] - ETA: 0s - batch: 0.0000e+00 - size: 32.0000 - loss: 8255.8672
2/3 [===================>..........] - ETA: 0s - batch: 0.5000 - size: 32.0000 - loss: 8086.1948
3/3 [==============================] - ETA: 0s - batch: 1.0000 - size: 32.0000 - loss: 8022.6501
3/3 [==============================] - 0s 69ms/step - batch: 1.0000 - size: 32.0000 - loss: 8022.6501
## Epoch 53/100
##
1/3 [=========>....................] - ETA: 0s - batch: 0.0000e+00 - size: 32.0000 - loss: 8213.5254
2/3 [===================>..........] - ETA: 0s - batch: 0.5000 - size: 32.0000 - loss: 8085.6204
3/3 [==============================] - ETA: 0s - batch: 1.0000 - size: 32.0000 - loss: 8117.7972
3/3 [==============================] - 0s 60ms/step - batch: 1.0000 - size: 32.0000 - loss: 8117.7972
## Epoch 54/100
##
1/3 [=========>....................] - ETA: 0s - batch: 0.0000e+00 - size: 32.0000 - loss: 8134.1523
2/3 [===================>..........] - ETA: 0s - batch: 0.5000 - size: 32.0000 - loss: 8128.4153
3/3 [==============================] - ETA: 0s - batch: 1.0000 - size: 32.0000 - loss: 8115.3187
3/3 [==============================] - 1s 495ms/step - batch: 1.0000 - size: 32.0000 - loss: 8115.3187
## Epoch 55/100
##
1/3 [=========>....................] - ETA: 0s - batch: 0.0000e+00 - size: 32.0000 - loss: 8074.1792
2/3 [===================>..........] - ETA: 0s - batch: 0.5000 - size: 32.0000 - loss: 8140.2878
3/3 [==============================] - ETA: 0s - batch: 1.0000 - size: 32.0000 - loss: 8189.1662
3/3 [==============================] - 0s 69ms/step - batch: 1.0000 - size: 32.0000 - loss: 8189.1662
## Epoch 56/100
##
1/3 [=========>....................] - ETA: 0s - batch: 0.0000e+00 - size: 32.0000 - loss: 8258.6348
2/3 [===================>..........] - ETA: 0s - batch: 0.5000 - size: 32.0000 - loss: 8164.2437
3/3 [==============================] - ETA: 0s - batch: 1.0000 - size: 32.0000 - loss: 8105.1226
3/3 [==============================] - 0s 61ms/step - batch: 1.0000 - size: 32.0000 - loss: 8105.1226
## Epoch 57/100
##
1/3 [=========>....................] - ETA: 0s - batch: 0.0000e+00 - size: 32.0000 - loss: 8183.5361
2/3 [===================>..........] - ETA: 0s - batch: 0.5000 - size: 32.0000 - loss: 8166.8145
3/3 [==============================] - ETA: 0s - batch: 1.0000 - size: 32.0000 - loss: 8112.4170
3/3 [==============================] - 0s 65ms/step - batch: 1.0000 - size: 32.0000 - loss: 8112.4170
## Epoch 58/100
##
1/3 [=========>....................] - ETA: 0s - batch: 0.0000e+00 - size: 32.0000 - loss: 8190.3223
2/3 [===================>..........] - ETA: 0s - batch: 0.5000 - size: 32.0000 - loss: 8077.6646
3/3 [==============================] - ETA: 0s - batch: 1.0000 - size: 32.0000 - loss: 8146.9326
3/3 [==============================] - 0s 71ms/step - batch: 1.0000 - size: 32.0000 - loss: 8146.9326
## Epoch 59/100
##
1/3 [=========>....................] - ETA: 0s - batch: 0.0000e+00 - size: 32.0000 - loss: 8509.5352
2/3 [===================>..........] - ETA: 0s - batch: 0.5000 - size: 32.0000 - loss: 8188.0981
3/3 [==============================] - ETA: 0s - batch: 1.0000 - size: 32.0000 - loss: 8204.0680
3/3 [==============================] - 0s 70ms/step - batch: 1.0000 - size: 32.0000 - loss: 8204.0680
## Epoch 60/100
##
1/3 [=========>....................] - ETA: 0s - batch: 0.0000e+00 - size: 32.0000 - loss: 8120.4736
2/3 [===================>..........] - ETA: 0s - batch: 0.5000 - size: 32.0000 - loss: 8070.5859
3/3 [==============================] - ETA: 0s - batch: 1.0000 - size: 32.0000 - loss: 8054.1520
3/3 [==============================] - 0s 73ms/step - batch: 1.0000 - size: 32.0000 - loss: 8054.1520
## Epoch 61/100
##
1/3 [=========>....................] - ETA: 0s - batch: 0.0000e+00 - size: 32.0000 - loss: 8050.2891
2/3 [===================>..........] - ETA: 0s - batch: 0.5000 - size: 32.0000 - loss: 8085.1743
3/3 [==============================] - ETA: 0s - batch: 1.0000 - size: 32.0000 - loss: 8023.8112
3/3 [==============================] - 0s 106ms/step - batch: 1.0000 - size: 32.0000 - loss: 8023.8112
## Epoch 62/100
##
1/3 [=========>....................] - ETA: 0s - batch: 0.0000e+00 - size: 32.0000 - loss: 7983.6787
2/3 [===================>..........] - ETA: 0s - batch: 0.5000 - size: 32.0000 - loss: 8112.3110
3/3 [==============================] - ETA: 0s - batch: 1.0000 - size: 32.0000 - loss: 8143.1748
3/3 [==============================] - 0s 66ms/step - batch: 1.0000 - size: 32.0000 - loss: 8143.1748
## Epoch 63/100
##
1/3 [=========>....................] - ETA: 0s - batch: 0.0000e+00 - size: 32.0000 - loss: 8281.8213
2/3 [===================>..........] - ETA: 0s - batch: 0.5000 - size: 32.0000 - loss: 8250.4321
3/3 [==============================] - ETA: 0s - batch: 1.0000 - size: 32.0000 - loss: 8157.6257
3/3 [==============================] - 0s 68ms/step - batch: 1.0000 - size: 32.0000 - loss: 8157.6257
## Epoch 64/100
##
1/3 [=========>....................] - ETA: 0s - batch: 0.0000e+00 - size: 32.0000 - loss: 7939.7539
2/3 [===================>..........] - ETA: 0s - batch: 0.5000 - size: 32.0000 - loss: 7914.3789
3/3 [==============================] - ETA: 0s - batch: 1.0000 - size: 32.0000 - loss: 8035.0755
3/3 [==============================] - 0s 63ms/step - batch: 1.0000 - size: 32.0000 - loss: 8035.0755
## Epoch 65/100
##
1/3 [=========>....................] - ETA: 0s - batch: 0.0000e+00 - size: 32.0000 - loss: 7892.5835
2/3 [===================>..........] - ETA: 0s - batch: 0.5000 - size: 32.0000 - loss: 7950.3635
3/3 [==============================] - ETA: 0s - batch: 1.0000 - size: 32.0000 - loss: 7934.5210
3/3 [==============================] - 0s 77ms/step - batch: 1.0000 - size: 32.0000 - loss: 7934.5210
## Epoch 66/100
##
1/3 [=========>....................] - ETA: 0s - batch: 0.0000e+00 - size: 32.0000 - loss: 7973.7637
2/3 [===================>..........] - ETA: 0s - batch: 0.5000 - size: 32.0000 - loss: 7971.7749
3/3 [==============================] - ETA: 0s - batch: 1.0000 - size: 32.0000 - loss: 7986.4849
3/3 [==============================] - 0s 85ms/step - batch: 1.0000 - size: 32.0000 - loss: 7986.4849
## Epoch 67/100
##
1/3 [=========>....................] - ETA: 0s - batch: 0.0000e+00 - size: 32.0000 - loss: 7831.5293
2/3 [===================>..........] - ETA: 0s - batch: 0.5000 - size: 32.0000 - loss: 7972.7659
3/3 [==============================] - ETA: 0s - batch: 1.0000 - size: 32.0000 - loss: 7984.7622
3/3 [==============================] - 0s 109ms/step - batch: 1.0000 - size: 32.0000 - loss: 7984.7622
## Epoch 68/100
##
1/3 [=========>....................] - ETA: 0s - batch: 0.0000e+00 - size: 32.0000 - loss: 7964.8613
2/3 [===================>..........] - ETA: 0s - batch: 0.5000 - size: 32.0000 - loss: 7923.0315
3/3 [==============================] - ETA: 0s - batch: 1.0000 - size: 32.0000 - loss: 7953.5270
3/3 [==============================] - 0s 58ms/step - batch: 1.0000 - size: 32.0000 - loss: 7953.5270
## Epoch 69/100
##
1/3 [=========>....................] - ETA: 0s - batch: 0.0000e+00 - size: 32.0000 - loss: 7979.0859
2/3 [===================>..........] - ETA: 0s - batch: 0.5000 - size: 32.0000 - loss: 8146.8965
3/3 [==============================] - ETA: 0s - batch: 1.0000 - size: 32.0000 - loss: 8205.8385
3/3 [==============================] - 0s 70ms/step - batch: 1.0000 - size: 32.0000 - loss: 8205.8385
## Epoch 70/100
##
1/3 [=========>....................] - ETA: 0s - batch: 0.0000e+00 - size: 32.0000 - loss: 7849.0044
2/3 [===================>..........] - ETA: 0s - batch: 0.5000 - size: 32.0000 - loss: 7984.5459
3/3 [==============================] - ETA: 0s - batch: 1.0000 - size: 32.0000 - loss: 7965.7686
3/3 [==============================] - 0s 56ms/step - batch: 1.0000 - size: 32.0000 - loss: 7965.7686
## Epoch 71/100
##
1/3 [=========>....................] - ETA: 0s - batch: 0.0000e+00 - size: 32.0000 - loss: 7793.7373
2/3 [===================>..........] - ETA: 0s - batch: 0.5000 - size: 32.0000 - loss: 7847.9365
3/3 [==============================] - ETA: 0s - batch: 1.0000 - size: 32.0000 - loss: 7910.7751
3/3 [==============================] - 0s 65ms/step - batch: 1.0000 - size: 32.0000 - loss: 7910.7751
## Epoch 72/100
##
1/3 [=========>....................] - ETA: 0s - batch: 0.0000e+00 - size: 32.0000 - loss: 8100.2095
2/3 [===================>..........] - ETA: 0s - batch: 0.5000 - size: 32.0000 - loss: 8053.0959
3/3 [==============================] - ETA: 0s - batch: 1.0000 - size: 32.0000 - loss: 8034.0352
3/3 [==============================] - 0s 119ms/step - batch: 1.0000 - size: 32.0000 - loss: 8034.0352
## Epoch 73/100
##
1/3 [=========>....................] - ETA: 0s - batch: 0.0000e+00 - size: 32.0000 - loss: 7771.1514
2/3 [===================>..........] - ETA: 0s - batch: 0.5000 - size: 32.0000 - loss: 7943.7300
3/3 [==============================] - ETA: 0s - batch: 1.0000 - size: 32.0000 - loss: 7930.5768
3/3 [==============================] - 0s 67ms/step - batch: 1.0000 - size: 32.0000 - loss: 7930.5768
## Epoch 74/100
##
1/3 [=========>....................] - ETA: 0s - batch: 0.0000e+00 - size: 32.0000 - loss: 7898.5244
2/3 [===================>..........] - ETA: 0s - batch: 0.5000 - size: 32.0000 - loss: 7834.5542
3/3 [==============================] - ETA: 0s - batch: 1.0000 - size: 32.0000 - loss: 7839.4074
3/3 [==============================] - 0s 62ms/step - batch: 1.0000 - size: 32.0000 - loss: 7839.4074
## Epoch 75/100
##
1/3 [=========>....................] - ETA: 0s - batch: 0.0000e+00 - size: 32.0000 - loss: 7852.5654
2/3 [===================>..........] - ETA: 0s - batch: 0.5000 - size: 32.0000 - loss: 7899.0132
3/3 [==============================] - ETA: 0s - batch: 1.0000 - size: 32.0000 - loss: 7993.6491
3/3 [==============================] - 0s 63ms/step - batch: 1.0000 - size: 32.0000 - loss: 7993.6491
## Epoch 76/100
##
1/3 [=========>....................] - ETA: 0s - batch: 0.0000e+00 - size: 32.0000 - loss: 8155.8027
2/3 [===================>..........] - ETA: 0s - batch: 0.5000 - size: 32.0000 - loss: 7986.3008
3/3 [==============================] - ETA: 0s - batch: 1.0000 - size: 32.0000 - loss: 7901.7288
3/3 [==============================] - 0s 79ms/step - batch: 1.0000 - size: 32.0000 - loss: 7901.7288
## Epoch 77/100
##
1/3 [=========>....................] - ETA: 0s - batch: 0.0000e+00 - size: 32.0000 - loss: 8108.4399
2/3 [===================>..........] - ETA: 0s - batch: 0.5000 - size: 32.0000 - loss: 8067.1384
3/3 [==============================] - ETA: 0s - batch: 1.0000 - size: 32.0000 - loss: 7991.4276
3/3 [==============================] - 0s 94ms/step - batch: 1.0000 - size: 32.0000 - loss: 7991.4276
## Epoch 78/100
##
1/3 [=========>....................] - ETA: 0s - batch: 0.0000e+00 - size: 32.0000 - loss: 7773.4941
2/3 [===================>..........] - ETA: 0s - batch: 0.5000 - size: 32.0000 - loss: 7715.5605
3/3 [==============================] - ETA: 0s - batch: 1.0000 - size: 32.0000 - loss: 7680.5213
3/3 [==============================] - 0s 79ms/step - batch: 1.0000 - size: 32.0000 - loss: 7680.5213
## Epoch 79/100
##
1/3 [=========>....................] - ETA: 0s - batch: 0.0000e+00 - size: 32.0000 - loss: 8056.9883
2/3 [===================>..........] - ETA: 0s - batch: 0.5000 - size: 32.0000 - loss: 8150.1201
3/3 [==============================] - ETA: 0s - batch: 1.0000 - size: 32.0000 - loss: 8056.2480
3/3 [==============================] - 0s 67ms/step - batch: 1.0000 - size: 32.0000 - loss: 8056.2480
## Epoch 80/100
##
1/3 [=========>....................] - ETA: 0s - batch: 0.0000e+00 - size: 32.0000 - loss: 7689.2490
2/3 [===================>..........] - ETA: 0s - batch: 0.5000 - size: 32.0000 - loss: 7883.0508
3/3 [==============================] - ETA: 0s - batch: 1.0000 - size: 32.0000 - loss: 7863.1074
3/3 [==============================] - 0s 70ms/step - batch: 1.0000 - size: 32.0000 - loss: 7863.1074
## Epoch 81/100
##
1/3 [=========>....................] - ETA: 0s - batch: 0.0000e+00 - size: 32.0000 - loss: 7568.4087
2/3 [===================>..........] - ETA: 0s - batch: 0.5000 - size: 32.0000 - loss: 7643.3213
3/3 [==============================] - ETA: 0s - batch: 1.0000 - size: 32.0000 - loss: 7686.5283
3/3 [==============================] - 0s 112ms/step - batch: 1.0000 - size: 32.0000 - loss: 7686.5283
## Epoch 82/100
##
1/3 [=========>....................] - ETA: 0s - batch: 0.0000e+00 - size: 32.0000 - loss: 7883.8857
2/3 [===================>..........] - ETA: 0s - batch: 0.5000 - size: 32.0000 - loss: 7811.8467
3/3 [==============================] - ETA: 0s - batch: 1.0000 - size: 32.0000 - loss: 7743.4111
3/3 [==============================] - 0s 60ms/step - batch: 1.0000 - size: 32.0000 - loss: 7743.4111
## Epoch 83/100
##
1/3 [=========>....................] - ETA: 0s - batch: 0.0000e+00 - size: 32.0000 - loss: 8076.0137
2/3 [===================>..........] - ETA: 0s - batch: 0.5000 - size: 32.0000 - loss: 7971.2705
3/3 [==============================] - ETA: 0s - batch: 1.0000 - size: 32.0000 - loss: 7977.8148
3/3 [==============================] - 0s 66ms/step - batch: 1.0000 - size: 32.0000 - loss: 7977.8148
## Epoch 84/100
##
1/3 [=========>....................] - ETA: 0s - batch: 0.0000e+00 - size: 32.0000 - loss: 7848.1333
2/3 [===================>..........] - ETA: 0s - batch: 0.5000 - size: 32.0000 - loss: 7842.6389
3/3 [==============================] - ETA: 0s - batch: 1.0000 - size: 32.0000 - loss: 7813.9404
3/3 [==============================] - 0s 61ms/step - batch: 1.0000 - size: 32.0000 - loss: 7813.9404
## Epoch 85/100
##
1/3 [=========>....................] - ETA: 0s - batch: 0.0000e+00 - size: 32.0000 - loss: 8020.9971
2/3 [===================>..........] - ETA: 0s - batch: 0.5000 - size: 32.0000 - loss: 7999.6714
3/3 [==============================] - ETA: 0s - batch: 1.0000 - size: 32.0000 - loss: 7957.0330
3/3 [==============================] - 1s 495ms/step - batch: 1.0000 - size: 32.0000 - loss: 7957.0330
## Epoch 86/100
##
1/3 [=========>....................] - ETA: 0s - batch: 0.0000e+00 - size: 32.0000 - loss: 7958.0156
2/3 [===================>..........] - ETA: 0s - batch: 0.5000 - size: 32.0000 - loss: 7895.0015
3/3 [==============================] - ETA: 0s - batch: 1.0000 - size: 32.0000 - loss: 7934.5723
3/3 [==============================] - 0s 59ms/step - batch: 1.0000 - size: 32.0000 - loss: 7934.5723
## Epoch 87/100
##
1/3 [=========>....................] - ETA: 0s - batch: 0.0000e+00 - size: 32.0000 - loss: 8094.9883
2/3 [===================>..........] - ETA: 0s - batch: 0.5000 - size: 32.0000 - loss: 8039.0781
3/3 [==============================] - ETA: 0s - batch: 1.0000 - size: 32.0000 - loss: 7994.8151
3/3 [==============================] - 0s 61ms/step - batch: 1.0000 - size: 32.0000 - loss: 7994.8151
## Epoch 88/100
##
1/3 [=========>....................] - ETA: 0s - batch: 0.0000e+00 - size: 32.0000 - loss: 8158.8184
2/3 [===================>..........] - ETA: 0s - batch: 0.5000 - size: 32.0000 - loss: 7994.6484
3/3 [==============================] - ETA: 0s - batch: 1.0000 - size: 32.0000 - loss: 7947.4269
3/3 [==============================] - 0s 78ms/step - batch: 1.0000 - size: 32.0000 - loss: 7947.4269
## Epoch 89/100
##
1/3 [=========>....................] - ETA: 0s - batch: 0.0000e+00 - size: 32.0000 - loss: 7878.3809
2/3 [===================>..........] - ETA: 0s - batch: 0.5000 - size: 32.0000 - loss: 7720.8579
3/3 [==============================] - ETA: 0s - batch: 1.0000 - size: 32.0000 - loss: 7669.2228
3/3 [==============================] - 0s 63ms/step - batch: 1.0000 - size: 32.0000 - loss: 7669.2228
## Epoch 90/100
##
1/3 [=========>....................] - ETA: 0s - batch: 0.0000e+00 - size: 32.0000 - loss: 7794.6230
2/3 [===================>..........] - ETA: 0s - batch: 0.5000 - size: 32.0000 - loss: 7900.4597
3/3 [==============================] - ETA: 0s - batch: 1.0000 - size: 32.0000 - loss: 7813.1479
3/3 [==============================] - 0s 68ms/step - batch: 1.0000 - size: 32.0000 - loss: 7813.1479
## Epoch 91/100
##
1/3 [=========>....................] - ETA: 0s - batch: 0.0000e+00 - size: 32.0000 - loss: 7994.8584
2/3 [===================>..........] - ETA: 0s - batch: 0.5000 - size: 32.0000 - loss: 7925.8425
3/3 [==============================] - ETA: 0s - batch: 1.0000 - size: 32.0000 - loss: 7880.6730
3/3 [==============================] - 0s 89ms/step - batch: 1.0000 - size: 32.0000 - loss: 7880.6730
## Epoch 92/100
##
1/3 [=========>....................] - ETA: 0s - batch: 0.0000e+00 - size: 32.0000 - loss: 7926.3291
2/3 [===================>..........] - ETA: 0s - batch: 0.5000 - size: 32.0000 - loss: 7832.7104
3/3 [==============================] - ETA: 0s - batch: 1.0000 - size: 32.0000 - loss: 7840.6584
3/3 [==============================] - 1s 463ms/step - batch: 1.0000 - size: 32.0000 - loss: 7840.6584
## Epoch 93/100
##
1/3 [=========>....................] - ETA: 0s - batch: 0.0000e+00 - size: 32.0000 - loss: 7721.2197
2/3 [===================>..........] - ETA: 0s - batch: 0.5000 - size: 32.0000 - loss: 7783.8169
3/3 [==============================] - ETA: 0s - batch: 1.0000 - size: 32.0000 - loss: 7794.1924
3/3 [==============================] - 0s 61ms/step - batch: 1.0000 - size: 32.0000 - loss: 7794.1924
## Epoch 94/100
##
1/3 [=========>....................] - ETA: 0s - batch: 0.0000e+00 - size: 32.0000 - loss: 7798.9004
2/3 [===================>..........] - ETA: 0s - batch: 0.5000 - size: 32.0000 - loss: 7753.4753
3/3 [==============================] - ETA: 0s - batch: 1.0000 - size: 32.0000 - loss: 7696.7140
3/3 [==============================] - 0s 72ms/step - batch: 1.0000 - size: 32.0000 - loss: 7696.7140
## Epoch 95/100
##
1/3 [=========>....................] - ETA: 0s - batch: 0.0000e+00 - size: 32.0000 - loss: 8031.5498
2/3 [===================>..........] - ETA: 0s - batch: 0.5000 - size: 32.0000 - loss: 7934.2278
3/3 [==============================] - ETA: 0s - batch: 1.0000 - size: 32.0000 - loss: 7925.4836
3/3 [==============================] - 0s 81ms/step - batch: 1.0000 - size: 32.0000 - loss: 7925.4836
## Epoch 96/100
##
1/3 [=========>....................] - ETA: 0s - batch: 0.0000e+00 - size: 32.0000 - loss: 7862.2812
2/3 [===================>..........] - ETA: 0s - batch: 0.5000 - size: 32.0000 - loss: 7867.2275
3/3 [==============================] - ETA: 0s - batch: 1.0000 - size: 32.0000 - loss: 7828.0576
3/3 [==============================] - 0s 67ms/step - batch: 1.0000 - size: 32.0000 - loss: 7828.0576
## Epoch 97/100
##
1/3 [=========>....................] - ETA: 0s - batch: 0.0000e+00 - size: 32.0000 - loss: 8188.8535
2/3 [===================>..........] - ETA: 0s - batch: 0.5000 - size: 32.0000 - loss: 7981.6812
3/3 [==============================] - ETA: 0s - batch: 1.0000 - size: 32.0000 - loss: 7995.9229
3/3 [==============================] - 0s 70ms/step - batch: 1.0000 - size: 32.0000 - loss: 7995.9229
## Epoch 98/100
##
1/3 [=========>....................] - ETA: 0s - batch: 0.0000e+00 - size: 32.0000 - loss: 7880.5889
2/3 [===================>..........] - ETA: 0s - batch: 0.5000 - size: 32.0000 - loss: 7819.8560
3/3 [==============================] - ETA: 0s - batch: 1.0000 - size: 32.0000 - loss: 7809.5212
3/3 [==============================] - 0s 100ms/step - batch: 1.0000 - size: 32.0000 - loss: 7809.5212
## Epoch 99/100
##
1/3 [=========>....................] - ETA: 0s - batch: 0.0000e+00 - size: 32.0000 - loss: 8078.8252
2/3 [===================>..........] - ETA: 0s - batch: 0.5000 - size: 32.0000 - loss: 7987.5435
3/3 [==============================] - ETA: 0s - batch: 1.0000 - size: 32.0000 - loss: 7964.3571
3/3 [==============================] - 0s 74ms/step - batch: 1.0000 - size: 32.0000 - loss: 7964.3571
## Epoch 100/100
##
1/3 [=========>....................] - ETA: 0s - batch: 0.0000e+00 - size: 32.0000 - loss: 7730.3989
2/3 [===================>..........] - ETA: 0s - batch: 0.5000 - size: 32.0000 - loss: 7729.3779
3/3 [==============================] - ETA: 0s - batch: 1.0000 - size: 32.0000 - loss: 7748.6608
3/3 [==============================] - 0s 69ms/step - batch: 1.0000 - size: 32.0000 - loss: 7748.6608
if (keras::is_keras_available() & reticulate::py_available()) {
# sample generation
set.seed(1)
gen_sample_result <- gen_exprs(cvae_result, 100,
batch_size, use_generator = TRUE)
# plot for augmented data
plot_aug(gen_sample_result, "PCA")
}
## generating...
## post-processing...
## R version 4.3.1 Patched (2023-06-17 r84564)
## Platform: x86_64-apple-darwin20 (64-bit)
## Running under: macOS Monterey 12.6.5
##
## Matrix products: default
## BLAS: /Library/Frameworks/R.framework/Versions/4.3-x86_64/Resources/lib/libRblas.0.dylib
## LAPACK: /Library/Frameworks/R.framework/Versions/4.3-x86_64/Resources/lib/libRlapack.dylib; LAPACK version 3.11.0
##
## locale:
## [1] C/en_US.UTF-8/en_US.UTF-8/C/en_US.UTF-8/en_US.UTF-8
##
## time zone: America/New_York
## tzcode source: internal
##
## attached base packages:
## [1] stats graphics grDevices utils datasets methods base
##
## other attached packages:
## [1] SC3_1.30.0 GenomicRanges_1.54.0
## [3] SummarizedExperiment_1.32.0 SingleCellExperiment_1.24.0
## [5] IRanges_2.36.0 S4Vectors_0.40.0
## [7] VAExprs_1.8.0 mclust_6.0.0
## [9] keras_2.13.0
##
## loaded via a namespace (and not attached):
## [1] sylly.en_0.1-3 sylly_0.1-6
## [3] later_1.3.1 bitops_1.0-7
## [5] textstem_0.1.4 tibble_3.2.1
## [7] matlab_1.0.4 lifecycle_1.0.3
## [9] doParallel_1.0.17 NLP_0.2-1
## [11] lattice_0.22-5 SnowballC_0.7.1
## [13] magrittr_2.0.3 sass_0.4.7
## [15] rmarkdown_2.25 jquerylib_0.1.4
## [17] yaml_2.3.7 httpuv_1.6.12
## [19] text2vec_0.6.3 doRNG_1.8.6
## [21] reticulate_1.34.0 cowplot_1.1.1
## [23] RColorBrewer_1.1-3 abind_1.4-5
## [25] zlibbioc_1.48.0 rvest_1.0.3
## [27] ttgsea_1.10.0 purrr_1.0.2
## [29] BiocGenerics_0.48.0 RCurl_1.98-1.12
## [31] WriteXLS_6.4.0 float_0.3-1
## [33] CatEncoders_0.1.1 GenomeInfoDbData_1.2.11
## [35] data.tree_1.0.0 tm_0.7-11
## [37] ggrepel_0.9.4 irlba_2.3.5.1
## [39] tokenizers_0.3.0 pheatmap_1.0.12
## [41] DelayedMatrixStats_1.24.0 codetools_0.2-19
## [43] DelayedArray_0.28.0 scuttle_1.12.0
## [45] xml2_1.3.5 tidyselect_1.2.0
## [47] PRROC_1.3.1 farver_2.1.1
## [49] ScaledMatrix_1.10.0 viridis_0.6.4
## [51] matrixStats_1.0.0 stats4_4.3.1
## [53] base64enc_0.1-3 jsonlite_1.8.7
## [55] rsparse_0.5.1 BiocNeighbors_1.20.0
## [57] e1071_1.7-13 ellipsis_0.3.2
## [59] scater_1.30.0 iterators_1.0.14
## [61] foreach_1.5.2 tools_4.3.1
## [63] stringdist_0.9.10 Rcpp_1.0.11
## [65] glue_1.6.2 gridExtra_2.3
## [67] tfruns_1.5.1 SparseArray_1.2.0
## [69] xfun_0.40 MatrixGenerics_1.14.0
## [71] GenomeInfoDb_1.38.0 dplyr_1.1.3
## [73] withr_2.5.1 fastmap_1.1.1
## [75] fansi_1.0.5 digest_0.6.33
## [77] rsvd_1.0.5 mime_0.12
## [79] R6_2.5.1 colorspace_2.1-0
## [81] koRpus_0.13-8 koRpus.lang.en_0.1-4
## [83] RhpcBLASctl_0.23-42 DiagrammeR_1.0.10
## [85] utf8_1.2.4 generics_0.1.3
## [87] data.table_1.14.8 robustbase_0.99-0
## [89] class_7.3-22 stopwords_2.3
## [91] httr_1.4.7 htmlwidgets_1.6.2
## [93] S4Arrays_1.2.0 whisker_0.4.1
## [95] pkgconfig_2.0.3 gtable_0.3.4
## [97] tensorflow_2.14.0 XVector_0.42.0
## [99] pcaPP_2.0-3 htmltools_0.5.6.1
## [101] scales_1.2.1 Biobase_2.62.0
## [103] png_0.1-8 lgr_0.4.4
## [105] knitr_1.44 rstudioapi_0.15.0
## [107] visNetwork_2.1.2 proxy_0.4-27
## [109] cachem_1.0.8 stringr_1.5.0
## [111] parallel_4.3.1 vipor_0.4.5
## [113] mlapi_0.1.1 pillar_1.9.0
## [115] grid_4.3.1 vctrs_0.6.4
## [117] promises_1.2.1 slam_0.1-50
## [119] BiocSingular_1.18.0 beachmat_2.18.0
## [121] xtable_1.8-4 cluster_2.1.4
## [123] beeswarm_0.4.0 evaluate_0.22
## [125] zeallot_0.1.0 mvtnorm_1.2-3
## [127] cli_3.6.1 compiler_4.3.1
## [129] rlang_1.1.1 crayon_1.5.2
## [131] rngtools_1.5.2 webchem_1.3.0
## [133] rrcov_1.7-4 labeling_0.4.3
## [135] ggbeeswarm_0.7.2 stringi_1.7.12
## [137] viridisLite_0.4.2 BiocParallel_1.36.0
## [139] DeepPINCS_1.10.0 munsell_0.5.0
## [141] Matrix_1.6-1.1 sparseMatrixStats_1.14.0
## [143] ggplot2_3.4.4 shiny_1.7.5.1
## [145] ROCR_1.0-11 bslib_0.5.1
## [147] DEoptimR_1.1-3
Aggarwal, C. C. (2018). Neural Networks and Deep Learning. Springer.
Al-Jabery, K., Obafemi-Ajayi, T., Olbricht, G., & Wunsch, D. (2019). Computational Learning Approaches to Data Analytics in Biomedical Applications. Academic Press.
Cinelli, L. P., Marins, M. A., da Silva, E. A. B., & Netto, S. L. (2021). Variational Methods for Machine Learning with Applications to Deep Networks. Springer.
Das, H., Pradhan, C., & Dey, N. (2020). Deep Learning for Data Analytics: Foundations, Biomedical Applications, and Challenges. Academic Press.
Marouf, M., Machart, P., Bansal, V., Kilian, C., Magruder, D. S., Krebs, C. F., & Bonn, S. (2020). Realistic in silico generation and augmentation of single-cell RNA-seq data using generative adversarial networks. Nature communications, 11(1), 1-12.
Pedrycz, W., & Chen, S. M. (Eds.). (2020). Deep Learning: Concepts and Architectures. Springer.
Yan, W. Q. (2020). Computational Methods for Deep Learning: Theoretic, Practice and Applications. Springer.