Although the book is in Polish, sources of all procedures used in it, which are available on my website, can be used without the book. Here is a simplified code from exercise 4.5 presenting neural network bagging:
library(nnet)
set.seed(1)
SAMPLE_SIZE <- 256
X <- seq(-2, 2, length.out = SAMPLE_SIZE)
TRUE_Y <- X ^ 2 / 2 + sin(4 * X)
y <- TRUE_Y + 2 * rnorm(SAMPLE_SIZE)
GetBootstrapPrediction <- function() {
bootstrap.indices <- sample(SAMPLE_SIZE, replace = T)
bootstrap.sample.y <- y[bootstrap.indices]
bootstrap.sample.x <- X[bootstrap.indices]
bootstrap.model <- nnet(bootstrap.sample.y ~
bootstrap.sample.x,
lin = T, size = 4, trace = FALSE, maxit = 10 ^ 6)
return(predict(bootstrap.model,
data.frame(bootstrap.sample.x
= X)))
}
progress.bar <- winProgressBar("Progress in
%", "0%
done", 0, 1, 0)
BOOTSTRAP_REPLICATIONS
<- 1024
bootstrap.predictions
<- rep(0, SAMPLE_SIZE)
for (i in 1:BOOTSTRAP_REPLICATIONS) {
bootstrap.predictions <-
bootstrap.predictions +
GetBootstrapPrediction()
percentage <- i / BOOTSTRAP_REPLICATIONS
setWinProgressBar(progress.bar, percentage, "Progress in
%",
sprintf("%d%% done", round(100 * percentage)))
}
close(progress.bar)
plot(X, y,xlim = c(-2, 2), ylim = c(-5, 6))
lines(X, TRUE_Y, lwd = 4)
lines(X, bootstrap.predictions /
BOOTSTRAP_REPLICATIONS,lwd = 3, col = 3)
It produces the following graph. Circles represent training data, black line is true relationship and green line is prediction from bagging procedure:
Hello, greating from Lithuania for writting R related book.
ReplyDeleteThanks for examples, of course more comments in R code would be great. One can try to translate it or maybe some Poland R enthusiast would translate to english. Will try to work with it as it is.
Anyway, good work!
Maybe in the future we will translate it. In the meantime if you have any questions concerning the codes please write an e-mail to me.
Delete