Take the 2-minute tour ×
Code Review Stack Exchange is a question and answer site for peer programmer code reviews. It's 100% free, no registration required.

I have trouble with for loop, my code runs very slowly. The thing I want to do is to use function from apply family to make my codes run faster (instead of using for loopand while). Here is an example and my loop:

require(data.table)
require(zoo)

K<-seq(1,1000, by=1)
b<-c(rep(2,250), rep(3, 250), rep(4, 250), rep(5,250))
a<-c(rep(6,250), rep(7,250), rep(8,250), rep(9,250))
rf<-rep(0.05, 1000)
L<-rep(10,1000)
cap<-rep(20,1000)
df<-data.frame(K, rf, L, cap, a,b)
blackscholes <- function(S, X, rf, h, sigma) {
    d1 <- (log(S/X)+(rf+sigma^2/2)*h)/sigma*sqrt(h)
}
df$logiterK<-log(df$K)
df<-as.data.table(df)
df[,rollsd:=rollapply(logret, 250, sd, fill = NA, align='right')*sqrt(250), by=c("a", "b")]
df[,assetreturn:=c(NA,diff(logiterK)),by=c("a", "b")] 
df[,rollsdasset:=rollapply(assetreturn, 249, sd, fill=NA, align='right')*sqrt(250), by=c("a", "b")]
df[,K1:=(cap+L*exp(-rf)*pnorm(blackscholes(K,L,rf, 1,rollsdasset[250]))-rollsdasset[250])/pnorm(blackscholes(K,L,rf, 1,rollsdasset[250])),by=c("a","b")] 

errors<-ddply( df, .(a,b), function(x) sum((x$K-x$K1)^2))
df<-as.data.frame(df)
df<-join(df, errors, by=c("a", "b"))
for ( i in 1:nrow(errors)){
    while(errors$V1[i] >= 10^(-10)) {
        df<-as.data.table(df)
        df[,K:= K1,by=c("a", "b")] 
        df[,assetreturn:=c(NA,diff(log(K))),by=c("a", "b")] 
        df[,rollsdasset:=rollapply(assetreturn, 249, sd, fill=NA, align='right')*sqrt(250), by=c("a", "b")]
        df[,iterK1:=(cap+L*exp(-rf)*pnorm(blackscholes(K,L,rf, 1,rollsdasset[250]))-rollsdasset[250])/pnorm(blackscholes(K,L,rf, 1,rollsdasset[250])) ,by=c("a", "b")]
        df<-as.data.frame(df)
        errors$V1[i]<-sum((df[df$V1 %in% errors$V1[i],"K"]-df[df$V1 %in% errors$V1[i],"K1"])^2)
        }
    }

Any help would be appreciated.

share|improve this question
    
I am unable to run the code because an object logret is not found. Is it a function? Could you please specify the package at the beginning? I have added loading of data.table and zoo. –  djhurio May 9 '14 at 5:15
1  
The applyfamily of functions all implement a for loop. They are not faster, just give you more ways to write for loops in a concise manner. –  flodel May 10 '14 at 11:27
2  
The constant switching from data.frame to data.table might be where you waste a lot of time. Try converting to a data.table once and for all and stick to it. –  flodel May 10 '14 at 11:39
    
Thanks a lot for suggestion, you are right –  user3618375 May 10 '14 at 12:41

1 Answer 1

up vote 2 down vote accepted

You could replace the for loop with a function + sapply like this:

reduce.errors <- function(err) {
  while (err >= 10^(-10)) {
    df<-as.data.table(df)
    df[,K:= K1,by=c("a", "b")] 
    df[,assetreturn:=c(NA,diff(log(K))),by=c("a", "b")] 
    df[,rollsdasset:=rollapply(assetreturn, 249, sd, fill=NA, align='right')*sqrt(250), by=c("a", "b")]
    df[,iterK1:=(cap+L*exp(-rf)*pnorm(blackscholes(K,L,rf, 1,rollsdasset[250]))-rollsdasset[250])/pnorm(blackscholes(K,L,rf, 1,rollsdasset[250])) ,by=c("a", "b")]
    df<-as.data.frame(df)
    err <- sum((df[df$V1 %in% err,"K"]-df[df$V1 %in% err,"K1"])^2)
  }  
}
sapply(errors$V1, reduce.errors)

But I don't think this will make it faster at all. If I understand correctly you need the while loop there to reduce the error below a threshold, and so you need the iteration and this cannot be replaced with the "apply" functions easily.

If you want to improve the speed, I think you'll need to rethink come up with a different approach, if it's even possible.

share|improve this answer
    
Thanks a lot for help –  user3618375 May 10 '14 at 12:41

Your Answer

 
discard

By posting your answer, you agree to the privacy policy and terms of service.

Not the answer you're looking for? Browse other questions tagged or ask your own question.