Professional Documents
Culture Documents
> sink("gev.txt")
> cat("
+ model {
+ # Verossimilhança
+ for (i in 1:n) {
+ }
+ # Prior
+ # dnorm(media, precisao)
+ mu ~ dnorm(986,0.0003)
+ sigma ~ dnorm(294,0.0004)
+ eta ~ dnorm(0.29,48)
+ yp3<-mu+((sigma/eta)*(pow(-log(1-(1/3)),-eta)-1))
+ yp6<-mu+((sigma/eta)*(pow(-log(1-(1/6)),-eta)-1))
+ yp9<-mu+((sigma/eta)*(pow(-log(1-(1/9)),-eta)-1))
+ yp18<-mu+((sigma/eta)*(pow(-log(1-(1/18)),-eta)-1))
+ yp24<-mu+((sigma/eta)*(pow(-log(1-(1/24)),-eta)-1))
+ }
+ ",fill=TRUE)
> sink()
> dados_bug
$y
[1] 1946 1372 1049 731 807 945 899 1222 807 937 661 1963 811 1131 1171
[16] 1302 1589 1245 3328 1124 1020 1121 1021 849 670 745 1752 1560 922
$n
[1] 29
>
> nt = 10
>
>
>
+ parameters =c(params,"yp3","yp6","yp9","yp18","yp24"),
+ model = "gev.txt",
try again with 'debug=TRUE' to figure out what went wrong within OpenBUGS.
>
>
>
+ parameters =c(params,"yp3","yp6","yp9","yp18","yp24"),
+ model = "gev.txt",
>
>
>
>
> geweke.diag(post_g.inf)
deviance
-0.5801
> raftery.diag(post_g.inf)
> heidel.diag(post_g.inf)
test iteration
mu passed 1 0.763
test
>
> HPD.inf=HPDinterval(post_g.inf)
> HPD.inf
lower upper
mu 8.752e+02 1083.0000
attr(,"Probability")
[1] 0.95
>
Current: 3 chains, each with 120000 iterations (first 30000 discarded), n.thin = 10
n.eff
mu 9
sigma 11
eta 650
yp3 9
yp6 12
yp9 16
yp18 25
yp24 31
deviance 35
and Rhat is the potential scale reduction factor (at convergence, Rhat=1).
>
> #resumo1
> #Preditiva
>
>
>
>
> maxf<-dados$Max[30:53]
>
> obs1_1<-max(maxf[1:3])
> obs1_1
[1] 1896
> obs1_2<-max(maxf[1:6])
> obs1_2
[1] 1896
> obs1_3<-max(maxf[1:9])
> obs1_3
[1] 1896
> obs1_4<-max(maxf[1:18])
> obs1_4
[1] 2239
> obs1_5<-max(maxf[1:24])
> obs1_5
[1] 2239
>
> obs<-c(obs1_1,obs1_2,obs1_3,obs1_4,obs1_5)
> obs
>
> #round(EpGEVT*100,2)
[1] 14.78
>
> sink("gumbel.txt")
> cat("
+ model {
+ # Verossimilhança
+ for (i in 1:n) {
+ }
+
+ # Prior
+ # dnorm(media, precisao)
+ mu ~ dnorm(1046,0.0002)
+ sigma ~ dnorm(363,0.0003)
+ }
+ ",fill=TRUE)
> sink()
>
>
> dados_bug
$y
[1] 1946 1372 1049 731 807 945 899 1222 807 937 661 1963 811 1131 1171
[16] 1302 1589 1245 3328 1124 1020 1121 1021 849 670 745 1752 1560 922
$n
[1] 29
>
>
> nt = 10
>
>
>
+ parameters =c(params,"yp3","yp6","yp9","yp18","yp24"),
+ model = "gumbel.txt",
>
>
>
>
>
> geweke.diag(post_g.inf)
> raftery.diag(post_g.inf)
> heidel.diag(post_g.inf)
test iteration
mu passed 1 0.215
test
>
> HPD.inf=HPDinterval(post_g.inf)
> HPD.inf
lower upper
mu 924.2 1110.0
attr(,"Probability")
[1] 0.95
>
Current: 2 chains, each with 60000 iterations (first 15000 discarded), n.thin = 10
sigma 348.930 40.111 275.5 320.8 347.4 375.3 432.2 1.001 6900
yp3 1332.893 65.210 1211.0 1287.0 1331.0 1376.0 1464.0 1.003 910
yp6 1611.782 90.584 1442.0 1548.0 1609.0 1672.0 1797.0 1.002 1300
yp9 1764.238 105.999 1566.0 1690.0 1761.0 1834.0 1981.0 1.002 1500
yp18 2016.519 132.662 1769.0 1924.0 2012.0 2104.0 2288.0 1.002 1800
yp24 2119.428 143.797 1852.0 2019.0 2115.0 2214.0 2414.0 1.002 1900
deviance 431.407 1.314 430.0 430.4 431.1 432.0 434.8 1.001 3500
For each parameter, n.eff is a crude measure of effective sample size,
and Rhat is the potential scale reduction factor (at convergence, Rhat=1).
>
> #resumo1
> #Preditiva
>
>
>
>
> maxf<-dados$Max[30:53]
>
> obs1_1<-max(maxf[1:3])
> obs1_1
[1] 1896
> obs1_2<-max(maxf[1:6])
> obs1_2
[1] 1896
> obs1_3<-max(maxf[1:9])
> obs1_3
[1] 1896
> obs1_4<-max(maxf[1:18])
> obs1_4
[1] 2239
> obs1_5<-max(maxf[1:24])
> obs1_5
[1] 2239
>
> obs<-c(obs1_1,obs1_2,obs1_3,obs1_4,obs1_5)
> obs
>
> #round(EpGEVT*100,2)
[1] 13.38