You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I think the GCV function can stay, but I agree it is necessary to add other other scoring functions to fit non-normal GAMs. Wood outlines the methods of "penalized likelihood maximization" and penalized iterative least squares (PIRLS) on page 180 and 181 of his text. These both employ the GCV function already built, but have other steps involved. When I have some time I will take a look into converting his functions into Julia and replying here.
Currently, the code is as per the below, but should we really be using RSS for say a Poisson problem?
function GCV(param::AbstractVector, Basis::BSplineBasis{Vector{Float64}}, x::AbstractVector, y::AbstractVector)
n = length(Basis.breakpoints)
Xp, yp = PenaltyMatrix(Basis, param[1], x, y)
β = coef(lm(Xp,yp))
H = Xpinv(Xp'Xp)Xp' # Hat matrix
trF = sum(diag(H)[1:n])
y_hat = Xpβ
rss = sum((yp-y_hat)[1:n].^2) # Residual sums of squares
gcv = n*rss/(n-trF)^2
return gcv
end
The text was updated successfully, but these errors were encountered: