Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

lasso is not able to handle matrices instead of vectors #5

Open
ingenieroariel opened this issue Feb 21, 2014 · 6 comments
Open

lasso is not able to handle matrices instead of vectors #5

ingenieroariel opened this issue Feb 21, 2014 · 6 comments

Comments

@ingenieroariel
Copy link

Here is an example that could be added to the test suite, for my problem I need to send matrices instead of vectors to the lasso solver:

using ParallelSparseRegression

dimensions, components, samples = 180, 100, 1500

dictionary = sprandn(dimensions, components, 0.1)
original_code = sprandn(components, samples, 0.1)
data = dictionary * original_code

# minimize \|data - dictionary*code\|_2^2 + \lambda\|code\|_1
# for variable code.
code = lasso(data, dictionary, 1)

# FIXME(Ariel): Is this the best way to check it found the real solution?
@assert(code = original_code)
@madeleineudell
Copy link
Owner

Generically, @assert(code = original_code) will NOT be true for a
solution to the lasso problem. The lasso is a biased estimator; the
recovered code will have smaller norm than the original code, even if it
has the same sparsity pattern. To test this solver, I would suggest
computing a solution to a particular instance using a known and trusted
solver (for example, cvx or tfocs in matlab, cvxpy or cvxopt in python) and
comparing the the solution returned by this package to that one. The
solutions should be close, up to the tolerance specified by the algorithm,
eg TOL = 1E-4; @assert(norm(recovered_code - cvxpy_recovered_code)<TOL)

On Fri, Feb 21, 2014 at 5:59 AM, Ariel Núñez notifications@github.comwrote:

Here is an example that could be added to the test suite, for my problem I
need to send matrices instead of vectors to the lasso solver:

using ParallelSparseRegression

dimensions, components, samples = 180, 100, 1500

dictionary = sprandn(dimensions, components, 0.1)
original_code = sprandn(components, samples, 0.1)
data = dictionary * original_code

minimize |data - dictionary*code|_2^2 + \lambda|code|_1

for variable code.

code = lasso(data, dictionary, 1)

FIXME(Ariel): Is this the best way to check it found the real solution?

@Assert(code = original_code)

Reply to this email directly or view it on GitHubhttps://github.com//issues/5
.

Madeleine Udell
PhD Candidate in Computational and Mathematical Engineering
Stanford University
www.stanford.edu/~udell

@ingenieroariel
Copy link
Author

I used CVX in matlab and saved the matrices to a file:
http://ingenieroariel.com/static/admm.mat

The sizes of A, B_cvx and X are (100,100), (100,80) and (100,80):

using MAT
using ParallelSparseRegression

# Get sample data from a .mat file
file = matopen("admm.mat")
A = read(file, "A")
B_cvx = read(file, "B")
X = read(file, "X")
cost = read(file, "cost")
close(file)

# minimize \|data - dictionary*code\|_2^2 + \lambda\|code\|_1
# for variable code.
B = lasso(X, A, 1)

TOL = 1E-4; @assert(B == B_cvx)

I still get "ERROR: BoundsError()" when I call the lasso, most likely because it is using matrices instead of vectors, is there a good way to reformulate my problem before sending to the lasso? I remember reading something about kron product, will look it up.

@ingenieroariel
Copy link
Author

I also checked that the following lasso admm implementation by Simon Lucey produced the same results in the file:

% Function to perform LASSO regression using Alternating Direction Method
% of Multipliers.
%
% arg min_{B} 0.5*||X - A*B||_{2}^{2} + gamma*||B||_{1}
%
% Usage:- [B,cost] = lasso_admm(X, A, gamma) 
%
% where:- <in>
%         b = bias vector
%         lambda = weighting on the l1 penalty
%         <out>
%         x = solution  
%
% Written by Simon Lucey 2012

function [B,cost] = lasso_admm(X, A, gamma)

% Get dimensions of B
c = size(X,2);
r = size(A,2); 

L = zeros(size(X)); % Initialize Lagragian to be nothing (seems to work well)
rho = 1e-4; % Set rho to be quite low to start with 
maxIter = 500; % Set the maximum number of iterations (make really big to ensure convergence)
I = speye(r); % Set the sparse identity matrix
maxRho = 5; % Set the maximum mu
C = randn(r,c); % Initialize C randomly

% Set the fast soft thresholding function
fast_sthresh = @(x,th) sign(x).*max(abs(x) - th,0);

% Set the norm functions
norm2 = @(x) x(:)'*x(:); 
norm1 = @(x) sum(abs(x(:))); 

cost = [];
for n = 1:maxIter
    F = (A'*A+rho*I);
    G = (A'*X + rho*C - L);
    % Solve sub-problem to solve B
    B = F \ G; 

    if n==115
        1 ==1
    end
    % Solve sub-problem to solve C
    C = fast_sthresh(B + L/rho, gamma/rho); 

    % Update the Lagrangian
    L = L + rho*(B - C);  

    %pause; 

    % Section 3.3 in Boyd's book describes strategies for adapting rho
    % main strategy should be to ensure that 
    rho = min(maxRho, rho*1.1); 

    % get the current cost
    cost(n) = 0.5*norm2(X - A*B) + gamma*norm1(B);

end

@madeleineudell
Copy link
Owner

This package is not designed to do regression on matrices. But if you want to solve minimize |AX - B|_2^2 + |X|_1, where X and B are matrices, you can reformulate the problem as a vector problem in the following way. Let b = vcat([B[:,i] for i in size(B,2)]) be the column vector that concatenates the columns of B, and let AA = vcat([A for i in size(B,2)]) be a matrix that vertically concatenates as many copies of A as there are columns in B. Then lasso(AA,b) gives a vector which vertically concatenates the columns of the matrix X you're looking for.

@ingenieroariel
Copy link
Author

Okay, I have been able to test this module against an implementation in CVX and a simple lasso, it seems the result I get is a solution to the l2 norm problem but is not sparse:

x@arielo ~/thesis/notes/admm $ julia example.jl 
iter :         r     eps_pri           s    eps_dual
   1 :  1.07e+01    1.16e-01    5.33e+00    8.94e-03
  10 :  7.32e+00    1.31e-01    8.57e-01    7.99e-01
  20 :  3.71e+00    1.71e-01    6.30e-01    1.18e+00
  30 :  2.22e+00    1.73e-01    3.54e-01    1.31e+00
  40 :  1.48e+00    1.73e-01    2.24e-01    1.37e+00
  50 :  1.10e+00    1.71e-01    1.55e-01    1.41e+00
  60 :  8.57e-01    1.70e-01    1.16e-01    1.44e+00
  70 :  7.04e-01    1.69e-01    8.01e-02    1.45e+00
  80 :  5.90e-01    1.68e-01    6.79e-02    1.46e+00
  90 :  5.06e-01    1.68e-01    5.12e-02    1.47e+00
 100 :  4.40e-01    1.67e-01    4.43e-02    1.48e+00
 110 :  3.85e-01    1.67e-01    3.78e-02    1.49e+00
 120 :  3.39e-01    1.66e-01    3.20e-02    1.49e+00
 130 :  2.99e-01    1.66e-01    2.87e-02    1.50e+00
 140 :  2.69e-01    1.66e-01    2.47e-02    1.50e+00
 150 :  2.46e-01    1.66e-01    2.02e-02    1.51e+00
 160 :  2.29e-01    1.65e-01    1.73e-02    1.51e+00
 170 :  2.16e-01    1.65e-01    1.49e-02    1.51e+00
 180 :  2.03e-01    1.65e-01    1.50e-02    1.51e+00
 190 :  1.90e-01    1.65e-01    1.41e-02    1.52e+00
 200 :  1.80e-01    1.65e-01    1.15e-02    1.52e+00
 210 :  1.71e-01    1.65e-01    1.09e-02    1.52e+00
total iterations: 217
elapsed time: 42.691435953 seconds
B_cvx[1, 1:5]
12698429068239566e-32   .013939616755278827 -7353072051576453e-20   .06373070099702301  -1157651480454043e-30

B_julia[1, 1:5]
.0003370479204108641    .012555947526811096 -.019597468707497928    .022700579905230008 -.0013147338574289631

ERROR: assertion failed
 in include_from_node1 at loading.jl:120
while loading /home/x/thesis/notes/admm/example.jl, in expression starting on line 41

The code is here:

using MAT
using ParallelSparseRegression


function matrixlasso(dictionary, data, lambda=1)
    dimensions, components = size(dictionary)
    dimensions, samples = size(data)

    # [b1 b2]
    b = sparse(reshape(data, dimensions * samples))

    # [D 0]
    # [0 D]
    AA = kron(speye(samples), sparse(dictionary))

    # [c1 c2]
    x = lasso(AA, b, lambda)

    # [C]
    code = reshape(x, components, samples)
    return code
end


file = matopen("admm.mat")
A = read(file,"A")
B_cvx = read(file, "B")
X = read(file, "X")
cost = read(file, "cost")
close(file)

# ||AB - X||_2^2 + ||B||_1
B_julia = matrixlasso(A, X)

print("B_cvx[1, 1:5]","\n")
print(B_cvx[1, 1:5], "\n")
print("B_julia[1, 1:5]","\n")
print(B_julia[1,1:5], "\n")

assert(size(B_julia) == size(B_cvx))
assert(B_julia == B_cvx)

Data can be downloaded from:
http://ingenieroariel.com/static/admm.mat

@ingenieroariel
Copy link
Author

Also did the following from Julia:

B_scipy = decomposition.sparse_encode(X', A', alpha=1)'
 @pyimport sklearn.decomposition as decomposition

And got the same result as matlab with CVX (best viewed directly in github):

julia> B_scipy = decomposition.sparse_encode(X', A', alpha=1)'
100x80 Array{Float64,2}:
  0.0          0.0139396   -7.35307e-5   0.0637307   0.0          0.197123      0.105221   …  -0.0853114   0.125546    -0.215341     0.00451901   0.16689      0.0         -0.162344 
  0.00671331  -0.271385     0.0         -0.0865927  -0.239533     0.0           0.130555      -0.247439   -0.192975     0.108291     0.129311    -0.00745488   0.151057     0.134188 
  0.092018    -0.0466571    0.0          0.0        -0.173526     0.0583904     0.0           -0.239976    0.0222884    0.0547041   -0.00838742  -0.0430508    0.0          0.0108664
  0.0          0.0406571    0.0710886   -0.0841806   0.149463    -0.213898     -0.155298       0.238002    0.0         -0.00231269   0.0654717   -0.30452      0.0429364   -0.181418 
 -0.222417     0.0         -0.0565775    0.0         0.0         -0.035914      0.0            0.302307    0.0205838    0.0845521   -0.21952     -0.383232     0.0         -0.107403 
 -0.143232     0.0736967   -0.0184351   -0.064996    0.0364668    0.0          -0.319453   …  -0.191406    0.0959937   -0.03323     -0.0321607    0.0          0.433653     0.138869 
 -0.0417249   -0.0537671   -0.129682     0.0         0.0         -0.0148316    -0.099432      -0.0530144  -0.202929    -0.131838     0.0424001   -0.152744     0.0525337    0.0      
  0.127261     0.11239      0.0          0.111226    0.0          0.026315      0.139523       0.101185   -0.229387     0.107155     0.0162594    0.189774    -0.205809     0.0790809
 -0.0288046    0.0          0.130798     0.0        -0.29608     -0.226412      0.0790443     -0.0115126   0.0         -0.368609     0.0         -0.0175567    0.104483    -0.0215514
  0.159154     0.187072     0.0         -0.0313609   0.0          0.0          -0.121287       0.0         0.0         -0.0390657    0.0          0.024583     0.228062     0.0      
  0.112463    -0.0586232   -0.0702961   -0.131088    0.11252      0.0          -0.125692   …  -0.0243822  -0.0109358    0.10058     -0.19238     -0.0525926    0.186771     0.0      
  0.0544893    0.0534809   -0.225345     0.151859   -0.00926664  -0.0113341     0.0           -0.0436159  -0.0338527    0.0229145    0.0108512    0.0          0.00151912   0.253073 
 -0.410077     0.0          0.0          0.0         0.0674603    0.0830968     0.14277        0.24455    -0.0582725   -0.421776    -0.0662308    0.0290147   -0.0463231   -0.169209 
  0.103649     0.0          0.0679544   -0.0282118   0.0834912    0.000702372   0.178511       0.254805    0.00543017   0.334662    -0.125654    -0.00758221   0.0915403    0.117483 
  0.17975     -0.147339     0.0          0.04693     0.184891    -0.0240954     0.184799       0.0397556   0.117944     0.172326     0.0          0.0121911    0.173312    -0.168928 
 -0.089977    -0.0446637    0.0715952    0.232478   -0.139833     0.0          -0.173907   …  -0.0382983  -0.0593752    0.125548    -0.00998831   0.0354989    0.0148703    0.0062226
  0.140243    -0.250244     0.288005    -0.168106   -0.31282     -0.179343      0.0855086      0.0911132   0.0975621    0.0         -0.111709     0.164241     0.111485     0.0      
  0.0849159    0.0         -0.0963193   -0.0744175   0.174541     0.0           0.136773       0.0        -0.0189869    0.0142278    0.0279454    0.0          0.0          0.0      
  0.0552466   -0.0134877   -0.14529      0.0521893   0.31103     -0.027733      0.0            0.0         0.301843     0.0609932    0.114953    -0.255255     0.306983     0.0      
 -0.0482881    0.00641886  -0.0462691    0.0337346   0.0211458    0.0736829     0.155617       0.088703    0.336031     0.28972     -0.015167     0.0308042    0.0412673   -0.152172 
  0.0          0.0225403    0.0          0.136902    0.10653     -0.202327      0.130218   …   0.0120922   0.0273708   -0.111827    -0.125273     0.118236     0.126329    -0.0746866
  0.0403421    0.0258216   -0.0884775    0.20906    -0.065059     0.022254      0.0870001      0.226462    0.23719      0.127652     0.0         -0.0496721    0.0          0.0      
  0.0          0.0          0.0850217   -0.171317    0.183628     0.101963      0.231549       0.0        -0.301888     0.115841    -0.167362     0.03563      0.124953     0.188337 
  0.0         -0.0394901    0.27371      0.0173336  -0.257816     0.161648      0.0            0.244729   -0.0502436    0.00320964  -0.0719819    0.0          0.0          0.0      
  0.0162171    0.251252     0.0729622    0.228595   -0.309659    -0.256088     -0.154805       0.256831    0.342689    -0.0034437    0.0732845   -0.04361      0.0         -0.0423222
  ⋮                                                               ⋮                        ⋱                            ⋮                                                            
  0.0          0.0         -0.135709     0.0         0.0155266    0.0          -0.119532      -0.016565   -0.211007     0.247943     0.0          0.0446025    0.0          0.0      
 -0.0542337    0.265554     0.0921252   -0.0476038   0.27764     -0.0682732    -0.0272853      0.234726   -0.0178333    0.0         -0.0248664    0.0         -0.325215    -0.086565 
  0.124329    -0.0025759    0.221306     0.0799693  -0.151647     0.0          -0.392836      -0.111459   -0.0358537    0.0780863    0.0          0.0352136    0.105638    -0.0119163
  0.100427     0.179975     0.0         -0.0844956   0.0142206    0.0           0.0628009      0.0         0.0453503   -0.0187895    0.0272987    0.0415041    0.0477133    0.0      
  0.101916     0.0         -0.0697573    0.0         0.107866     0.0303107     0.119486   …  -0.068872   -0.0398203    0.0952906    0.0          0.0          0.0         -0.265723 
  0.0163071    0.0         -0.133172     0.208391    0.154222     0.0           0.290136      -0.0216161  -0.0531057    0.0         -0.00708974  -0.0156749    0.0655515    0.0      
  0.0          0.0         -0.212975     0.0         0.0102621    0.0533782     0.0            0.0958267   0.0          0.0          0.0         -0.175486     0.0          0.0148493
 -0.291424     0.0837155    0.0          0.0857853   0.0          0.0307569     0.122385       0.0623261  -0.0420877    0.0          0.0153309    0.240045     0.127116     0.0      
  0.040017    -0.127659     0.0854545    0.0        -0.202186    -0.0432097     0.0327421      0.0        -0.0442597    0.0          0.0         -0.19467     -0.236476     0.0636748
  0.0286221   -0.0366417   -0.0990827   -0.0741034  -0.196447    -0.00559864   -0.0157355  …  -0.0164241   0.100856    -0.0513738   -0.197535     0.0689478    0.0          0.0      
 -0.0457259    0.303425     0.0          0.0        -0.0292353    0.0           0.229368       0.199846   -0.0529167    0.00215494  -0.290643     0.0         -0.03808     -0.0372205
 -0.206679    -0.15615     -0.0797169    0.120087   -0.215291     0.305089     -0.0241901     -0.233644   -0.0871398   -0.222093     0.20695      0.0559714   -0.0646555    0.19889  
 -0.148835     0.111808    -0.124896    -0.0817988   0.19326     -0.0489159    -0.0984961     -0.189682    0.0783978   -0.0290623   -0.00779019   0.217742    -0.118851     0.0729628
 -0.053811     0.0         -0.0211201    0.0         0.137885     0.0433281     0.114878       0.17886     0.0272867   -0.081256     0.0374359    0.113293     0.151196     0.0      
  0.0          0.231325    -0.0731003   -0.209764   -0.048146     0.0818697     0.0        …  -0.0569606   0.148913    -0.0860469    0.0         -0.0172812   -0.133877    -0.22088  
  0.15216      0.0          0.0489168    0.053746    0.0         -0.0235847     0.138471       0.0788577  -0.108394    -0.224967     0.072282     0.287115     0.0          0.0      
 -0.0901374   -0.185604     0.246231    -0.0941761  -0.269384    -0.316862     -0.341035      -0.537795   -0.129642    -0.0189861    0.0          0.0          0.0         -0.0427636
 -0.148731     0.0          0.00612961   0.0         0.0777175    0.0197545     0.0638228      0.132194   -0.01997     -0.0771284    0.125271    -0.147775     0.0919206    0.0929849
  0.176009    -0.027776     0.0442896    0.0         0.086653     0.0752426     0.220084       0.0380418  -0.0935554    0.0          0.0219196   -0.161433     0.106393     0.0      
  0.0          0.0          0.0476136   -0.003474   -0.161337     0.256054      0.0472292  …  -0.126704    0.10149      0.05294      0.159351     0.0         -0.316103     0.0      
 -0.0228349    0.0342999   -0.140112     0.0        -0.0733659    0.0           0.0802205      0.0732138   0.100556    -0.0304118    0.0         -0.090497     0.232528    -0.136202 
  0.0         -0.146676     0.0          0.0326132   0.173679    -0.0668674    -0.158378       0.0756563  -0.319589     0.0          0.135652     0.00611796  -0.00236722  -0.0476401
  0.00616579  -0.0550297    0.0145877   -0.11284     0.0         -0.0595453    -0.161158      -0.0929086  -0.470292     0.0146403    0.0         -0.194445     0.154948    -0.0403172
 -0.139795    -0.116678    -0.0975358    0.250031    0.0          0.0          -0.290872       0.256622    0.0         -0.39564     -0.0348308   -0.0226514    0.0          0.0  

julia> B_cvx
100x80 Array{Float64,2}:
  1.26984e-16   0.0139396    -7.35307e-5    0.0637307    -1.15765e-15   0.197123      0.105221     …   0.125546     -0.215341      0.00451901    0.16689      -4.34295e-14  -0.162344   
  0.00671331   -0.271385      1.60135e-12  -0.0865927    -0.239533      9.94123e-14   0.130555        -0.192975      0.108291      0.129311     -0.00745488    0.151057      0.134188   
  0.092018     -0.0466571    -5.25166e-13  -4.64765e-11  -0.173526      0.0583904    -8.4439e-17       0.0222884     0.0547041    -0.00838742   -0.0430508     5.31588e-14   0.0108664  
  1.07977e-14   0.0406571     0.0710886    -0.0841806     0.149463     -0.213898     -0.155298         2.46853e-16  -0.00231269    0.0654717    -0.30452       0.0429364    -0.181418   
 -0.222417      3.42479e-12  -0.0565775     5.22998e-12   1.86998e-16  -0.035914     -1.37868e-16      0.0205838     0.0845521    -0.21952      -0.383232     -1.62462e-14  -0.107403   
 -0.143232      0.0736967    -0.0184351    -0.064996      0.0364668    -7.70142e-14  -0.319453     …   0.0959937    -0.03323      -0.0321607    -2.64668e-12   0.433653      0.138869   
 -0.0417249    -0.0537671    -0.129682      4.31252e-12  -4.689e-16    -0.0148316    -0.099432        -0.202929     -0.131838      0.0424001    -0.152744      0.0525337    -3.80182e-12
  0.127261      0.11239      -1.06557e-12   0.111226      1.61246e-15   0.026315      0.139523        -0.229387      0.107155      0.0162594     0.189774     -0.205809      0.0790809  
 -0.0288046    -2.44397e-11   0.130798      3.83147e-11  -0.29608      -0.226412      0.0790443       -1.22876e-16  -0.368609     -1.45545e-13  -0.0175567     0.104483     -0.0215514  
  0.159154      0.187072     -1.51861e-12  -0.0313609     1.50293e-16   5.54167e-14  -0.121287         4.01235e-17  -0.0390657     4.3033e-13    0.024583      0.228062      7.16578e-12
  0.112463     -0.0586232    -0.0702961    -0.131088      0.11252      -2.56007e-13  -0.125692     …  -0.0109358     0.10058      -0.19238      -0.0525926     0.186771      2.6878e-11 
  0.0544893     0.0534809    -0.225345      0.151859     -0.00926664   -0.0113341    -5.02133e-17     -0.0338527     0.0229145     0.0108512    -1.06462e-13   0.00151912    0.253073   
 -0.410077      1.11298e-11  -4.43362e-12  -6.75429e-11   0.0674603     0.0830968     0.14277         -0.0582725    -0.421776     -0.0662308     0.0290147    -0.0463231    -0.169209   
  0.103649      1.08989e-11   0.0679544    -0.0282118     0.0834912     0.000702372   0.178511         0.00543017    0.334662     -0.125654     -0.00758221    0.0915403     0.117483   
  0.17975      -0.147339      1.75041e-12   0.04693       0.184891     -0.0240954     0.184799         0.117944      0.172326      6.13239e-13   0.0121911     0.173312     -0.168928   
 -0.089977     -0.0446637     0.0715952     0.232478     -0.139833      1.84145e-14  -0.173907     …  -0.0593752     0.125548     -0.00998831    0.0354989     0.0148703     0.0062226  
  0.140243     -0.250244      0.288005     -0.168106     -0.31282      -0.179343      0.0855086        0.0975621    -1.06792e-16  -0.111709      0.164241      0.111485     -2.81678e-11
  0.0849159     2.44908e-11  -0.0963193    -0.0744175     0.174541     -1.07807e-13   0.136773        -0.0189869     0.0142278     0.0279454    -2.51349e-12  -2.56534e-14   1.49469e-11
  0.0552466    -0.0134877    -0.14529       0.0521893     0.31103      -0.027733      1.36437e-16      0.301843      0.0609932     0.114953     -0.255255      0.306983     -2.8078e-11 
 -0.0482881     0.00641886   -0.0462691     0.0337346     0.0211458     0.0736829     0.155617         0.336031      0.28972      -0.015167      0.0308042     0.0412673    -0.152172   
 -1.97109e-15   0.0225403    -7.01816e-13   0.136902      0.10653      -0.202327      0.130218     …   0.0273708    -0.111827     -0.125273      0.118236      0.126329     -0.0746866  
  0.0403421     0.0258216    -0.0884775     0.20906      -0.065059      0.022254      0.0870001        0.23719       0.127652     -4.13464e-13  -0.0496721     6.55799e-14  -1.02061e-11
  2.15158e-14  -2.59867e-11   0.0850217    -0.171317      0.183628      0.101963      0.231549        -0.301888      0.115841     -0.167362      0.03563       0.124953      0.188337   
 -2.12543e-15  -0.0394901     0.27371       0.0173336    -0.257816      0.161648     -2.90586e-16     -0.0502436     0.00320964   -0.0719819    -3.28179e-13  -3.18788e-14  -5.27976e-13
  0.0162171     0.251252      0.0729622     0.228595     -0.309659     -0.256088     -0.154805         0.342689     -0.0034437     0.0732845    -0.04361      -4.16791e-14  -0.0423222  
  ⋮                                                                     ⋮                          ⋱                 ⋮                                                                  
  3.02952e-14   4.48974e-12  -0.135709     -4.65853e-12   0.0155266     1.4682e-13   -0.119532        -0.211007      0.247943      6.57317e-14   0.0446025     6.37175e-14  -2.72854e-11
 -0.0542337     0.265554      0.0921252    -0.0476038     0.27764      -0.0682732    -0.0272853       -0.0178333     1.13075e-17  -0.0248664    -1.9957e-12   -0.325215     -0.086565   
  0.124329     -0.0025759     0.221306      0.0799693    -0.151647      1.22538e-13  -0.392836        -0.0358537     0.0780863    -6.14396e-13   0.0352136     0.105638     -0.0119163  
  0.100427      0.179975     -2.39445e-12  -0.0844956     0.0142206    -2.65506e-13   0.0628009        0.0453503    -0.0187895     0.0272987     0.0415041     0.0477133     3.81477e-11
  0.101916     -4.92003e-12  -0.0697573     5.75295e-11   0.107866      0.0303107     0.119486     …  -0.0398203     0.0952906    -8.66129e-13   3.70512e-12  -4.65128e-14  -0.265723   
  0.0163071    -2.78817e-11  -0.133172      0.208391      0.154222      1.99108e-13   0.290136        -0.0531057    -1.50316e-16  -0.00708974   -0.0156749     0.0655515    -6.48394e-12
  8.16982e-15  -6.81406e-12  -0.212975      4.73686e-12   0.0102621     0.0533782    -1.17497e-16      2.2312e-16    1.78399e-16   2.89511e-13  -0.175486      1.92221e-14   0.0148493  
 -0.291424      0.0837155     2.77985e-12   0.0857853    -1.70895e-16   0.0307569     0.122385        -0.0420877     1.82848e-17   0.0153309     0.240045      0.127116      5.63598e-12
  0.040017     -0.127659      0.0854545     1.85898e-11  -0.202186     -0.0432097     0.0327421       -0.0442597     7.47656e-17   8.09253e-15  -0.19467      -0.236476      0.0636748  
  0.0286221    -0.0366417    -0.0990827    -0.0741034    -0.196447     -0.00559864   -0.0157355    …   0.100856     -0.0513738    -0.197535      0.0689478     2.54892e-14  -3.57924e-11
 -0.0457259     0.303425      3.06233e-12   2.76056e-11  -0.0292353     8.33311e-14   0.229368        -0.0529167     0.00215494   -0.290643     -5.10446e-14  -0.03808      -0.0372205  
 -0.206679     -0.15615      -0.0797169     0.120087     -0.215291      0.305089     -0.0241901       -0.0871398    -0.222093      0.20695       0.0559714    -0.0646555     0.19889    
 -0.148835      0.111808     -0.124896     -0.0817988     0.19326      -0.0489159    -0.0984961        0.0783978    -0.0290623    -0.00779019    0.217742     -0.118851      0.0729628  
 -0.053811      1.86443e-11  -0.0211201     2.89916e-11   0.137885      0.0433281     0.114878         0.0272867    -0.081256      0.0374359     0.113293      0.151196     -9.44555e-12
  1.45399e-14   0.231325     -0.0731003    -0.209764     -0.048146      0.0818697     3.04947e-16  …   0.148913     -0.0860469     4.31955e-13  -0.0172812    -0.133877     -0.22088    
  0.15216       3.70605e-12   0.0489168     0.053746      1.94918e-16  -0.0235847     0.138471        -0.108394     -0.224967      0.072282      0.287115      1.72256e-14   7.90182e-12
 -0.0901374    -0.185604      0.246231     -0.0941761    -0.269384     -0.316862     -0.341035        -0.129642     -0.0189861     9.34785e-14   2.55009e-12  -3.30011e-14  -0.0427636  
 -0.148731     -7.95238e-12   0.00612961    1.63064e-11   0.0777175     0.0197545     0.0638228       -0.01997      -0.0771284     0.125271     -0.147775      0.0919206     0.0929849  
  0.176009     -0.027776      0.0442896    -3.98776e-11   0.086653      0.0752426     0.220084        -0.0935554     7.99783e-17   0.0219196    -0.161433      0.106393     -1.16811e-11
  4.05117e-14  -9.22803e-12   0.0476136    -0.003474     -0.161337      0.256054      0.0472292    …   0.10149       0.05294       0.159351     -2.02335e-12  -0.316103     -1.74537e-11
 -0.0228349     0.0342999    -0.140112      2.21944e-11  -0.0733659    -1.06882e-13   0.0802205        0.100556     -0.0304118     4.69221e-13  -0.090497      0.232528     -0.136202   
 -4.10374e-15  -0.146676     -3.98025e-13   0.0326132     0.173679     -0.0668674    -0.158378        -0.319589     -9.85169e-17   0.135652      0.00611796   -0.00236722   -0.0476401  
  0.00616579   -0.0550297     0.0145877    -0.11284      -8.54843e-16  -0.0595453    -0.161158        -0.470292      0.0146403     2.48209e-13  -0.194445      0.154948     -0.0403172  
 -0.139795     -0.116678     -0.0975358     0.250031      6.22888e-16  -3.04969e-14  -0.290872        -1.08957e-16  -0.39564      -0.0348308    -0.0226514    -1.97114e-15  -3.90435e-12

julia> B_julia
100x80 Array{Float64,2}:
  0.000337048   0.0125559    -0.0195975     0.0227006    -0.00131473    0.158092      0.000663977  …   0.00132733   -0.0556719     0.0859936     0.0901456     0.0274917    -0.156614   
 -0.000999775  -0.137709     -0.00189461   -0.108788     -0.133136      0.10675       0.0133637       -0.286436     -0.0329385    -0.00322267    0.000318102  -0.135325      0.00885051 
  0.0737673    -0.00030499   -0.00249059    0.0261302     0.124452      0.0320024    -0.000321017      0.000889766  -0.000529372   0.119507     -0.00173593    0.000114643  -0.00306437 
 -0.079319      0.000324243  -0.00339591   -0.00036351    0.000527439   0.00130374   -0.0467694        0.000461938  -0.0609708     0.0690949    -0.0704039    -0.00167842   -0.00103464 
 -0.102341     -0.0749729     0.00371976    0.0016458     0.00128979   -0.000929159   0.000943008      0.00044275    0.0779099    -0.000608869  -0.000409666   0.0311462    -0.000302858
 -0.120621     -0.000311348  -0.0655874    -0.000266848   0.145086      0.0383281    -0.125069     …  -0.00120831   -0.108513      0.00230788    0.0651062     0.241779      0.075382   
  0.0758896    -0.00591877   -0.0606568    -0.000956742   0.0374417     0.0186736    -0.232604        -7.72066e-5    0.0481777     0.0366206    -0.139991     -0.000495568  -0.00219826 
  2.88976e-6    0.0758953    -0.003234      0.146643     -0.00052971    0.00120368   -0.00080923       0.0969681     0.159552      0.0429008     0.182256     -0.0667012    -3.46449e-5 
 -0.0019605    -0.000235125  -0.000844206   0.0010696    -8.86654e-5   -0.000233001  -0.092073        -0.000446137  -0.108565      0.00243809   -0.0174647     0.079925     -0.0865091  
  0.0781275     0.19871      -0.00191063   -0.000804741   0.199612     -0.104539     -0.0670915       -0.000663136  -0.000714714   0.0028652     0.0371377     0.0430611     0.000523397
  0.0158877    -0.000120918  -0.067413     -0.00103618    0.0422939     0.0367469    -0.184133     …   0.0103255    -0.00105483   -0.227575     -0.000630592   0.0127628     0.000649823
  0.0259366    -0.0523148    -0.213944      0.00967588   -0.000993193  -0.00093327    0.0600554       -0.000914976   0.0999781     0.010863      0.000491431   0.0744603     0.141033   
 -0.234498      4.58081e-5   -0.00417361   -0.00153287    0.0121918     0.0353958     0.0706351       -0.0849251    -0.337175     -0.056515      0.0388506     0.0619645    -0.14783    
 -0.000448261  -0.000955057   0.00174637   -0.113326      0.000128093  -0.00111729    5.39709e-5       0.000604747   8.90301e-5    0.00125488    0.000862802   0.000424115   0.00045246 
  0.000258625   0.00113976    0.00281135    0.000888562   0.000153247  -0.000210421   0.0162539        0.0725167    -0.000309987  -0.0835678    -0.000392986  -0.000816783  -0.0431199  
 -0.0234674     0.113753      0.0023156     0.0505136    -0.000580892   0.000614167   0.000960195  …  -0.0645927    -0.000302794   0.000845978  -0.00129536    3.43712e-5   -0.00010349 
  0.109872     -0.0743854     0.229784     -0.230191      0.0417266    -0.143704     -0.0520787       -0.00120834    0.130588     -0.16166      -0.000530155   0.13643      -0.0716151  
  0.0289604     0.0916581     0.00287153   -0.013683      0.000441428  -0.000266201  -0.000283894     -2.78584e-6    0.0423694     0.0895212     0.000583212   0.000901425  -0.0001322  
  0.0778609    -0.185006     -0.147858     -0.115069      0.191568      0.000501431  -0.191749         0.170681      0.0887981     0.108111     -0.102144      0.00102936    0.00109931 
 -0.0704627     0.0072464    -0.00162096   -0.000514679  -0.0010019     0.126287     -0.00097227       0.0899665     0.000374993   0.041103      0.0919705    -0.00415496   -0.0223385  
  0.000400198   0.000269394  -0.00628424    0.221756     -0.000830738  -0.119343      0.010114     …   0.000188966  -0.0699956    -0.0981138     0.0646589     0.00070182   -0.0386415  
 -8.73404e-5   -0.000224535   0.00271826    0.0796657    -0.169764      0.020407      0.0182277       -0.00018863    0.0903883    -0.134761     -0.116892      0.000953686  -5.07551e-5 
  0.0313879     0.00290555   -0.00222414    0.00126002    0.000427026   0.0152981     0.0370784       -0.139747      0.00259641    0.00189545    0.000599819   0.0248655     0.0324957  
 -0.013871     -0.000992704   0.0263489     0.000919314  -0.192936      0.0320112    -9.01476e-5       0.0799235     0.000481464  -0.00210076   -0.123522     -0.173831     -0.00067874 
  0.00112559    0.256739      0.00153983    0.0237268    -0.0200846    -0.212301     -0.000243966      0.0113312     0.00198908   -0.0483714     0.166568     -0.0800781     0.0669477  
  ⋮                                                                     ⋮                          ⋱                 ⋮                                                                  
  0.00092431    0.13294      -0.166694      0.0828401     0.012587      0.00164478    0.00079533      -0.205943      0.02711      -0.0647445    -0.00161003    0.086439      0.2168     
  0.000602251   0.000274472  -0.00135318    0.0529948     0.0425002     0.000466094   0.00120179       0.000596479   0.00298202   -0.00117354    0.00197412   -0.145644     -0.00112347 
  0.0954518    -0.0126554    -0.0263831     0.0956105    -0.126602      0.112251      0.000901736      0.00134231    0.029124     -0.000320644  -0.000342259  -0.00169632   -0.00392543 
 -0.00101901    0.191769      0.0345135    -5.62647e-5    0.000871625  -0.000171948   0.00013333      -0.079211     -0.0189447     0.00312025    0.00041664    0.224698      0.0637783  
  0.164225      0.0761533     0.00241984    0.0436349    -0.000244279  -0.0383924    -0.0492106    …  -0.0581738     0.0620536     0.0056159    -0.000320606   0.00363909   -0.256293   
  0.0145109     0.00125964   -0.0531116     0.16977      -0.00106843   -0.0530862     0.0701563       -0.0770485    -0.119355      0.00166972    0.0151789     0.0609907     0.0988225  
 -0.000429347  -0.000298952  -0.0899243    -0.0719078     0.151216     -0.000614316   0.00164534       0.000967415  -0.0591879    -0.125367     -0.100942     -0.00146107    0.000989065
 -0.250254      0.0928965     0.00332207    0.00121716    0.0526083     0.0986189     0.198085        -0.0568833    -0.00185639    0.0932408     0.135025      0.017814      0.00343379 
  0.00740545   -0.128521      0.0737627    -0.00571919   -0.0942927     0.00261609    0.161746        -0.0116408    -0.00133246    0.0017285    -0.111471     -0.154944     -0.0217122  
 -0.000599954  -0.0902916    -0.0761979    -0.081075     -0.126232     -0.000427275   0.101758     …   0.0729002     0.00337086   -0.112974      0.0821665    -0.00426502   -0.0340602  
  0.0767949     0.26242      -0.0138063    -0.000765209  -0.19055      -0.0332927    -0.000556921     -0.00172655   -0.0177578    -0.00599413    0.0422183     0.143861      4.28319e-5 
 -0.278005      0.00037781   -0.0749007     0.080908      0.00112031    0.279345      0.137768         0.0304783    -0.101874      0.106087      0.000860824  -0.00314154   -0.0811042  
 -0.000280719  -0.167395     -0.162634     -0.0898738     0.0242555    -0.000327285   0.000237801     -0.0586578     0.0170453    -0.00167834   -3.20847e-5   -0.0179528     0.000424374
  0.149003      0.000501913  -0.00325152   -0.000169014  -0.000717131  -0.125752      0.000504163      0.054423      0.000501139   0.00300892    0.080138      0.135118     -0.0759846  
 -0.000314447   0.000533176  -0.0012193    -0.0250234     0.00113922    0.0621237     0.0959975    …  -0.000303605  -0.00126875    0.000462062  -0.00112783    0.000763351  -0.000826756
 -0.000824322  -0.000311654   0.00119355    0.0229343    -0.0382838    -0.13775       0.111224        -7.36839e-5   -0.0811097    -0.00190996    0.0800673    -0.103401     -0.000566397
 -0.00101375   -0.13783      -0.00190418    0.000867497  -0.0374163    -0.00158244   -0.0946434        0.0556102    -0.00230705    0.149802     -0.000523509   0.000788743  -0.117547   
 -0.0857366    -0.146055      0.00321863    0.024037     -0.0126447     0.127917      0.207326         0.00102758   -0.0504689     0.194544     -0.0378578    -0.00212678    0.147205   
  0.13645      -0.0844313     0.198377     -0.0512691     0.00132027    0.141729      0.000561085     -0.000128916   0.000250492   0.0496934    -0.218118      0.00396689   -0.00120344 
 -0.00117122   -0.0744178     0.0690885     0.000536034  -6.27491e-5    0.000557838  -0.01064      …  -0.000698088  -0.00197782    0.00121599   -0.16723      -0.304525      0.00524283 
  0.0198386    -0.000519034  -0.213309     -0.0247626     0.00127581   -4.09628e-5    0.063534        -0.00132398    0.0320213    -0.107832     -0.0414453     0.0748763    -0.179304   
 -0.00268172   -0.00194285    0.203731     -0.000252043  -0.00012673   -0.198046     -9.41217e-5      -0.0808084     0.000409373   0.00194694    0.000247766  -0.0010174    -0.000681956
 -0.0477631     0.107093      0.000765209  -0.000792182   0.0941726     0.0180149     0.000616269     -0.312714      0.000618501   0.0004254    -0.000130796   0.228201      0.00308488 
  0.0589122     0.00122412   -0.21586       0.000635813   0.000404807   0.000516293  -0.0501303        0.000608686  -0.103942      0.00234117    0.000168612   0.108051      0.000175169

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants