Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Windows wheel package (.whl) on Pypi #5479

Closed
mcarans opened this issue Jan 22, 2015 · 267 comments
Closed

Windows wheel package (.whl) on Pypi #5479

mcarans opened this issue Jan 22, 2015 · 267 comments

Comments

@mcarans
Copy link

mcarans commented Jan 22, 2015

Please make Windows wheel packages and put them on Pypi.

Currently it is possible to download Windows wheel packages for numpy here: http://www.lfd.uci.edu/~gohlke/pythonlibs/#numpy

It would be great if the wheels were directly available on the Pypi server https://pypi.python.org/pypi/ so that they can be installed with pip.

@matthew-brett
Copy link
Contributor

Well said - and indeed there is a lot of work by @carlkl going on behind the scenes to make this happen. I believe we are nearly there now - @carlkl - when will you go public, do you think?

@njsmith
Copy link
Member

njsmith commented Jan 22, 2015

For context: the reason this isn't trivial is that the binaries you linked
to depend on Intel's proprietary runtime and math library, which
complicates redistributing them.

@carlkl
Copy link
Member

carlkl commented Jan 22, 2015

I deployed the recent OpenBLAS based numpy and scipy wheels on binstar. You can install them with:

pip install -i https://pypi.binstar.org/carlkl/simple numpy
pip install -i https://pypi.binstar.org/carlkl/simple scipy

This works for python-2.7 and for python-3.4. The wheels are marked as 'experimental'. Feedback is welcome.

@njsmith
Copy link
Member

njsmith commented Jan 22, 2015

If you want widespread testing then you should send this to the list :-)

On Thu, Jan 22, 2015 at 8:54 PM, carlkl notifications@github.com wrote:

I deployed the recent OpenBLAS based numpy and scipy wheels on binstar.
You can install them with:

pip install -i https://pypi.binstar.org/carlkl/simple numpy
pip install -i https://pypi.binstar.org/carlkl/simple scipy

This works for python-2.7 and for python-3.4. The wheels are marked as
'experimental'. Feedback is welcome.


Reply to this email directly or view it on GitHub
#5479 (comment).

Nathaniel J. Smith
Postdoctoral researcher - Informatics - University of Edinburgh
http://vorpus.org

@juliantaylor
Copy link
Contributor

fwiw I personally would like to change the size of the default integer in win64 before we actually provide official binaries, though there was some resistance too it when I last proposed it, also possibly with anaconda and other third party binaries its probably already too late :(

also speaking of openblas, someone fancy some debugging, I'm tired of it (looks like the same failure that breaks scipy with openblas):

test_einsum_sums_float64 (test_einsum.TestEinSum) ... ==31931== Invalid read of size 16
==31931==    at 0x7B28EB9: ddot_k_NEHALEM (in /usr/lib/libopenblasp-r0.2.10.so)
==31931==    by 0x6DBDA90: DOUBLE_dot (arraytypes.c.src:3127)
==31931==    by 0x6E93DEC: cblas_matrixproduct (cblasfuncs.c:528)
==31931==    by 0x6E6B7B3: PyArray_MatrixProduct2 (multiarraymodule.c:994)
==31931==    by 0x6E6E29B: array_matrixproduct (multiarraymodule.c:2276)

@carlkl
Copy link
Member

carlkl commented Jan 22, 2015

OpenBLAS version used is 0.2.12. I didn't experienced significant problems with this version yet.

The scipy failures are copied to https://gist.github.com/carlkl/b05dc6055fd42eba8cc7.

32bit only numpy failures due to http://sourceforge.net/p/mingw-w64/bugs/367

======================================================================
FAIL: test_nan_outputs2 (test_umath.TestHypotSpecialValues)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "D:\tools\wp_279\python-2.7.9rc1\lib\site-packages\numpy\core\tests\test_umath.py", line 411, in test_nan_outputs2
    assert_hypot_isinf(np.nan, np.inf)
  File "D:\tools\wp_279\python-2.7.9rc1\lib\site-packages\numpy\core\tests\test_umath.py", line 402, in assert_hypot_isinf
    "hypot(%s, %s) is %s, not inf" % (x, y, ncu.hypot(x, y)))
  File "D:\tools\wp_279\python-2.7.9rc1\lib\site-packages\numpy\testing\utils.py", line 53, in assert_
    raise AssertionError(smsg)
AssertionError: hypot(nan, inf) is nan, not inf

======================================================================
FAIL: test_umath_complex.TestCabs.test_cabs_inf_nan(<ufunc 'absolute'>, inf, nan, inf)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "D:\tools\wp_279\python-2.7.9rc1\lib\site-packages\nose\case.py", line 197, in runTest
    self.test(*self.arg)
  File "D:\tools\wp_279\python-2.7.9rc1\lib\site-packages\numpy\core\tests\test_umath_complex.py", line 523, in check_real_value
    assert_equal(f(z1), x)
  File "D:\tools\wp_279\python-2.7.9rc1\lib\site-packages\numpy\testing\utils.py", line 275, in assert_equal
    return assert_array_equal(actual, desired, err_msg, verbose)
  File "D:\tools\wp_279\python-2.7.9rc1\lib\site-packages\numpy\testing\utils.py", line 739, in assert_array_equal
    verbose=verbose, header='Arrays are not equal')
  File "D:\tools\wp_279\python-2.7.9rc1\lib\site-packages\numpy\testing\utils.py", line 628, in assert_array_compare
    chk_same_position(x_isnan, y_isnan, hasval='nan')
  File "D:\tools\wp_279\python-2.7.9rc1\lib\site-packages\numpy\testing\utils.py", line 608, in chk_same_position
    raise AssertionError(msg)
AssertionError: 
Arrays are not equal

x and y nan location mismatch:
 x: array([ nan])
 y: array(inf)

======================================================================
FAIL: test_umath_complex.TestCabs.test_cabs_inf_nan(<ufunc 'absolute'>, -inf, nan, inf)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "D:\tools\wp_279\python-2.7.9rc1\lib\site-packages\nose\case.py", line 197, in runTest
    self.test(*self.arg)
  File "D:\tools\wp_279\python-2.7.9rc1\lib\site-packages\numpy\core\tests\test_umath_complex.py", line 523, in check_real_value
    assert_equal(f(z1), x)
  File "D:\tools\wp_279\python-2.7.9rc1\lib\site-packages\numpy\testing\utils.py", line 275, in assert_equal
    return assert_array_equal(actual, desired, err_msg, verbose)
  File "D:\tools\wp_279\python-2.7.9rc1\lib\site-packages\numpy\testing\utils.py", line 739, in assert_array_equal
    verbose=verbose, header='Arrays are not equal')
  File "D:\tools\wp_279\python-2.7.9rc1\lib\site-packages\numpy\testing\utils.py", line 628, in assert_array_compare
    chk_same_position(x_isnan, y_isnan, hasval='nan')
  File "D:\tools\wp_279\python-2.7.9rc1\lib\site-packages\numpy\testing\utils.py", line 608, in chk_same_position
    raise AssertionError(msg)
AssertionError: 
Arrays are not equal

x and y nan location mismatch:
 x: array([ nan])
 y: array(inf)

@njsmith
Copy link
Member

njsmith commented Jan 22, 2015

I don't disagree about changing the win64 integer size, but I think that's
a separate issue that should be decoupled from the wheels. If this were the
first time win64 numpy builds were becoming widely available, then it'd
make sense to link them, but at this point there are already tons of users
for years, they're just using cgholke or anaconda or whatever. So let's
treat that as an independent discussion?

(Strictly speaking it's a backcompat break, but even so it seems reasonable
that we might be able to pull it off, since it actually reduces
incompatibility between platforms -- all portable code has to handle 64-bit
dtype=int already.)

On Thu, Jan 22, 2015 at 8:59 PM, Julian Taylor notifications@github.com
wrote:

fwiw I personally would like to change the size of the default integer in
win64 before we actually provide official binaries, though there was some
resistance too it when I last proposed it, also possibly with anaconda and
other third party binaries its probably already too late :(

also speaking of openblas, someone fancy some debugging, I'm tired of it
(looks like the same failure that breaks scipy with openblas):

test_einsum_sums_float64 (test_einsum.TestEinSum) ... ==31931== Invalid read of size 16
==31931== at 0x7B28EB9: ddot_k_NEHALEM (in /usr/lib/libopenblasp-r0.2.10.so)
==31931== by 0x6DBDA90: DOUBLE_dot (arraytypes.c.src:3127)
==31931== by 0x6E93DEC: cblas_matrixproduct (cblasfuncs.c:528)
==31931== by 0x6E6B7B3: PyArray_MatrixProduct2 (multiarraymodule.c:994)
==31931== by 0x6E6E29B: array_matrixproduct (multiarraymodule.c:2276)


Reply to this email directly or view it on GitHub
#5479 (comment).

Nathaniel J. Smith
Postdoctoral researcher - Informatics - University of Edinburgh
http://vorpus.org

@TFenby
Copy link

TFenby commented Feb 12, 2015

I'm also interested in this. Is there some way of assisting with the process?

@carlkl
Copy link
Member

carlkl commented Feb 12, 2015

OpenBLAS can be compiled with INTERFACE64=1 and numpy can be compiled with -fdefault-integer-8 for a first try.

@tkelman
Copy link
Contributor

tkelman commented Mar 16, 2015

Just a heads up. Using 64 bit integers in blas is a terrible idea. Stop before you get too far down that road. Matlab, and Julia before I went and fixed it, did this, and it breaks any third-party library that assumes conventional 32-bit integers in blas.

What we've been doing in Julia for the past ~5 months is actually renaming all the symbols in openblas to add a _64 suffix to them for the 64-bit-int version, that way you can do linear algebra on really huge arrays if you want, but loading external libraries into the same process won't segfault by name shadowing and trying to call dgemm with the wrong ABI.

@guyverthree
Copy link

Hey guys is there any update on the wheels files being made available for Numpy ?

@njsmith
Copy link
Member

njsmith commented Jun 25, 2015

Not that I'm aware of right now.
On Jun 25, 2015 4:27 AM, "guyverthree" notifications@github.com wrote:

Hey guys is there any update on the wheels files being made available for
Numpy ?


Reply to this email directly or view it on GitHub
#5479 (comment).

@mikofski
Copy link

@guyverthree Christoph Gohlke has been releasing NumPy using Intel's MKL as wheels for a while now.

Also, see my blog post on NumPy wheels. I made some NumPy wheels in my Dropbox using Carl Kleffner's modified mingw-w64 toolchain and Zhang Xianyi's OpenBLAS port of GotoBLAS. Olivier Grisel was looking for help modifying the NumPy buildbot to repeat the same steps used in the OpenBLAS google groups thread I post to.

@carlkl
Copy link
Member

carlkl commented Jun 26, 2015

My latest version is available on binstar.org though I'm not sure if anaconda.org is the new prefered name now.
The wheels for py-2.6 .. 3.4 (32/64bit) are about 2 months old:

  • numpy-1.9.2
  • scipy-0.15.1
  • scikit-image-0.11.2

build with my https://bitbucket.org/carlkl/mingw-w64-for-python and a more or less recent OpenBLAS.
pip install:

  • pip install -i https://pypi.binstar.org/carlkl/simple numpy
  • pip install -i https://pypi.binstar.org/carlkl/simple scipy

@mikofski
Copy link

+1 @carlkl and I wish these could be added to NumPy build at the Cheese Factory as well.

@matysek
Copy link

matysek commented Oct 23, 2015

+1 I would love to see this happen too.

@carlkl
Copy link
Member

carlkl commented Oct 23, 2015

IMHO: there are at least three problems to be solved before these builds are going to be accepted:

  • the mingwpy patches for the numpy repository has to be recreated
  • there is no build mechanism aside manually building yet
  • many 3-rd party windows packages (depolyed by C. Gohlke) explicitly depends on numpy-MKL, because the binaries are hard linked against MKL DLLs. This may change in future, as scipy now provides a mechanism for an implicit dependancy on the scipy BLAS/Lapack implementation. So installing (numpy-MKL & scipy-MKL) OR (numpy-OpenBLAS & scipy-OpenBLAS) should be suffcient for all other packages in future.

@njsmith
Copy link
Member

njsmith commented Oct 24, 2015

@carlkl: FWIW, I'm not really worried about @cgohlke's packages -- that will sort itself out (just like there aren't major problems now due to people trying to combine scipy-MKL with anaconda numpy). And I'm not even really worried about there being some fancy build mechanism -- a manual build is fine so long as there's a text file documenting the steps.

The main issue that I'm worried about is sustainability: if we can't get this stuff upstream, then we'll have to re-validate and re-do patches every time a new version of gcc / mingw-w64 / msvc comes out, and it probably won't happen. We don't want to get caught in the trap where we start providing builds, but then this becomes more and more onerous over time as we have to deal with a cranky old compiler to do it.

Which is why I've been trying to round up funding to support doing this upstream... +1's are great and all, but if anyone wants to donate some money or knows a company who might be interested in making gcc generally usable for python extensions on windows, then send me an email :-) (njs@pobox.com)

If you don't have $$ but still want to help, then one way to do that would be to send patches to mingw-w64 improving their support for transcendental functions like sin and cos. (It turns out that the MSVC ABI disagrees with everyone else about how the x87 FPU unit should be configured, so most of the free software mathematical functions don't work quite right.) Fortunately there are good, license-compatible implementations in Android's "bionic" libc, so this doesn't require any mathematical wizardry or deep insight into ABI issues -- it's just a mostly mechanical matter of finding and extracting the relevant source files and then dropping them into the mingw-w64 tree at the right place. We can provide more details on this too if anyone's interested.

@matthew-brett
Copy link
Contributor

Isn't this the kind of thing that numfocus should be funding? If not, then perhaps we can go back and revisit applying to the PSF.

How much money are we talking?

@hickford
Copy link

hickford commented Dec 1, 2015

+1 please publish wheels for Windows to PyPI https://pypi.python.org/pypi/numpy

If you try pip install numpy on an out-the-box Python Windows installation, you get the infamously unhelpful error message "Unable to find vcvarsall.bat".

@johnthagen
Copy link
Contributor

+1 would really help Windows users.

@techtonik
Copy link
Contributor

Can't play with https://github.com/glumpy/glumpy because of this. What are the manual build steps to get Numpy working on Windows? Looks like AppVeyor job is there, so should be no problem to upload artifacts to GitHub.

@njsmith
Copy link
Member

njsmith commented Dec 26, 2015

Right now it is literally impossible to build a fast, BSD-licensed version of numpy on windows. We're working on fixing that, but it's a technical limitation; +1's aren't going to have any effect either way. (The appveyor job does build on windows, but it uses a fallback unoptimized linear algebra library that isn't really suitable for real work.) Until we get this sorted, then I'd recommend downloading wheels from Christoph Gohlke's website, or using Anaconda or another scientific python distribution.

@techtonik
Copy link
Contributor

@njsmith can you be more specific? Preferably with exact commands that don't work. Right now this stuff is not actionable.

@matthew-brett
Copy link
Contributor

I think 'impossible' is too strong - but there's certainly not yet an obvious general way forward. I put up a wiki page on the current status here: https://github.com/numpy/numpy/wiki/Whats-with-Windows-builds . Please feel free to edit / amend all ye who care to.

jaimefrio pushed a commit to jaimefrio/numpy that referenced this issue Mar 22, 2016
Add note about wheels on pypi, and Windows wheels in particular.  See
discussion at: numpy#5479
@fgregg
Copy link

fgregg commented Mar 29, 2016

Thanks for all the great work! Will numpy 1.11.0 Window wheels be added to pypi soon? https://pypi.python.org/pypi/numpy

@njsmith
Copy link
Member

njsmith commented Mar 29, 2016

oh yeah, we possibly need to figure out how to update our release procedures here... IIUC the user experience right now is that as soon as the 1.11 source release was uploaded, all the windows machines out there suddenly switched from downloading wheels (yay) to trying to download and build the source (boo). I guess the "right" way to do this is that once the final release is tagged, we build and upload all the binary wheels before uploading the sdist. As annoying as that is...

@fgregg
Copy link

fgregg commented Mar 30, 2016

@njsmith that would be nice, but a few minutes lag (or even a few hours) lag would be fine with me.

@jjhelmus
Copy link
Contributor

Just to clarify are the current Windows whl files on PyPI for the 1.11.0 release build against ATLAS? Is there a build script that can be shared?

@matthew-brett
Copy link
Contributor

Yes, the wheels are built against ATLAS, but we're thinking of moving to OpenBLAS when we're confident of the results.

Build is automated via Appveyor : https://github.com/numpy/windows-wheel-builder

@techtonik
Copy link
Contributor

23735 downloads in the last day. =)

It might be possible to create hidden release - at least there is an option on PyPI form https://pypi.python.org/pypi?%3Aaction=submit_form and unhide it when all files are ready.

@matthew-brett
Copy link
Contributor

Sadly, the hidden release feature does stop people getting that release via the command line, it only stops them seeing the release via the pypi GUI:

https://sourceforge.net/p/pypi/support-requests/428/

@rowleya
Copy link

rowleya commented Apr 28, 2016

I have tried the 64-bit windows install of numpy and that works great, so thanks to all who have put in work on this.

What I am wondering is if there is still a plan to do the same thing with scipy wheels? Is this awaiting the decision to move to OpenBLAS?

@carlkl
Copy link
Member

carlkl commented Apr 28, 2016

On https://bitbucket.org/carlkl/mingw-w64-for-python/downloads there are some test wheels of scipy-0.17.0 . These wheels have been build with mingwpy against @matthew-brett 's builds of numpy https://pypi.python.org/pypi/numpy/1.10.4

@matthew-brett
Copy link
Contributor

On Thu, Apr 28, 2016 at 12:48 PM, carlkl notifications@github.com wrote:

On https://bitbucket.org/carlkl/mingw-w64-for-python/downloads there are
some test wheels of scipy-0.17.0 . These wheels have been build with
mingwpy against @matthew-brett https://github.com/matthew-brett 's
builds of numpy https://pypi.python.org/pypi/numpy/1.10.4

Sorry if you said already, and I missed it - but do you get any test
failures for these wheels?

Are you linking to the ATLAS shipped inside the numpy wheels?

@carlkl
Copy link
Member

carlkl commented Apr 28, 2016

@matthew-brett, I announced these builds a month ago, but I don't remember where. Anyway, these builds link against numpy-atlas supplied by your numpy wheels.

scipy-0.17.0-cp35-cp35m-win##.whl are linked against the wrong C-runtime msvcrt.dll. For scipy this seems to be OK. Test logs are here: https://gist.github.com/carlkl/9e9aa45f49fedb1a1ef7

@matthew-brett
Copy link
Contributor

Is that the right log? It has NumPy is installed in D:\devel\py\python-3.4.4\lib\site-packages\numpy at the end.

I was wondering if we are close to being able to provide a scipy wheel, even if it dangerously links against the wrong MSVC runtime, but it looks as is there are far too many errors for this build.

Do you get fewer errors for the 64-bit build? For the current best build against openblas 0.2.18 ?

@carlkl
Copy link
Member

carlkl commented Apr 28, 2016

64bit has only 6 failures all with:

FAIL: test_continuous_basic.test_cont_basic(<scipy.stats._continuous_distns.nct_gen object ...

I know: this cries for comparison with OpenBLAS. However, I'm stuck since the last 4 weeks for several reasons as you may have noticed. Hopefully the situation will continue to improve.

@matthew-brett, I would appreciate using numpy MSVC builds with OpenBLAS. My latest builds are here:

@mikofski
Copy link

As if mingwpy, conda-forge, Anaconda and Canopy wasn't enough here comes the Intel Distribution for Python and it's free to download. It includes just the numerical tools (SciPy, NumPy, Numba, Scikit-Learn) plus some extras (mpi4py Intel mp interface and pyDAAL data analytics) and uses conda.

@mrslezak
Copy link

No worries the license expires 10/29/16 so these Intel builds are just a
beta test followed by probably an MKL+ etc. license fee. OpenBLAS builds
will remain the open source solution so thank you for providing these
builds.
On Apr 28, 2016 7:21 PM, "Mark Mikofski" notifications@github.com wrote:

As if mingwpy, conda-forge, Anaconda and Canopy wasn't enough here comes
the Intel Distribution for Python
https://software.intel.com/en-us/python-distribution and it's free to
download
https://software.intel.com/en-us/articles/intel-distribution-for-python-support-and-documentation.
It includes just the numerical tools (SciPy, NumPy, Numba, Scikit-Learn)
plus some extras (mpi4py Intel mp interface and pyDAAL data analytics) and
uses conda.


You are receiving this because you were mentioned.
Reply to this email directly or view it on GitHub
#5479 (comment)

@giumas
Copy link

giumas commented Jul 18, 2016

For 1.11.1, it looks like that there is a missing Windows wheel on PyPi for Python 3.5 amd64.

Is there a particular reason for that? If I go to 1.11.0 (https://pypi.python.org/pypi/numpy/1.11.0), the wheel is there.

@matthew-brett
Copy link
Contributor

Thanks for the report - I think we must have uploaded too soon, and therefore before all the wheels were built. I've uploaded the missing wheel. It looks like we need a test to make sure this doesn't happen again.

@giumas
Copy link

giumas commented Jul 18, 2016

I've uploaded the missing wheel.

I have just tested it, and it works great!

Thank you so much for all the work done to make the Windows wheels available.

@pv
Copy link
Member

pv commented Oct 24, 2016

Closing the issue -- wheels have been available for the last few releases.

@pv pv closed this as completed Oct 24, 2016
@waynenilsen
Copy link

I understand that this issue is closed but I believe we should consider re-opening it.

This remains an issue for windows users trying to get their scientific stack running without having to resort to conda. I still need to use the @cgohlke 'MKL' builds see this related scipy issue which remains open. Although wheels are being created, without being compatible with scipy, they are not usable for many.

@astrojuanlu
Copy link
Contributor

@waynenilsen you have the instructions for installing the new wheels in the mailing list thread that is linked in the issue you just mentioned:

scipy/scipy#5461 (comment)

So if you do

pip install -f https://7933911d6844c6c53a7d-47bd50c35cd79bd838daf386af554a83.ssl.cf2.rackcdn.com/ --pre scipy

it should work for you.

@pv
Copy link
Member

pv commented Sep 6, 2017 via email

@waynenilsen
Copy link

This works great for me @Juanlu001 I am really looking forward to when this is on pypi!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests