Skip to content

Releases: arogozhnikov/einops

v0.8.0: tinygrad, small fixes and updates

28 Apr 04:07
Compare
Choose a tag to compare

TLDR

  • tinygrad backend added
  • resolve warning in py3.11 related to docstring
  • remove graph break for unpack
  • breaking TF layers were updated to follow new instructions, new layers compatible with TF 2.16, and not compatible with old TF (certainly does not work with TF2.13)

What's Changed

New Contributors

Full Changelog: v0.7.0...v0.8.0

V0.7.0: torch.compile, preserve axis identity, array api

01 Oct 01:13
Compare
Choose a tag to compare

Major changes:

  • torch.compile just works, registration of operations happens automatically
  • JAX's distributed arrays can use ellipses, and in general ellipsis processing now preserves axis identity. This involved changing internal gears of einops.
  • Array API: einops operations can be used with any framework that follows the standard (see einops.array_api)
  • Python 3.7 is dead. Good bye, you were great at the time
  • Gluon is dropped as previously announced
  • reduce/repeat/rearrange all accept lists now

PRs list

Full Changelog: v0.6.1...v0.7.0

V0.7.0rc2: allow dynamic shapes in torch.compile

14 Aug 05:41
bc89539
Compare
Choose a tag to compare

What's Changed

Full Changelog: v0.7.0rc1...v0.7.0rc2

V0.7.0rc1: torch.compile, preserve axis identity, array api

08 Jul 02:54
f656128
Compare
Choose a tag to compare

Major changes:

  • torch.compile just works, registration of operations happens automatically
  • JAX's distributed arrays can use ellipses, and in general ellipsis processing now preserves axis identity. This involved changing internal gears of einops.
  • Array API: einops operations can be used with any framework that follows the standard (see einops.array_api)
  • Python 3.7 is dead. Good bye, you were great at the time
  • Gluon is dropped as previously announced
  • Reduce/repeat/rearrange all accept lists now

What's Changed

Full Changelog: v0.6.1...v0.7.0rc1

V0.6.2rc0: drop python 3.7 + preserve axis identity

05 Jul 08:17
Compare
Choose a tag to compare

pre-release is published to allow public testing of new caching logic (pattern analysis is now dependent on input dimensionality to preserve axis identity).

What's Changed

Full Changelog: v0.6.1...v0.6.2rc0

V0.6.1: support paddle, support pytorch.compile

19 Apr 04:13
Compare
Choose a tag to compare
  • einops layers perfectly interplay with torch.compile
  • einops operations needs registration: run einops._torch_specific.allow_ops_in_compiled_graph() before torch.compile
  • paddle is now supported (thanks to @zhouwei25)
  • as previously announced, support of mxnet is dropped

What's Changed

New Contributors

  • @zhouwei25 made their first contribution in #242

Full Changelog: v0.6.0...v0.6.1

V0.6.0: pack and unpack

09 Nov 20:32
d6f7910
Compare
Choose a tag to compare

What's Changed

New Contributors

Announcement

Sunsetting experimental mxnet support: no demand and package is outdated, with numerous deprecations and poor support of corner cases. 0.6.0 will be the last release with mxnet backend.

Full Changelog: v0.5.0...v0.6.0

Einops v0.5.0: einsum + support of flax and oneflow.

03 Oct 06:38
Compare
Choose a tag to compare

What's Changed

New Contributors

Full Changelog: v0.4.1...v0.5.0

Einops v0.4.1: Avoid importing numpy if it is not required

04 Mar 09:31
Compare
Choose a tag to compare

What's Changed

New Contributors

Full Changelog: v0.4.0...v0.4.1

Einops v0.4.0: EinMix and torch.jit.script

18 Jan 07:30
Compare
Choose a tag to compare

Main Changes

  • torch.jit.script is supported (in addition to previous torch.jit.trace)
  • EinMix (swiss-knife for next-gen MLPs) is added. A much-improved einsum/linear layer is now available.
  • einops.repeat in torch does not create copy when possible

Detailed PRs

New Contributors

Full Changelog: v0.3.2...v0.4.0