Debian Patches

Status for pytorch/1.13.1+dfsg-4

Patch Description Author Forwarded Bugs Origin Last update
dirtyhack.patch The elegant patching work is based on the master branch https://github.com/pytorch/pytorch/issues/14699
And we will be able to use that solution in the next upstream release.
I don't want to rebase my patches back to this version, so let's go with a fast, yet dirty hack.
===================================================================
Mo Zhou no
mkldnn.patch =================================================================== no
zstd.patch =================================================================== no
cmake-strip-3rdparty.patch =================================================================== no
shebang.patch change shebang find . -type f -name '*.py' -exec sed -i -e 's@#!/usr/bin/env @#!/usr/bin/@g' '{}' \;
find . -type f -name '*.sh' -exec sed -i -e 's@#!/usr/bin/env @#!/usr/bin/@g' '{}' \;
git diff > x.shebang
cat x.shebang >> debian/patches/shebang.patch
===================================================================
Mo Zhou no
pytorch_glog_update.patch move IsGoogleLoggingInitialized() to public API It was an internal function and project used hacks to reach it. Now it's part
of the public API.
Laszlo Boszormenyi (GCS) <gcs@debian.org> no 2022-03-08
flatbuffers-v2.0.8.patch flatbuffers API breakage fix in debian/rules, you can find a line invoking `flatc` to generate some cpp
code from flatbuffers protocol file. The pytorch upstream code is based on
some early version of flatbuffers, while our flatbuffers version (v2.0.8)
is much newer than that supported by pytorch upstream. As a result, there
are some API mismatches. This patch is to fix these API mismatch and hence
fix FTBFS.
But the upstream (seemingly) does not intent to bump their dependency lib.
===================================================================
Mo Zhou yes
fix-wrong-shebang.patch fix wrong shebang=================================================================== no
fmtlib-revert.patch revert the string formatting overhead. Basically a partial revert of this: https://github.com/pytorch/pytorch/pull/76977
we encountered a strange FTBFS issue when compiling against libfmt.
reverting back to c++ std and try again.
This patch is not yet verified.
diff --git a/torch/csrc/Exceptions.cpp b/torch/csrc/Exceptions.cpp
index 5210d6f7..d0dfbc89 100644
no
0009-Fix-corner-cases-with-permute-88226.patch Fix corner cases with permute (#88226)
Previously the permute function was extended to behave like the `order`
function for first-class dimensions. However, unlike `permute`,
`order` doesn't have a keyword argment `dims`, and there is no way to add
it in a way that makes both permute an order to continue to have the same
behavior. So this change just removes the extra functionality of permute,
which wasn't documented anyway. Fixes #88187
Pull Request resolved: https://github.com/pytorch/pytorch/pull/88226
Approved by: https://github.com/zou3519
Zachary DeVito <zdevito@meta.com> no 2022-11-01
0010-fix-possible-overflow-83389.patch fix possible overflow (#83389)
Fix some errors detected by static analysis.

Pull Request resolved: https://github.com/pytorch/pytorch/pull/83389
Approved by: https://github.com/zou3519
cyy <cyyever@outlook.com> no 2022-11-29
0011-Use-python-compat-from-python-pythoncapi_compat-9116.patch Use python compat from python/pythoncapi_compat (#91163)
Pull Request resolved: https://github.com/pytorch/pytorch/pull/91163
Approved by: https://github.com/ezyang
albanD <desmaison.alban@gmail.com> no 2022-12-21
0013-Fix-test_math_ops-for-python-3.11-91774.patch Fix `test_math_ops` for python-3.11 (#91774)
From [math.pow](https://docs.python.org/3/library/math.html#math.pow) documentation:
> Changed in version 3.11: The special cases `pow(0.0, -inf)` and `pow(-0.0, -inf)` were changed to return `inf` instead of raising [`ValueError`](https://docs.python.org/3/library/exceptions.html#ValueError), for consistency with IEEE 754.

Pull Request resolved: https://github.com/pytorch/pytorch/pull/91774
Approved by: https://github.com/ngimel
Nikita Shulga <nshulga@meta.com> no 2023-01-06
0015-Add-missing-gc-untrack-for-cpp-autograd-Nodes-92351.patch Add missing gc untrack for cpp autograd Nodes (#92351)
Fixes https://github.com/pytorch/pytorch/issues/91161 the assertion after the warning seems to be linked to the fact that we didn't untrack this properly.
In 3.11 they added a warning when this is not called properly before tp_free
Pull Request resolved: https://github.com/pytorch/pytorch/pull/92351
Approved by: https://github.com/ezyang
albanD <desmaison.alban@gmail.com> no 2023-01-18
0014-Skip-builtins-while-enumerating-class-methods-91805.patch Skip builtins while enumerating class methods (#91805)
This is needed to support `enum.Enum` derived classes in Python-3.11
that adds `_new_member_` to classdict, see:
https://github.com/python/cpython/blob/15c44789bb125b93e96815a336ec73423c47508e/Lib/enum.py#L529

Following snippet illustrates the problem with the previous iteration of
the code on 3.11:
```python
from enum import Enum
import inspect

class Color(Enum):
RED = 1
GREEN = 2

def print_routines(cls):
print(cls.__name__)
for name in cls.__dict__:
fn = getattr(cls, name)
if inspect.isroutine(fn):
print(name, fn, f"has_globals: {hasattr(fn, '__globals__')}")

print_routines(Color)
```

Pull Request resolved: https://github.com/pytorch/pytorch/pull/91805
Approved by: https://github.com/albanD, https://github.com/suo
Nikita Shulga <nshulga@meta.com> no 2023-01-06

All known versions for source package 'pytorch'

Links