You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
restic 0.16.4 compiled with go1.21.6 on linux/amd64
I'm running on arch linux
What backend/service did you use to store the repository?
Local filesystem.
Problem description / Steps to reproduce
restic restore with the --verify switch prints verifying files in ... and then a % estimate. After the estimate reaches 100%, restic may continue to execute for a long time (tens of minutes for the example below).
Expected behavior
restic doesn't:
run for a long time without some progress update
present a verification progress indicator at 100% when there is still ~10s of minutes of run time remaining
does:
always make it clear what phase of operation is executing
I see output like this at the point it stops printing updates for tens of minutes:
repository 4ae3f911 opened (version 2, compression level auto)
found 32 old cache directories in /home/me/.cache/restic, run `restic cache --cleanup` to remove them
[0:02] 100.00% 29 / 29 index files loaded
restoring <Snapshot 459c74a3 of [/path/to/restic/repo] at 2024-01-05 20:23:32.433605071 +0000 UTC by root@host> to /path/to/restore/space
Summary: Restored 6187417 files/dirs (223.845 GiB) in 1:33:17
verifying files in /path/to/restore/space
[1:35:17] 100.00% 6187301 files/dirs 223.845 GiB, total 6187417 files/dirs 223.845 GiB
There is still a lot of disk activity during the time when it is no longer printing any updates.
Then, a long time later, it prints:
finished verifying 5156840 files in /path/to/restore/space (took 1h16m22.147s)
Do you have any idea what may have caused this?
No. I'm not sure what restic is doing in this time.
I don't see any other obvious processes like tracker or mlocate running that might be reading or writing the directories involved. lsof run on the source and target directories doesn't show anything.
Did restic help you today? Did it make you happy in any way?
restic is pleasantly simple to get started with :-)
The text was updated successfully, but these errors were encountered:
Summary: Restored 6187417 files/dirs (223.845 GiB) in 1:33:17
verifying files in /path/to/restore/space
[1:35:17] 100.00% 6187301 files/dirs 223.845 GiB, total 6187417 files/dirs 223.845 GiB
Verifying files so far does not implement a progress bar xD . That is, the correct output should have been the following. That also explains why 1:33:17 != 1:16:22.
[1:35:17] 100.00% 6187301 files/dirs 223.845 GiB, total 6187417 files/dirs 223.845 GiB
Summary: Restored 6187417 files/dirs (223.845 GiB) in 1:33:17
verifying files in /path/to/restore/space
finished verifying 5156840 files in /path/to/restore/space (took 1h16m22.147s)
The second part obviously is a feature request for adding a progress bar.
Output of
restic version
restic 0.16.4 compiled with go1.21.6 on linux/amd64
I'm running on arch linux
What backend/service did you use to store the repository?
Local filesystem.
Problem description / Steps to reproduce
restic restore with the
--verify
switch printsverifying files in ...
and then a % estimate. After the estimate reaches 100%, restic may continue to execute for a long time (tens of minutes for the example below).Expected behavior
restic doesn't:
does:
Actual behavior
I see output like this at the point it stops printing updates for tens of minutes:
There is still a lot of disk activity during the time when it is no longer printing any updates.
Then, a long time later, it prints:
Do you have any idea what may have caused this?
No. I'm not sure what restic is doing in this time.
I don't see any other obvious processes like tracker or mlocate running that might be reading or writing the directories involved.
lsof
run on the source and target directories doesn't show anything.Did restic help you today? Did it make you happy in any way?
restic is pleasantly simple to get started with :-)
The text was updated successfully, but these errors were encountered: