Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Backup to NFS target fails with I/O error #5001

Closed
syphernl opened this issue Apr 5, 2024 · 7 comments
Closed

Backup to NFS target fails with I/O error #5001

syphernl opened this issue Apr 5, 2024 · 7 comments
Labels
bug network-storage Network storage related bugs stale

Comments

@syphernl
Copy link

syphernl commented Apr 5, 2024

Describe the issue you are experiencing

I configured my HASS to backup to my OMV instance. While the backup is being made I see the file being written to the target but the backup eventually fails and doesn't show up in the list.

  • Backing up all "main categories" individually: HA, Folders & Addons individually works just fine and shows up in the list
  • A backup made with ha core update --backup --version 2024.4.1 worked fine and shows up in the list (but this seems to be a "HA backup" without addons/folders)
  • Creating a backup in the UI or the CLI results in the same outcome

What type of installation are you running?

Home Assistant OS

Which operating system are you running on?

Home Assistant Operating System

Steps to reproduce the issue

  1. Mount NFS storage
  2. Set NFS as default backup location
  3. Make a full backup
    ...

Anything in the Supervisor logs that might be useful for us?

2024-04-05 17:29:30.523 INFO (MainThread) [supervisor.backups.manager] Backup 66ee2b63 starting stage addon_repositories
2024-04-05 17:29:30.523 INFO (MainThread) [supervisor.backups.manager] Backup 66ee2b63 starting stage docker_config
2024-04-05 17:29:30.523 INFO (MainThread) [supervisor.backups.manager] Creating new full backup with slug 66ee2b63
2024-04-05 17:29:30.579 INFO (MainThread) [supervisor.backups.manager] Backup 66ee2b63 starting stage addons
2024-04-05 17:29:30.609 INFO (MainThread) [supervisor.addons.addon] Building backup for add-on core_ssh
2024-04-05 17:29:30.619 INFO (MainThread) [supervisor.addons.addon] Finish backup for addon core_ssh
2024-04-05 17:29:30.804 INFO (MainThread) [supervisor.addons.addon] Building backup for add-on core_mariadb
2024-04-05 17:31:11.747 INFO (MainThread) [supervisor.addons.addon] Finish backup for addon core_mariadb
2024-04-05 17:31:11.781 INFO (MainThread) [supervisor.addons.addon] Building backup for add-on core_git_pull
2024-04-05 17:31:11.787 INFO (MainThread) [supervisor.addons.addon] Finish backup for addon core_git_pull
2024-04-05 17:31:11.817 INFO (MainThread) [supervisor.addons.addon] Building backup for add-on core_configurator
2024-04-05 17:31:11.823 INFO (MainThread) [supervisor.addons.addon] Finish backup for addon core_configurator
2024-04-05 17:31:11.856 INFO (MainThread) [supervisor.addons.addon] Building backup for add-on a0d7b954_phpmyadmin
2024-04-05 17:31:11.862 INFO (MainThread) [supervisor.addons.addon] Finish backup for addon a0d7b954_phpmyadmin
2024-04-05 17:31:11.896 INFO (MainThread) [supervisor.addons.addon] Building backup for add-on a0d7b954_logviewer
2024-04-05 17:31:11.900 INFO (MainThread) [supervisor.addons.addon] Finish backup for addon a0d7b954_logviewer
2024-04-05 17:31:11.937 INFO (MainThread) [supervisor.addons.addon] Building backup for add-on 402f1039_eufy_security_ws
2024-04-05 17:31:11.944 INFO (MainThread) [supervisor.addons.addon] Finish backup for addon 402f1039_eufy_security_ws
2024-04-05 17:31:11.972 INFO (MainThread) [supervisor.addons.addon] Building backup for add-on 5c53de3b_esphome
2024-04-05 17:31:12.626 INFO (MainThread) [supervisor.addons.addon] Finish backup for addon 5c53de3b_esphome
2024-04-05 17:31:12.626 INFO (MainThread) [supervisor.backups.manager] Backup 66ee2b63 starting stage home_assistant
2024-04-05 17:31:12.665 INFO (MainThread) [supervisor.homeassistant.module] Backing up Home Assistant Core config folder
2024-04-05 17:31:21.075 WARNING (MainThread) [supervisor.addons.options] Unknown option 'host' for MariaDB (core_mariadb)
2024-04-05 17:31:21.075 WARNING (MainThread) [supervisor.addons.options] Unknown option 'host' for MariaDB (core_mariadb)
2024-04-05 17:31:50.965 INFO (MainThread) [supervisor.homeassistant.module] Backup Home Assistant Core config folder done
2024-04-05 17:31:50.972 INFO (MainThread) [supervisor.backups.manager] Backup 66ee2b63 starting stage folders
2024-04-05 17:31:50.973 INFO (SyncWorker_2) [supervisor.backups.backup] Backing up folder share
2024-04-05 17:31:50.979 INFO (SyncWorker_2) [supervisor.backups.backup] Backup folder share done
2024-04-05 17:31:50.981 INFO (SyncWorker_0) [supervisor.backups.backup] Backing up folder addons/local
2024-04-05 17:31:50.984 INFO (SyncWorker_0) [supervisor.backups.backup] Backup folder addons/local done
2024-04-05 17:31:50.985 INFO (SyncWorker_4) [supervisor.backups.backup] Backing up folder ssl
2024-04-05 17:31:50.988 INFO (SyncWorker_4) [supervisor.backups.backup] Backup folder ssl done
2024-04-05 17:31:50.989 INFO (SyncWorker_3) [supervisor.backups.backup] Backing up folder media
2024-04-05 17:31:50.993 INFO (SyncWorker_3) [supervisor.backups.backup] Backup folder media done
2024-04-05 17:31:50.994 INFO (MainThread) [supervisor.backups.manager] Backup 66ee2b63 starting stage finishing_file
2024-04-05 17:32:52.884 ERROR (MainThread) [supervisor.backups.manager] Backup 66ee2b63 error
Traceback (most recent call last):
  File "/usr/src/supervisor/supervisor/backups/manager.py", line 261, in _do_backup
    async with backup:
  File "/usr/src/supervisor/supervisor/backups/backup.py", line 363, in __aexit__
    self._outer_secure_tarfile.__exit__(
  File "/usr/local/lib/python3.12/site-packages/securetar/__init__.py", line 140, in __exit__
    self._tar.close()
  File "/usr/local/lib/python3.12/tarfile.py", line 1975, in close
    self.fileobj.close()
OSError: [Errno 5] I/O error

System Health information

System Information

version core-2024.4.1
installation_type Home Assistant OS
dev false
hassio true
docker true
user root
virtualenv false
python_version 3.12.2
os_name Linux
os_version 6.6.20-haos
arch x86_64
timezone Europe/Amsterdam
config_dir /config
Home Assistant Community Store
GitHub API ok
GitHub Content ok
GitHub Web ok
GitHub API Calls Remaining 4995
Installed Version 1.34.0
Stage running
Available Repositories 1415
Downloaded Repositories 99
HACS Data ok
Home Assistant Cloud
logged_in false
can_reach_cert_server ok
can_reach_cloud_auth ok
can_reach_cloud ok
Home Assistant Supervisor
host_os Home Assistant OS 12.1
update_channel stable
supervisor_version supervisor-2024.03.1
agent_version 1.6.0
docker_version 24.0.7
disk_total 30.8 GB
disk_used 10.6 GB
healthy true
supported true
board ova
supervisor_api ok
version_api ok
installed_addons Terminal & SSH (9.10.0), MariaDB (2.6.1), Git pull (7.14.0), File editor (5.8.0), phpMyAdmin (0.9.1), Log Viewer (0.17.0), eufy-security-ws (1.8.0-2), ESPHome (2024.3.1)
Dashboards
dashboards 17
resources 57
views 60
error /config/ui-lovelace_minimalist.yaml not found
mode yaml
Recorder
oldest_recorder_run April 2, 2024 at 07:41
current_recorder_run April 5, 2024 at 17:06
estimated_db_size 1529.25 MiB
database_engine mysql
database_version 10.6.12
Spotify
api_endpoint_reachable ok

Supervisor diagnostics

No response

Additional information

No response

@syphernl syphernl added the bug label Apr 5, 2024
@agners
Copy link
Member

agners commented Apr 8, 2024

Hm, is there additional info in the host logs?

Could it be a size issue maybe?

@syphernl
Copy link
Author

syphernl commented Apr 8, 2024

It does appear to be size issue, but it's not clear as to what the cause is. The OMV box with the NFS export has ample diskspace available and doesn't impose any limits (that I know off).
The backup itself is 1,3GB

@agners
Copy link
Member

agners commented Apr 8, 2024

Can you check the host logs when taking a backup? Maybe also check the logs on OMV side for the same time frame. I wonder if there is maybe a more detailed error what I/O error NFS is encountering.

@agners agners added the network-storage Network storage related bugs label Apr 8, 2024
@syphernl
Copy link
Author

syphernl commented Apr 9, 2024

Sadly there is nothing in the host or OMV logs whenever an backup is running to the NFS target.

A few days ago my backup was ~850MB and it failed as soon as it appears to have finished writing the file.
Today the backups that are made are 1,3GB and again it fails at 1,3GB.

@agners
Copy link
Member

agners commented Apr 9, 2024

I've just tested Backup on a NFS storage in my test setup, things generally do seem to work here. It must be some combination of things which cause this problems 😢

Did you try to store the same backup on a different NFS serer?

I am a bit out of ideas honestly. Normally, I would expect some kind of error in the kernel logs on either side when a I/O error is reported to user space.

@bdraco do you maybe have an idea what could be going on here?

@syphernl
Copy link
Author

syphernl commented Apr 9, 2024

I have a feeling the connection to the NFS host may get interrupted for (split)second breaking the copy process.
It is however rather odd that it only gets cut off after the backup (seemingly) has been written, not somewhere randomly in between.

My OMV box is running offsite (connected via VPN) and none of the other services running over the VPN are experiencing issues like this.

As a test I have setup a different NFS server on a Debian box (locally) and it is working fine on there. The backup gets written and the backup process ends without an error.

Copy link

github-actions bot commented May 9, 2024

There hasn't been any activity on this issue recently. Due to the high number of incoming GitHub notifications, we have to clean some of the old issues, as many of them have already been resolved with the latest updates.
Please make sure to update to the latest version and check if that solves the issue. Let us know if that works for you by adding a comment 👍
This issue has now been marked as stale and will be closed if no further activity occurs. Thank you for your contributions.

@github-actions github-actions bot added the stale label May 9, 2024
@github-actions github-actions bot closed this as not planned Won't fix, can't repro, duplicate, stale May 16, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug network-storage Network storage related bugs stale
Projects
None yet
Development

No branches or pull requests

2 participants