Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

BlockingIOError on Block download #379

Open
Jef-GB opened this issue May 25, 2023 · 2 comments
Open

BlockingIOError on Block download #379

Jef-GB opened this issue May 25, 2023 · 2 comments

Comments

@Jef-GB
Copy link

Jef-GB commented May 25, 2023

The Goal

I am trying to write a file from my Windows device to an embedded controller via CANopen. To transfer the file I am using the CiA301 defined Block transfers as implemented in this stack. I have created a script to write to the bus, making use of the block download example from the documentation.

FIRMWARE_PATH = './notebooks/test.txt'
FILESIZE = os.path.getsize(FIRMWARE_PATH)
BLOCK_SIZE = 127
print(FILESIZE)  # Size = 20473 bytes

with open(FIRMWARE_PATH, 'rb') as infile, \
        interface._node.sdo.open(index=0x2000, subindex=0x2, mode='wb', buffering=BLOCK_SIZE,
                                 size=FILESIZE, block_transfer=True) as outfile:

    # Iteratively transfer data without having to read all into memory
    while True:
        data = infile.read(BLOCK_SIZE)
        if not data:
            break
        outfile.write(data)

The Problem

Running this code correctly starts a block transfer resulting in correct data on the bus (This has been validated using PCAN-View). After transferring 1 ore sometimes a few block of data, an error gets thrown on the python side resulting in ending the transfer.

INFO:canopen.sdo.client:Initiating block download for 0x2000:2
DEBUG:canopen.sdo.client:Expected size of data is 20473 bytes
DEBUG:can.pcan:Data: bytearray(b'\xc6\x00 \x02\xf9O\x00\x00')
DEBUG:can.pcan:Type: <class 'bytearray'>
DEBUG:canopen.sdo.client:Server requested a block size of 127
...
ERROR:canopen.sdo.client:Block transfer was not finished
DEBUG:canopen.sdo.client:Ending block transfer...
DEBUG:can.pcan:Data: bytearray(b'\xdd\x00\x00\x00\x00\x00\x00\x00')
DEBUG:can.pcan:Type: <class 'bytearray'>
Traceback (most recent call last):
  File "C:\...\notebooks\test.py", line 32, in <module>
    outfile.write(data)
BlockingIOError: [Errno 0] write could not complete without blocking

During handling of the above exception, another exception occurred:

BlockingIOError: [Errno 0] write could not complete without blocking

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "C:\...\notebooks\test.py", line 23, in <module>
    with open(FIRMWARE_PATH, 'rb') as infile, \
  File "C:\...\.env\lib\site-packages\canopen\sdo\client.py", line 790, in close
    raise SdoCommunicationError("Block download unsuccessful")
canopen.sdo.exceptions.SdoCommunicationError: Block download unsuccessful

A BlockingIOError gets thrown resulting in the transfer to be aborted. I sadly enough do not have any other CANopen devices supporting block transfers to fully rule out the embedded side, but the responses from the device follow the CANopen specification. Transferring of small files (2 lines of text) does seem to work, bigger files result in this error (An EDS has been used for this test).

Any ideas what could be the issue, what I am doing wrong or what could be a solution?

@Jef-GB
Copy link
Author

Jef-GB commented May 25, 2023

Update

I have found a workaround but I feel like it is not the way to go. By changing the BLOCK_SIZE to be equal to the FILESIZE the transfer does complete. This does somehow defeat the purpose of the while loop in the example and does load as far as I know, everything to be transferred into memory.

FIRMWARE_PATH = './notebooks/firmware.bin'
FILESIZE = os.path.getsize(FIRMWARE_PATH)
BLOCK_SIZE = FILESIZE
print(FILESIZE)  # Size = 20473 bytes

with open(FIRMWARE_PATH, 'rb') as infile, \
        interface._node.sdo.open(index=0x2000, subindex=0x2, mode='wb', buffering=BLOCK_SIZE,
                                 size=FILESIZE, block_transfer=True) as outfile:
    data = infile.read(BLOCK_SIZE)
    if data:
        outfile.write(data)

It does work with my expected file size but is not optimal if it all gets loaded into memory.

@christiansandberg
Copy link
Owner

The problem is that the library makes use of the buffering functionality in io.BufferedWriter. If you want full control I think it is better that you try to set buffering=0 and BLOCK_SIZE = 7. Otherwise set the buffering argument to something dividable by 7.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants