Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

read records stored in AWS S3 #268

Open
entest-hai opened this issue Dec 15, 2020 · 2 comments · May be fixed by #298
Open

read records stored in AWS S3 #268

entest-hai opened this issue Dec 15, 2020 · 2 comments · May be fixed by #298

Comments

@entest-hai
Copy link

Can we add a function to read records stored in AWS S3 like Pandas?

@Dubrzr
Copy link
Collaborator

Dubrzr commented Dec 16, 2020

This issue is linked to this one : #73

You should be able to open the file on any filesystem (local, remote S3, HDFS...) and read the contents of it using wfdb-python, instead of implementing a connection to these different filesystems directly into wfdb-python :)

@Lucas-Mc
Copy link
Collaborator

Lucas-Mc commented Apr 27, 2021

Hey guys thanks for the suggestion! I can get this to work by doing this so I'll make a pull request shortly:

>>> import s3fs
>>> fs = s3fs.S3FileSystem(anon=True)
>>> with fs.open('s3://BUCKET_NAME/100.hea') as f: print(f.readlines())
... 
[b'# unnecessary comment\n', b'100 2 360 650000\r\n', b'100.dat 212 200 11 1024 995 -22131 0 MLII\n',
b'\r\n', b'100.dat 212 200 11 1024 1011 20052 0 V5\r\n', b'# 69 M 1085 1629 x1\r\n',
b'# Aldomet, Inderal\r\n']

Lucas-Mc added a commit that referenced this issue Apr 29, 2021
@Lucas-Mc Lucas-Mc linked a pull request Apr 29, 2021 that will close this issue
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

3 participants