Skip to content
This repository has been archived by the owner on May 5, 2021. It is now read-only.

I got error "error: resulting document after update is larger than 16777216" #641

Open
nhha1602 opened this issue Jun 19, 2020 · 6 comments

Comments

@nhha1602
Copy link

Hi,

I tried to add data to NLU, but i got error "error: resulting document after update is larger than 16777216"

Could you, please help to advise this?
Before of this error, i got "error: bsonobj size: 16904485 (0x101f125) is invalid"

Is it Mongo issue ? if yes, could you, please advice this?

Thanks you alot

@nhha1602
Copy link
Author

I added more information in mongo log:

2020-06-20T14:17:44.094+0000 I NETWORK [conn12] end connection 172.18.0.7:34960 (7 connections now open)
2020-06-20T14:41:31.256+0000 E - [conn15] Assertion: BSONObjectTooLarge: BSONObj size: 16939217 (0x10278D1) is invalid. Size must be between 0 and 16793600(16MB) First element: $v: 1 src/mongo/bson/bsonobj.cpp 98
2020-06-20T14:46:20.183+0000 E - [conn8] Assertion: BSONObjectTooLarge: BSONObj size: 16809968 (0x1007FF0) is invalid. Size must be between 0 and 16793600(16MB) First element: $v: 1 src/mongo/bson/bsonobj.cpp 98

@znat
Copy link
Contributor

znat commented Jun 20, 2020 via email

@nhha1602
Copy link
Author

Hi, seems your model hit the MongoDB document size limit. How many examples do you have? Le sam. 20 juin 2020 10 h 51, nhha1602 notifications@github.com a écrit :

I added more information in mongo log: 2020-06-20T14:17:44.094+0000 I NETWORK [conn12] end connection 172.18.0.7:34960 (7 connections now open) 2020-06-20T14:41:31.256+0000 E - [conn15] Assertion: BSONObjectTooLarge: BSONObj size: 16939217 (0x10278D1) is invalid. Size must be between 0 and 16793600(16MB) First element: $v: 1 src/mongo/bson/bsonobj.cpp 98 2020-06-20T14:46:20.183+0000 E - [conn8] Assertion: BSONObjectTooLarge: BSONObj size: 16809968 (0x1007FF0) is invalid. Size must be between 0 and 16793600(16MB) First element: $v: 1 src/mongo/bson/bsonobj.cpp 98 — You are receiving this because you are subscribed to this thread. Reply to this email directly, view it on GitHub <#641 (comment)>, or unsubscribe https://github.com/notifications/unsubscribe-auth/AA4GAKT6IV5IXKYPF5VDXZTRXTEH3ANCNFSM4OCPVAJA .

Hi,

Yes, there are alot ... around 100K and may be more
Do you have any solutions for this ? I think data will be more and more :) when we have more complicated stories

@engahmed1190
Copy link

Can we use GridFS to break big size documents @znat @nhha1602

@nhha1602
Copy link
Author

Can we use GridFS to break big size documents @znat @nhha1602

@engahmed1190 Do you know how to do it? Please help. Thank you.

@engahmed1190
Copy link

Hello @nhha1602 .

Please refer to this link

@znat can we add this as enhancement feature

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants