Added chunked data together in database read

While extracting zip and saving it in database, if any of the fize size
is biggger than 65535 it is getting truncated during save.
It is because of the data which is a LimitingReader object has
BlobIterator as internal content iterator which has internal
chunk limit of 65535.

Change-Id: I3220abcd7cbc38db821478989c4c226a6f3366b5
closes-bug: #1821866
This commit is contained in:
Kushal Agrawal 2019-03-27 17:06:33 +05:30
parent 26e4184c8d
commit 4f7543a2a0

View File

@ -775,7 +775,13 @@ def save_blob_data_batch(context, blobs, session):
for blob_data_id, data in blobs:
blob_data = models.ArtifactBlobData()
blob_data.id = blob_data_id
blob_data.data = data.read()
blob_data.data = b''
while True:
chunk = data.read()
if chunk:
blob_data.data += chunk
else:
break
session.add(blob_data)
locations.append("sql://" + blob_data.id)