Binary data archiving library - supports uploading to object storage.
You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
OpenDev Sysadmins e56799eb1b OpenDev Migration Patch 2 months ago
shoebox Add PEP8 check and fix related issues 4 years ago
test Add PEP8 check and fix related issues 4 years ago
.gitignore Add PEP8 check and fix related issues 4 years ago
.gitreview OpenDev Migration Patch 2 months ago
LICENSE Initial commit 5 years ago fix docs to reflect kwargs constructors 5 years ago
requirements.txt Add PEP8 check and fix related issues 4 years ago
setup.cfg Skip tarball and just gzip the archive file. 4 years ago First commit 5 years ago
tox.ini Add PEP8 check and fix related issues 4 years ago


binary data archiving library - supports uploading to object storage

Json payloads and string:string metadata dicts are stored in local-disk binary files. The binary file format is versioned and tagged to allow for easy extension.

There are ArchiveReaders and ArchiveWriters which are managed by the RollManager. “Roll” comes from “roll over”. It controls when roll-over occurs from one Archive to the next. There is only one Archiver active at a time per RollManager.

The RollManager opens and closes Archivers as needed. “As needed” is determined by which RollChecker that was passed into the RollManager. Archive files can roll over based on file size or elapsed time (for writing). For reading, archive files are only rolled over when the EOF is reached.

Roll Managers also take care of filename creation, compression of completed archives and transfer of archive files to remote storage locations.

The RollCheckers have a reference to the current Archive so they can ask file-related questions (like “how big are you?”)

You can register callbacks with the RollManager for notifications on when new Archive files are opened or closed.

Important Note! The Callback handlers and the RollCheckers take kwargs in the constructor since they can be dynamically loaded as plugins. So, make sure you provide named parameters to the constructors.


# Make a roll checker of whatever strategy you choose.
checker = roll_checker.SizeRollChecker(roll_size_mb=100)  # 100mb files

# Make a roll manager for reading or writing. 
# Give the filename template and the checker. 
# (and an optional working directory for new files)

# The %c in the template is per the python strptime method: 

x = roll_manager.WritingRollManager("", checker)

# Write some metadata and payload ...
# WritingRollManager.write(metadata, payload) where
# metadata = string:string dict
# payload = string of data. Most likely a json structure.

# If the archive file grows beyond 100mb the old one
# will automatically close and a new one created.
for index in range(10):
    x.write({"index": str(index)}, "payload_%d" % index)


For Reading:

# Read from all the event data files using wildcards ...
manager = roll_manager.ReadingRollManager("test_*.events")

# This will keep reading across all files in the archive
# until we reach the end.
while True:
        metadata, json_payload =
    except roll_manager.NoMoreFiles:

Look at test/integration/ for a more complete example.