3LC Python Package Version 2.15¶
2.15.0¶
Features¶
[15056] Added segmentation support for
detectron2
Updated
register_coco_instances
to take two additional arguments:task
, which maps directly toTableFromCoco
, andmask_format
, which maps directly to thedetectron2
config variableINPUT.MASK_FORMAT
Added argument
save_segmentations
toBoundingBoxMetricsCollector
, which writes predicted masks in 3LC format to metrics tables
[14597] Use finer-grained per-project timestamp index files instead of per-scan-URL, which increases cache granularity to the natural level of the project. This means that changes to files in one project will no longer trigger an indexing re-scan of files belonging to other projects under the same scan URL.
[14931] Made it so that imports required for the Hugging Face integration are lazy-loaded when they need to be used rather than loaded during initial
tlc
import; this avoids a significant delay up-front for imports that may not end up being used at all[10880] Made it possible for the Object Service to get/put user settings by scope and key, which lays the groundwork for persisting Dashboard settings
Enhancements and Fixes¶
[15070] Extended
TableInfo
with ‘row_cache_url’ and ‘type’ columns, making diagnostic information available for display in the Dashboard[15145] Output all 3LC config files in use when running the Object Service
[15090] Reimplemented GCS URL adapter to avoid occasional hangs when referencing gs:// URLs, particularly with VertexAI instances
[15161] Use explicit filename matching in indexer where possible; this avoids issues with files automatically created by the OS inside 3LC data directories in some cases (e.g. hidden MacOS files)
[15094] Worked around an occasional crash during
3lc
shutdown caused by use ofpyarrow
and a race condition in destruction of its backing C++ library[15177] Fixed an issue that made some
VisionDatasets
used with 3LC unpickleable[15187] Explicitly set
weights_only=True
parameter intorch.load()
call inLargeTorchTensor.read_sample_from_buffer()
following the strong recommendation in the PyTorch documentation for loading untrusted data. This change eliminates a potential security vulnerability where malicious model files could execute arbitrary Python code during deserialization, while maintaining identical functionality for legitimate tensor loading.