https://github.com/mhatzl/spark_unbound
Author:MIT
Version:0.2.1
Alire CI: Dependencies:
Spark_Unbound is a take on providing generic unbound data structures in Spark.
In addition to proving general absence of runtime errors, the heap allocation is done in a non-Spark function to catch a possible Storage_Error
.
This further increases the security and confident use of this library.
The following packages are currently available:
Spark_Unbound.Safe_Alloc
: Providing formally proven safe heap allocation functionalitySpark_Unbound.Arrays
: Providing a formally proven alternative to Ada.Containers.Vector
Note: If you use this library, starring the repository on GitHub helps me a lot to see if it is even useful for someone else.