FAIR data pipeline manual
This is the manual for the upcoming FAIR DataPipeline
package.
Managing code runs
DataPipeline.initialise
— Functioninitialise(config_file, submission_script)
Reads in working config.yaml file, generates a new Code Run entry, and returns a DataRegistryHandle
containing various metadata.
DataPipeline.finalise
— Functionfinalise(handle)
Complete (i.e. finish) code run.
Reading data
DataPipeline.read_array
— Functionread_array(handle, data_product[, component])
Read [array] data product.
- note that it must already have been downloaded from the remote data store using
fdp pull
. - the latest version of the data is read unless otherwise specified.
DataPipeline.read_table
— Functionread_table(handle, data_product[, component])
Read [table] data product.
- note that it must already have been downloaded from the remote data store using
fdp pull
. - the latest version of the data is read unless otherwise specified.
DataPipeline.read_estimate
— Functionread_estimate(handle, data_product, [component])
Read TOML-based data product.
- note that it must already have been downloaded from the remote data store using
fdp pull
. - the specific version can be specified in the config file (else the latest version is used.)
DataPipeline.read_distribution
— Functionreaddistribution(handle, dataproduct, [component])
Read TOML-based data product.
- note that it must already have been downloaded from the remote data store using
fdp pull
. - the specific version can be specified in the config file (else the latest version is used.)
Missing docstring for link_read
. Check Documenter's build log for details.
Writing data
DataPipeline.write_array
— Functionwrite_array(handle, data, data_product, component)
Write an array as a component to an hdf5 file.
See also: write_table
, read_array
, read_table
DataPipeline.write_table
— Functionwrite_table(handle, data, data_product, component)
Write a table as a component to an hdf5 file.
See also: write_array
, read_array
, read_table
DataPipeline.write_estimate
— Functionwrite_estimate(handle, value, data_product, component)
Write a point estimate as a component to a toml file.
DataPipeline.write_distribution
— Functionwrite_distribution(handle, distribution, parameters, data_product, component)
Write a distribution as a component to a toml file.
Missing docstring for link_write
. Check Documenter's build log for details.
Raising issues
DataPipeline.raise_issue
— Functionraise_issue(handle; ... )
Register issue with data product; component; external object; or script.
Pass the object URI as a named parameter[s], e.g. raise_issue(handle; data_product=dp, component=comp)
.
Optional parameters
data_product
component
external_object
script