13 stable releases (5 major)
6.0.0+20240605 | Oct 15, 2024 |
---|---|
5.0.5+20240605 | Jun 27, 2024 |
5.0.4+20240228 | Mar 5, 2024 |
5.0.3+20221220 | Aug 23, 2023 |
1.0.14+20200612 | Jul 10, 2020 |
#2282 in Web programming
144 downloads per month
Used in google-healthcare1-cli
2.5MB
32K
SLoC
The google-healthcare1
library allows access to all features of the Google Cloud Healthcare service.
This documentation was generated from Cloud Healthcare crate version 6.0.0+20240605, where 20240605 is the exact revision of the healthcare:v1 schema built by the mako code generator v6.0.0.
Everything else about the Cloud Healthcare v1 API can be found at the official documentation site.
Features
Handle the following Resources with ease from the central hub ...
- projects
- locations datasets consent stores attribute definitions create, locations datasets consent stores attribute definitions delete, locations datasets consent stores attribute definitions get, locations datasets consent stores attribute definitions list, locations datasets consent stores attribute definitions patch, locations datasets consent stores check data access, locations datasets consent stores consent artifacts create, locations datasets consent stores consent artifacts delete, locations datasets consent stores consent artifacts get, locations datasets consent stores consent artifacts list, locations datasets consent stores consents activate, locations datasets consent stores consents create, locations datasets consent stores consents delete, locations datasets consent stores consents delete revision, locations datasets consent stores consents get, locations datasets consent stores consents list, locations datasets consent stores consents list revisions, locations datasets consent stores consents patch, locations datasets consent stores consents reject, locations datasets consent stores consents revoke, locations datasets consent stores create, locations datasets consent stores delete, locations datasets consent stores evaluate user consents, locations datasets consent stores get, locations datasets consent stores get iam policy, locations datasets consent stores list, locations datasets consent stores patch, locations datasets consent stores query accessible data, locations datasets consent stores set iam policy, locations datasets consent stores test iam permissions, locations datasets consent stores user data mappings archive, locations datasets consent stores user data mappings create, locations datasets consent stores user data mappings delete, locations datasets consent stores user data mappings get, locations datasets consent stores user data mappings list, locations datasets consent stores user data mappings patch, locations datasets create, locations datasets data mapper workspaces get iam policy, locations datasets data mapper workspaces set iam policy, locations datasets data mapper workspaces test iam permissions, locations datasets deidentify, locations datasets delete, locations datasets dicom stores create, locations datasets dicom stores deidentify, locations datasets dicom stores delete, locations datasets dicom stores dicom web studies get study metrics, locations datasets dicom stores dicom web studies series get series metrics, locations datasets dicom stores export, locations datasets dicom stores get, locations datasets dicom stores get dicom store metrics, locations datasets dicom stores get iam policy, locations datasets dicom stores import, locations datasets dicom stores list, locations datasets dicom stores patch, locations datasets dicom stores search for instances, locations datasets dicom stores search for series, locations datasets dicom stores search for studies, locations datasets dicom stores set iam policy, locations datasets dicom stores store instances, locations datasets dicom stores studies delete, locations datasets dicom stores studies retrieve metadata, locations datasets dicom stores studies retrieve study, locations datasets dicom stores studies search for instances, locations datasets dicom stores studies search for series, locations datasets dicom stores studies series delete, locations datasets dicom stores studies series instances delete, locations datasets dicom stores studies series instances frames retrieve frames, locations datasets dicom stores studies series instances frames retrieve rendered, locations datasets dicom stores studies series instances retrieve instance, locations datasets dicom stores studies series instances retrieve metadata, locations datasets dicom stores studies series instances retrieve rendered, locations datasets dicom stores studies series retrieve metadata, locations datasets dicom stores studies series retrieve series, locations datasets dicom stores studies series search for instances, locations datasets dicom stores studies store instances, locations datasets dicom stores test iam permissions, locations datasets fhir stores create, locations datasets fhir stores deidentify, locations datasets fhir stores delete, locations datasets fhir stores export, locations datasets fhir stores fhir patient-everything, locations datasets fhir stores fhir resource-purge, locations datasets fhir stores fhir resource-validate, locations datasets fhir stores fhir capabilities, locations datasets fhir stores fhir conditional delete, locations datasets fhir stores fhir conditional patch, locations datasets fhir stores fhir conditional update, locations datasets fhir stores fhir create, locations datasets fhir stores fhir delete, locations datasets fhir stores fhir execute bundle, locations datasets fhir stores fhir history, locations datasets fhir stores fhir patch, locations datasets fhir stores fhir read, locations datasets fhir stores fhir search, locations datasets fhir stores fhir search-type, locations datasets fhir stores fhir update, locations datasets fhir stores fhir vread, locations datasets fhir stores get, locations datasets fhir stores get fhir store metrics, locations datasets fhir stores get iam policy, locations datasets fhir stores import, locations datasets fhir stores list, locations datasets fhir stores patch, locations datasets fhir stores rollback, locations datasets fhir stores set iam policy, locations datasets fhir stores test iam permissions, locations datasets get, locations datasets get iam policy, locations datasets hl7 v2 stores create, locations datasets hl7 v2 stores delete, locations datasets hl7 v2 stores export, locations datasets hl7 v2 stores get, locations datasets hl7 v2 stores get hl7v2 store metrics, locations datasets hl7 v2 stores get iam policy, locations datasets hl7 v2 stores import, locations datasets hl7 v2 stores list, locations datasets hl7 v2 stores messages create, locations datasets hl7 v2 stores messages delete, locations datasets hl7 v2 stores messages get, locations datasets hl7 v2 stores messages ingest, locations datasets hl7 v2 stores messages list, locations datasets hl7 v2 stores messages patch, locations datasets hl7 v2 stores patch, locations datasets hl7 v2 stores set iam policy, locations datasets hl7 v2 stores test iam permissions, locations datasets list, locations datasets operations cancel, locations datasets operations get, locations datasets operations list, locations datasets patch, locations datasets set iam policy, locations datasets test iam permissions, locations get, locations list and locations services nlp analyze entities
Structure of this Library
The API is structured into the following primary items:
- Hub
- a central object to maintain state and allow accessing all Activities
- creates Method Builders which in turn allow access to individual Call Builders
- Resources
- primary types that you can apply Activities to
- a collection of properties and Parts
- Parts
- a collection of properties
- never directly used in Activities
- Activities
- operations to apply to Resources
All structures are marked with applicable traits to further categorize them and ease browsing.
Generally speaking, you can invoke Activities like this:
let r = hub.resource().activity(...).doit().await
Or specifically ...
let r = hub.projects().locations_datasets_dicom_stores_studies_series_instances_frames_retrieve_frames(...).doit().await
let r = hub.projects().locations_datasets_dicom_stores_studies_series_instances_frames_retrieve_rendered(...).doit().await
let r = hub.projects().locations_datasets_dicom_stores_studies_series_instances_retrieve_instance(...).doit().await
let r = hub.projects().locations_datasets_dicom_stores_studies_series_instances_retrieve_metadata(...).doit().await
let r = hub.projects().locations_datasets_dicom_stores_studies_series_instances_retrieve_rendered(...).doit().await
let r = hub.projects().locations_datasets_dicom_stores_studies_series_retrieve_metadata(...).doit().await
let r = hub.projects().locations_datasets_dicom_stores_studies_series_retrieve_series(...).doit().await
let r = hub.projects().locations_datasets_dicom_stores_studies_series_search_for_instances(...).doit().await
let r = hub.projects().locations_datasets_dicom_stores_studies_retrieve_metadata(...).doit().await
let r = hub.projects().locations_datasets_dicom_stores_studies_retrieve_study(...).doit().await
let r = hub.projects().locations_datasets_dicom_stores_studies_search_for_instances(...).doit().await
let r = hub.projects().locations_datasets_dicom_stores_studies_search_for_series(...).doit().await
let r = hub.projects().locations_datasets_dicom_stores_studies_store_instances(...).doit().await
let r = hub.projects().locations_datasets_dicom_stores_search_for_instances(...).doit().await
let r = hub.projects().locations_datasets_dicom_stores_search_for_series(...).doit().await
let r = hub.projects().locations_datasets_dicom_stores_search_for_studies(...).doit().await
let r = hub.projects().locations_datasets_dicom_stores_store_instances(...).doit().await
let r = hub.projects().locations_datasets_fhir_stores_fhir__patient_everything(...).doit().await
let r = hub.projects().locations_datasets_fhir_stores_fhir__resource_validate(...).doit().await
let r = hub.projects().locations_datasets_fhir_stores_fhir_capabilities(...).doit().await
let r = hub.projects().locations_datasets_fhir_stores_fhir_conditional_patch(...).doit().await
let r = hub.projects().locations_datasets_fhir_stores_fhir_conditional_update(...).doit().await
let r = hub.projects().locations_datasets_fhir_stores_fhir_create(...).doit().await
let r = hub.projects().locations_datasets_fhir_stores_fhir_delete(...).doit().await
let r = hub.projects().locations_datasets_fhir_stores_fhir_execute_bundle(...).doit().await
let r = hub.projects().locations_datasets_fhir_stores_fhir_history(...).doit().await
let r = hub.projects().locations_datasets_fhir_stores_fhir_patch(...).doit().await
let r = hub.projects().locations_datasets_fhir_stores_fhir_read(...).doit().await
let r = hub.projects().locations_datasets_fhir_stores_fhir_search(...).doit().await
let r = hub.projects().locations_datasets_fhir_stores_fhir_search_type(...).doit().await
let r = hub.projects().locations_datasets_fhir_stores_fhir_update(...).doit().await
let r = hub.projects().locations_datasets_fhir_stores_fhir_vread(...).doit().await
The resource()
and activity(...)
calls create builders. The second one dealing with Activities
supports various methods to configure the impending operation (not shown here). It is made such that all required arguments have to be
specified right away (i.e. (...)
), whereas all optional ones can be build up as desired.
The doit()
method performs the actual communication with the server and returns the respective result.
Usage
Setting up your Project
To use this library, you would put the following lines into your Cargo.toml
file:
[dependencies]
google-healthcare1 = "*"
serde = "1"
serde_json = "1"
A complete example
extern crate hyper;
extern crate hyper_rustls;
extern crate google_healthcare1 as healthcare1;
use healthcare1::{Result, Error};
use healthcare1::{CloudHealthcare, FieldMask, hyper_rustls, hyper_util, yup_oauth2};
// Get an ApplicationSecret instance by some means. It contains the `client_id` and
// `client_secret`, among other things.
let secret: yup_oauth2::ApplicationSecret = Default::default();
// Instantiate the authenticator. It will choose a suitable authentication flow for you,
// unless you replace `None` with the desired Flow.
// Provide your own `AuthenticatorDelegate` to adjust the way it operates and get feedback about
// what's going on. You probably want to bring in your own `TokenStorage` to persist tokens and
// retrieve them from storage.
let auth = yup_oauth2::InstalledFlowAuthenticator::builder(
secret,
yup_oauth2::InstalledFlowReturnMethod::HTTPRedirect,
).build().await.unwrap();
let client = hyper_util::client::legacy::Client::builder(
hyper_util::rt::TokioExecutor::new()
)
.build(
hyper_rustls::HttpsConnectorBuilder::new()
.with_native_roots()
.unwrap()
.https_or_http()
.enable_http1()
.build()
);
let mut hub = CloudHealthcare::new(client, auth);
// You can configure optional parameters by calling the respective setters at will, and
// execute the final call using `doit()`.
// Values shown here are possibly random and not representative !
let result = hub.projects().locations_datasets_fhir_stores_fhir__patient_everything("name")
.start("magna")
.end("no")
._type("ipsum")
._since("voluptua.")
._page_token("At")
._count(-8)
.doit().await;
match result {
Err(e) => match e {
// The Error enum provides details about what exactly happened.
// You can also just use its `Debug`, `Display` or `Error` traits
Error::HttpError(_)
|Error::Io(_)
|Error::MissingAPIKey
|Error::MissingToken(_)
|Error::Cancelled
|Error::UploadSizeLimitExceeded(_, _)
|Error::Failure(_)
|Error::BadRequest(_)
|Error::FieldClash(_)
|Error::JsonDecodeError(_, _) => println!("{}", e),
},
Ok(res) => println!("Success: {:?}", res),
}
Handling Errors
All errors produced by the system are provided either as Result enumeration as return value of the doit() methods, or handed as possibly intermediate results to either the Hub Delegate, or the Authenticator Delegate.
When delegates handle errors or intermediate values, they may have a chance to instruct the system to retry. This makes the system potentially resilient to all kinds of errors.
Uploads and Downloads
If a method supports downloads, the response body, which is part of the Result, should be
read by you to obtain the media.
If such a method also supports a Response Result, it will return that by default.
You can see it as meta-data for the actual media. To trigger a media download, you will have to set up the builder by making
this call: .param("alt", "media")
.
Methods supporting uploads can do so using up to 2 different protocols:
simple and resumable. The distinctiveness of each is represented by customized
doit(...)
methods, which are then named upload(...)
and upload_resumable(...)
respectively.
Customization and Callbacks
You may alter the way an doit()
method is called by providing a delegate to the
Method Builder before making the final doit()
call.
Respective methods will be called to provide progress information, as well as determine whether the system should
retry on failure.
The delegate trait is default-implemented, allowing you to customize it with minimal effort.
Optional Parts in Server-Requests
All structures provided by this library are made to be encodable and decodable via json. Optionals are used to indicate that partial requests are responses are valid. Most optionals are are considered Parts which are identifiable by name, which will be sent to the server to indicate either the set parts of the request or the desired parts in the response.
Builder Arguments
Using method builders, you are able to prepare an action call by repeatedly calling it's methods. These will always take a single argument, for which the following statements are true.
- PODs are handed by copy
- strings are passed as
&str
- request values are moved
Arguments will always be copied or cloned into the builder, to make them independent of their original life times.
Cargo Features
utoipa
- Add support for utoipa and deriveutoipa::ToSchema
on all the types. You'll have to import and register the required types in#[openapi(schemas(...))]
, otherwise the generatedopenapi
spec would be invalid.
License
The healthcare1 library was generated by Sebastian Thiel, and is placed under the MIT license. You can read the full text at the repository's license file.
Dependencies
~20–31MB
~565K SLoC