31 stable releases (6 major)
6.0.0+20240617 | Oct 15, 2024 |
---|---|
5.0.5+20240617 | Jun 27, 2024 |
5.0.4+20240222 | Mar 5, 2024 |
5.0.3+20230103 | Aug 23, 2023 |
0.1.11+20151209 | Jan 30, 2016 |
#1792 in Web programming
Used in google-dataproc1-cli
1.5MB
22K
SLoC
The google-dataproc1
library allows access to all features of the Google Dataproc service.
This documentation was generated from Dataproc crate version 6.0.0+20240617, where 20240617 is the exact revision of the dataproc:v1 schema built by the mako code generator v6.0.0.
Everything else about the Dataproc v1 API can be found at the official documentation site.
Features
Handle the following Resources with ease from the central hub ...
- projects
- locations autoscaling policies create, locations autoscaling policies delete, locations autoscaling policies get, locations autoscaling policies get iam policy, locations autoscaling policies list, locations autoscaling policies set iam policy, locations autoscaling policies test iam permissions, locations autoscaling policies update, locations batches analyze, locations batches create, locations batches delete, locations batches get, locations batches list, locations operations cancel, locations operations delete, locations operations get, locations operations list, locations session templates create, locations session templates delete, locations session templates get, locations session templates list, locations session templates patch, locations sessions create, locations sessions delete, locations sessions get, locations sessions list, locations sessions terminate, locations workflow templates create, locations workflow templates delete, locations workflow templates get, locations workflow templates get iam policy, locations workflow templates instantiate, locations workflow templates instantiate inline, locations workflow templates list, locations workflow templates set iam policy, locations workflow templates test iam permissions, locations workflow templates update, regions autoscaling policies create, regions autoscaling policies delete, regions autoscaling policies get, regions autoscaling policies get iam policy, regions autoscaling policies list, regions autoscaling policies set iam policy, regions autoscaling policies test iam permissions, regions autoscaling policies update, regions clusters create, regions clusters delete, regions clusters diagnose, regions clusters get, regions clusters get iam policy, regions clusters inject credentials, regions clusters list, regions clusters node groups create, regions clusters node groups get, regions clusters node groups repair, regions clusters node groups resize, regions clusters patch, regions clusters repair, regions clusters set iam policy, regions clusters start, regions clusters stop, regions clusters test iam permissions, regions jobs cancel, regions jobs delete, regions jobs get, regions jobs get iam policy, regions jobs list, regions jobs patch, regions jobs set iam policy, regions jobs submit, regions jobs submit as operation, regions jobs test iam permissions, regions operations cancel, regions operations delete, regions operations get, regions operations get iam policy, regions operations list, regions operations set iam policy, regions operations test iam permissions, regions workflow templates create, regions workflow templates delete, regions workflow templates get, regions workflow templates get iam policy, regions workflow templates instantiate, regions workflow templates instantiate inline, regions workflow templates list, regions workflow templates set iam policy, regions workflow templates test iam permissions and regions workflow templates update
Structure of this Library
The API is structured into the following primary items:
- Hub
- a central object to maintain state and allow accessing all Activities
- creates Method Builders which in turn allow access to individual Call Builders
- Resources
- primary types that you can apply Activities to
- a collection of properties and Parts
- Parts
- a collection of properties
- never directly used in Activities
- Activities
- operations to apply to Resources
All structures are marked with applicable traits to further categorize them and ease browsing.
Generally speaking, you can invoke Activities like this:
let r = hub.resource().activity(...).doit().await
Or specifically ...
let r = hub.projects().locations_batches_analyze(...).doit().await
let r = hub.projects().locations_batches_create(...).doit().await
let r = hub.projects().locations_operations_get(...).doit().await
let r = hub.projects().locations_sessions_create(...).doit().await
let r = hub.projects().locations_sessions_delete(...).doit().await
let r = hub.projects().locations_sessions_terminate(...).doit().await
let r = hub.projects().locations_workflow_templates_instantiate(...).doit().await
let r = hub.projects().locations_workflow_templates_instantiate_inline(...).doit().await
let r = hub.projects().regions_clusters_node_groups_create(...).doit().await
let r = hub.projects().regions_clusters_node_groups_repair(...).doit().await
let r = hub.projects().regions_clusters_node_groups_resize(...).doit().await
let r = hub.projects().regions_clusters_create(...).doit().await
let r = hub.projects().regions_clusters_delete(...).doit().await
let r = hub.projects().regions_clusters_diagnose(...).doit().await
let r = hub.projects().regions_clusters_inject_credentials(...).doit().await
let r = hub.projects().regions_clusters_patch(...).doit().await
let r = hub.projects().regions_clusters_repair(...).doit().await
let r = hub.projects().regions_clusters_start(...).doit().await
let r = hub.projects().regions_clusters_stop(...).doit().await
let r = hub.projects().regions_jobs_submit_as_operation(...).doit().await
let r = hub.projects().regions_operations_get(...).doit().await
let r = hub.projects().regions_workflow_templates_instantiate(...).doit().await
let r = hub.projects().regions_workflow_templates_instantiate_inline(...).doit().await
The resource()
and activity(...)
calls create builders. The second one dealing with Activities
supports various methods to configure the impending operation (not shown here). It is made such that all required arguments have to be
specified right away (i.e. (...)
), whereas all optional ones can be build up as desired.
The doit()
method performs the actual communication with the server and returns the respective result.
Usage
Setting up your Project
To use this library, you would put the following lines into your Cargo.toml
file:
[dependencies]
google-dataproc1 = "*"
serde = "1"
serde_json = "1"
A complete example
extern crate hyper;
extern crate hyper_rustls;
extern crate google_dataproc1 as dataproc1;
use dataproc1::api::Cluster;
use dataproc1::{Result, Error};
use dataproc1::{Dataproc, FieldMask, hyper_rustls, hyper_util, yup_oauth2};
// Get an ApplicationSecret instance by some means. It contains the `client_id` and
// `client_secret`, among other things.
let secret: yup_oauth2::ApplicationSecret = Default::default();
// Instantiate the authenticator. It will choose a suitable authentication flow for you,
// unless you replace `None` with the desired Flow.
// Provide your own `AuthenticatorDelegate` to adjust the way it operates and get feedback about
// what's going on. You probably want to bring in your own `TokenStorage` to persist tokens and
// retrieve them from storage.
let auth = yup_oauth2::InstalledFlowAuthenticator::builder(
secret,
yup_oauth2::InstalledFlowReturnMethod::HTTPRedirect,
).build().await.unwrap();
let client = hyper_util::client::legacy::Client::builder(
hyper_util::rt::TokioExecutor::new()
)
.build(
hyper_rustls::HttpsConnectorBuilder::new()
.with_native_roots()
.unwrap()
.https_or_http()
.enable_http1()
.build()
);
let mut hub = Dataproc::new(client, auth);
// As the method needs a request, you would usually fill it with the desired information
// into the respective structure. Some of the parts shown here might not be applicable !
// Values shown here are possibly random and not representative !
let mut req = Cluster::default();
// You can configure optional parameters by calling the respective setters at will, and
// execute the final call using `doit()`.
// Values shown here are possibly random and not representative !
let result = hub.projects().regions_clusters_patch(req, "projectId", "region", "clusterName")
.update_mask(FieldMask::new::<&str>(&[]))
.request_id("ipsum")
.graceful_decommission_timeout(chrono::Duration::seconds(9579437))
.doit().await;
match result {
Err(e) => match e {
// The Error enum provides details about what exactly happened.
// You can also just use its `Debug`, `Display` or `Error` traits
Error::HttpError(_)
|Error::Io(_)
|Error::MissingAPIKey
|Error::MissingToken(_)
|Error::Cancelled
|Error::UploadSizeLimitExceeded(_, _)
|Error::Failure(_)
|Error::BadRequest(_)
|Error::FieldClash(_)
|Error::JsonDecodeError(_, _) => println!("{}", e),
},
Ok(res) => println!("Success: {:?}", res),
}
Handling Errors
All errors produced by the system are provided either as Result enumeration as return value of the doit() methods, or handed as possibly intermediate results to either the Hub Delegate, or the Authenticator Delegate.
When delegates handle errors or intermediate values, they may have a chance to instruct the system to retry. This makes the system potentially resilient to all kinds of errors.
Uploads and Downloads
If a method supports downloads, the response body, which is part of the Result, should be
read by you to obtain the media.
If such a method also supports a Response Result, it will return that by default.
You can see it as meta-data for the actual media. To trigger a media download, you will have to set up the builder by making
this call: .param("alt", "media")
.
Methods supporting uploads can do so using up to 2 different protocols:
simple and resumable. The distinctiveness of each is represented by customized
doit(...)
methods, which are then named upload(...)
and upload_resumable(...)
respectively.
Customization and Callbacks
You may alter the way an doit()
method is called by providing a delegate to the
Method Builder before making the final doit()
call.
Respective methods will be called to provide progress information, as well as determine whether the system should
retry on failure.
The delegate trait is default-implemented, allowing you to customize it with minimal effort.
Optional Parts in Server-Requests
All structures provided by this library are made to be encodable and decodable via json. Optionals are used to indicate that partial requests are responses are valid. Most optionals are are considered Parts which are identifiable by name, which will be sent to the server to indicate either the set parts of the request or the desired parts in the response.
Builder Arguments
Using method builders, you are able to prepare an action call by repeatedly calling it's methods. These will always take a single argument, for which the following statements are true.
- PODs are handed by copy
- strings are passed as
&str
- request values are moved
Arguments will always be copied or cloned into the builder, to make them independent of their original life times.
Cargo Features
utoipa
- Add support for utoipa and deriveutoipa::ToSchema
on all the types. You'll have to import and register the required types in#[openapi(schemas(...))]
, otherwise the generatedopenapi
spec would be invalid.
License
The dataproc1 library was generated by Sebastian Thiel, and is placed under the MIT license. You can read the full text at the repository's license file.
Dependencies
~20–31MB
~571K SLoC