aws s3 streaming upload sink AWS S3 Streaming upload Sink

Provided by: "Apache Software Foundation"

Support Level for this Kamelet is: "Preview"

Upload data to AWS S3 in streaming upload mode.

Access Key/Secret Key are the basic method for authenticating to the AWS S3 Service. These parameters are optional, because the Kamelet provide also the 'useDefaultCredentialsProvider'.

When using a default Credentials Provider the S3 client will load the credentials through this provider and won’t use the static credential. This is reason for not having the access key and secret key as mandatory parameter for this Kamelet.

Configuration Options

The following table summarizes the configuration options available for the aws-s3-streaming-upload-sink Kamelet:

Property Name Description Type Default Example

bucketNameOrArn

Bucket Name

Required The S3 Bucket name or ARN.

string

keyName

Key Name

Required Setting the key name for an element in the bucket through endpoint parameter. In Streaming Upload, with the default configuration, this will be the base for the progressive creation of files.

string

region

AWS Region

Required The AWS region to connect to.

string

eu-west-1

accessKey

Access Key

The access key obtained from AWS.

string

autoCreateBucket

Autocreate Bucket

Setting the autocreation of the S3 bucket bucketName.

boolean

false

batchMessageNumber

Batch Message Number

The number of messages composing a batch in streaming upload mode.

integer

10

batchSize

Batch Size

The batch size (in bytes) in streaming upload mode.

integer

1000000

namingStrategy

Naming Strategy

The naming strategy to use in streaming upload mode. There are 2 enums and the value can be one of progressive, random.

string

progressive

overrideEndpoint

Endpoint Overwrite

Set the need for overiding the endpoint URI. This option needs to be used in combination with uriEndpointOverride setting.

boolean

false

restartingPolicy

Restarting Policy

The restarting policy to use in streaming upload mode. There are 2 enums and the value can be one of override, lastPart.

string

lastPart

secretKey

Secret Key

The secret key obtained from AWS.

string

streamingUploadMode

Streaming Upload Mode

Setting the Streaming Upload Mode.

boolean

true

streamingUploadTimeout

Streaming Upload Timeout

While streaming upload mode is true, this option set the timeout to complete upload.

long

uriEndpointOverride

Overwrite Endpoint URI

Set the overriding endpoint URI. This option needs to be used in combination with overrideEndpoint option.

string

useDefaultCredentialsProvider

Default Credentials Provider

Set whether the S3 client should expect to load credentials through a default credentials provider or to expect static credentials to be passed in.

boolean

false

Dependencies

At runtime, the aws-s3-streaming-upload-sink Kamelet relies upon the presence of the following dependencies:

  • camel:aws2-s3

  • camel:kamelet

Usage

This section describes how you can use the aws-s3-streaming-upload-sink.

Knative sink

You can use the aws-s3-streaming-upload-sink Kamelet as a Knative sink by binding it to a Knative object.

aws-s3-streaming-upload-sink-binding.yaml
apiVersion: camel.apache.org/v1alpha1
kind: KameletBinding
metadata:
  name: aws-s3-streaming-upload-sink-binding
spec:
  source:
    ref:
      kind: Channel
      apiVersion: messaging.knative.dev/v1
      name: mychannel
  sink:
    ref:
      kind: Kamelet
      apiVersion: camel.apache.org/v1alpha1
      name: aws-s3-streaming-upload-sink
    properties:
      bucketNameOrArn: The Bucket Name
      keyName: The Key Name
      region: eu-west-1

Prerequisite

You have Camel K installed on the cluster.

Procedure for using the cluster CLI

  1. Save the aws-s3-streaming-upload-sink-binding.yaml file to your local drive, and then edit it as needed for your configuration.

  2. Run the sink by using the following command:

    kubectl apply -f aws-s3-streaming-upload-sink-binding.yaml

Procedure for using the Kamel CLI

Configure and run the sink by using the following command:

kamel bind aws-s3-streaming-upload-sink -p "sink.bucketNameOrArn=The Bucket Name" -p "sink.keyName=The Key Name" -p "sink.region=eu-west-1" channel:mychannel

This command creates the KameletBinding in the current namespace on the cluster.

Kafka sink

You can use the aws-s3-streaming-upload-sink Kamelet as a Kafka sink by binding it to a Kafka topic.

aws-s3-streaming-upload-sink-binding.yaml
apiVersion: camel.apache.org/v1alpha1
kind: KameletBinding
metadata:
  name: aws-s3-streaming-upload-sink-binding
spec:
  source:
    ref:
      kind: KafkaTopic
      apiVersion: kafka.strimzi.io/v1beta1
      name: my-topic
  sink:
    ref:
      kind: Kamelet
      apiVersion: camel.apache.org/v1alpha1
      name: aws-s3-streaming-upload-sink
    properties:
      bucketNameOrArn: The Bucket Name
      keyName: The Key Name
      region: eu-west-1

Prerequisites

  • You’ve installed Strimzi.

  • You’ve created a topic named my-topic in the current namespace.

  • You have Camel K installed on the cluster.

Procedure for using the cluster CLI

  1. Save the aws-s3-streaming-upload-sink-binding.yaml file to your local drive, and then edit it as needed for your configuration.

  2. Run the sink by using the following command:

    kubectl apply -f aws-s3-streaming-upload-sink-binding.yaml

Procedure for using the Kamel CLI

Configure and run the sink by using the following command:

kamel bind aws-s3-streaming-upload-sink -p "sink.bucketNameOrArn=The Bucket Name" -p "sink.keyName=The Key Name" -p "sink.region=eu-west-1" kafka.strimzi.io/v1beta1:KafkaTopic:my-topic

This command creates the KameletBinding in the current namespace on the cluster.