AWS DynamoDB Sink
Provided by: "Apache Software Foundation"
Support Level for this Kamelet is: "Preview"
Send data to AWS DynamoDB service. The sent data will insert/update/delete an item on the given AWS DynamoDB table.
Access Key/Secret Key are the basic method for authenticating to the AWS DynamoDB service. These parameters are optional, because the Kamelet provide also the 'useDefaultCredentialsProvider'.
When using a default Credentials Provider the AWS DynamoDB client will load the credentials through this provider and won’t use the static credential. This is reason for not having the access key and secret key as mandatory parameter for this Kamelet.
This Kamelet expects a JSON as body. The mapping between the JSON fields and table attribute values is done by key, so if you have the input:
'{"username":"oscerd", "city":"Rome"}'
The Kamelet will insert/update an item in the given AWS DynamoDB table and set the attributes 'username' and 'city' respectively. Please note that the JSON object must include the primary key values that define the item.
Configuration Options
The following table summarizes the configuration options available for the aws-ddb-sink
Kamelet:
Property | Name | Description | Type | Default | Example |
---|---|---|---|---|---|
AWS Region |
Required The AWS region to connect to. |
string |
eu-west-1 |
||
Table |
Required Name of the DynamoDB table to look at. |
string |
|||
Access Key |
The access key obtained from AWS. |
string |
|||
Operation |
The operation to perform (one of PutItem, UpdateItem, DeleteItem). |
string |
PutItem |
PutItem |
|
Endpoint Overwrite |
Set the need for overiding the endpoint URI. This option needs to be used in combination with uriEndpointOverride setting. |
boolean |
false |
||
Secret Key |
The secret key obtained from AWS. |
string |
|||
Overwrite Endpoint URI |
Set the overriding endpoint URI. This option needs to be used in combination with overrideEndpoint option. |
string |
|||
Default Credentials Provider |
Set whether the DynamoDB client should expect to load credentials through a default credentials provider or to expect static credentials to be passed in. |
boolean |
false |
||
Write Capacity |
The provisioned throughput to reserved for writing resources to your table. |
integer |
1 |
Dependencies
At runtime, the aws-ddb-sink
Kamelet relies upon the presence of the following dependencies:
-
github:apache.camel-kamelets:camel-kamelets-utils:0.8.2-SNAPSHOT
-
camel:core
-
camel:jackson
-
camel:aws2-ddb
-
camel:kamelet
Usage
This section describes how you can use the aws-ddb-sink
.
Knative sink
You can use the aws-ddb-sink
Kamelet as a Knative sink by binding it to a Knative object.
apiVersion: camel.apache.org/v1alpha1
kind: KameletBinding
metadata:
name: aws-ddb-sink-binding
spec:
source:
ref:
kind: Channel
apiVersion: messaging.knative.dev/v1
name: mychannel
sink:
ref:
kind: Kamelet
apiVersion: camel.apache.org/v1alpha1
name: aws-ddb-sink
properties:
region: eu-west-1
table: The Table
Prerequisite
You have Camel K installed on the cluster.
Kafka sink
You can use the aws-ddb-sink
Kamelet as a Kafka sink by binding it to a Kafka topic.
apiVersion: camel.apache.org/v1alpha1
kind: KameletBinding
metadata:
name: aws-ddb-sink-binding
spec:
source:
ref:
kind: KafkaTopic
apiVersion: kafka.strimzi.io/v1beta1
name: my-topic
sink:
ref:
kind: Kamelet
apiVersion: camel.apache.org/v1alpha1
name: aws-ddb-sink
properties:
region: eu-west-1
table: The Table
Prerequisites
-
You’ve installed Strimzi.
-
You’ve created a topic named
my-topic
in the current namespace. -
You have Camel K installed on the cluster.