DynamoDB

basic concept

  1. The Table (Table), consistent with the concept of another database table, namely the aggregate data.
  2. The Items (data item), on behalf of a row of data.
  3. The Attributes (attribute), a data item may be composed of a plurality of attributes, one attribute is constituted by attribute name, attribute value and attribute value types.
  4. Key the Partition (partitioning key), i.e., the primary key is the simplest form consists of an attribute, a table and only one partition key. Due to achieve an even distribution of data across multiple partitions within the project DynamoDB using Hash partition key value in storing data, it is also known as the partition key Hash Key .
  5. Key sort (the sort key), the sort keys and another partition bonds primary key for key sorted in the sorting partition. Since the internal DynamoDB orderly sorted key having the same partition key items stored in another physical location in close proximity, it is also known as the sort key the Range Key .
  6. The Indexes Secondary (secondary index), so named should be considered as a primary key index. DynamoDB two secondary index:
    • Global secondary index (GSI), the primary key is created with different index partition key and sort key, DynamoDB have created limited to a maximum of 20 GSI;
    • Local secondary index (LSI), with the same partition key to create the primary key index but different sort key, only if and when creating a table is available, DynamoDB create up to 5 LSI has limitation;

Configure the local development environment

Download DynamoDB local version

From aws - DynamoDB (downloadable version) on a computer to download, and placed into the appropriate position.

Configuration

> aws configure --profile local                                                                                    AWS Access Key ID [None]: fake-ak
AWS Secret Access Key [None]: fake-sk
Default region name [None]:
Default output format [None]:

start up

java -Djava.library.path=./DynamoDBLocal_lib -jar DynamoDBLocal.jar -sharedDb

CLI operations

Note: If you need to add is a local version --endpoint-url http://localhost:8000

# 创建表格
aws dynamodb create-table \
    --table-name ab-debug-proxy \
    --attribute-definitions \
        AttributeName=host,AttributeType=S \
        AttributeName=port,AttributeType=N \
        AttributeName=channel,AttributeType=S \
    --key-schema AttributeName=host,KeyType=HASH AttributeName=port,KeyType=RANGE \
    --global-secondary-indexes IndexName=channel-index,KeySchema=["{AttributeName=channel,KeyType=HASH}"],Projection="{ProjectionType=ALL}",ProvisionedThroughput="{ReadCapacityUnits=1,WriteCapacityUnits=1}" \
    --provisioned-throughput ReadCapacityUnits=1,WriteCapacityUnits=1 \
    --endpoint-url http://localhost:8000

# 删除表格
aws dynamodb delete-table \
    --table-name ab-debug-proxy \
    --endpoint-url http://localhost:8000

# 列出所有表格
aws dynamodb list-tables --endpoint-url http://localhost:8000

# 添加数据项
aws dynamodb put-item \
    --table-name ab-debug-proxy \
    --item '{"host": {"S": "192.168.1.0"},"port": {"N": "9090"},"channel": {"S": "sw"} }' \
    --return-consumed-capacity TOTAL \
    --endpoint-url http://localhost:8000

# 扫描数据项
aws dynamodb scan \
    --table-name ab-debug-proxy \
    --endpoint-url http://localhost:8000

Java operation

Use DynamoDBMapper

AmazonDynamoDB amazonDynamoDB = builder.build();
final String prefix = "project-" + AppConfig.getVariant() + "-";
final DynamoDBMapperConfig.TableNameOverride tableNameOverride =
    DynamoDBMapperConfig.TableNameOverride.withTableNamePrefix(prefix);
DynamoDBMapperConfig dynamoDBMapperConfig = DynamoDBMapperConfig.builder()
    .withTableNameOverride(tableNameOverride)
    .build();
DynamoDBMapper DynamoDBMapper(amazonDynamoDB, dynamoDBMapperConfig);

Bulk loading (Batch Load)

List<KeyPair> keyPairList = new ArrayList<>();
for (Proxy proxy : proxyList) {
    KeyPair keyPair = new KeyPair();
    keyPair.setHashKey(proxy.getHost());
    keyPair.setRangeKey(proxy.getPort());
    keyPairList.add(keyPair);
}
Map<Class<?>, List<KeyPair>> keyPairForTable = new HashMap<>();
keyPairForTable.put(Proxy.class, keyPairList);
Map<String, List<Object>> stringListMap = dynamoDBMapper.batchLoad(keyPairForTable);

Troubleshooting

DynamoDBMappingException: [class name] :no mapping for HASH key

Ensure the primary key to set the following comment:

@DynamoDBHashKey

Use limit the query data field but did not return the expected data

The reason is that limit is to limit the query result set rather than a collection, see comments can be seen:

/**
     * Sets the limit of items to scan and returns a pointer to this object for
     * method-chaining. Please note that this is <b>not</b> the same as the
     * number of items to return from the scan operation -- the operation will
     * cease and return as soon as this many items are scanned, even if no
     * matching results are found.
     *
     * @see DynamoDBScanExpression#getLimit()
     */
public DynamoDBScanExpression withLimit(Integer limit) {
    this.limit = limit;
    return this;
}

So how do the limit of the result set it? Its use with pagethe interface tab query, as follows:

// page query
do {
    QueryResultPage<Proxy> proxyQueryResultPage =
        dynamoDBMapper.queryPage(Proxy.class, queryExpression);
    proxies.addAll(proxyQueryResultPage.getResults());
    queryExpression.setExclusiveStartKey(proxyQueryResultPage.getLastEvaluatedKey());
} while (queryExpression.getExclusiveStartKey() != null && proxies.size() < limit);

Set SaveBehavior.UPDATE_SKIP_NULL_ATTRIBUTESbut the call batchSavewhen the method does not skip a null field

When you create a set UPDATE_SKIP_NULL_ATTRIBUTES DynamoDBMapperConfig, as follows:

DynamoDBMapperConfig dynamoDBMapperConfig = DynamoDBMapperConfig.builder()
    .withTableNameOverride(tableNameOverride)
    // 设置 UPDATE_SKIP_NULL_ATTRIBUTES
    .withSaveBehavior(DynamoDBMapperConfig.SaveBehavior.UPDATE_SKIP_NULL_ATTRIBUTES)
    .build();

View SaveBehaviorsource code, found:

/**
* UPDATE_SKIP_NULL_ATTRIBUTES is similar to UPDATE, except that it
* ignores any null value attribute(s) and will NOT remove them from
* that item in DynamoDB. It also guarantees to send only one single
* updateItem request, no matter the object is key-only or not.
*/
UPDATE_SKIP_NULL_ATTRIBUTES,

That does not support batch operations!

Empty Scan filter returns a Boolean field

Map<String, AttributeValue> eav = new HashMap<>();
Map<String, String> ean = new HashMap<>();
StringBuilder filterExpressionBuilder = new StringBuilder();
eav.put(":valid", new AttributeValue().withBOOL(proxy.getLocked()));
ean.put("#valid","valid");
filterExpressionBuilder.append("#valid = :valid");
DynamoDBScanExpression scanExpression = new DynamoDBScanExpression().withConsistentRead(false)
    .withExpressionAttributeValues(eav)
    .withExpressionAttributeNames(ean)
    .withFilterExpression(filterExpressionBuilder.toString());
PaginatedScanList<Proxy> result = dynamoDBMapper.scan(Proxy.class, scanExpression);

DynamoDB to store Boolean type when in fact use the Number type storage, can not be filtered using Boolean value filtering, but to use "0" (false) and "1" (true), with the following:

eav.put(":valid", new AttributeValue().withN(proxy.getLocked() ? "1" : "0"));

DynamoDBMappingException

The following error occurred calling DynamoDB of batchSave interfaces:

com.amazonaws.services.dynamodbv2.datamodeling.DynamoDBMappingException: not supported; requires @DynamoDBTyped or @DynamoDBTypeConverted
    at com.amazonaws.services.dynamodbv2.datamodeling.StandardModelFactories$Rules$NotSupported.set(StandardModelFactories.java:664)
    at com.amazonaws.services.dynamodbv2.datamodeling.StandardModelFactories$Rules$NotSupported.set(StandardModelFactories.java:650)
    at com.amazonaws.services.dynamodbv2.datamodeling.StandardModelFactories$AbstractRule.convert(StandardModelFactories.java:709)
    at com.amazonaws.services.dynamodbv2.datamodeling.StandardModelFactories$AbstractRule.convert(StandardModelFactories.java:691)
    at com.amazonaws.services.dynamodbv2.datamodeling.DynamoDBMapperFieldModel.convert(DynamoDBMapperFieldModel.java:138)
    at com.amazonaws.services.dynamodbv2.datamodeling.DynamoDBMapper.batchWrite(DynamoDBMapper.java:1107)
    at com.amazonaws.services.dynamodbv2.datamodeling.AbstractDynamoDBMapper.batchSave(AbstractDynamoDBMapper.java:173)

In the original notes for the @DynamoDBTableuse of a custom type fields in the entity class, DynamoDB does not support.

Solution: Add the custom annotation on the type of field @DynamoDBTypeConvertedJsoncan be, the principle is the use of json value Jackson the field of re-storage. If you do not want Jackson, you can customize converters, as follows:

// CustomTypeConverter.java
public class CustomTypeConverter<T> implements DynamoDBTypeConverter<String, T> {

  private Class<T> clazz;

  public CustomTypeConverter(Class<T> clazz) {
    this.clazz = clazz;
  }

  @Override
  public String convert(T object) {
    return Json.toJson(object);
  }

  @Override
  public T unconvert(String json) {
    return Json.fromJson(json, clazz);
  }
}
// JobTreeConverter.java
public class JobTreeConverter extends CustomTypeConverter<JobTree> {
  public JobTreeConverter(Class<JobTree> clazz) {
    super(clazz);
  }
}

// JobGroup.java
@DynamoDBTable(tableName = "job-group")
public class JobGroup implements Serializable {
  @DynamoDBHashKey(attributeName = "id")
  @SerializedName("id")
  private String id;

  @SerializedName("name")
  private String name;

  @DynamoDBTypeConverted(converter = JobTreeConverter.class)
  @DynamoDBAttribute(attributeName = "default_job_tree")
  @SerializedName("default_job_tree")
  private JobTree defaultJobTree;
  //...
}

reference

  1. DynamoDB (downloadable version) on your computer - aws
  2. Instroduction - Amazon DynamoDB
  3. DynamoDB API documentation -aws

Guess you like

Origin www.cnblogs.com/lshare/p/11334293.html