I'm using boto's DynamoDB v2 and I'm writing items to a table in batch. However I'm unable to prevent DynamoDB from overwriting attributes of existing items. I'd rather have the process fail.
The table has the following schema:
from boto.dynamodb2.table import Table, HashKey, RangeKey
conn = get_connection()
t = Table.create(
'intervals',
schema=[
HashKey('id'),
RangeKey('start')
],
connection=conn
)
Say I insert one item:
item = {
'id': '4920',
'start': '20',
'stop': '40'
}
t.put_item(data=item)
Now, when I insert new items with batch_write, I want to make sure DynamoDB will not overwrite the existing item. According to the documentation, this should be achieved with the overwrite parameter from the put_item method of the BatchTable class (which is the one that is used as context manager in the example below)
new_items = [{
'id': '4920',
'start': '20',
'stop': '90'
}]
with t.batch_write() as batch:
for i in new_items:
batch.put_item(data=i, overwrite=False)
However, it doesn't. The stop attribute in my example gets a new value 90. So the previous value (40) is overwritten.
If I use the table's own put_item method, the overwrite parameter works. Setting it to True replaces the stop value while setting it to False results in a ConditionalCheckFailedException.
How can I get that exception when using batch_write?
Aucun commentaire:
Enregistrer un commentaire