samedi 26 septembre 2015

fluentd upload file to s3

I wonder how to use fluentd to monitor a path for new created file and put them into s3 as the with the filename. eg. I have some log file created as individual files in:

/var/app/myapp/logs/20150926/a.txt
/var/app/myapp/logs/20150926/b.txt
/var/app/myapp/logs/20150926/c.txt
...

I want them upload to s3 as:

/bucketname/2015/09/26/a.txt
/bucketname/2015/09/26/b.txt
/bucketname/2015/09/26/c.txt

currently I have some config like below, data are stored in s3 but I have two issues: 1. I don't know how to get the filename of the file to upload and use as the s3 object key. 2. file boundary is not kept. files are concatenated and stored as chunks of text.

<source>
    type tail
    format none
    path /var/app/myapp/logs/%%Y%%m%%d/*
    pos_file /var/log/td-agent/myapplog.pos
    read_from_head true
    tag s3.myapplog
</source>
<match s3.myapplog>
     type s3
     aws_key_id xxx
     aws_sec_key yyy
     s3_bucket zzz
     path /a/b/c
     s3_object_key_format /a/b/c/%%{time_slice}_my_log_%%{index}.log
     buffer_path /var/log/td-agent/log_myapplog
     store_as text
     append true
     buffer_chunk_limit 8m
     buffer_queue_limit 4096
     time_slice_format %%Y/%%m/%%d/%%H
     flush_interval 5m
     utc
</match>




Aucun commentaire:

Enregistrer un commentaire