You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Support for dynamic routing:
* refactored codebase to be modules wrt transformers and routers
* created constants to be imported into modules
* added support for mocha tests
Copy file name to clipboardExpand all lines: README.md
+24-3
Original file line number
Diff line number
Diff line change
@@ -128,7 +128,7 @@ eventName
128
128
129
129
For more information on DynamoDB Update Streams, please read http://docs.aws.amazon.com/amazondynamodb/latest/developerguide/Streams.html.
130
130
131
-
By default, the function will append a newline character to received data so that files delivered to S3 are nicely formatted, and easy to load into Amazon Redshift. However, the function also provides a framework to write your own transformers. If you would like to modify the data after it's read from the Stream, but before it's forwarded to Firehose, then you can implement and register a new Javascript function with the following interface:
131
+
By default, the function will append a newline character to received data so that files delivered to S3 are nicely formatted, and easy to load into Amazon Redshift. However, the function also provides a framework to write your own transformers. If you would like to modify the data after it's read from the Stream, but before it's forwarded to Firehose, then you can implement and register a new Javascript function with the following interface (ideally in `transformer.js`:
132
132
133
133
```
134
134
function(inputData, callback(err,outputData));
@@ -139,10 +139,11 @@ callback: function to be invoked once transformation is completed, with argument
139
139
outputData: Buffer instance (typically 'ascii' encoded) which will be forwarded to Firehose
140
140
```
141
141
142
-
You then register this transformer function by assigning an instance of it to the exported ```transformer``` instance:
142
+
You then register this transformer function by assigning an instance of it to the exported ```transformer``` instance in the header of `index.js`:
143
143
144
144
```
145
-
var transformer = myTransformerFunction.bind(undefined, <internal setup args>);
145
+
// var useTransformer = transform.addNewlineTransformer.bind(undefined);
146
+
var useTransformer = myTransformerFunction.bind(undefined, <internal setup args>);
146
147
```
147
148
148
149
You can also take advantage of a built in regex-to-csv transformer, which can be used by un-commenting and configuring the following entry in the function:
@@ -152,6 +153,26 @@ You can also take advantage of a built in regex-to-csv transformer, which can be
152
153
153
154
Where ```/(myregex) (.*)/``` is the regular expression that uses character classes to capture data from the input stream to export to the CSV, and ```"|"``` is the delimiter.
154
155
156
+
# Delivery Stream Routing
157
+
As stated previously, data will be forwarded on the basis of a Kinesis tag named `ForwardToFirehoseStream`, and if this isn't found, then it will fall back to a default delivery stream. DynamoDB update streams are always routed to the delivery stream with the same name as the base table.
158
+
159
+
In version 1.4.0, we added the ability to do dynamic routing. For example, you might want to route to different destinations on S3 or Redshift on the basis of the actual data being received. You can now use this by overriding the default routing, and providing a map of who records should be routed. You do this by changing the `router.defaultRouting` method to be `router.routeByAttributeMapping`. When done, you need to previde an 'attribute delivery map' which tells the router which fields to look at in your data, and how to route based on their values. You do this with a configuration object - for example to route by the value of an attribute `binaryValue` that can only be `true` or `false`:
160
+
161
+
```
162
+
var attributeMap = {
163
+
"binaryValue" : {
164
+
"true" : "TestRouting-route-A",
165
+
"false" : "TestRouting-route-B"
166
+
}
167
+
};
168
+
```
169
+
170
+
this attribute map is then used to configure the router instance:
171
+
172
+
```
173
+
var useRouter = router.routeByAttributeMapping.bind(undefined, attributeMap);
174
+
```
175
+
155
176
# Confirming Successful Execution
156
177
157
178
When successfully configured, writes to your Stream will be automatically forwarded to the Firehose Delivery Stream, and you'll see data arriving in Amazon S3 and optionally Amazon Redshift. You can also view [CloudWatch Logs](http://docs.aws.amazon.com/AmazonCloudWatch/latest/DeveloperGuide/WhatIsCloudWatchLogs.html) for this Lambda function as it forwards stream data to Firehose
0 commit comments