Amazon Web Services

Urban Airship Connect currently supports two integrations with Amazon Web Services: S3 and Kinesis. This document walks through the process of connecting to each service.

Amazon S3

Amazon S3 is a secure, cloud-based storage service. The process below explains how to route your streaming Connect data into an Amazon S3 bucket.

Setup

AWS

  1. Log in to Amazon Web Services.

  2. Create a bucket by following Amazon’s provided process, making sure to select US Standard for S3 Region. For more information on S3 regions, please refer to Amazon Regions.

  3. Grant permissions via the UI or S3’s JSON-based policy management system:

    Option 1: UI
    In the Permissions menu of your newly created bucket, add a new permission, using the Urban Airship User ID for the Grantee field:
          7e7585ea39ccec40d8297a9038ba7f211b1c4a48994c2c702298aca8732f9f0e
          
    
    Make sure the Upload/Delete box is checked, and save the permission.

    Option 2: Create/Edit policy
    Grant the Urban Airship User ID the s3:PutObject permission in your bucket policy, replacing <bucket_name> with the name of your bucket:
          {
             "Action": "s3:PutObject",
             "Effect": "Allow",
             "Resource": "arn:aws:s3:::<bucket_name>/*",
             "Principal": {
                "CanonicalUser": "7e7585ea39ccec40d8297a9038ba7f211b1c4a48994c2c702298aca8732f9f0e"
             }
          }
          
    

Connect Dashboard

  1. From within the Urban Airship dashboard, with your app selected, click Connect from the top navigation. Click the S3 Integration option to begin setup, then name and configure a new Amazon S3 integration:
    • Enter a user-friendly name and description.
    • Enter the S3 Bucket Name for the bucket you created in the AWS setup steps above.
    • Select the Output Format.
    • Optionally check the boxes to Encrypt your data using server-side encryption or export as a compressed gzip file. If you select gzip, the file will have the extension .gz.
    • Choose one or more event type:

      • Opens
      • Closes
      • Custom Events
      • Screen Viewed
      • Location
      • Region
      • Sends
      • Control
      • Tag Changes
      • First Opens
      • Uninstalls
      • Push Bodies
      • Rich Read, Delivery, and Delete Events
      • In-App Message Expiration, Resolution, and Display Events
      • Web Notify Session
      • Web Notify Click

        Tag Changes are not supported by CSV because their dynamic nature cannot be easily represented in a single row column structure.

  2. Click the Save button.

Output

Structure and Files

Your S3 bucket’s directory structure and files will be named using the following patterns:

CSV
appKey + “/” + integrationId + “/S3_CSV/” + eventType + “/” + year + “_” + month + “_” + day + “/” + year + “_” + month + “_” + day + “_” + hour + “_” + minute + “_” + second + “.csv”
JSON
appKey + “/” + integrationId + “/S3_JSON/” + year + “_” + month + “_” + day + “/” + year + “_” + month + “_” + day + “_” + hour + “_” + minute + “_” + second + “.json”

There will be one file generated per hour, assuming the relevant event occurred during that hour.

File size maximum is 15 GB. If the volume of events for an app results in a file exceeding 15 GB, you may instead see multiple files.

Sample File

Here are a few lines from a sample CSV file for LOCATION events:

Nonet.id,event.type,event.occurred,device.platform,device.channel_id,device.named_user_id,LOCATION.latitude,LOCATION.longitude,LOCATION.foreground,LOCATION.session_id
00000000-0000-0000-0000-000000000000,LOCATION,2015-11-18T01:21:33.180Z,IOS,90823094-1234-94b2-sb39-099s9018gx55,,14.5224123,-22.1236212,false,
00000000-0000-0000-0000-000000000000,LOCATION,2015-11-18T01:21:33.171Z,IOS,90823094-1234-94b2-sb39-099s9018gx55,,14.5224123,-22.1236212,false,
00000000-0000-0000-0000-000000000000,LOCATION,2015-11-18T01:21:33.162Z,IOS,90823094-1234-94b2-sb39-099s9018gx55,,14.5224123,-22.1236212,false,
...

If you choose to output your data as JSON, the files will follow the format laid out in the Connect API Reference.

Use Cases

S3 is a cloud storage service. Once you have integrated Connect with S3, your streaming Connect data should begin funnelling into a bucket. From there, how you use the data is up to you, but some potential ideas are:

  • Output CSV files with user-level send and open information, and import these files into your CRM system.
  • Output your data in JSON format, and use that data along with Amazon Redshift to perform detailed analysis of your users.

Be sure to regularly audit your Connect S3 bucket. Connect outputs large amounts of data, which can lead to expensive AWS bills if not managed appropriately.

Kinesis

Kinesis is AWS’s conduit for streaming data. Once you have integrated Connect with Kinesis, you will be able to load and analyze mobile events in real time.

Setup

AWS

  1. Log in to the AWS Console via US East Region/Availability Zone 1.

    You must be in us-east-1 to set up this integration. Verify this by checking the URL.

  2. Create a Kinesis Stream by clicking Kinesis in the AWS console and following the provided steps.

  3. Create a user in the Identity and Access Management section, and download the associated credentials.

  4. Grant permissions via an Inline Policy or Managed Policy:

    Option 1: Inline Policy
    Attach the following statement to your user’s policy:
          {
             "Action": ["kinesis:Put*", "cloudwatch:Put*"],
             "Effect": "Allow",
             "Resource": "arn:aws:sqs:us-east-1:<account_ID>:<stream_name>"
          }
          
    
    Replace <account_ID> and <stream_name> with your account ID and the name of the stream you created in step 2.

    Option 2: Managed Policy
    Attach the AmazonKinesisFullAccess policy to the user.

Connect Dashboard

  1. From within the Urban Airship dashboard, with your app selected, click Connect from the top navigation. Click the Amazon Web Services / Amazon Kinesis option to begin setup, then name and configure a new Amazon Kinesis integration:
    • Enter a user-friendly name and description.
    • Enter the Access Key, Access Secret, and Kinesis Stream Name from the credentials file downloaded during the AWS setup steps above.
    • Choose one or more event type:

      • Opens
      • Closes
      • Custom Events
      • Screen Viewed
      • Location
      • Region
      • Sends
      • Control
      • Tag Changes
      • First Opens
      • Uninstalls
      • Push Bodies
      • Rich Read, Deliver, and Delete Events
      • In-App Message Expiration, Resolution, and Display Events
      • Web Session
      • Web Click
  2. Click the Save button.

Use Cases

Once you have Connect data running through a functioning Kinesis stream, you will likely want to begin analyzing and processing that data. To that end, AWS Lambda is a backend compute service that can process events in real time. Lambda responds to events that occur on other AWS products, such as a Kinesis stream.

As an example, let’s say you have built out a mapping between Named User IDs and associated email addresses, and you would like to send an email to any user that chooses to uninstall your app. Here’s a typical app uninstall event:

{
   "id" : "ff76bb85-74bc-4511-a3bf-11b6117784db",
   "type": "UNINSTALL",
   "offset": 1235,
   "occurred": "2015-05-03T02:32:12.088Z",
   "processed": "2015-05-03T12:12:43.180Z",
   "device": {"named_user_id": "named-user-id-123"}
}

You would then use Lambda to create a function associated with your Connect Kinesis stream. The function should search for UNINSTALL type events that occurred on devices with associated Named User IDs and then send those devices an email:

def uninstall_email(event, context):
   if (event['type'] == 'UNINSTALL' and 'named_user_id' in event['device']):
      send_email(event['device']['named_user_id'])

The above pseudo-code finds events that fit the parameters and then uses the send_email function to send an email to the appropriate Named User.

There are thousands of ways to process and analyze your Connect data with Kinesis and Lambda, e.g., editing your CRM database when certain events come in, writing streams into Redshift or Dynamo DB, monitoring tag changes. The possibilities are endless.